Upload
others
View
2
Download
0
Embed Size (px)
Citation preview
___________ [1]
Suitable measurement processes - Valid information
Abstract
In order to obtain valid information regarding the quality of production systems and the products
manufactured with them in the industrial sector, suitable measuring processes are the central pre-
requisite. A measuring process is defined as all measuring systems and sensors used under real oper-
ating conditions. Over the years, various methods have been developed to evaluate the suitability of
measuring processes. These methods are described in association and company guidelines as well as
in ISO standards. This article outlines the origins of these documents and informs about the status
quo.
The measured values determined with the measuring systems or sensors must be managed in a
structured way and supplemented with descriptive information. To ensure this, a standardized data
format is required in which all relevant information can be stored. Such an internationally recognized
format does not exist to date. However, the data format AQDEF (Advanced Quality Data Format) has
become generally accepted in the automotive industry, which is further discussed in this article. Ad-
ditionally, the paper gives advice on how to avoid faulty transfer of information by means of plausi-
bility checks and to what extent relationships between process parameters and quality characteris-
tics can be established by means of meaningful visualization.
In light of the new possibilities now available, typical application cases are described and an outlook
on the potential changes brought about by AI technologies is given.
1. Introduction
The amount of stored data grows every day. This is due to the growing number of sensors, the com-
munication between systems and the processing and storage of data. Also, data storage in private or
public clouds with seemingly infinite storage capacity contributes to this. Often you hear: "No matter
in what format, we will store the data, maybe we will need it again!" This leads to unstructured data
storage. Whether this makes sense is often questionable, as finding useful information in data ceme-
teries can often be tedious and in some cases even impossible. In addition, there is often a high level
of mistrust in the information extracted in this way. Therefore, only structured ("smart") data can be
considered gold. The unstructured stored data can at best be regarded as silver. However, they often
represent little more than data garbage. In order to create a structured data storage, certain require-
ments must be met, which will be discussed in more detail in this article.
It follows from this that larger amounts of data, especially if unstructured, do not necessarily provide
added value for companies. On the other hand, the negative effects of large data storage on the en-
vironment are certain and cannot be ignored. Therefore, 30 Google queries consume as much energy
as preparing a cup of espresso, or the energy needed to do 200 queries could be used to iron a shirt.
What is true for Google queries is in its essence also true for data retention in the industrial sector.
For this reason, only relevant data should be collected for a task and stored in a structured manner.
In addition to structured data storage, the measurement processes (measuring devices, sensors, etc.)
from which the data comes must reflect reality with sufficient accuracy. For this purpose, the meas-
urement uncertainty of the measured value must be known and sufficiently small. Only if this is the
case, one can speak of a valid measured value. In addition, the descriptive data must be assigned to
the recorded measured values and the conditions under which they were obtained.
___________ [2]
For this reason, the topic of measurement uncertainty as a basis for valid measurement results will
be discussed below. A further section addresses the structured data management and its prerequi-
sites.
1.1 Historical development in the last 40 years
When the automated measuring technology entered industrial production at the beginning of the
1980s, primarily relevant quality characteristics were measured in order to assess the quality of the
produced product characteristics and to indirectly conclude from this the suitability of the machines
and the production equipment. Once the suitability had been proven, the ongoing processes were
monitored by means of random checks or, if necessary, 100% checks. This procedure has not
changed until today. In the course of time more and more measuring systems and complex measur-
ing devices or coordinate measuring machines (CMM) have been added. Especially the progress in
optical measuring technology and scanning methods has led to a increase in the amount of data. To-
day, more and more features are measured directly on the machine, making the processing machine
also a measuring system. In addition, more and more process parameters and environmental condi-
tions can be recorded.
In the course of time, the steps shown in Figure 1a for the evaluation of manufacturing processes
have proven to be useful. First of all, it is necessary to check, under idealized conditions, whether the
specifications from the design or development can be produced at all with the existing equipment.
For this purpose, statistical methods have been established with which the suitability or capability
can be proven. One then speaks of machine or preliminary process capability and describes this fact
by means of capability characteristics (e.g. Cm, Pp and Cmk, Ppk). If the capability is given, the pro-
cess is repeated under more realistic conditions and evaluated using the process capability values
(Cp, Cpk). If the process is suitable, the process is transferred to continuous process monitoring using
quality control card technology. If the suitability of the process cannot be proven or if instabilities are
found during monitoring, the process must be improved. This can be achieved by using methods
from the Design of Experiments (DoE) field.
The procedures used are described in ISO standards, in guidelines from associations such as the AIAG
(Automotive Industry Action Group) or the VDA (German Association of the Automotive Industry) or
in a large number of company guidelines.
Fig. 1a: Measurement processes in production Fig. 1b: Supplementary information
___________ [3]
Translation of figure 1
Suitable measuring process Design/development Suitability machine/equipment Process capability Process control SPC Process optimization
Descriptive data Quality features Environmental parameters Texts pictures Process parameters Noises Vibrations Videos
Today, due to the large number of sensors installed in machines, equipment and plants more and
more process parameters can be recorded and stored. In addition, there are also sensors for record-
ing environmental conditions. These include not only temperature or humidity, but also noise, vibra-
tions and the like. The aim is to get to know the processes better and to understand their behavior in
order to use this knowledge to control the influencing parameters responsible for quality.
An important element here is the descriptive data (see Figure 1b). This mainly refers to header data,
which specifies the respective subject matter in more detail and establishes the relationship of the
measurement results to other sources of information.
Today one tries to describe the real processes with technologies of AI (artificial intelligence) in a
model-like way. This model is then called the digital twin. Simulations for optimization can be easily
carried out on these models.
1.2 Measurement uncertainty and its effect on the suitability of measurement processes
When observing a process on the basis of measurement results, the results are scattered due to the
various influencing parameters that occur during a manufacturing process. Additionally, there is the
scattering of the measuring process. The question therefore arises to what extent the scattering of
the measurement process distorts the view of the real process.
Figure 2 shows a sufficiently small scattering of the measurement process using the example of nor-
mally distributed measurement values of a characteristic whereas Figure 3 shows a scattering that is
too large.
Fig. 2: Small measurement process scatter Fig. 3: Large measurement process scatter
Translation of figure 2 & 3
Observed process variance Actual process variance Measuring system with small scattering
Observed process variance Actual process variance Measuring system with large scattering
___________ [4]
Figure 3 clearly shows that the
actual process scatter is greatly
distorted by the excessive scat-
tering of the measurement pro-
cess and thus a correct statement
about the process itself is no
longer possible. Why do the
measured values scatter around
the correct values in a measuring
process? Measurement processes
are also subject to a large num-
ber of influencing factors, which
are shown in Figure 4.
Fig. 4: Influences on the uncertainty of measured values
(Source: VDA 5 Volume Test process suitability)
___________ [5]
Translation of figure 4
Human Qualification Physical constitution Psychic constitution Motivation Discipline Care
Measurement object Material Form Surface Accessibility
Evaluation method Mathematical models Link of measured values Use of computer Statistical method
Environment Pressure Temperature Air humidity Contaminations Voltage, current Illumination Vibrations
Measurement method Contactless Tactile touch Measuring point arrangement Number of measuring points
Measuring equipment Sensitivity Measuring range Time/costs Stability Resilience Resolution Calibration/Adjustment Random measurement errors Undetected systematic measurement error Set-up Uncertainty
Mounting device Stability Form Location Position
Normal Surface structure Type of normal Shape/position Measurement stability
Measurement Result
In practice, there will never be zero measurement uncertainty, so the question arises: "What meas-
urement uncertainty is still acceptable so that the capability indices determined still reflect the spe-
cific situation with sufficient accuracy? Figure 5 shows the extent to which the process capability in-
dex Cp is distorted by the measurement uncertainty of the measurement process.
The statement made in this figure applies not only to normally distributed characteristic values, but
also to any other distribution model used to describe the behavior of a characteristic. In order to de-
termine the suitability or capability of a measurement process in a way that is as relevant to practice
as possible, procedures were described in the early 1990s to determine a so-called GRR value (Gage
Repeatability Reproducibility), which is referred to as the %GRR value in relation to a reference (e.g.
process variation or tolerance) [2]. The effect of this %GRR value on the capability index to be deter-
mined (e.g. the Cp value) is shown in Figure 6.
___________ [6]
Fig.5: Capability index as a function Fig.6: Cp value as a function of %GRR
the expanded measurement uncertainty U (MSA Measurement System Analysis [2])
(VDA 5 volume test process suitability [11])
Translation of figure 5 & 6
Y Axis: measured process capability
Range of measuring process capability (method 1: Cp = 1.0 ... 1.67)
Range of the golden rule
Process capability range (method 2/3)
X axis: observed cp value Y axis: real cp value
Regardless of whether the measuring process is evaluated via the extended measurement uncer-
tainty U or via the %GRR value, both methods show the influence of the scatter of the measuring
process on the actual process capability. For this reason, limit values are defined to assess the suita-
bility of a measuring process. If these are exceeded, a measuring process is not suitable and must be
improved.
1.3 Change in quality management through certification.
With the introduction of the Ford Q-101 quality management system [6] at the beginning of the
1980s, the field of quality assessment was fundamentally changed by involving the suppliers. The
quality control charts [1] developed by Walter Shewhart in the 1930s were introduced under the
name SPC (Statistical Process Control) to monitor and evaluate production. The evaluation of ma-
chine, process and product quality is based on the above-mentioned capability parameters. The im-
plementation of these procedures was continuously monitored by means of customer audits, which
quickly led to the spread of these requirements. Qualified suppliers were awarded the Q1 Award.
The introduction and implementation of SPC was driven by the automotive industry in particular. The
procedures and requirements described in Q-101 have been adopted in a modified form by other au-
tomotive companies. In the USA in the middle of the 1990s, the AIAG (Automotive Industry Action
Group) replaced the Q 101 with the QS-9000 (see Figure 7) which also contains the SPC guidelines.
___________ [7]
Fig. 7: Transition from Ford's Q101 [6] to AIAG's QS-9000 [3]
In Germany, the VDA (German Association of the Automotive Industry) has established a similar sys-
tem with the volumes VDA 4 and VDA 6.x. In France and Italy, the associations have also issued rec-
ommendations on how suppliers are to be assessed. As a result, a supplier with transnational cus-
tomers had to operate different systems. This is why the IATF (International Automotive Task Force)
has harmonized the various documents and developed them further on the basis of ISO 9000, result-
ing in suppliers today only having to set up and operate their quality management system in accord-
ance with IATF 16949 (see Figure 8) [7].
In addition, further restructuring took place
in the companies. For example, the "quality
control", which was a separate department
until the 1980s, was abolished and the sole
responsibility for the "quality" was assigned
to the production. At the same time, the so-
called "factory self-check" was introduced.
Tiresome and unnecessary discussions
between "quality control" and "production"
departments about the quality of the
products have thus been eliminated.
Fig.:8 Origin of the IATF 16949-2016 [7]
Translation of figure 8
Harmonisation and specification of requirements
With the advent of SPC (Statistical Process Control) another requirement regarding quality assess-
ment changed dramatically. Whereas before it was considered that the entire feature specification
(tolerance) could be exploited (see Figure 9a), target value-oriented thinking suddenly came to the
foreground. This could be formulated by the Taguchi loss function, which emphasizes that every devi-
ation from a given or reasonable target value causes costs (see Figure 9b). In order to be able to as-
sess these requirements and thus also comply with them, the capability indices mentioned above
were introduced (see Figure 9c).
___________ [8]
Fig. 9: a) Good/bad test b) Taguchi function c) Proof of suitability
Translation of figure 9
Verlust: Loss Schlechte Qualität: poor quality Messwert: measured value Zielgröße: target value
The calculated characteristic values are compared with characteristic-specific limits. If the limit re-
quirements are not met, the machining process must be improved. Otherwise, monitoring of the pro-
cess by random sampling is not possible. In order to meet these requirements, more and more meas-
uring systems of various kinds have been installed at relevant machining or assembly stages in recent
decades. There, the relevant characteristics (SPC characteristics) are measured and evaluated to en-
sure that the current production meets the requirements.
___________ [9]
2. Development of procedures for the verification of the capability of measurement processes
When SPC was introduced, little thought was initially given to the measurement processes used. It
was assumed that the measured values largely reflected the reality. However, later on, it was shown
that the scattering of the measuring process was as great as the scattering of the manufactured
parts. Due to the realization: "You can only manufacture as accurately as you can measure", the first
company guidelines were developed in the early 1990s in which procedures for the evaluation of
measuring processes were specified.
In Germany, the Bosch company published a guideline for the evaluation of measuring processes in
the publication of issue 10 (Capability of measuring and testing processes) [10]. Other automobile
manufacturers followed with their own guidelines. In the USA, AIAG published the MSA (Measure-
ment System Analysis) guideline on this topic. Unfortunately, there was no ISO standard at that time.
Therefore, the guidelines, even though they had the same objective, differed in terms of terminology
and partly also in the procedures themselves.
It was not until 1995 that the GUM (Guide to the Expression of Uncertainty in Measurement) [8] was
published, which described very precisely the determination of the expanded measurement uncer-
tainty U using a mathematical model. In practice, however, it has turned out that GUM can only be
applied to a limited extent due to the complexity of the measurement processes. For the accredita-
tion of measuring laboratories according to ISO/IEC 17025:2017 "General requirements for the com-
petence of testing and calibration laboratories", the GUM is still the basis for determining the ex-
panded uncertainty of measurement for the respective SI unit.
Fig. 10: Relevant guidelines with year of first issue [10], [2], [8], [11], [5]
According to ISO 14253-1 "Geometrical Product Specifications (GPS) - Inspection of Workpieces and
Measuring Instruments by Measurement", the measurement uncertainty for the respective measure-
ment process must be determined in order to have the greatest possible clarity when exchanging
products between customer and supplier. For this reason, the VDA published the VDA 5 Volume (Test
Process Suitability) as a simplified version of the GUM in 2003 and issued a completely improved edi-
tion [11] in 2010. Currently this volume is being revised again.
The ISO standard 22514 Part 7 (Capability of Measurement Processes) [5], which is also based on the
GUM, was first published in 2012. Similar to the VDA 5 volume, this standard proposes a recipe-like
procedure with which the expanded measurement uncertainty for the measurement systems used in
production can be estimated more practically by a mathematical layperson. Figure 10 shows exam-
ples from the different areas of the current cover sheets of the documents, with the respective year
of the first edition listed below.
___________ [10]
The individual directives, guidelines or standards have been and will continue to be continuously de-
veloped on the basis of new findings and requirements. In any case, sufficient methods are available
today to prove the capability of measuring processes or to determine the extended measurement
uncertainty of measuring processes.
3. Requirements for structured data management
As mentioned at the beginning, the volume of incoming data is very high due to the measurement
technology that has been further developed over the last decades. On the other hand, there is the
necessity to store data in a structured way. The motto here is: "As much as possible, but only as
much as necessary!” The prerequisite for this is a uniform data format that is supported by the in-
stalled measuring systems and the programs used to process the data.
But when does a data format become generally accepted? The answer is simple: "When it is used by
the majority!” In the past, data formats that had already been defined by associations have failed
due to a lack of companies that supported them, which consequently meant that there were not
enough users of the respective formats. Why was this and still is this the case? Because every soft-
ware producer usually has his own format, which he does not want to give up as it offers flexibility.
Also, using a standard format would mean that one would become interchangeable, at least in this
area. Admittedly, format definitions of associations or international standardization bodies that are
adapted every few years - in the case of the ISO International Standards Organization, this is five
years - are of only limited help. For the fast-moving software industry this is too long to be able to
react to new possibilities and requirements. Nevertheless, it is helpful if such standards are created
which at least provide a long-lasting basic framework, but also allow individual adaptations.
A data format may not only contain the data itself but must also manage the relationships of the data
to the respective circumstances. For example, characteristics of a product, part, or component are
not inspected simultaneously but at different measuring. Or individual components are built into a
finished product. Therefore, it must be ensured that all measured characteristic values of the differ-
ent components can be assigned to the final product. As one is particularly interested in the relation-
ship between process parameters, environmental conditions and quality characteristics, a connection
must be established between the measured value and the conditions prevailing at the time of meas-
urement.
3.1 Origin of the data format AQDEF
With the introduction of the first commercially offered SPC systems and automated measuring sys-
tems in the mid-80s, this topic was quickly brought into focus. However, many manufacturers of SPC
or measuring systems stored the recorded data in their own file structure which they kept secret.
This made it near impossible to extract data from these systems and to merge them centrally in a da-
tabase that could be accessed by other programs. However, due to the wide variety of measuring
tasks, a company may have several dozen different measuring systems, which inevitably come from
different manufacturers. This meant that the desired merging of data and structured data storage
was not possible. If the format of a manufacturer was known, the data could be transferred into a
specified format by means of individually developed converters, which is still common practice to-
day. However, this is not efficient and has the disadvantage that the converter has to be readjusted
in case of changes on one side or the other.
Since this was not desired by the corporations, several companies in the automotive and supplier in-
dustry joined forces with Q-DAS GmbH at the beginning of the 1990s to develop and continuously
expand the AQDEF (Advanced Quality Data Exchange Format, see Figure 11) data in order to adjust it
___________ [11]
to new requirements. Today, Version 12 of 2015 is valid and can be downloaded from the Q-DAS
homepage www.q-das.com [9]. Thus, it is accessible to all interested parties.
After the first version was approved, the participating companies invited the suppliers of measuring
systems and presented the new format. At the same time, however, support for the data format was
made mandatory in the supplier conditions for the purchase of new measuring systems. Thus the dis-
semination was guaranteed. Presently several hundred manufacturers of measuring systems support
this format.
Fig. 11: Cover sheet (left) of the data format AQDEF with the basic structure
(Kxxxx are key fields that are numbered in ascending order)
Translation of figure 11
Teiletyp: part type Merkmale: features Messwerte: measured value
Initially, not all suppliers of measuring systems supported the data format correctly. As a result, the
format founders required the certification of the data format interface. This ensured that measuring
systems with a certified interface provide the data correctly. In the meantime, some manufacturers
have also adopted the format as their own standard and store the recorded data in this format.
The AQDEF data format has thus become a quasi-standard, which has also been adopted outside the
automotive industry. Although the flexibility to react quickly to new market requirements must be
maintained, the basic framework of the format (see Figure 11 on the right) is to be published as a
Technical Report by ISO in coordination with the AQDEF working group. A corresponding draft al-
ready exists as working draft with the name ISO 116462 Part 5, which has not yet been published.
Such a standard facilitates confidence on the market and provides users with investment protection.
___________ [12]
3.2 QML-Format
The AQDEF structure is also available based on XML technology and is then referred to as QML (Qual-
ity Markup Language) format. This makes it compliant with the World Wide Web Standard.
The format itself is defined by the QML schema, which describes the elements that may be found in a
valid file containing data. By validating the data file against this schema file, it is therefore possible to
detect and localize errors in the transfer data at an early stage. However, the disadvantage is the per-
formance loss caused by the validation, which is particularly noticeable when dealing with large
amounts of data.
3.3 Data format QIF
In the USA some companies have founded the DMSC (Dimensional Metrology Standards Consortium)
and developed the XML-based data format QIF (Quality Information Framework). Mitutoyo, the larg-
est manufacturer of measuring systems, is the driving force behind this. However, not many competi-
tors have yet decided to support the format, which has limited its degree of dissemination.
3.4 Plausibility monitoring increases data quality
Validated data includes not only the knowledge of the measurement uncertainty, but also the cor-
rectness and completeness of the descriptive data and the correct assignment of the data to the re-
spective specified fields. It is easy to check whether all required fields have been filled in. If this is not
the case, the software will inform the user and prevent subsequent steps. However, it becomes more
difficult when dealing with respective contents of the fields. In any case, free texts should be
avoided. The assignment of fields to catalogues, in which the terms, texts or descriptions available
for selection are stored, has proven to be an effective method for this. These are then uniquely as-
signed to a key, so that only this key is selected and stored with the data.
As far as unique identification, such as product, part, component or characteristic numbers are con-
cerned, these are automatically and directly taken from other systems (e.g. CAD systems). This also
applies to commercial information, such as order or batch numbers and inspection plans, which usu-
ally come from ERP systems.
As far as the measured values are concerned, it is possible to directly specify limits, which a
measured value must not exceed. Values outside these limits are illogical and indicate measurement
errors. The operator is informed of this and can repeat the measurement before the data is saved.
This is due to the fact that repeated measurements on the same part and repeated saving of the data
without marking them as incorrect measurements inevitably lead to data graveyards.
In particular, if several features are measured auto-
matically on coordinate measuring machines or
measuring devices, the probability of incorrect meas-
urements is increased with each feature. In order to
be able to assess this, it is recommended to perform
intermediate checks and explicitly release the results
for storage in the database.
Fig. 12: Temporary storage for subsequent decision making
___________ [13]
If the operator cannot decide this himself, the results are stored temporarily until a process owner
makes the assessment and decides whether or not to repeat the measurement on the respective
part (see Figure 12). If predefined limits are complied with, the results are automatically transferred
to the database.
If, due to special requirements, such as those frequently demanded in medical technology, the phar-
maceutical industry or in the production of foodstuffs, a measurement once performed cannot be
deleted, it must be marked as such. Such release processes can be supported by workflows. In this
way, measurements can be carried out at the various recording stations (measuring stations) and the
measured values determined can be stored centrally together with the descriptive information. This
results in potentials for continuous and efficient quality monitoring. This procedure as well as the de-
veloping data stock are the basis for many optimization and improvement possibilities.
4. quality characteristics and process parameters
The above also applies to all other information collected, such as process or environmental parame-
ters as well as text, images or video recordings. It must be possible to assign these to the measured
quality feature, otherwise this information will become considerably less useful. As far as the meas-
ured values of process or environmental parameters are concerned, their measurement uncertain-
ties must also be known and the assignment to the measured values must be clear. If this is the case,
correlations between process parameters and quality characteristics can be easily established.
4.1 Combining quality features and process parameters
As Figure 12 shows, there are two opposing views:
- Product view, which is used to assess the quality of the processed characteristics
- Process view, which is used to observe the behaviour of the facilities.
If there is an assignment between these two views, the behaviour of a process step can also be ob-
served based on the measured values of a characteristic, as shown in Figure 12.
___________ [14]
Fig. 12: Product and process view (Source: Renault)
For example, the trend behaviour of a quality feature measured at certain intervals can be used to
infer the wear and tear of a tool, thus eliminating or at least minimizing additional checks on the tool.
Conversely, if the relationships
between quality characteristics and
process parameters are known and
can be modelled, in many cases it is
sufficient as a quality assurance
measure to monitor only the
relevant process parameters, which
reduces the measurement effort on
the produced part. (see Figure 13).
Figure 13: Monitoring of process parameters
Translation of figure 13
Prozessparameter: process parameters Qualitätsmerkmale: quality characteristics
Similarly, environmental parameters can also be included. For example, vibrations of the equipment
or noises in its near vicinity can indicate a changing process behaviour.
4.2 Central evaluation and visualization
Especially in the highly automated production, one has to keep track of many factors simultaneously.
If significant deviations of a process from its target specifications are not quickly detected, considera-
ble follow-up costs are the result, especially in mass production. Although processes can be moni-
tored in the background, it is sometimes helpful to visualize important states in real-time in the form
of dashboards (see Figure 14) and thus make them clearly accessible. Using predefined queries for
various tasks, it is possible to switch quickly between individual dashboards. At the same time, peri-
odic reports such as daily, weekly or monthly reports can be generated, made available to the re-
sponsible user and stored. This also allows ad hoc evaluations to be carried out.
Fig. 14: a) Central visualization b) Overview of capability characteristics
This approach can be seen as a first step towards Process Mining, as it has already been successfully
established in the business sector. In industrial production, the aim is to analyse the data streams for
quality-relevant information and to display the desired information transparently at prominent
points.
___________ [15]
5. Optimization and control of processes using AI technologies
Today, the use of AI (artificial intelligence) offers new possibilities whose effects can only be guessed
so far. Already today AI is used in various fields. These range from speech and image recognition to
translators and medical applications. From the already existing numerous application possibilities a
selection is shown in the following. There neural networks are trained task-specifically by means of
Deep Learning and used efficiently in the concrete application case.
The basic prerequisite for training a neural network is to have valid data and as much of it as possi-
ble, as the quality of the trained network heavily depends on it. Sufficient reference data are also
needed to evaluate the quality of the neural network. Today, it is difficult to make users trust this
new technology. Therefore, it is essential to have a valid data stock that can be accessed. In many
cases, it is also possible to use simulated data if they describe the real facts sufficiently well.
The following are some examples of application areas for AI technologies in quality management:
- Dynamic modification of inspection plans
- SPC evaluation - finding distribution or process models
- Digital twin as a virtual representation of reality
- Error detection during visual inspections
- Dynamization of maintenance intervals
- Creation of FMEA
- Risk assessments
- etc.
The success of this new possibility will depend mainly on the available data for training the neural
networks, but also on the expertise of the staff involved. One focus will be to prove the quality of
such networks in order to create confidence among users.
6. Outlook
Due to new technologies many changes can be expected in the near future. Thus, one will learn more
and more about the respective processes and their interrelations between quality and relevant influ-
encing factors, try to map them virtually and describe them by models. The more successfully this is
accomplished, the more it will have an impact - if not disruptive - on the applied measurement tech-
nology in production.
However, the changes will only be beneficial if the requirements for quality information described at
the beginning of this article are met from both the product and process viewpoints: "The measure-
ment uncertainty of the measured values must be known and the descriptive data used to determine
the measured values must adequately reflect the real situation. This information together with the
correlations must be stored in a structured way!" The necessary technology is available. However,
companies often still lack a target-oriented approach to implement this.
___________ [16]
7. Literature
[1] A.I.A.G. – Chrysler Corp., Ford Motor Co., General Motors Corp.
Fundamental Statistical Process Control, Reference Manual, 3. edition.
Michigan, USA, 2005.
[2] A.I.A.G. – Chrysler Corp., Ford Motor Co., General Motors Corp.
Measurement Systems Analysis, Reference Manual, 4. edition.
Michigan, USA, 2010.
[3] A.I.A.G. – Chrysler Corp., Ford Motor Co., General Motors Corp.
Quality System Requirements, QS-9000, 3. edition.
Michigan, USA, 1998.
[4] DIN - Deutsches Institut für Normung
DIN EN ISO 9001:2015: Qualitätsmanagementsysteme - Anforderungen.
Beuth Verlag, Berlin, 2015.
[5] DIN - Deutsches Institut für Normung
ISO 22514-7:2012: Statistische Verfahren im Prozessmanagement - Fähigkeit und Leistung -
Part 7: Fähigkeit von Messprozessen.
Beuth Verlag, Berlin, 2012.
[6] Ford-Werke AG
Qualitäts-System-Richtlinie Q-101.
Köln, 1985.
[7] IATF 16949:2016-10
Anforderungen an Qualitätsmanagementsysteme für die Serien- und Ersatzteilproduktion in
der Automobilindustrie, Beuth, Berlin
[8] ISO/IEC Guide 98-3:2008-09
Messunsicherheit - Part 3: Leitfaden zur Angabe der Unsicherheit beim Messen; Beuth, Ber-
lin
[9] Q-DAS® GmbH
Qualitätsdatenaustauschformat der Automobilindustrie AQDEF, Version 4.0
Weinheim, 2019.
[10] Robert Bosch GmbH
Publication series „Qualitätssicherung in der Bosch-Gruppe Nr. 10“
Fähigkeit von Mess- und Prüfprozessen.
Stuttgart, 2010.
[11] VDA - Verband der Automobilindustrie
VDA volume 5: Prüfprozesseignung
VDA, Frankfurt, 2010.