65
AD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences Corporation P.O. Box 7463 Colorado Springs, CO 80933-7463 DTIC AtLECT E fl AY 1 1 993 1 1 May 1993 Technical Report CONTRACT No. DNA 001-89-C-0080 Approved for public release; distribution Is unlimited. I 93-10801 93 5 1 4 00 I l11111101.1111111l1

Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

AD-A264 365

Defense Nuclear AgencyAlexandria, VA 22310-3398

DNA-TR-92-79

Statistics of Mass Production

R. Larry WilliamsWilson Y. GateleyKaman Sciences CorporationP.O. Box 7463Colorado Springs, CO 80933-7463 DTIC

AtLECT E flAY 1 1 9931 1

May 1993

Technical Report

CONTRACT No. DNA 001-89-C-0080

Approved for public release;distribution Is unlimited. I

93-10801

93 5 1 4 00 I l11111101.1111111l1

Page 2: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Destroy this report when it is no longer needed. Do notreturn to sender.

PLEASE NOTIFY THE DEFENSE NUCLEAR AGENCY,ATTN: CSTI, 6801 TELEGRAPH ROAD, ALEXANDRIA, VA22310-3398, IF YOUR ADDRESS IS INCORRECT, IF YOUWISH IT DELETED FROM THE DISTRIBUTION LIST, ORIF THE ADDRESSEE IS NO LONGER EMPLOYED BY YOURORGANIZATION.

,ON4

Page 3: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Pubic repiofing burden' tof t,. "m gct~o, of mfornition t usabtknabd vi aventip I hour per response. inciudi fte we fix rev*"~n mirVuctioris It**fC9.g exlistrg dam stources"attlh"~ few mawllain"vei 4" dt needed. amd coinpieln and rew.elll tOe CeftOlOO 0$ in wion*t,0 Send WWYMWZ fifga.aing ithS Ou.roernaiinials1 Of "n Other fpecI Of Maollctite on 01 .nfOrr"Moel,mficluing Suggebone for teduding ft, burden. l Wesh"1gln HeedQuerte Services D0ecipfate tOw mnforinam. Oceinionsi "~ Repiorts, 1215 .1*ffuor,Davis gHoway. Sute 1204, Aulingon. VA 22202-4302. and to the office of Mansggenie md Budget PaparinvoA Reduction, Pto~ec (0704-0166) *asJ'ngto. OC 20M3

1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED

1 930501 ITechnical 920101 -9203154. TITLE AND SUBTITLE 5. FUNDING NUMBERS

Statistics of Mass Production C -DNA 00-89-C-0080PE -62715HPR -SB, SD, SE

6. AUTHOR(S) T SR. Larry Williams and Wilson Y. Gateley WU -DH057780, DH057790

DH057800, DH0578107.PROMNORAIAINNMESAN DRS(S 8- PERFORMING ORGANIZATION

PERFRMIG ORANIATIN NAE(S AN ADOESSES)REPORT NUMBER

Kaman Sciences CorporationP.O. Box 7463 K2IURColorado Springs, CO 80933-7463 K2IUR

9. SPONSORINGIONITORING AGENCY NAME(S) AND AijDRESS(ES) 10. SPONSORINGIMONITOPINGAGENCY REPORT NUMBER

Defense Nuclear Agency6801 Telegraph Road DNA-TR-92-79Alexandria, VA 22310-3398OTAJ~oho

11. SUPPLEMENTARY NOTESThis work was sponsored by the Defense Nuclear Agency under RDT&E RMC Codes B7664D SB SCLTHRR PRPD 1950A 25904D, B7664D SB SA LTHRR PRPD 1950A 25904D, B7664D SB SE LTHKCPRPD 1950A 25904D, B7664D SB SE LTHKT PRPD 1950A 25904D, B7664D SB SE LTHRR PRPD

1 2a. DISTHIBUTIONIAVAILABILITY STATEMENT 12b. DISTRIBUTION CODE

Approved for public release; distribution is unlimited.

13. ABSTRACT (Maximum 200 words)

This paper summarizes the statistical quality control methods and procedures that can be employedin mass producing electronic parts (integrated circuits, buffers, capacitors, connectors) to reducevariability and ensure performance to specified radiation, current, voltage, temperature, shock, andvibration levels. Producing such qua'ity parts reduces uncertainties in performance and will aid mate-rially in validating the survivability of components, subsystems, and systems to specified threats.

14. SUBJECT TERMS 15. NUMBER OF PAGES

Survivability Mass Production 6Quality Control Process Control 16. PRICE CODE

17. SECURITY CLASSIFICATION 18. SECURITY CLASSIFICATION 1S SE~CURITY CLASSIFICATION 20. LIMITATION OF ABSTRACT

OF REPORT OF THIS P2AGE OF ABSTRACT

UNCLASSIFIED UNCLASSIFIED UNCLASSIFIED SARNSN 7540-280-5500 IStandard Form 298 (Rev 2-89)

i ~Prescrbed bry ANSI Si. 2W6 if

2WI-02

Page 4: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

UNCLASSIFIEDSECUqRTY CLASSWICATION OF ThiS PAGE

CLASSIFIED BY:

N/A since Unclassified.

DECLASSIFY ON:

N/A since Unclassified.

11. SUPPLEMENTARY NOTES (Continued)

1950A 25904D, B7664D SB SD LTHRR PRPD 1950A 25904d, and B7664D SB SE LTHRCPRPD 1950A 25904D.

""SE"IJn'y CLASSFW•CrAmN Of Th"

UNCLASSIFIED

Page 5: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SUMMARY

Statistical Quality Control (SQC) is a broad term defining procedures that can beemployed during production to insure that parts are produced having specifiedcharacteristics and performance. In developing complex systems of systems whoseperformance cannot be tested as entities, methods exist to identify components andsubsystems whose failures, degradations, and variations most severely affect systemlevel performance. Once identified, two methods may be employed to reduce oreliminate the system effects caused by such critical components - (1) design changeemploying such techniques as redundancy, proliferation, and spatial separation, and(2) component quality improvement.

This paper summarizes the SQC methods aaid procedures that can be employed inmass producing electronic parts - ICs, SRAMs, buffers, capacitors, connectors - toreduce variability and insure performance to specified radiation, current, voltage,temperature, shock, and vibration levels. Producing such quality parts reducesuncertainties in performance and will aid materially in validating the survivability ofcomponents, subsystems, and systems to specified threats.

SAcoession For3ITIS GRAkIDTIC ThB 0lUllunw'.-.tced 0JU:ti•f iciticn

D I 3t r bu-t 1 z•/

Ax, i3,abi1' t CcdesAvail and/or

ýDfut Speoial

iii . . .

Page 6: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

TABLE OF CONTENTS

Section Page

SUM M A RY ....................................................... iiiLIST OF ILLUSTRATIONS ........................... vi

1 CONTROLLING QUALITY TO INSURE SURVIVABILITY ................... 1

1.1 TOP LEVEL STATEMENT OF THE PROBLEM ...................... 11.2 STATISTICAL QUALITY CONTROL ............................... 2

2 IDENTIFYING CRITICAL ITEMS AND THE EFFECTS OF UNCERTAINTIES .... 3

2.1 SYSTEM MODELS FOR ANALYSIS ............................... 32.2 FMEAS AND FMECAS TO IDENTIFY CRITICAL COMPONENTS ....... 3

2.2.1 GO Fault Finder and FaultTrees .......................... 32.2.2 Common Cause Failures ................................. 4

2.3 BENEFITS OF ANALYSIS ....................................... 4

3 SOME BASIC IDEAS OF SQC .......................................... 5

4 QUALITY CONTROL CHRONOLOGY .................................. 6

5 THE PROCESS ...................................................... 8

5.1 PROCESS PARAMETERS AND STATISTICS ........................ 95.2 PROCESS PROBABILITY DISTRIBUTIONS ........................ 125.3 DISCRETE DISTRIBUTIONS .................................... 135.4 CONTINUOUS DISTRIBUTIONS ................................ 145.5 TESTING FOR NORMALITY .................................... 17

6 PROCESS CONTROL ................................................ 20

6.1 COLLECTING DATA .......................................... 206.2 CONTROL CHARTS ........................................... 216.3 CREATING STABILITY ........................................ 226.4 MEETING THE CUSTOMER'S SPECIFICATIONS ................... 246.5 MAINTAINING AND IMPROVING A STABLE PROCESS ............ 266.6 ACCEPTANCE SAMPLING ..................................... 276.7 IN-PROCESS MEASUREMENT AND CONTROL ................... 28

7 TOTAL QUALITY CONTROL ......................................... 29

8 TAGUCHI M ETHODS ............................................... 31

9 COM PUTER USE ................................................... 32

10 STANDARDS ...................................................... 33

11 LIST OF REFERENCES .............. ................................ 35

iv

Page 7: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Section Page

APPENDICES

A GLOSSARY OF ACRONYMS, TERMS, AND DEFINITIONS ............... A-1

B QUALITY CONTROL PERSONALITIES ................................ B-1

v

Page 8: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

LIST OF ILLUSTRATIONS

Figure Page

5-1 GBP statistics ................................................... 10

5-2 X-bar chart .................................................... 11

5-3 Process mean and standard deviation char ....................... 12

5-4 Uniform distribution (density) function ........................... 15

5-5 Norm al distributions ............................................ 16

5-6 Common histogram patterns .................................... 18

5-7 Normal probability plot ......................................... 19

6-1 Some charts of out-of-control processes .......................... 23

6-2 GBP accuracy and precision ...................................... 24

vi

Page 9: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 1

CONTROLLING QUAUTY TO INSURE SURVIVABILITY

1.1 TOP LEVEL STATEMENT OF THE PROBLEM.

The companion paper The Meaning and Utility of Confidence (See References)shows how system models can be created and employed to develop system-levelsurvivability distributions and confidence intervals from quantitative informationabout piece parts, components, and systems. A major contributor to uncertainty inthe survivability of components and piece parts is performance variability inherentfrom the production process. The discipline of statistical quality control providesnumerous techniques for ferreting out the causes of variation and controlling oreliminating them during the production process. Applying such procedures duringmass production of piece parts (e.g., silicon chips, integrated circuits) could providesignificant reductions in overall uncertainties in the survivabilities of all componentsand subsystems of large complex systems of systems (e.g., GPALS).

Important contributors to survivability uncertainties of components andsubsystems are variations in the quality of ubiquitous mass-produced electronicpiece parts - microchips, integrated circuits, PC boards, switches, cables, connectors,etc. Instituting statistical quality control techniques and procedures during theproduction of these parts io reduce variations in parameters having the mostimportant effects upon performance is a significant way to reduce uncertainties insurvivability estimates. Consequently, SQC may be a principal tool in validatingsurvivability at all levels of integration.

A nuclear detonation is probably the most severe conceived threat against acomplex system. Such detonations can produce several types of stresses uponsystems - energy coupling, airblast, dynamic loads, thermal radiation, EMP, andelectro-optic disturbances. A range of such threats exist definable by hostile systemcapabilities and locations of detonations from specific equipment. Each postulateddetonation can be specified as a point environment - an x-ray fluence, spectrum, andtime waveform, a total dose, a neutron fluence, a blast overpressure, and an IRirradiance. These can be characterized as prompt (x-rays, prompt gammas, neutronsand EMP) and persistent (IR, debris gammas, betas) environments. For a system tosurvive and function during and after being exposed to such environments,provision must be made to withstand their effects.

Electron emission drives currents which may burnout or upset electronics. Debrisgammas cause noise spikes in focal plane arrays and may affect Signal-to-Noise ratiosin equipment. Induced currents due to the prompt environment and performancedegradation from total dose radiation are effects that concern system developers.These effects, and those of most other nuclear environments, can be mitigated bygood design and shielding. If, in the absence of other survivability measures, thesurvivability of various vulnerable components to specified environmental levels orcurrent thresholds can be assured by SQC, then it appears feasible that acombination of demonstrated capability and additional hardening and shieldingfeatures will insure survivability to all but a limited set of proximate nuclearenvironments.

A recent paper entitled Statistical Process Control for Qualified Manufacturer'sLine (QML) Radiation Hardness Assurance by P. S. Winokur, et al, at Sandia National

Page 10: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Laboratories (SAND-90-0368C) states that a quantitative understanding of tworelationships is essential to establishing a QML to produce macrocells (1K SRAMs,random logic, buffers) and capacitors. A Qualified Manufacturer's Line is one whichhas been certified to meet quality standards such that additional sampling andtesting of products from that line are not necessary. All parts produced meetrequired quality levels.

The necessary relationships are the capability to extrapolate the radiationresponse of an IC in actual threat scenarios from controlled laboratorymeasurements (e.g .,total-dose "rebound" testing of hardened ICs) and thecapability to correlate responses of test structures with the responses of ICsfabricated on the line. As the authors show, threshold voltage shifts due to oxide-trapped charge (DVot) and interface traps (DVit) caused by irradiation can bemeasured, extrapolated, and correlated. By identifying the production parametersand IC characteristics that induce the most significant effects in radiation hardness,production processes can be fine-tuned to reduce variations in these importantcharacteristics, produce quality parts, and thus insure radiation hardness to specifiedlevels.

1.2 STATISTICAL QUALITY CONTROL.

Statistical quality control is the application of statistical techniques to control thequality of manufactured products. While certain principal features of SQC - controlcharts, acceptance sampling - have been extant for many decades, newer techniquescan also be employed. These include Taguchi methods, design of experiments,computer analysis software and process control that employ sophisticatedorthogonal testing to determine the relationships of product characteristics toprocess parameters. The specified quality can be built into mass-produced parts byusing automated processing of such data to focus on parameters having greatesteffects on product performance and variability. Properly employed, these toolsprovide the mechanism to produce quality parts that will insure and validatesurvivability. Consequently, this paper summarizes and reviews pertinent SQCtechniques and shows their application to reduce product variabilities that affectsurvivability in nuclear environments.

2

Page 11: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 2

IDENTIFYING CRITICAL ITEMS AND THE EFFECTS OF UNCERTAINTIES

2.1 SYSTEM MODELS FOR ANALYSIS.

Early in the conceptual design phase of developing a complex system of systems,system level models (e.g., system GO models as discussed in The Meaning and Utilityof Confidence) should be constructed to the subsystem and component levels. Usingsuch models with performance distributions defined as accurately as possible,sensitivity runs can be made to identify individual components and classes ofcomponents whose performance and performance uncertainties most degrade orcause variation in system survivability. Once identified, and degradation andvariation quantified, appropriate measures can be taken to mitigate such effects.These inc~ude design tradeoffs, redundancy, proliferation, spatial separation, andstatistical quality control.

2.2 FMEAS AND FMECAS TO IDEN•i iFY CRITICAL COMPONENTS.

Another fruitful approach for identifying critical components is to performFailure Modes and Effects Analyses (FMEAs) and Failure Modes and Effects CriticalityAnalyses (FMECAs). Such analyses identify which components and which failuremodes of these components can cause system failures. These components andfailure modes can then be rank ordered by probability of occurrence and steps takento eliminate failures by design alterations and quality improvements.

FMEAs and FMECAs are systematic and detailed analyses down to root causelevels of component and piece part failures. Results are usually presented in tabularform. Such studies become even more meaningful if probabilities of occurrence ofthe various failure modes can be determined, because, if the probabilities ofoccurrence of failures in specified environments can be specified, then thequantitative effects upon system performance can be determined and ordered. Thisprovides a prioritized list of which components force design changes or qualityimprovement, and quantifies the potential benefits of effecting such changes orimprovements.

2.2.1 GO Fault Finder and Fault Trees.

To perform FMEAs and FMECAs low-order fault sets causing system failure mustbe identified. Systematic procedures to identify low-order fault sets, thecombinations of failed components that cause system failure, include developingFault Trees (FT) and using FT software, or developing a system GO model andexercising the GO Fault Finder software, to identify the fault sets causing system-level failures. A nice feature of the GO Fault Finder capability is that no additionalmodeling is required to employ it for any anomalous system events once the systemmodel has been developed. The software automatically identifies fault sets forspecified system-level failure events. Identifying low-order fault sets using eithertechnique is an integral part of an FMEA or an FMECA to find those componentswhose failures most effect system performance. Once identified, their possiblefailure modes and causes are comprehensively examined.

3

Page 12: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

2.2.2 Common Cause Failures.

Of special interest in such studies is the examination of common cause failures.That is, the common environment or cause that can simultaneously fail numerouscomponents all susceptible to the same stimulus (e.g., temperature, moisture,radiation, excessive currents, etc.).

2.3 BENEFITS OF ANALYSIS.

Performing these types of system and component assessments early in the designprocess will produce the biggest performance increases for the least cost. Anadditional benefit is that when the design is finalized, the system models can beupdated to provide current performance estimates throughout system life basedupon operating experience and data.

4

Page 13: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 3

SOME BASIC IDEAS OF SQC

Producing a product involves the transformation of one or more inputs(materials, components, subsystems, etc.) into an output (the product). Massproduction involves many repetitions of a particular transformation, and we call thisset -- conceptually extended to an infinite number of repetitions -- a process.

The concept of a process is very general and includes everything from anemployee marking his timecard to NASA producing a space shuttle. The latter, ofcourse, involves thousands of subprocesses, each of which can be studied by itself.

The four words "quality," "control," "process," and "statistical" are used invarious combinations such as "quality control," "process control," "statistical qualitycontrol," and "statistical process control" to describe the efforts to deliver anacceptable product to a customer. These four phrases are frequently usedinterchangeably, and so careful definitions are academic for the most part.However, a phrase involving "process" usually refers to the process which produces aproduct, whereas if "process" is missing, the referenca is usually to the product itself.The word "statistical suggests that formal statistical methods of one kind oranother are involved in the control of the quality of the product.

Byusing thorough sampling procedures it is possible that acceptable productsmay e obtained from a low quality process - txis is defect detection. It is generallya costly procedure and repres -nts a last-ditch defense against an inferior productionprocess. Defect prevention is the preferable course of action, and this meansdesigning and controlling the process so that high quality products are consistentlycreated. For this reason, our principal concern in this paper is with process controland, in particular, bcause of the need for, and efficiency of, statistical methods,statistical process control (SPC).

5

Page 14: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 4

QUALITY CONTROL CHRONOLOGY

A brief chronology of some of the highpoints of manufacturing quality control isgiven in this section:

Pre-industrial Revolution:Master-apprentice system; everyone a craftsman; small organizations; close

producer-customer contact; careful inspection of products; good quality control.

18th & 19th centuries-- Industrial Revolution:Factory system; skilled, semi-skilled, and unskilled workers; some quality control

by inspection.

1890s:F. W. Taylor develops his "scientific management system" in which planning is

separated from production (the 'brains" from the "brawn"). This contributed tolarge increases in productivity, but at the expense of decreases in human relationsand product quality. Inspection departments flourished, and they - rather than theproduction departments -- were usually blamed for the faulty products sent tocustomers.

Early 1900s:More and more, science and technology replace empiricism in the development

of new materials and the designs of new products and production processes. Lowquality products are prevalent and quality control remains in the hands of theinspectors.

1924:Walter A. Shewhart proposed the use of control charts to his superiors at Western

Electric. Shewhart s books Economic Control of Quality of Manufactured Product(1931) and Statistical Method from the Viewpoint of Quality Control were thefoundations of much current quality control theory and practice.

1930s:The Dodge-Romig Sampling Inspection Tables were developed by Harold Dodge

and Harry Romig at the Bell Telephone Laboratories.

World War II:Quantity rather than quality was the name of the game. The use of sampling was

expanded and this effort was aided by the use of MIL-STD-105 which was based onthe Dodge-Romig tables and which continues to be used in both military andnonmilitary contracts to this day. In addition, a department was set up by the WarProduction Board to help companies meet quality standards. Two professors ofstatistics were appointed to lead that department, and they developed a series ofeight-day courses based upon probability theory, sampling theory, and the Shewhartcontrol chart. These courses were instrumental in spreading the news aboutstatistical quality control and led to the formation of the American Society forQuality Control in 1946.

6

Page 15: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Post-World War It:The pent-up demand for consumer goods encouraged manufacturers to continue

to emhasize quantity rather than quality, and quality control efforts generallydwindled.

1950:W. Edwards Deming spoke to Japan's leading industrialists at the request of JUSE

(the Union of Japanese Scientists and Engineers) in their search for ways to rebuildthe Japanese manufacturing industry at a high quality level.

1951:Total Quality Control by Armand V. Feigenbaum was published. In this book,

Feigenbaum espoused the concept of quality control in all areas of business fromdesign to sales and emphasized the desirability of defect prevention rather thandefect detection.

1957:Genichi Taguchi's book Design of Experiments Method was published. (See

Section 8 herein.)

1982:W. Edwards Deming's book Out of the Crisis published. This book is an exposition

of the author's thoughts on how the transformation of American management canand should take place.

1987:ISO 9000 series of standards published by the International Standards

Organization. (See Section 10 herein.)

7

Page 16: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 5

THE PROCESS

Broadly speaking, a process is a transformation which changes something (theprocess input) into something else (the process output) -- there, of course, may bemultiple inputs and outputs.

Although the change is usually thought to be in the form of the input(manufacturing a VCR), it may involve a change in location (sending a letter to yourcongressman) or a change in time (getting to the church on time). We aie involvedthroughout our lives with processes of all-kinds, frequently worry about many ofthem, and occasionally take positive action to improve some of them. However, theuse of statistical methods for measuring, analyzing, improving, and finallycontrolling them is seldom encountered outside large-scale production facilities.

Generally, the processes of interest are those which are repeatable over timebecause that gives us the possibility of improving them so that the cost of operationis reduced, the quality of the output is increased, or, preferably, both.

A simple process example which we will refer to several times below is thefollowing: A golfer -- let's call him *Gus" -- hits a golf ball. The inputs of interestmight include the ball, the club used to hit the ball, the weather conditions(precipitation, temperature, wind), and many aspects of Gus himself. The outputsmight include how long the ball was in the air, the direction and speed with which itwas hit, and where it came to rest. For our example we will let the only output ofinterest be the distance between the starting point of the ball and the point where itfinally stopped. We will call this the Golf Ball Process (GBP).

The ultimate interest in a process is with its output, and this interest usuallyinvolves both its cost and its quality (which refers to how well the output compareswith some conceptual ideal, the nominal specification).

GRP: Gus wishes to learn how to hit the ball some particular distance and istrying for 100 yards at the moment. Thus the GBP output nominal specification is100 yards.

The evaluation of an output will be done using either attribute data or variablesdata. Attribute data is usually based on good/ba (acceptable/unacceptable, go/no-go) information -- that is, does a specific output possess a certain attribute or not?The resulting numerical information for evaluating the process might then be thenumber or the proportion of good outputs produced over a certain time. Variablesdata consist of measurements (length, weight, etc.) and consequently provide moreinformation than attribute data. Thus, if the output of a process is a metal plate inwhich a hole has been drilled, the use of a caliper to measure the diameter of thehole is preferable (at least for process control purposes) to the use of a go/no-gogage.

GBP: The output is evaluated by variables data -- the measured distance fromwhere the ball was when it was hit to where it stopped. (Note that the measuring ofthis distance is another process in itself; we will assume that any errors made by thisprocess are insignificant compared to those of the GBP.)

8

Page 17: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

The outputs from several repetitions of a process almost always exhibit variation.Thus some drilled holes will be acceptable and some will not be; the golf balldistance will be 101 yards on the first try and 95 on the second. Such variation,besides being the bane of everyone concerned with the process or its outputs, is themajor raison d'etre of probability and statistics. This rightly suggests that statisticalmethods can be valuable tools for studying the performance of a process.

5.1 PROCESS PARAMETERS AND STATISTICS.

The most important numerical measures of the performance of a process arethose which define the average of the values of an output characteristic and thosewhich define the variation of those values.

For variables data, averages include the mean (the most commonly used), themedian, and the mode, while measures of variation include the variance and thestandard deviation (the square root of the variance). These and other process-related measures are called process parameters.

Conceptually, any process may go on forever -- that is, there may be an unlimited(infinite) number of repetitions of the process operation. This suggests that it isimpossible to ever know the exact value of these performance measures, except, ofcourse, for theoretical studies in which the values are postulated. However, bysampling the process outputs -- looking at some of them -- over a period of time andcomputing appropriate sample statistics corresponding to the process parameters,estimates of the parameter values may be obtained. Thus the process mean(denoted by ip) can be estimated by the sample mean (denoted by xbar), the processvariance (02) can be estimated by the sample variance (s2), and the process standarddeviation (a) by the sample standard deviation (s) or the sample range (r). Theformulas for these statistics, given a sample of n measurements of the process output(xl, x2, ..., xn), are:

xbar = (xl + x2 + ... + xn)/ns2 = ((xl - xbar)2 + (x2 - xbar)2 + ... + (xn - xbar)2 ] / (n-1)s = Vs2r - largest xi - smallest xi

GBP: Gus hits ten balls and measures the start-to-finish distance of each. The tensample values and the calculation of the sample statistics are shown in Figure 5-1.

If additional samples of size n are taken from the process, we would expect to getfor each sample somewhat different sample values and, consequently, differentvalues for the sample statistics. For each statistic we could make a graph (a controlchart) by plotting the several values of the statistics versus the sample number (orpossibly the time at which the sample was taken).

GBP: Gus hit 15 buckets each containing 10 balls. He calculated the sample mean(xbar) for the balls from each bucket and plotted them in an x-bar chart (Figure 5-2).

How would the graph in Figure 5-2 change if Gus had used buckets containing 40rather than 10 balls? Probability theory tells the answer: the spread in the resultingxbar values would be cut in half -- that is, the xbar values would tend to fall between99 and 101 instead of between 98 and 102. More generally, the spread in xbar

9

Page 18: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

x xi xi-xbar (xi-xbar)2

1 102 3.1 9.61

2 105 6.1 37.21

3 96 2.9 8.41

4 101 2.1 4.41

5 99 0.1 0.01

6 98 0.9 0.81

7 92 6.9 47.61

8 101 2.1 8.41

9 100 1.1 1.21

10 95 3.9 15.21

Total 989 128.90

xbar = 989/10 = 98.9s2= 128.90/9 = 14.32S = V/14.32 = 3.78r= 105-92 = 13

Figure 5-1. GBP statistics.

values (and s values as well) is inversely proportional to the square root of the samplesize.

Even further, probability theory tells us, for example, that about 99% of the xbarvalues will lie between p-3o/Vn and p + 3ai/Vn, which not only says that as nincreases the spread in the xbar values decreases towards zero, but also that thevalue about which the xbar values cluster is p, the process mean. A similar statementcan be made about s -- that is, as n increases, the spread in the values of s will shrinktowards zero, and the value about which they cluster will be a, the process standarddeviation.

Now imagine that we were able to take an extremely large sample at frequentintervals (Gus hits a bucket of one million balls every minute). What would theresulting x-bar chart look like? Assuming no distorting influence occurring to theprocess, we would expect all of the xbar values to be very close to the process mean,and so the plotted points would lie along a straight horizontal line. Ignoring thepoints but keeping the line, we can think of the chart as representing the processmean itself rather than the sample means -- that is, we would have a Process Meanchart rather than an x-bar chart. Similarly, we can imagine constructing a Process

10

Page 19: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

102•

II

99q .1 _____

0 5 10 1s

BUCKET NO.

Figure S-2. X-bar chart.

Standard Deviation chart. In both cases we can think of the horizontal axis asrepresenting time. Figure 5-3 shows the resulting charts (combined into one).

The horizontal lines for p and a tell us that at least over the time interval coveredby the chart, both p and a remained constant, and consequently, we say that theprocess was stable or in-control.

An unstable or out-of-control process is then one characterized by a nonconstantmean and/or standard deviation.

GBP: If Gus became tired, he might tend to hit the balls with less force (whichwould cause the process mean curve to dip downwards) and, possibly, moreerratically (which would cause the process standard deviation curve to rise).Consequently, if both of those effects occurred, the points on a sample s chart wouldtend to move higher and those on a sample x-bar chart would move higher, andwould also spread more because of the increase in a.

An unstable process is inherently an unpredictable one. Consequently, becauseof the interest in producin9 high quality products in the future, the top priorityactivity is to create and maintain stable processes. Unfortunately, neither thedetection of process instability nor its removal once it is detected is a simple orstraightforward task.

To start with, there is not a practical sharp boundary between stability andinstability -- that is, a few wiggles in a process chart curve are probably of no greatconsequence for the process output and would be ignored even if they werediscovered.

Secondly, finding small instabilities in a process is extremely difficult. Detection isaccomplished by searching for unusual events in the behavior of one or more of thesample statistics. However, even a stable process has some inherent variability whichcauses variation in the values of a -ample statistic, and so small changes due to

11

Page 20: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Ila

TimeFigure 5-3. Process mean and standard deviation chart.

changes in the process can easily be missed. As a result of this, a process is consideredto be stable unless and until an indication of instability is found.

Process variations are divided into two classifications, the first containing thosewhich are present in a stable process and the second containing those which causeinstability. The stable process variations (or causes of such variations) are referred toas common, chance, random, or inherent, while the others are called special,assignable, non-random, or non-inherent. The distinction between common andspecial causes is not always clear-cut, and one which is common today may be specialtomorrow if it becomes prominent after other causes have been removed.

GBP: A spectator coughing and upsetting Gus is a special cause of variation,while minor fluctuations in the wind speed is probably considered a common cause ifGus is a duffer, but may be special if he is Jack Nicklaus.

5.2 PROCESS PROBABILITY DISTRIBUTIONS.

A process probability distribution is a mathematical function which describes themanner in which the output measurements of a stable process are distributed. Inparticular -- and of prime importance for the purpose of process control - it providesa way to calculate the (approximate) proportion of output measurements falling inany specified interval (this is usually stated as the probability that a measurement

12

Page 21: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

will fall in the interval). The world "approximate" is important because theprobability distribution of a process can never be known exactly except byassumption or in theoretical studies.

Fortunately, for our purposes, approximations give sufficient information formost purposes.

All of the parameters of a process are implicitly defined by its probabilitydistribution -- that is, if we know the distribution, the values of all of the parameters,as well as the probability behavior of the process can be found. Because of this, it ispreferable to define process stability as an unchanging distribution rather than justone with a constant mean and standard deviation. Changes in the distributionalmost always show up as changes in either the mean or standard deviation, and thechanges in these are relatively easy to detect.

When discussing a process probability distribution, the term random variable isfrequently used as a synonym for "output measurement". The adjective "random"basically means "coming from a distribution according only to its probability law" --that is, nothing other than the process itself determines the value of a randomvariable. A set containing one or more values of a random variable is called arandom sample.

The manner of defining a probability distribution depends upon whether thedistribution is discrete or continuous. A discrete distribution is one for which therandom variable is discrete -- that is, takes on only "separated" values such 1, 2, and3, but not 1.5 -- and is generally associated with attribute data, such as the numberof acceptable items in a sample or the number of telephone calls received per day. Acontinuous distribution is associated with variables data and is one for which therandom variable can take on any value in an interval such as 0 to 100 or even - -to -(where,ca is the infinity symbol) and the interval includes all real numbers.

A distribution function is a formula and, in addition to the usual "dummyvariable" (which we will always denote by "x"), will usually contain one or moreparameters which permits the formula to define a family of similar distributionswhich differ by having different specific values for each of the parameters. In mostcases, the mean and standard deviation are given by simple formulas involving theparameters.

5.3 DISCRETE DISTRIBUTIONS.

For each possible value of the dummy variable, the formula for a discretedistribution gives the probability that the random variable will have that value.Among the important discrete distributions are the binomial, the Poisson, and thehypergeometric.

The binomial distribution gives the probabilities of obtaining a certain number of"good" items when a sample of size n is taken from a stable process which producesa certain proportion of such items. The number of good items in the sample must, ofcourse, be an integer between zero and n.

GBP: If a "good" shot is one which stops between 98 and 102 yards and, ifexperience shows that 95% of all shots are good, then the binomial distribution canbe used to find, for example, the probability that it a bucket of ten balls is shot, atleast eight of them will be good. The answer is 0.988 -- the reader may consult an

13

Page 22: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

elementary probability book or the local statistician for computational details. Wealso find the mean number of good shots (over an assumed unlimited number often-ball buckets) is 9.5 and the standard deviation is 0.69.

The Poisson distribution gives, for example, the probabilities of the number ofevents occurring over some time interval when the average rate of occurrence isknown. Thus, if an office receives an average of ten telephone calls per hour, thePoisson distribution says that the probability of 15 or more calls in an hour is 0.0835.

The hypergeometric distribution provides probabilities associated with

acceptance sampling schemes.

5.4 CONTINUOUS DISTRIBUTIONS.

A continuous distribution is defined in a manner different from a discrete one.The graph of the distribution function (usually called a density function) is a non-negative curve having the property that the area between the curve, the x-axis, andthe vertical lines at two points on the x-axis gives the probability that the randomvariable lies between those two points. The total area between the curve and the x-axis must be 1.0 because the total probability of any distribution is always 1.0.

The simplest continuous distribution (and of no particular importance to usexcept as an example) is the uniform distribution defined on the unit interval(between x = 0 and x = 1). Its graph is shown in Figure 5-4. The area under the"curve" between x = 0 and x = 1 is clearly 1.0. The probability that the randomvariable lies between 0.2 and 0.6 is 0.4; ((0.6-0.2) x 1.0).

Unfortunately, probabilities for most continuous distributions are not as simpleto find, and, generally, we must rely upon published tables or appropriate computerprograms to obtain theqi.

Because of its overriding importance, the only other continuous distribution wewill mention is the normal (or Gaussian). The normal distribution function containstwo parameters which happen to be the mean p and the standard deviation o. Itsgraph is the familiar bell-shaped curve which is symmetric about the vertical line atthe mean. Figure 5-5 shows the graphs of several normal distributions whichillustrate the influence of the parameters p and o.

It turns out that probabilities for a normal distribution with any specified valuesfor 'and a can be obtained from a table for the standard normal for which P = 0and o = 1. Such a table can be found in almost any probability or statistics book, butwe will not discuss its use in this paper.

The reason normal distributions play such an important role lies in theremarkable Central Limit Theorem which, in essence, says that if the total variation(the deviations from the mean) of a stable process is the sum of many smalldeviations coming from independent random causes, then the process distributionwill be approximately normal and the approximation will improve if the number ofcauses increases.

GBP: There are undoubtedly many reasons why a golf ball fails to stop at exactlythe 100 yard mark: the ball has a small defect or two; the wind has a slight changein direction or speed; the grass on which the ball rolls after landing differs fromshot-to-shot; etc. Consequently, we would expect the process distribution to be

14

Page 23: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

0.8 -

0 . 6 .... .. . ... . . ........................ .

f(x) 0 .4 .. ..... .

0.2-'

0 0.2 0.4 0.6 0.8 1X

Figure 5-4. Uniform distribution (density) function.

approximately normal once the process has been stabilized -- that is, the major anderratic causes of variation have been eliminated.

As a result of the Central Limit Theorem and the ease of finding probabilities of anormal distribution, we are able to find approximate values for the probabilities ofmost stable processes.

The Central Limit Theorem also plays an important role when samples of aprocess are used, because, for example, if a sample of size ten is taken, the numberof small variations is ten-fold greater than that of a single observation.Consequently, the probabilistic behavior of the values of the sample mean (whichare essentially sums of individual values) is approximately normal. Even if theprocess distribution itself is far from normal, sample sizes as small as four producedistributions quite close to normal.

It must be stressed that *approximate" means exactly that, and it is a mistake tobelieve strongly in approximate probabilities calculated to several decimal places.This caution applies particularly to probabilities associated with the tails of thedistribution -- that is, with those points more than two or so standard deviationsfrom the mean.

If a distribution cannot safely be normally approximated, there are otherapproximations which can be used. One of these involves Chebyshev's inequalityand applies to any distribution; another involves the Camp-Meidell inequality andapplies to any distribution which has a single mode -- that is, the graph has a single

15

Page 24: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

fix)

-5 i-10 X

(a) Different means

f(x)

(b) Different standard deiations

f(x)

•r 8 I I21

I,.-5 Ix -O 10

(c) Different means and dlfferent standard devations

Figure 5-5. Normal distributions.

high point. Table 5-1 shows some statements which can be made about theprobability of a random variable lying in an interval centered at its mean, dependingupon whether an assumption of normality, a single mode, or nothing is made aboutthe distribution.

16

Page 25: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Table 5-1. Probabilities of a random variable being in the intervali1- to p + no.

n Normal Unimodal Any_ _ Distribution Distribution Distribution

2 0.954 at least 0.889 at least 0.750

3 0.997 at least 0.951 at least 0.889

4 0.99994 at least 0.972 at least 0.938

0.9999994 at least 0.982 at least 0.960

5.5 TESTING rOR NORMAUTY.

The determination of whether or not a distribution is approximately normal ornot is subjective for the most part, but can be guided by some graphical tools. In anycase, a sample of the process output is needed, and the larger the better (probably inthe range from 100 to 1000).

The first -- and relatively crude -- tool is a histogram of the data. This involvesdividing the range of the sample values into several subintervals (maybe 8 to 10),counting the number of sample values which lie in each subinterval, and drawing abar chart as shown in Figure 5-6 where the height of each bar is the number ofvalues in that subinterval. The figure shows several common histogram patterns ofwhich only the Bell-Shaped one suggests a normal distribution.

A much better procedure is to arrange Lhe sample values in ascending order andplot them on normal probabillty graph paper whose vertical axis has a nonlinearscale constructed so that the points on a plot of a sample from a normal distributionwill fall very close to a straigh; inclined line. Systematic deviations from a straightline indicate non-normality. Figure 5-7 shows a plot which dearly suggestsnormality. (Computer programs are available to do all of the manipulations of thedata and the drawing of the graphs.)

17

Page 26: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

r7

iL tLBell-Shaped Double-Peaked

Plateau Comb

Skewed Truncated

isolated-Peaked Edge-Peaked

Figure 5-6. Common histogram patterns.

18

Page 27: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

9,-

9g-

95

90

7-0

70-

so-40--

3O-

20-

2

I I I I I400 500 600 700 8oo 900

Figure 5-7. Normal probability plot.

19

Page 28: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 6

PROCESS CONTROL

The purpose of process control is to manipulate the process so that almost all ofits products meet or exceed the requirements of the customer ("the almost" is due tothe fact that even the best regulated process will occasionally turn out a lemon).

A briefer statement might simply say that the purpose is to produce a high-quality product, but generally the meaning of "quality" is both vague andsubjective. However, a reasonably good definition can be obtained by convertingthe almost" in the previous paragraph to a number which is the percentage ofacceptable products produced. Thus, a process for which 95% of its products areacceptable has a quality of 95% (in the sense that the quality of a product actuallyrefers to the quality of the process which created the product). Process controlconsists of those procedures which

"* create a stable process.

"* adjust a stable process so that a product of an acceptable quality level isproduced.

"* continue to monitor, adjust, and improve the process so that the productquality is at least maintained and preferably improved.

6.1 COLLECTING DATA.

The first step in the creation of a stable process is to learn as much as possibleabout the design and operation of the current process and the nature of its outputs.Most of this study will be non-statistical and might involve process flowcharts ofvarious kinds, blueprints, layout diagrams, conversations with people at all levelswho are involved, either directly or indirectly, with the process, etc.

The sources and nature of the inputs to the process should be investigated, and,in particular, the quality of the inputs must be determined.

The use of the process products by a customer should be studied, because thecustomer is the one who, by his use of the product, ultimately defines anddetermines its quality.

The analysis of the process output involves taking random samples of the outputand using them to investigate the probability behavior of the process. Thisinvestigation might involve the calculation of sample statistics, constructing processcontrol charts to see if any obvious abnormalities are present, drawing a histogramto see the general shape of the distribution, and, generally, using all graphical orapalytical tools which might shed some light on the process and help detect anduncover the causes of instabilities.

Discussions with those workers most closely associated with the process areparticularly valuable here because these people can often pinpoint the cause ofobserveo irregularities.

20

Page 29: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

6.2 CONTROL CHARTS.

Control charts are probably the major analytical tool used to create and maintaina stable process. Such charts consist of nothing more than graphs of consecutivevalues of some sample statistic on which certain horizontal lines are drawn. Chartsmay be constructed for any sample statistic or even for individual measurements(really just a sample of size 1). Table 6-1 lists most of the common charts.

Table 6-1. Control chart types.

Data Type Statistic Chart Name

Variables mean x-Barstandard deviation srange rindividual xindividual moving range

Attribute proportion defective Pnumber defective npnumber of defects c or u

Either any cumulative sum (cusum)

We will not discuss any particular type of chart, but simply note that theprinciples of constructing and using any of them are similar. Variables charts are themost common and are preferred over attribute charts because a measurementprovides more information than a good/bad determination. Attribute charts areusually "one-sided" in the sense that the number of defective items in a sample orthe number of defects occurring with a single item can never be less than zero, andzero is always the desired goal.

For variables data at least two charts are used: one for a measure of the sampleaverage (usually an x-Bar chart for the sample means) and another for a measure ofthe sample variability (either an s chart for the sample standard deviations or an rchart for the sample ranges). S charts are preferable to r charts because, in astatistical sense, a sample standard deviation is a better estimator than a samplerange of the process standard deviation. The range has been used irn the past, atleast for small sample sizes (less than ten or so) because of the relative simplicity ofcalculating a range compared to that for a standard deviation. With the ready andinexpensive availability of electronic calculators and computers to do the arithmetic,the objection to using a standard deviation has much less force today.

The control chart on which the values of a sample statistic are plotted iscompleted by adding a horizontal line at the best available estimate of the unknownand unknowable mean of the theoretical population of sample statistic values andtwo other horizontal lines, one a certain distance above the mean line and the otherbelow. These two lines are called control lines -- upper (UCL) and lower (LCL) -- andare positioned so that almost all possible sample statistic values will fall betweenthem if the process is stable. In most cases the lines are so-called 3-sigma lines whichmeans that they are placed a distance from the mean line of 3 times the estimatedstandard deviation of the sample statistic population. The idea is that if the

21

Page 30: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

distribution of the statistic were normal and the exact values of the populationmean and standard deviation were known, the probability would be exactly 0.997(99.7%) that any sample statistic value would fall in the band between the controllimits.

Consequently, a vuiue falling outside the band would say that either a veryunlikely event occurred (probability = 0.3%) if the process is stable, or an unlikelyevent has not occurred because the process is out-of-control (not stable) and needssome attention. Unfortunately, except in theoretical studies, we do not know eitherthe population mean or standard deviation or whether or not the distribution isnormal. As a result, the probability associated with the control band is almostcertainly not 99.7%.

Table 5-1 above indicates that the probability associated with a 3-sigma band is atleast 89% for any distribution and at least 95% ;f the distribution is unimodal. Inshort, the probability is almost certainly greater than 0.3% that a statistic value willfall outside the control band, and so we may expect more false alarms than we hadhoped for. We might, of course, use 4-sigma limits to reduce the number of suchalarms, but this would also increase the risk of not detecting instability. The moralhere is that use of a control chart should always be tempered by the wisdom andintuition provided by a well-trained person v ith considerable process controlknowledge and experience.

Some statistics, unlike the sample mean, do not have symmetrical probabilitydistributions, and so the two control limits are set at different distances from themean line in such a manner that the probability associated with the part of the bandabove the mean line is the same as that below. (Actually, regardless of the form ofthe distribution, there may be situations in which unequal probabilities are desired,but such cases are unusual.) Doing all this sounds a bit forbidding, but almost anyprocess control textbook or manual contains tables of theoretically-derived constantcoefficients which can be easily used to set the limits for any type of chart.

There are many ways by which a process instability can reveal itself other than asample statistic value falling outside the control band.

These include a substantial change in the level of the observed values, a driftupwards or downwards, a cyclical pattern, and generally any pattern which has alow probability of occurring if the process is indeed in-control. Figure 6-1 showssome typical charts for out-of-control processes. There are a number of specific rulesavailable, but their application should always be used with some degree of caution.

6.3 CREATING STABILITY.

Once instability is detected or suspected in a process, the question of how toisolate the cause arises. There are no magic tools to use for this, but the possibilitiesinclude:

e Create and use a Cause-and-Effect diagram. This is a diagram showing all ofthe possible causes which can be thought of which might lead to an effect (inthis case, the instability). This is usually a group effort and has proved to bequite successful in many cases. A Cause-and-Effect diagram is also called afishbone or an Ishikawa diagram, the first because of the form it frequentlytakes as possible causes are broken down into smaller ones, and the secondafter Kaoru Ishikawa who developed it in 1943.

22

Page 31: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

UCL

LCL

Sample number Sample numbe

CYCLES ON A CONTROL CHART A SHIFT IN PROCESS LEVEL

Figure 6-1. Some charts of out-of-control processes.

"0 Create a Pareto chart for the more promising causes. A Pareto chart is just abar chart in which the height of each bar represents a measure of how muchof the effect is due to the cause associated with that bar (the bars are usuallyplaced in descending order of height). The Pareto principle states that inmany situations almost all of an effect (80% or so) is due to only a few of thepossible causes (20% or so). Thus a Pareto chart, even though the numbersinvolved are crude, may indicate which causes are the most promisingcandidates for attention.

"* Investigate possible process mixtures. The output of a process may involve amixture of two or more subprocesses -- for example, a night shift and a dayshift, several machines or people whose outputs are combined beforesampling occurs, process inputs from different sources, etc. Measuring theoutput due to each component of a mixture may reveal significantdifferences.

The Shewhart Cycle -- renamed the Deming Cycle by the Japanese and commonlyreferred to as the Plan-Do-Check-Act (PDCA) Cycle -- is a general four-step methodwhich can aid management in stabilizing a process as well as in other qualityimprovement efforts:

1. Plan: A plan to effect improvement is developed.

2. Do: The plan is carried out, preferably on a small scale.

3. Check: The effects of the plan are observed.

4. Act: If the results are favorable, the plan can be implemented on a largescale; otherwise, go back to step 1.

23

Page 32: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

6.4 MEETING THE CUSTOMER'S SPECIFICATIONS.

The goal of controlling a process is to produce a product which makes thecustomer happy. Usually what makes the customer happy is expressed in the formof product specifications. In most cases, the specifications for each of the possiblemeasurements of the product are given numerically and define a band of valueswithin which the measurement must fall in order that the product be acceptable(conformable). This band is commonly defined by a nominal (target) value and atolerance, so that the band extends from "nominal - tolerance" to "nominal +tolerance" (there may be separate tolerances for below and above). The resultingtwo numbers are called the Lower Specification Limit (LSL) and the UpperSpecification Limit (USLQ respectively.

The ability of a process to satisfy product specifications depends upon twoattributes of its distribution: accuracy and precision. Accuracy pertains to thedistance between the process mean p and the corresponding nominal specification(this distance is called the bias of the process), while precision pertainsto thevariation of the process about that mean and is measured by the process standarddeviation o (the larger the standard deviation, the smaller the precision). Accuracyand precision are frequently used in a relative manner -- for example, process A ismore accurate than process B, or process C is not precise (presumably relative tosome other process). The two attributes are independent of each other, precisionbeing an inherent property of the process and accuracy depending upon the processbehavior with respect to an external standard.

GBP: Figure 6-2, in which each dot represents one golf ball, shows the results of asample for each of the four possible combinations of accuracy and precisioncombinations. The nominal specification is 100 yards, but no tolerance is indicated.

Precise ? Accurate ?

no no

no yes

yes no

yes yes .....

100

Hit Distance (yards)

Figure 6-2. GBP accuracy and precision.

24

Page 33: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

It is desirable to have a measure of the ability of a process to satisfy the productspecifications. Such a measure (capability index) -- which is meaningful only for astable process -- should involve both the accuracy and precision of the process andthe specification limits. Some commonly used indices include:

"* Cp = (USL- LSL)/ (6o)

"* CPU = (USL-p)/(3a)

"* CPL =(P- LSL)/(3u)

"* Cpk = minimum of CPU and CPL

GBP: Gus wants to have the balls stop within four yards (the tolerance) of thenominal value of 100 yards. Therefore, LSL = 96 and USL = 104. If i = 99 anda = 1,then;

Cp (104 - 96) / 6 = 1.33

CPU = (104 - 99) / 3 = 1.67

CPL = (99 - 96) / 3 = 1.00

Cpk - min(1.67, 1.00) = 1.00

These indices are not altogether satisfactory, because, for them to bemeaningful, it is necessary to have a normally distributed process (Cp also requiresthat the process be unbiased). There is considerable debate in the quality controlcommunity about their worth, and care must be taken with their use.

If the Cp of a stable, normally distributed, and unbiased process is equal to unity,then the probability of producing a conformable product is 99.7% -- a defect rate ofabout three per thousand. If the C were 1.67,. each specification limit would be 5ofrom the process mean, and the defect rate would be only about one per million. Adefect rate -- or, equivalently, the percent conformable -- is what the above indicesare attempting to express and is a much more understandable measure. In addition,it can be applied to any biased or non-normal distribution as long as the distributioncan be adequately estimated.

In the above discussion it was tacitly assumed that, as far as the customer isconcerned, a product whose value falls at or near one of the specification limits isjust as good as one falling exactly at the nominal value. Although this is certainlytrue in many cases, it seems unreasonable that the worth of a product suddenlyjumps from totally satisfactory to worthless as its value crosses a specification limit.Taguchi (and others before him) has suggested that the worth of a particularproduct item be degraded as a function of the distance between the product valueand the specification target and, in particular, that a quadratic loss function be used-- that is, the loss in worth be proportional to that distance. Even though this formof the loss function is not always appropriate, the idea that getting product valuescloser and closer to the target value is a worthwhile goal for both the producer andthe customer can do and has done much to promote high quality in productionprocesses.

25

Page 34: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

6.5 MAINTAINING AND IMPROVING A STABLE PROCESS.

Once a stable process has been created, the stability must be maintained andcontinuing effort made to reduce the common variation. Reducing the commonvariation is beneficial to both producer and customer, primarily by the reduction orelimination of defective products. Whether a defective item is found by theproducer or by the customer, it must be fixed or replaced, and this creates a directand/or indirect cost to both parties.

The maintenance of the stability of a process uses the same tools as those for thecreation of the stability in the first place. Instability might occur for any of severalreasons -- wearout of the machinery, changes in personnel, changes in the nature ofan input material, complacency setting in after initial stability has been achieved,environmental changes, etc. The possibility of instability returning is lessened ifcommon variation reduction is pursued, because of the required active surveillanceof the process.

One of the major pitfalls to be avoided with a stable process is an overzealousattempt to reduce the variability by frequent process adjustments. If the process istruly stable, this action -- called tampering -- is guaranteed to produce just theopposite effect and increase rather than decrease the variability. Such an effect isconvincingly demonstrated by the funnel experiment -- popularized by Deming -- inwhich marbles are dropped one at a time through an elevated funnel onto a flatsurface and the stopping locations of the marbles marked. If the location of thefunnel remains fixed, the experiment will be a stable process and will produce a well-defined pattern of dispersion of the marked points. If the funnel's position ischanged after each marble in an attempt to obtain more uniformity in the stoppingpositions (a fairly natural impulse), the variation will inevitable increase, possiblywithout bound, depending on the nature of the tampering rule used.

GBP: If Gus has developed a stable process, his attempts to adjust it by hitting theball softer or harder, changing his stance, and so forth will almost certainly increaserather than decrease the variation in the stopping distance of a ball.

The total common variation of a process is almost always due to several sources.Reduction of that variation first requires identifying some of these sources and thenestimating the variation produced by each of them. This program sounds similar tothat employed for changing an unstable process to a stable one, but a majordifference is that common causes of variation do not produce distinctive patterns incontrol charts, whereas special causes usually do. Consequently, the identification ofcommon causes and the measurement of their effects usually requires experimentswith the process in which one or more of the process variables (input materials,operators, machines, etc.) are altered in a systematic manner, and the resultsanalyzed. Although studying one cause at a time can lead to improvements in manycases, the possibility of interaction between causes, and the desire to obtainoptimum results over several causes without excessive experimental costs, point tothe need for statistically designed experiments. The father of such experiments wasR. A. Fisher, whose work in the 1920s and 1930s laid both the theoretical andpractical foundations of these experimental methods. Over the past twenty years orso, the name of Genichi Taguchi has become well-known largely because of his workin this area.

GBP: In his efforts to reduce the variation in his golf game (which has becomestable), Gus has identified several possible sources of variation, including golf ball

26

Page 35: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

brand, club design, air temperature and humidity, and the color of his socks.Without going into technical detail, we suggest that he visit his neighborhoodstatistician to help set up a suitable experimental design which will permit Gus toinvestigate the effects of all of these factors simultaneously. This will minimize thetime and cost of the experiment and still produce results which include the effectsfor both the individual factors and their interactions, consequently guiding Gustowards an optimized game.

6.6 ACCEPTANCE SAMPLING.

Goods or services enter an organization from a vendor or are passed alonginternally from department to department and may be subjected to inspectionprocedures in an effort to both receive and produce high quality goods and thusreduce total production costs. Three alternatives for inspection exist: (1) noinspection; (2) 100% inspection (inspect everything with the goal of weeding out alldefective items); and (3) acceptance sampling (inspect a sample of the goods todetermine if the remainder should be accepted, rejected, or screened). Historically,acceptance sampling has been used if the inspection test is destructive or the cost of100% inspection is excessive.

The purpose of acceptance sampling is to determine the disposition of goods orservices (accept, reject, or screen). This is accomplished by selecting that dispositionwhich minimizes the cost of inspection to achieve a desired level of quality (%acceptable). Although acceptance sampling has a long history, its use has comeunder attack in recent years by Deming and others, the argument being that if theproduction process is stable, both a sample and the remainder of the sampled groupof products are independent random samples of the process, and, consequently, thenumber of defects in the sample indicates nothing about the number in theremainder; consequently, the only reasonable sampling procedures are either 0% or100%.

Whether or not this view will prevail remains to be seen, but it is unlikely thatacceptance sampling will totally vanish from the scene.

To choose between the two alternatives of 0% and 100%, Deming has proposedusing the kp rule. If we let p be the proportion of defects produced by the process,ki be the cost to initially inspect one item, and k2 be the cost to dismantle, repair,and reassemble a good or service that fails because a defective item was used in itsproduction, then the inspection rules are:

1. If kl/k2 is greater than p, then use 0% inspection. There is little risk associatedwith a defective item.

2. If kl/k2 is less than p, then use 100% inspection. There is great risk associatedwith a defective item.

3. If kl/k2 equals p, choose either 0% or 100% inspection, but if there is anyserious doubt, for safety's sake choose 100%.

A major problem in applying the kp rule is the difficulty of determining values fork 1 and k2, but even rough estimates may be sufficient to choose an appropriatecourse of action.

27

Page 36: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

It is important to note that 0% inspection does not mean that no information isavailable about the process, because small samples should always be taken toprovide data for maintaining control charts.

6.7 IN-PROCESS MEASUREMENT AND CONTROL.

Inspection, including dimensional inspection, has commonly been an activityperformed after, rather than during, a manufacturing step or process. In manyinstances, several steps may be performed before a part is measured. If a part isfound to deviate from the manufacturing tolerances, it must be either reworked orrejected at a point where considerable value has been added to it, but in either case,the manufacturing cost is boosted.

Because of increasing competitive pressures, the American industrial communityis turning to means of reducing manufacturing costs while improving quality. Inaddition to such approaches, such as increased automation and the use of robots,more subtle methods that work hand in hand with automation are beingimplemented. One such approach is the combination of in-process measurementand control. In-process measurement is significant in that it ultimately allows amanufacturer to achieve a goal of zero scrap, since deviations in the manufacturingprocess which are measured by sensors can be used in a corrective manner to controlthe process before tolerances are exceeded. (Note that this is not tampering becausethe corrections are made to insure that the current item is satisfactory rather than tohope that the next one will be.) Advances in sensor technology and digitalcomputers and controllers are permitting a dramatic increase in the implementationof these ideas.

28

Page 37: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 7

TOTAL QUALITY CONTROL

Total Quality Control (TQC) is essentially a philosophy of productionmanagement which espouses the involvement of management at all levels in thetask of quality control. Names of prominent people involved in the practice and/orteaching of TQC include Phillip Crosby, W. Edwards Deming, Armand Feigenbaum,Kaoru Ishikawa, and Joseph Juran.

As an example of the ideas, the well-known 14 points of Deming (taken from the1986 edition of his book Out of the Crisis) are:

1. Create constancy of purpose toward improvement of product and service,with the aim to become competitive and to stay in business and to providejobs.

2. Adopt the new philosophy. We are in a new economic age. Westernmanagement must awaken to the challenge, must learn their responsibilities,and take on leadership for change.

3. Cease dependence on inspection to achieve quality. Eliminate the need forinspection on a mass basis by building quality into the product in the firstplace.

4. End the practice of awarding business on the basis of price tag. Instead,minimize total cost. Move toward a single supplier for any one item, on along-term relationship of loyalty and trust.

5. Improve constantly and forever the system of production and service, toimprove quality and productivity, and thus constantly decrease costs.

6. Institute training on the job.

7. Institute leadership. The aim of supervision should be to help people,machines, and gadgets to do a better job. Supervision of management is inneed of overhaul as well as supervision of production workers.

8. Drive out fear, so that everyone may work effectively for the company.

9. Break down barriers between departments. People in research, design, sales,and production must work as a team to foresee problems of production and inuse that may be encountered with the product or service.

10. Eliminate slogans, exhortations, and targets for the work force asking for zerodefects and new levels of productivity. Such exhortations only createadversarial relationships, as the bulk of the causes of low quality and lowproductivity belong to the system and thus lie beyond the power of the workforce.

11a. Eliminate work standards (quotas) on the factory floor. Substitute leadership.

29

Page 38: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

11 b. Eliminate management by objective. Eliminate management by numbers,numerical goals. Substitute leadership.

12a. Remove barriers that rob the hourly worker of his right to pride ofworkmanship. The responsibility of supervisors must be changed from sheernumbers to quality.

12b. Remove barriers that rob people in management and in engineering theirright to pride of workmanship. This means, inter alia, abolishment of theannual or merit rating and of management by objective.

13. Institute a vigorous program of education and self-improvement.

14. Put everybody in the company to work to accomplish the transformation. Thetransformation is everybody's job.

30

Page 39: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 8

TAGUCHI METHODS

Genichi Taguchi is a Japanese engineer who has been active in the improvementof Japan's industrial products and processes since the late 1940s. He has developedboth a philosophy and a methodology for the process of quality improvement thatdepends heavily upon statistical concepts and tools, especially statistically designedexperiments. Many Japanese firms have achieved great success by applying hismethods, and it has been reported that thousands of engineers have performed tensof thousands of experiments based on his teaching.

Although he has given seminars in the U.S. since the early 1970s, it wasn't untilabout 1980 that American companies (including AT&T, Ford, and Xerox) activelybegan applying his ideas. While much attention has focused on Taguchi's statisticalmethods, the heart of his message has more to do with his conceptual framework forthe process of quality improvement. The following seven points summarize theelements of this message:

1. An important dimension of the quality of a manufactured product is the totalloss to society generated by that product.

2. In a competitive economy, continuous quality improvement and costreduction are necessary for staying in business.

3. A continuous quality improvement orogram includes incessant reduction inthe variation of product performance characteristics about their target values.

4. The customer's loss due to a product's performance variation iP oftenapproximately proportional to the square of the distance of the performancecharacteristic from its target value.

5. The final quality and cost of a manufactured product are determined to alarge extent by the engineering designs of the products and its manufacturingprocess.

6. A product's (or process') performance variation can be reduced by exploitingthe nonlinear effects of the product (or process) parameters on theperformance characteristics.

7. Statistically designed experiments can be used to identify the settings ofproduct (and process) parameters that reduce performance variation.

Most of these points have been made by others in the past, but Taguchi has beenparticularly successful in integrating them and demonstrating their effectiveness.

31

Page 40: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 9

COMPUTER USE

There are several ways in which computers are involved in process control, andtheir use is becoming more and more significant as an increasingly large number ofcommercial computer programs become available. The list below shows inparentheses the number of these programs which are appropriate for each of thelisted categories and which were included in the 1991 yearly summary of computerprograms in Quality Progress, a publication of the American Association for QualityControl. (Many of the individual programs are included in more than one category):

Calibration (34)Capability Studies (85)Data Acquisition and Reporting (71)Design of Experiments (51)Gage Repeatability and Reproducibility (39)Inspection (60)Management (74)Measurement (74)Problem Solving (64)Quality Assurance for Software Development (11)Quality Costs (39)Quality Function Deployment (5)Reliability (47)Sampling (49)Simulation (14)Statistical Methods (82)Statistical Process Control (103)Supplier Quality Assurance (49)Taguchi Techniques (24)Training (52)Other (40)

32

Page 41: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 10

STANDARDS

Companies in the United States have long had standards to describe what aquality system should be. Some of these have been regulatory standards such as theGood Manufacturing Practices for the Food and Drug Administration and militarystandards such as MIL-Q-9858A (quality program requirements), MIL-STD-45662A(creating and maintaining a calibration system for measurement and testequipment), MIL-STD-414 (controlling the fraction of incoming material that doesnot conform to specifications for variable data), and MIL-STD-10SE (lot-by-lotacceptance sampling plans). In recent years, the International StandardsOrganization (ISO) -- a specialized international agency for standardizationcomposed of the national standards bodies of 91 countries -- has developed the ISO9000 series of international standards.

The standards in the ISO 9000 series were developed by Technical Committee 176of the ISO and approved in their present form in 1987.

Concurrently, the American Society for Quality Control cooperated in adoptingthe ISO 9000 series for use in the United States, and this resulted in the Q90 series.

There are five standards in the ISO 9000 series:

The ISO 9000 standard provides some basic definitions and concepts andsummarizes how to select and use the other standards in the series.

The ISO 9001, 9002, and 9003 standards are for use in external quality assurancepurposes for contractual situations:

ISO 9001 is to be used to ensure conformance to specified requirements duringdesign and development, production, installation, and servicing.

ISO 9002 is used when only production and installation conformance is to beensured.

ISO 9003 is the least detailed standard and requires only that conformance infinal test and inspection be ensured.

ISO 9004 contains guidance on the technical, administrative, and human factorsaffecting the quality of products and services. This standard is for internal use onlyand is not to be used in contractual situations. The standard lists the essentialelements that make up a quality system in some detail, starting with theresponsibilities of management. Whole sections are devoted to each aspect of thequality system: marketing, design, procurement, production, measurement, postproduction, materials control, documentation, safety, and use of statistical methods.This standard could be used to evaluate a company's progress toward a fullyimplemented quality system.

The ISO 9000 series of standards has been adopted by -,itish companies as a toolto establish whether their suppliers have been and are using a quality system thatwill ensure their ability to meet their product and service requirements. Thisrequires that the supplier be audited and then registered (if it passes the audit) as an

33

Page 42: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

ISO 9000 supplier. All other companies will accept this registration with little or nofurther auditing. Companies wishing to be suppliers to British companies are beingrequired to become registered under the appropriate ISO standard. By 1990, morethan 30 U.S. companies were registered by the British Standards Institute or otheraccredited registration agencies. Starting in 1992, companies in the integratedEuropean Community will require their suppliersto be audited and registered underthe British EN 29000 series of standards. These, like the Q90 series, are technicallyidentical to the ISO 9000 series.

34

Page 43: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

SECTION 11

UST OF REFERENCES

The references list below is not intended to be exhaustive, but includes thosebooks and articles that have been consulted in the preparation of this paper or havebeen particularly recommended.

TEXTS:AT&T, Statistical Quality Control Handbook. Indianapolis, Ind.: AT&T, 1956.

Box, G.E.P., J. S. Hunter, and W. G. Hunter, Statistics for Experimenters. New York:John Wiley & Sons, 1978.

Deming, W. Edwards, Out of the Crisis. Cambridge, Mass.: MIT Center forAdvanced Engineering Study, 1986.

Feigenbaum, A. V., Total Quality Control. New York: McGraw-Hill, 1980.

Gitlow, H., S. Gitlow, A. Oppenheim, and R. Oppenheim, Tools and Methods forthe Improvement of Quality, Boston, Mass.: Richard D. Irwin, Inc., 1989.

Hart, M. K., and R. F. Hart, Quantitative Methods for Quality and ProductivityImprovement. Milwaukee, Wis.: ASQC Quality Press, 1989.

Ishikawa, K., Guide to Quality Control. Tokyo, Japan: Asian ProductivityOrganization, 1976.

Juran, J., Quality Control Handbook. New York: McGraw-Hill, 1979.

Montgomery, D.C., Introduction to Quality Control. New York: John Wiley &Sons, 1985.

Murphy, S. D. (editor), In-Process Measurement and Control. New York: MarcelDecker, Inc., 1990.

Oakland, J. S., Statistical Process Control. New York: John Wiley & Sons, 1988.

Ott, E. R., Process Quality Control. New York: McGraw-Hill, 1975.

Taguchi, G. and Y. Wu, Introduction to Off-Line Quality Control. Nagoya, Japan:Central Japan Quality Control Association, 1980.

Williams, R. L. and Gateley, W. Y., The Meaning and Utility of Confidence.Colorado Springs: Kaman Sciences Corporation, 1992.

Winokur, P. S., et al, Statistical Process Control for QML Radiation HardnessAssurance. Albuquerque: Sandia National Laboratories, 1990.

PERIODICALS:

Journal of Quality Technology (formerly Industrial Quality Control), published bythe American Society for Quality Control.

35

Page 44: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Quality Engineering, published by the American Society for Quality Control.

Quality Progress, published by tdoe American Society for Quality Control.

Technometrics, published jointly by the American Statistical Association and theAmerican Society for Quality Control.

36

Page 45: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

APPENDIX A

GLOSSARY OF ACRONYMS, TERMS. AND DEFINITIONS

Acceptable Quality Level -When a continuing series of lots is considered, aquality level that, for the purposes of sampling inspection, is the limit of asatisfactory process average.

Acceptance Sampling - Inspection of a sample from a lot to decide whether toaccept or not accept that lot. There are two types: attributes sampling and variablessampling. In attributes sampling, the presence or absence of a characteristic is notedin each of the units inspected. In variables sampling the numerical magnitude of acharacteristic is measured and recorded for each inspected unit; this involvesreference to a continuous scale of some kind.

Acceptance Sampling Plan - A specific plan that indicates the sampling sizes andthe associated acceptance or non-acceptance criteria to be used. In attributessampling, for example, there are single, double, multiple, sequential, chain, andskip-lot sampling plans. In variables sampling, there are single, double, andsequential sampling plans.

Accreditation - Certification by a duly recognized body of the facilities, capability,objectivity, competence, and integrity of an agency, service, or operational group orindividuai to provide the specific service or operation needed. The RegistrationAccreditation Board accredits those organizations that register companies to the ISO9000 series standards.

American Quality Foundation - An independent, nonprofit entity formed underthe auspices of the American Society for Quality Control (ASQC). Its mission is toinvolve business leaders and public policy analysts in strengthening the future ofquality. Acting as a research and development function of ASQC, the foundationdevelops programs to promote and support efforts that improve U.S. business'sability to compete.

American Society for Quality Control - A professional, not-for-profit associationthat develops, promotes, and applies quality-related information and technology forthe private sector, government, and academia. The Society serves more than 96,000individual and 700 corporate members in the United States and 63 other countries.

Analysis of Means (ANOM) - A statistical procedure for troubleshooting industrialprocesses and analyzing the results of experimental designs with factors at fixedlevels. It provides a graphical display of data. Ellis R. Ott developed the procedure in1967 because he observed that nonstatisticians had difficulty understanding analysisof variance. Analysis of means is easier for quality practitioners to use because it isan extension of the control chart. In 1973, Edward G. Schilling further extended theconcept, enabling analysis of means to be used with non-normal distributions andattributes data where the normal approximation to the binomial distribution doesnot apply. This is referred to as analysis of means for treatment effects.

Analysis of Variance (ANOVA) - A basic statistical technique for analyzingexperimental data. It subdivides the total variation of a data set into meaningfulcomponent parts associated with specific sources of variation in order to test a

A-1

Page 46: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

hypothesis on the parameters of the model or to estimate variance components.

There are three models: fixed, random, and mixed.

ANOM - Analysis of Means

ANOVA - Analysis of Variance

AOQ - Average Outgoing Quality

AOQL - Average Outgoing Quality Limit

ASQC - American Society for Quality Control

Attribute Data - Go/no-go information. The control charts based on attributedata include percent chart, number of affected units chart, count chart, count-per-unit chart, quality score chart, and demerit chart.

Availability - The ability of a product to be in a state to perform its designatedfunction understated conditions at a given time. Availability can be expressed bythe ratio uptime/(uptime + downtime), uptime being when the product isoperative (in active use and in standby state) and downtime being when the productis inoperative (while under repair, awaiting spare parts, etc.).

Average Chart - A control chart in which the subgroup average, X-bar, is used toevaluate the stability of the process level.

Average Outgoing Quality - The expected average quality level of outgoingquality over all possible levels of incoming quality for a given acceptance samplingplan and disposal specification.

Benchmarking - An improvement process in which a company measures itsperformance against that of best-in-class companies, determines how thosecompanies achieved their performance levels, and uses the information to improveits own performance. The subjects that can be benchmarked include strategies,operations, processes, and procedures.

Big Q, Little Q - A term used to contrast the difference between managing forquality in all business processes and products (big 0) and managing for quality in alimited capacity - traditionally in only factory products and processes (little Q).

Blemish - At. imperfection that is severe enough to be noticed but should notcause any real impairment with respect to intended normal or reasonablyforeseeable use. (See also defect, imperfection, and nonconformity.)

Block Diagram - A diagram that shows the operation, interrelationships, andinterdependencies of components in a system. Boxes, or blocks (hence the name),represent the components; connecting lines between the blocks representinterfaces. There are two types of block diagrams: a functional block diagram,which shows a system's subsystems and lower-level products, their interrelationships,and interfaces with other systems; and a reliability block diagram, which is similar tothe functional block diagram except that it is modified to emphasize those aspectsinfluencing reliability.

A-2

Page 47: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Brainstorming - A technique that teams use to generate ideas on a particularsubject. Each person in the team is asked to think creatively and write down as manyideas as possible. The ideas are not discussed or reviewed until after thebrainstorming session.

C Chart - Count Chart

Calibration - The comparison of a measurement instrument or system ofunverified accuracy to a measurement instrument or system of a known accuracy todetect any variation from the required performance specification.

Cause-And-Effect Diagram - A tool for analyzing process dispersion. It is alsoreferred to as the Ishikawa diagram, because Kaoru Ishikawa developed it, and thefishbone diagram, because the complete diagram resembles a fish skeleton. Thediagram illustrates the main causes and subcauses leading to an effect (symptom).The cause-and-effect diagram is one of the seven tools of quality.

Check Sheet - A simple data-recording device. The check sheet is custom-designed by the user, which allows him or her to readily interpret the results. Thecheck sheet is one of the seven tools of quality. Check sheets are often confusedwith data sheets and checklists.

Checklist - A tool used to ensure that all important steps or actions in anoperation have been taken. Checklists contain items that are important or relevantto an issue or situation. Checklists are often confused with check sheets and datasheets.

Common Causes - Causes of variation that are inherent in a process overtime.They affect every outcome of the process and everyone working in the process.

Company Culture - A system of values, beliefs, and behaviors inherent in acompany. To optimize business performance, top management must define andcreate the necessary culture.

Conformance - An affirmative indication or judgment that a product or servicehas met the requirements of a relevant specification, contract, or regulation.

Continuous Improvement - The ongoing improvement of products, services, orprocesses through incremental and breakthrough improvements.

Control Chart - A chart wit:- upper and lower control limits on which values ofsome statistical measure for a series of samples or subgroups are plotted. The chartfrequently shows a central line to help detect a trend of plotted -alues toward eithercontrol limit.

Corrective Action - The implementation of solutions resulting in the reduction orelimination of an identified problem.

Cost of Poor Quality - The costs associated with providing poor-quality productsor services. There are four categories of costs: internal failure costs (costs associatedwith defects found before the customer receives the product or service), externalfailure costs (costs associated with defects found after the customer receives theproduct or service), appraisal costs (costs incurred to determine the degree of

A-3

Page 48: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

conformance to quality requirements), and prevention costs (costs incurred to keepfailure and appraisal costs to a minimum).

Cost of Quality - A term coined by Philip Crosby referring to the cost of poorquality.

Count Chart - A control chart for evaluating the stability of a process in terms ofthe count of events of a given classification occurring in a sample.

Count-Per-Unit Chart - A control chart for evaluating the stability of a process interms of the average count of events of a given classification per unit occurring in asample.

Cp - A widely used process capability index. It is expressed as (upper control limit -lower control limit)/(6a).

C k - A widely used process capability index. It is expressed as Ip - nearerspecr/(3o).

Cross Plot - See Scatter Diagram.

Cumulative Sum Control Chart - A control chart on which the plotted value is thecumulative sum of deviations of successive samples from a target value. Theordinate of each plotted point represents the algebraic sum of the previous ordinateand the most recent deviations from the target.

Customer - See External Customer and Internal Customer.

Customer Delight - The result of delivering a product or service that exceedscustomer expectations.

Customer Satisfaction - The result of delivering a product or service that meetscustomer requirements.

Customer-Supplier Partnership - A long-term relationship between a buyer and asupplier characterized by teamwork and mutual confidence. The supplier isconsidered an extension of the buyer's organization. The partnership is based onseveral commitments. The buyer provides long-term contracts and uses fewersuppliers. The supplier implements quality assurance processes so that incominginspection can be minimized. The supplier also helps the buyer reduce costs animprove product and process designs.

D Chart - Demerit Chart

Decision Matrix - A matrix used by teams to evaluate problems or possiblesolutions. After a matrix is drawn to evaluate possible solutions, for example, theteam lists them in the far-left vertical column. Next, the team selects criteria to ratethe possible solutions, writing them across the top row. Third, each possible solutionis rated on a scale of 1 to 5 for each criterion and the rating recorded in thecorresponding grid. Finally, the ratings of all the criteria for each possible solutionare added to determine its total score. The total score is then used to help decidewhich solution deserves the most attention.

A-4

Page 49: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Defect - A product's or service's nonfulfillment of an intended requirement orreasonable expectation for use, including safety considerations. There are fourclasses of defects: Class 1, Very Serious, leads directly to severe injury or catastrophiceconomic loss; Class 2, Serious, leads directly to significant injury or significanteconomic loss; Class 3, Major, is related to major problems with respect to intendednormal or reasonably foreseeable use; and Class 4, Minor, is related to minorproblems with respect to intended normal or reasonably foreseeable use.

Demerit Chart - A control chart for evaluating a process in terms of a demerit (orquality score), i.e., a weighted sum of counts of various classified nonconformities.

Deming Cycle - See Plan-Do-Check-Act Cycle.

Dependability - The degree to which a product is operable and capable ofperforming its required function at any randomly chosen time during its specifiedoperating time, provided that the product is available at the start of that period.(Nonoperation-related influences are not included.) Dependability can be expressedby the ratio (time available)/(time available + time required).

Design of Experiments - A branch of applied statistics dealing with planning,conducting, analyzing, and interpreting controlled tests to evaluate the factors thatcontrol the value of a parameter or group of parameters.

Designing In Quality Vs. Inspecting In Quality - A two-phase investigation used byteams to solve chronic quality problems. In the first phase - the diagnostic journey -the team journeys from the symptom of a chronic problem to its cause. In the secondphase - the remedial journey - the team journeys from the cause to its remedy.

Dodge-Romig Sampling Plans - Plans for acceptance sampling developed byHarold F. Dodge and Harry G. Romig. Four sets of tables were published in 1940:single-sampling lot tolerance tables, double-sampling lot tolerance tables, single-sampling average outgoing quality limit tables, and double-sampling averageoutgoing quality limit tables.

DOE - Design of Experiments

80-20 - A term referring to the Pareto principle, which was first defined by J.M.Juran in 1950. The principle suggests that most effects come from relatively fewcauses; that is, 80% of the effects come from 20% of the possible causes.

Employee Involvement - A practice within an organization whereby employeesregularly participate in making decisions on how their work areas operate, includingmaking suggestions for improvement, planning, goal setting, and monitoringperformance.

Empowerment - A condition whereby employees have the authority to makedecisions and take action in their work areas without prior approval. For example,an operator can stop a production process if he detects a problem or a customerservice representative can send out a replacement product if a customer calls with aproblem.

Experimental Design - A formal plan that details the specifics for conducting anexperiment, such as which responses, factors, levels, blocks, treatments, and tools areto be used.

A-5

Page 50: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

External Customer - A person or organization who receives a product, a service,or information but is not part of the organization supplying it.

Failure Mode Analysis - A procedure to determine which malfunction symptomsappear immediately before or after a failure of a critical parameter ir a system.After all the possible causes are listed for each symptom, the product is designed toeliminate the problems.

Failure Mode Effects Analysis - A procedure in which each potential failure modein every sub-item of an item is analyzed to determine its effect on other sub-itemsand on the required function of the item.

Failure Mode Effects And Criticality Analysis - A procedure that is performed aftera failure mode effects analysis to classify each potential failure effect according to itsseverity and probability of occurrence.

Fishbone Diagram - See Cause-And-Effect Diagram

Fitness For Use - A term used to indicate that a product or service fits thecustomer's defined purpose for that product or service.

Flowchart - A graphical representation of the steps in a process. Flowcharts aredrawn to better understand processes. The flowchart is one of the seven tools ofquality.

FMA - Failure Mode Analysis

FMEA - Failure Mode Effects Analysis

FMECA - Failure Mode Effects And Criticality Analysis

Force Field Analysis - A technique for analyzing the forces that will aid or hinderan organization in reaching an objective. An arrow pointing to an objective isdrawn down the middle of a piece of paper. The factors that will aid the objective'sachievement (called the driving forces) are listed on the left side of the arrow; thefactors that will hinder its achievement (called the restraining forces) are listed onthe right side of the arrow.

14 Points - W. Edwards Deming's 14 management practices to help companiesincrease their quality and productivity: 1) create constancy of purpose for improvingproducts and services, 2) adopt the new philosophy, 3) cease dependence oninspection to achieve quality, 4) end the practice of awarding business on pricealone; instead, minimize total cost by working with a single supplier, 5) improveconstantly and forever every process for planning, production, and service, 6)institute training on the job, 7) adopt and institute leadership, 8) drive out fear, 9)break down barriers between staff areas, 10) eliminate slogans, exhortations, andtargets for the work force, 11) eliminate numerical quotas for the work force andnumerical goals for management, 12) remove barriers that rob people of pride ofworkmanship and eliminate the annual rating or merit system, 13) institute avigorous program of education and self-improvement for everyone, and 14) puteverybody in the company to work to accomplish the transformation.

A-6

Page 51: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Funnel Experiment - An experiment that demonstrates the effects of tampering.Marbles are dropped through a funnel in an attempt to hit a flat-surfaced targetbelow. The experiment shows that adjusting a stable process to compensate for anundesirable result or an extraordinarily good result will produce output that is worsethan if the process had been left alone.

Gantt Chart - A type of bar chart used in process planning and control to displayplanned work and finished work in relation to time.

Gauge Repeatability and Reproducibility - The evaluation of a gauginginstrument's accuracy by determining whether the measurements taken with it arerepeatable (i.e., there is close agreement among a number of consecutivemeasurements of the output for the same value of the input under the sameoperating conditions) and reproducible (i.e., there is close agreement amongrepeated measurements of the output for the same value of input made under thesame operating conditions over a period of time).

GDT - Geometric Dimensioning and Tolerancing

Geometric Dimensioning and Tolerancing - A method to minimize productioncosts by showing the dimension and tolerancing on a drawing while considering thefunctions or relationships of part features.

GO/NO-GO - State of a unit or product. Two parameters are possible: go(conforms to specification) and no-go (does not conform to specification).

GR&R - Gauge Repeatability And Reproducibility

Histogram - A graphic summary of variation in a set of data. The pictorial natureof the histogram lets people see patterns that are difficult to see in the simple tableof numbers. The histogram is one of the seven tools of quality.

Hoshin Planning - Breakthrough planning. A Japanese strategic planning processin which a company develops up to four vision statements that indicate where thecompany should be in the next five years. Company goals and work plans aredeveloped based on the vision statements. Periodic audits are then conducted tomonitor progress.

Imperfection - A quality characteristic's departure from its intended level or statewithout any association to conformance to specification requirements or to theusability of a product or service.

In-Control Process - A process in which the statistical measure being evaluated isin a state of statistical control (i.e., the variations among the observed samplingresults can be attributed to a constant system of chance causes).

Inspection - Measuring, examining, testing, or gauging one or morecharacteristics of a product or service and comparing the results with specifiedrequirements to determine whether conformity is achieved to each characteristic.

Instant Pudding - A term used to illustrate an obstacle to achieving quality; thesupposition that quality and productivity improvement is achieved quickly throughan affirmation of faith rather than through sufficient effort and education.

A-7

Page 52: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

W. Edwards Deming used this term - which was initially coined by James Bakken ofthe Ford Motor Co. - in his book Out of the Crisis.

Internal Customer - The recipient (person or department) of another person's ordepartment's output (product, service, or information) within an organization.

Ishikawa Diagram - See Cause-And-Effect Diagram

ISO 9000 Series Standards - A set of five individual but related internationalstandards on quality management and quality assurance developed to helpcompanies effectively document the quality system elements to be implemented tomaintain an efficient quality system. The standards, initially published in 1987, arenot specific to any particular industry, product, or service. The standards weredeveloped by the International Organization for Standardization (ISO), a specializedinternational agency for standardization composed of the national standards bodiesof 91 countries.

JIT Manufacturing - Just-In-Time Manufacturing

Just-In-Time Manufacturing - An optimal material requirement planning systemfor a manufacturing process in which there is little or no manufacturing materialinventory on hand at the manufacturing site and little or no incoming inspection.

Kaizen -A Japanese term that means gradual unending improvement by doinglittle things better and setting and achieving increasingly higher standards. Theterm was made famous by Masaaki Imai in his book Kaizen: The Key to Japan'sCompetitive Success.

LCL - Lower Control Limit

Leadership - An essential part of a quality improvement effort. Organizationleaders must establish a vision, communicate that vision to those in the organization,and provide the tools and knowledge necessary to accomplish the vision.

Lot - A defined quantity of product accumulated under conditions that areconsidered uniform for sampling purposes.

Lower Control Limit - Control limit for points below the central line in a controlchart.

Maintainability - The probability that a given maintenance action for an itemunder given usage conditions can be performed within a stated time interval whenthe maintenance is performed under stated conditions using stated procedures andresources. Maintainability has two categories: serviceability (the ease of conductingscheduled inspections and servicing) and repairability (the ease of restoring serviceafter a failure.

Mean Time Between Failures - The average time interval between failures forrepairable product for a defined unit of measure (e.g., operating hours, cycles, miles)

MIL-Q-9858A - A military standard that describes quality program requirements.

MIL-STD-105E - A military standard that describes the sampling procedures andtables for inspection by attributes.

A-8

Page 53: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

MIL-STD-45662A - A military standard that describes the requirements for

creating and maintaining a calibration system for measurement and test equipment.

MTBF - Mean Time Between Failures

Multivariate Control Chart - A control chart for evaluating the stability of aprocess in terms of the levels of two or more variables or characteristics.

n - Sample size (the number of units in a sample).

NDE - Nondestructive Evaluation

NDT - Nondestructive Testing

Nominal Group Technique - A technique, similar to brainstorming, used by teamsto generate ideas on a particular subject. Team members are asked to silently comeup with as many ideas as possible, writing them down. Each member is then askedto share one idea, which is recorded. After all the ideas are recorded, they arediscussed and prioritized by the group.

Nonconformity - The nonfulfillment of a specified requirement.

Nondestructive Testing and Evaluation -Testing and Evaluation methods that donot damage or destroy the product being tested.

np Chart - Number of Affected Units Chart

Number of Affected Units Chart - A control chart for evaluation the stability of aprocess in terms of the total number of units in a sample in which ar event of a givenclassification occurs.

OC Curve - Operating Characteristic Curve

Operating Characteristic Curve - A graph used to determine the probability ofaccepting lots as a function of the lots or processes' quality level when using varioussampling plans. There are three types: Type A curves, which give the probability ofacceptance for an individual lot comingfrom finite production (will not continue inthe future); Type B curves, which give the p,.obability of acceptance for lots comingfrom a continuous process; and Type C curves, which, for a continuous samplingplan, give the long-run percentage of product accepted during the sampling phase.

Out-Of-Control Process - A process in which the statistical measure beingevaluated is not in a state of statistical control (i.e., the variations among theobserved sampling results can be attributed to a constant system of chance causes).

Out of Spec - A term used to indicate that a unit does not meet a given

specification.

p Chart - Percent Chart

Pareto Chart - A graphical tool for ranking causes from most significant to leastsignificant. It is based on the Pareto principle, which was first defined by J.M. Juranin 1950. The principle, named after 19th-century economist Vilfredo Pareto,

A-9

Page 54: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

sug§gests that most effects come from relatively few causes; that is, 80% of theeffects come from 20% of the possible causes. The Pareto chart is one of the seventools of quality.

PDCA Cycle - Plan-Do-Check-Act Cycle

Percent Chart - A control chart for evaluating the stability of a process in terms ofthe percent of the total number of units in a sample in which an event of a givenclassification occurs. The percent chart is also referred to as a proportion chart.

Plan-Do-Check-Act Cycle - A four-step process for quality improvement. In thefirst step (plan), a plan to effect improvement is developed. In the second step (do),the plan is carried out, preferably on a small scale. In the third step (check), theeffects of the plan are observed. In the last step (act), the results are studied todetermine what was learned and what can be predicted. The plan-do-check-actcycle is sometimes referred to is the Shewhart cycle (because Walter A. Shewhartdiscussed the concept in his book Statistical Method From the Viewpoint of QualityControl) and as the Deming cycle (because W. Edwards Deming introduced theconcept in Japan; the Japanese subsequently called it the Deming cycle).

Prevention Vs. Detection - A term used to contrast two types of quality activities.Prevention refers to those activities designed to prevent nonconformances inproducts and services. Detection refers to those activities designed to detectnonconformances already in products and services. Another term used to describethis distinction is "designing in quality vs. inspecting in quality.u

Process Capability - A statistical measure of the inherent process variability for agiven characteristic. The most widely accepted formula for process capability is 6o.

Process Capability Index - The value of the tolerance specified for thecharacteristic divided by the process capability. There are several types of processcapability indexes, including the widely used Cp and Cpk.

Product Or Service Liability - The obligation of a company to make restitution forloss related to personal injury, proper.y damage, or other harm caused by its productor service.

Q Chart - Quality Score Chart

QFD - Quality Function Deployment

QML - Qualified Manufacturer's Line

Q90 Series - Refers to ANSI/AXQC Q90 series of standards, which is theAmericanized version of the ISO 9000 series standards. The United States adoptedthe ISO 9000 series as the ANSI/ASQC Q90 series in 1987.

Qualified Manufacturer's Line - A Qualified Manufacturer's Line is one which hasbeen certified to meet quality standards such that additional sampling and testing ofproducts from that line are not necessary. All parts produced meet required qualitylevels.

Quality - A subjective term for which each person has his or her own definition.In technical usage, quality can have two meanings: 1) the characteristics of a product

A-10

Page 55: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

or service that bear on its ability to satisfy stated or implied needs and 2) a product orservice free of deficiencies.

Quality Assurance/Quality Control - Two terms that have many interpretationsbecause of the multiple definitions for the words "assurance" and "control." Forexample, "assurance" can mean the act of making certain; "control" can mean anevaluation to indicate needed corrective responses, the act of guiding, or the stateof a process in which the variability is attributable to a constant system of chancecauses. (For a detailed discussion on the multiple definitions, see ANSI/ASQCStandard A3-1987, Definitions, Symbols, Formulas, and Tables for Control Charts.")One definition of quality assurance is: all the planned and systematic activitiesimplemented within the quality system that can be dem6rstraeled to provideconfidence that a product or service will fulfill requirements for quality. Onedefinition for quality control is: the operational techniques and activities used tofulfill requirements for quality. Often, however, "quality assuran-e" and "qualitycontrol" are used interchangeably, referring to the actions performed to ensure thequality of a product, service, or process.

Quality Audit - A systematic, independent examination and review to determinewhether quality activities and related results comply with planned arrangementsand whether these arrangements are implemented effE tively and are suitable toachieve the objectives.

Quality Circles - Quality improvement or self-improvement study groupscomposed of a small number of employees (10 or fewer) and their supervisor.Quality circles originated in Japan, where they are called quality control circles.

Quality Control - See Quality Assurance/Quality Control

Quality Costs - See Cost of Poor Quality

Quality Engineering - The analysis of a manufacturing system at all stages tomaximize the quality of the process itself and the products it produces.

Quality Function Deployment - A structured method in which customerrequirements are translated into appropriate technical requirements for each stageof product development and production. The QFD process is often referred to aslistening to the voice of the customer.

Quality Loss Function - A parabolic approximation of the quality loss that occurswhen a quality characteristic deviates from its target value. The quality loss functionis expressed in monetary units: the cost of deviating from the target increasesquadratically the farther the quality characteristic moves from the target. Theformula used to compute the quality loss function depends on the type of qualitycharacteristic being used. The quality loss function was first introduced in this formby Genichi Taguchi.

Quality Score Chart - A control chart for evaluating the stability of a process interms of a quality score. The quality score is the weigh ted sum of the count of eventsof various classifications where each classification is assigned a weight.

Quality Trilogy - A three-pronged approach to managing for quality. The threelegs are quality planning (developing the products and processes required to meet

A-11

Page 56: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

customer needs), quality control (meeting product and process goals), and qualityimprovement (achieving unprecented levels of performance).

Quincunx - A tool that creates frequency distributions. Beads tumble overnumerous horizontal rows of pins, which force the beads to the right or left. After arandom journey, the beads drop into vertical slots. After many beads are dropped, afrequency distribution results. In the classroom, quincunxes are often used tosimulate a manufacturing process. The quincunx was invented by English scientistFrancis Galton in the 1890s.

R Chart - Range Chart

RAB - Registrar Accreditation Board

RAM - Reliability/Availability/Maintainability

Random Sampling - A commonly used sampling technique in which sample unitsare selected in such a manner that all combination of n units under considerationhave an equal chance of being selected as the sample.

Range Chart - A control chart in which the sub-group range, R, is used to evaluatethe stability of the variability within a process.

Red Bead Experiment - An experiment developed by W. Edwards Deming toillustrate that it is impossible to put employees in rank order of performance for thecoming year based on their performance during the past year because performancedifferences must be attributed to the system, not to employees. Four thousand redand white beads (20% red) in a jar and six people are needed for the experiment.The participants' goal is to produce white beads, because the customer will notaccept red beads. One person begins by stirring the beads and then, blindfolded,selects a sample of 50 beads. That person hands the jar to the next person, whorepeats the process, and so on. When everyone has his or her sample, the number ofred beads for each is counted. The limits of variation between employees that canbe attributed to the system are calculated. Everyone will fall within the calculatedlimits of variation that could arise from the system. The calculations will show thatthere is no evidence one person will be a better performer than another in thefuture. The experiment shows that it would be a waste of management's time to tryto find out why, say, John produced four red beads and Jane produced 15; instead,management should improve the system, making it possible for everyone to producemore white beads.

Registrar Accreditation Board - A board that evaluates the competency andreliability of registrars (organizations that assess and register companies to theappropriate ISO 9000 series standards). The Registrar Accreditation Board, formed in1989 by ASQC, is governed by a board of directors from industry, academia, andquality management consulting firms.

Registration To Standards - A process in which an accredited, independent third-party organization conducts on on-site audit of a company's operations against therequirements of the standard to which the company wants to be registered. Uponsuccessful completion of the audit, the company receives a certificate indicating thatit has met the standard requirements.

A-12

Page 57: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Regression Analysis - A statistical technique for determining the bestmathematical expression describing the functional relationship between oneresponse and one or more independent variables.

Reliability - The probability of a product performing its intended function understated conditions without failure for a given period of time.

Right The First Time - A term used to convey the concept that it is beneficial andmore cost-effective to take the necessary steps up front to ensure a product orservice meets its requirements than to provide a product or service that will needrework or not meet customers' needs. In other words, an organization wouldengage in defect prevention rather than defect detection.

Robustness - The condition of a product or process design that remains relativelystable with a minimum of variation even though factors that influence operations orusage, such as environment and wear, are constantly changing.

s Chart - Sample Standard Deviation Chart

Sample Standard Deviation Chart - A control chart in which the subgroupstandard deviation, s, is used to evaluate the stability of the variability within aprcocess.

Scatter Diagram - A graphical technique to analyze the relationship between twovariables. Two sets of data are plotted on a grapa, with the y axis being used for thevariable to be predicted and the x axis being used for the variable to make theprediction. The graph will show possible relationships (although two variablesmight appear to be related, they might not be - those who know most about thevariables must make that evaluation). The scatter diagram is one of the seven toolsof quality.

Seven Tools of Quality - Tools that help organizations understand their processesin order to improve them. The tools are the cause-and-effect diagram, check sheet,control chart, flowchart, histogram, Pareto chart, and scatter diagram.

Shewhart Cycle - See Plan-Do-Check-Act Cycle

Signal-To-Noise Ratio - A mathematical equation that indicates the magnitude ofan experimental effect above the effect of experimental error due to chancefluctuations.

Six-Sigma Quality - A term used generally to indicate that a process is well-controlled, i.e., + -3 sigma from the centerline in a control chart. The term is usuallyassociated with Motorola, which named one of its key operational initiatives "SixSigma Quality."

S/N Ration - Signal-To-Noise Ratio

SPC - Statistical Process Control

Special Causes - Causes of variation that arise because of special circumstances.They are not an inherent part of a process. Special causes are also referred to asassignable causes.

A-13

Page 58: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Specification - A document that states the requirements to which a given product

or service must conform.

SQC - Statistical Quality Control

Statistical Process Control - The application of statistical techniques to control aerocess. Often the term "statistical quality control" is used interchangeably withstatistical process control.*

Statistical Quality Control - The application of statistical techniques to controlquality. Often the term "statistical process control* is used interchangeably with

statistical quality control," although statistical quality control includes acceptancesampling as well as statistical process control.

Structural Variation - Variation caused by regular, systematic changes in output,such as seasonal patterns and long-term trends.

Supplier Quality Assurance - Confidence that a supplier's product or service willfulfill its customers' needs. This confidence is achieved by creatinga relationshipbetween the customer and supplier that ensures the product willbe fit for use withminimal corrective action and inspection. According to J.M. Juran, there are nineprimary activities needed: 1) define product and program quality requirements, 2)evaluate alternative suppliers, 3) select suppliers, 4) conduct joint quality planning,5) cooperate with the supplier during the execution of the contract, 6) obtain proofof conformance to requirements, 7) certify qualified suppliers, 8) conduct qualityimprovement programs as required, and 9) create and use supplier quality ratings.

Taguchi Methods -The American Supplier Institute's trademarked term for thequality engineering methodology developed by Genichi Taguchi. In thisengineering approach to quality control, Taguchi calls for off-line quality control,on-line quality control, and a system of experimental design to improve quality andreduce costs.

Tampering - Action taken to compensate for variation within the control limits ofa stable system. Tampering increases rather than decreases variation, as evidencedin the funnel experiment.

Top-Management Commitment - Participation of the highest-level officials intheir organization's quality improvement efforts. Their participation includesestablishing and serving on a quality committee, establishing quality policies andgoals, deploying those goals to lower levels of the organization, providing theresources and training that the lower levels need to achieve the goals, participatingin quality improvement teams, reviewing progress organization-wide; recognizingthose who have performed well, and revising the current reward system to reflectthe importance of achieving the quality goals.

Total Quality Management - A term initially coined in 1985 by the Naval AirSystems Command to describe its Japanese-style management approach to qualityimprovement. Since then, total quality management (TQM) has taken on manymeanings. Simply put, TQM is a management approach to long-term successthrough customer satisfaction. TQM is based on the participation of all members ofan organization in improving processes, products, services, and the culture in whichthey work. TQM benefits all organization members and society. The methods forimplementing this approach are found in the teaching of such quality leader as

A-14

Page 59: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

Philip B. Crosby, W. Edwards Deming, Armand V. Feigenbaum, Kaoru Ishikawa, and

J.M. Juran.

TQM - Total Quality Management

Trend Control Chart - A control chart in which the deviation of the subgroupaverage, X-bar, from an expected trend in the process level is used to evaluate thestability of a process.

Type I Error - An incorrect decision to reject something (such as a statisticalhypothesis or a lot of products) when it is acceptable.

Type II Error - An incorrect decision to accept something when it is unacceptable.

u Chart - Count Per Unit Chart

UCL - Upper Control Limit

Upper Control Limit - Control limit for points above the central line in a controlchart.

Value-Adding Process - Those activities that transform an input into a customer-usable output. The customer can be internal or external to the organization.

Variables Data - Measurement information. Control charts based on variablesdata include average (X-bar) chart, range (R) chart, and sample standard deviation(s) chart.

Variation - A change in data, a characteristic, or a function that is caused by oneof four factors: special case, common cases, tampering, or structural variation.

Vital Few, Useful Many - A term used by J.M. Duran to describe his use of thePareto principle, which he first defined in 1950. (The principle was used much earlierin economics and inventory control methodologies.) The principle suggests thatmost effects come from relatively few causes; that is, 80% of the effects come from20% of the possible causes. The 20% of the possible causes are referred to as the"vital few"; the remaining causes are referred to as the "useful many." When Juranfirst defined this principle, he referred to the remaining causes as the "trivial many,"but realizing that no problems are trivial in quality assurance, he changed it to"useful many."

World-Class Quality - A term used to indicate a standard of excellence: the best of

the best.

x-bar Chart - Average Chart

Zero Defects - A performance standard developed by Philip B. Crosby to address adual attitude in the workplace; people are willing to accept imperfection in someareas, while, in other areas, they expect the number of defects to be zero. This dualattitude had developed because of the conditioning that people are human andhumans make mistakes. However, the zero defect methodology states that, ifpeople commit themselves to watching details and avoiding errors, they can movecloser to the goal of zero defects. The performance standard that must be set is"zero defects," not "close enough."

A-15

Page 60: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

APPENDIX B

QUALITY CONTROL PERSONALITIES

Brumbaugh, Martin A. (Deceased) - The founder and first editor of IndustrialQuality Control magazine. A former professor of statistics at the University ofBuffalo, Brumbaugh's writings on applied statistics were regularly published.Brumbaugh was instrumental in getting two separate quality organizations - theFederated Societies and the Society for Quality Control - merged into one nationalorganization: ASQC.

Collier, Simon (Deceased) - An ASQC president who led the Society during acritical growth period in 1952-53. His term was marked by numerous milestoneevents, including a membership increase of 22% and the formation of 11 newsections and the first divisions. Collier was a chemist who began his career at theNational Bureau of Standards (now the National Institute of Standards andTechnology). Later he worked at Johns-Manville Corporation, where he produced aquality training film used by more than 300 companies.

Crosby, Philip - The founder and chairman of the board of Career IV, an executivemanagement consulting firm. Crosby also founded Philip Crosby Associates Inc. andthe Quality College. He has authored many books, including Quality is Free, QualityWithout Tears, Let's Talk Quality, and Leading: the Art of Becoming an Executive.Crosby originated the zero defect concept.

Deming, W. Edwards - A prominent consultant, teacher, and author on thesubject of quality. After sharing his expertise in statistical quality control to help theU.S. war effort during World War II, the War Department sent Deming to Japan in1946 to help that nation recover from its wartime losses. Deming has publishedmore than 200 works, including the well-known books Quality, Productivity, andCompetitive Position and Out of the Crisis.

Dodge, Harold F. (Deceased) - An ASQC founder. His work with acceptancesampling plans scientifically standardized inspection operations and providedcontrollable risks. Although he usually is remembered for the Dodge-Romigsampling plans he developed with Harry G. Romig, Dodge also helped develop otherbasic acceptance sampling concepts (e.g., consumer's risk, producer's risk, averageoutgoing quality level) and several acceptance sampling schemes.

Edwards, George D. (Deceased) - First president of ASQC. Edwards was noted forhis administrative skills in forming and preserving the Society. He was the head ofthe inspection engineering department and the director of quality assurance at BellTelephone Laboratories. He also served as a consultant to the Army OrdnanceDepartment and the War Production Board during World War II.

Feigenbaum, Armand V. - The founder and president of General Systems Co., aninternational engineering company that designs and implements total qualitysystems. Feigenbaum originated the concept of total quality control in his bookTotal Quality Control, which was published 1951. The book has been translated intomany languages, including Japanese, Chinese, French, and Spanish.

Grant, Eugene L. - Professor Emeritus of economics engineering at StanfordUniversity. Grant was part of a small team of professors assigned during World War

B-1

Page 61: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

II to introduce statistical quality control concepts to improve manufacturingproduction. He has written many textbooks, including Principles of EngineeringEconomy and Statistical Quality Control, editions of which he co-authored with W.Grant Ireson and Richard S. Leavenworth.

Ishikawa, Kaoru (Deceased) - A pioneer in quality control activities in Japan. In1943, he developed the cause-and-effect diagram. Ishikawa published many works,including What Is Total Quality Control? The Japanese Way, Quality Control Circlesat work, and guide to Quality Control. He was a member of the quality controlresearch group of the Union of Japanese Scientists and Engineers while also workingas an assistant professor at the University of Tokyo.

Juran, Joseph M. -The chairman emeritus of the Juran Institute. Since 1924, Juranhas pursued a varied career in management as an engineer, executive, governmentadministrator, university professor, labor arbitrator, corporate director, andconsultant. Specializing in managing for quality, he has authored hundreds ofpapers and 12 books, including Juran's Quality Control Handbook, Quality Planningand Analysis (with F.M. Gryna), and Juran On Leadership For Quality.

Ott, Ellis R. (Deceased) - An educator who devoted his career to providing U.S.industry with statistical quality control professionals. In 1946, Ott became thechairman of the mathematics department at Rutgers University's University Collegewith one condition: that he could also consult on and teach quality control. Hisinfluence led the university to establish the Rutgers Statistics Center. Ott developedthe analysis of means procedure and published many papers.

Romig, Harry G. (Deceased) - An ASOC founder who was most widely known forhis contributions in sampling. At AT&T Bell Laboratories, Romig and Harold F.Dodge developed the Dodge-Romig sampling tables, operating characteristics forsampling plans, and other fundamentals. Romig alone developed the first samplingplans using variables data and the concept of average outgoing quality limit. Laterin his life, Romig was a consultant and taught quality-related courses at severaluniversities.

Shewhart, Walter A. (Deceased) - Referred to as the father of statistical qualitycontrol because he brought together the discipline of statistics, engineering, andeconomics. He describedthe basic principles of this new discipline in his bookEconomic Control of Quality of Manufactured Product. Shewhart, ASQC's firstHonorary member, was best known for creating the control chart. Shewhart workedfor Western Electric and AT&T Bell Telephone Laboratories in addition to lecturingand consulting on quality control.

Taguchi, Genichi - The executive director of the American Supplier Institute, thedirector of the Japan Industrial Technology Institute, and an honorary professor atNanjing Institute of Technology in China. Taguchi is well-known for developing amethodology to improve quality and reduce costs, which, in the United States, isreferred to as the Taguchi Methods. He also developed the quality loss function.

Wescott, Mason E. - A professor emeritus at the Rochester Institute of Technology(RIT), Wescott has been teaching mathematics and statistics since 1925. He taught atNorthwestern University, Rutgers University, and RIT, where the Wescott StatisticsLaboratory was dedicated in his honor in 1984. Wescott succeeded Martin A.Brumbaugh as editor of Industrial Quality Control in 1947, a position he held until1961.

B-2

Page 62: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

DISTRIBUTION LIST

DNA-TR-92-79

NATO ATTN: SPSPATTN: SPWE

AAFCE/OOST ATTN: TITLATTN: SONLDR MULLIGAN

DEFENSE TECHNICAL INFORMATION CENTERARTILLERIE KOMMANDO 3 ATTN: DTIC-DE

ATTN: LTC JOACHIM BURTH ATTN: DTIC/FDAB

ATOMIC WEAPONS ESTABLISHMENT, FOULNESS DEPARTMENT OF DEFENSEATTN: DR DARYL LANDEG ATTN: DEP DIR TEST EVAL

SIEMENS PLESSEY DEFENCE SYSTEMS NATIONAL COMMUNICATIONS SYSTEMATTN: HENRY CHESTERS ATTN: NCS-TS A H RAUSCH

TARGETS BRANCH NATIONAL DEFENSE UNIVERSITYATTN: ITTI ATTN: CLASSIFIED LIBRARY

USECOM NATIONAL SECURITY AGENCYATTN: ECJ3-FC ATTN: D/DIR

2 CYS ATTN: ECJ4-LW ATTN: DDIATTN: ECJ5°N ATTN; TECH DOC LIBATTN: ECJ6-TATTN: MAJ FLEMING NET ASSESSMENT

ATTN: DIRECTORDEPARTMENT OF DEFENSE ATTN: DOCUMENT CONTROL

ARMED FORCES STAFF COLLEGE OPERATIONAL TEST & EVALUATIONATTN: C3-JCEWS ATTN: DEP DIR/OPER TEST & EVAL STRAT SYSATTN: JCEWS-C3D ATTN: SCIENCE ADVISOR

ASSISTANT SEC OF DEF (C31) ATTN. DEP DIR RESOURCES & ADMIN

ATTN: J BAIN STRATEGIC & SPACE SYSTEMSATTN: DIRECTOR

ASSISTANT TO THE SECRETARY OF DEFENSE ATTN: DR E SEVIN

ATTN: DR BIRELY ATTN: DR SEIN

ATTN: LTCOL M CRAWFORD ATTN: DR SCHNEITER

DEF RSCH & ENGRG STRATEGIC DEFENSE INITIATIVE ORGANIZATION

ATTN: DIR TEST FACILITIES & RESOURCES ATTN: DA DR GERRY

ATTN: DEP DIR TEST EVAL DEPARTMENT OF THE ARMY

DEFENSE ADVANCED RSCH PROJ AGENCY ARMY RESEARCH LABORATORIESATTN: ASST DIR r-1 FCTRONIC SCIENCES DIV ATTN: TECH LIB

ATTN: DEP DIR RESEARCH ATTN: SLCSM-D COL J DOYLEATTN: DIR DEFENSE SCIENCES OFCATTN: TTO PATRIOT

ATTN: PROJECT MANAGERDEFENSE COMMUNICATION AGENCY

ATTN: JOHN SELISKAR COMMANDER RESEARCH & DEV CENTERATTN: COMMANDER

DEFENSE ELECTRONIC SUPPLY CENTERATTN: DESC-E SATELLITE COMMUNICATIONS

ATTN: PROJECT MANAGERDEFENSE LOGISTICS AGENCY

ATTN: DLA-F U S ARMY AVIATION SYSTEMS CMDATTN: DLA-QE ATTN: PM AIRCRAFT SURVIVABILITY EQUIPATTN: DLA-QESATTN: DLA-SCC U S ARMY BELVOIR RD&E CTRATTN: DLA-SCT ATTN: TECH LIBATTN: DLSMO U S ARMY LABORATORY CMD

DEFENSE NUCLEAR AGENCY ATTN: COMMANDERATTN: DFRA JOAN MA PIERRE U S ARMY MATERIAL COMMANDATTN: RAEEUSARYMTRACO ANATTN: RAEV ATTN: PROJECT MANAGEMENT

ATTN: SPSD

Dist-1

Page 63: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

ONA-TR-92-79 (DL CONTINUED)

U S ARMY MISSILE COMMAND NAVAL AVIONICS CENTERATTN: PM/rO ATTN: B/71OFGAHIMER

ATTN: B/713 D PAULSU S ARMY MISSILE COMMAND

ATTN- AMCPM-CC/PM NAVAL POSTGRADUATE SCHOOLATTN: CODE 1424 LIBRARY

U S ARMY MISSILE COMMANDATTN: MAJ R LUSHBOUGH NAVAL RESEARCH LABORATORY

ATTrN: CODE 2627U S ARMY MISSILE COMMAND

ATTN: R LENNING NAVAL SEA SYSTEMS COMMANDATTN: COMMANDING OFFICER

U S ARMY NUCLEAR & CHEMICAL AGENCYATTN: MONA-AD NAVAL SURFACE WARFARE CENTER

ATTN: COMMANDERU S ARMY ORD MISSILE & MUNITIONS

ATTN: ATSK-MS NAVAL WAR COLLEGEATTN: ATSK-XO ATTN: CODE E-111

U S ARMY RESEARCH DEV & ENGRG CTR NAVAL WARFARE ASSESSMENT CENTERATTN: TECH LIB ATTN: DOCUMENT CONTROL

U S ARMY SIGNAL CTR & FT GORDON NAVSEAATTN: ATZH-CDC ATTN: J SATINATTN: ATZH-CDM

OPERATIONAL TEST & EVALUATION FORCEU S ARMY SPACE & STRATEGIC DEFENSE CMD ATTN: COMMANDER

ATTN: CSSD-SA-EVATTN: CSSD-SA-EV R CROWSON SCIENCE & TECHNOLOGY LIBRARYATTN: CSSD-SL ATTN: CODE 202.13ATTN: SFAE-SD-GST-E P BUHRMAN STRATEGIC SYSTEMS PROGRAM

U S ARMY SPACE STRATEGIC DEFENSE CMD ATTN: SP-113 D ELLINGSONATTN: CSSD-CSATTN: CSSD-OP DEPARTMENT OF THE AIR FORCEATTN: CSSD-SA-E AERONAUTICAL SYSTEMS DIVISION

U S ARMY STRATEGIC SPACE & DEFENSE CMD ATTN: ASD/ENACE

ATTN: CSSD-H-LS ATIN: ASD/ENSSSATTN: CSSD-SD-A ATTN: ASD/RWWI

U S ARMY TEST & EVALUATION COMMAND AF SPACE COMMAND

ATTN: AMSTE-TA-F ATTN: LKNIP MAJ S HOFFATTN: AMSTE-TAoF L TELETSKI ATTN: SM-ALC DET 25 P C LARTER

ATTN: DOCE

U S ARMY VULNERABILITY ASSESSMENT LABA'T'N: SLCVA-D AIR FORCE CTR FOR STUDIES & ANALYSISATTN: AFCSA/SAS

USA ELECT WARFARE/SEC SURV & TARGET ACO CTR ATTN: AFSAANSAKIATTN: COMMANDER AIR FORCE INSTITUTE OF TECHNOLOGY/EN

USA SURVIVABILITY MANAGMENT OFFICE ATTN: LTCOL R TUTTLEATTN: AMSLC-VL-NE DR J FEENEY AIR FORCE SPACE COMMAMD (LKI)ATTN: F MANIONATTrN: SLCSM-SE J BRAND ATTN: CAPT WEIDNER

ATTN: LT D BARRONDEPARTMENT OF THE NAVY ATTN: MAJ D ROBINSON

ATTN: STOP 7DEPARTMENT OF THE NAVY

ATTN: DIRECTOR ASSISTANT CHIEF OF STAFFATTN: AF/SAN

GPS NAVSTAR JOINT PROGRAM OFFICEATTN: CHARLES TABBERT BALLISTICS MISSLE ORGANIZATION

ATTN: A F BURKHOLDERNAVAL AIR SYSTEMS COMMAND

ATTN: AIR-5115J G T SIMPSON DEPARTMENT OF THE AIR FORCEATTN: NCGS B HOPKINS

Dist-2

Page 64: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

DNA-TR-92-79 (DL CONTINUED)

DEPUTY CHIEF OF STAFF/AF-RD-D APTEK, INCA•TN: AFJRD-D ATTN: T MEAGHER

DEPUTY CHIEF OF STAFF/AF-RD-W BERKELEY RSCH ASSOCIATES, INC

ATTN: AF/SCMCW ATTN: J ORENSAT-1: N PEREIRA

DEPUTY CHIEF OF STAFFIAFRDSDATTN: SAF/AQSD BERKOWITZ ENTERPRISES

ATTN: H M BERKOWITZMILITARY DEPUTY FOR ACQUISITION

ATTN: AQQS LTCOL KEE BOEING AIRCRAFT MARINEATTN: CHUCK WERTZBERGER

PHILLIPS LABORATORYATTN: CAPTIAN LORANG BOOZ ALLEN & HAMILTON INCATrN: NTES LTCOL T BRETZ ATTN: J KEEATTN: PIAWS L CONTRERASATTN: WSP J DEGAN BOOZ-ALLEN & HAMILTON, INCATTN: WSP R PETERKIN ATTN: J M VICE

ROME LABORATORY/SUL HUDSON INSTITUTE, INC

ATrN: COMMANDER ATTN: LIBRARY

SECRETARY OF THE AIR FORCE INSTITUTE FOR DEFENSE ANALYSES

ATTN: AFIRDSL ATTN: CLASSIFIED LIBRARYATTN: DR H DICKENSON

SPACE DIVISION(AFSC) ATTN: DR LESLIE COHEN, STDATTN: SSD/MZS ATTN: E BAUER

ATTN: GRAHAM MCBRYDEUNITED STATES STRATEGIC COMMAND ATTN: IKOHLBERG

ATTN: J 22A ATTN: OED W SHELESKIATTN: J 51 ATTN: R MILLERATTN: J 53ATTN: J 532 JASPER WELCH ASSOCIATESATTN: J 533 ATTN: MAJ GEN J WELCHATTN: J534ATTN: JIC/ODM JAYCORATTN: JIC/ODTD ATTN: E WENAAS

WRIGHT LABORATORY KAMAN SCIENCES CORPATTN: D WAiTS ATTN: LIBRARY

WRIGHT RESEARCH & DEVELOPMENT CENTER KAMAN SCIENCES CORPATTN: COMMANDANT 2 CYS ATTN: R WILLIAMS

2 CYS ATTN: W GATELEYDEPARTMENT OF ENERGY

KAMAN SCIENCES CORPDEPARTMENT OF ENERGY ATTN: DASIAC

ATTN: WALT KELLYKAMAN SCIENCES CORPORATION

DEPARTMENT OF ENERGY ATTN: DASIACATTN: C MEYERS

LOCKHEED AERONAUTICAL SYSTEMSEG&G IDAHO INC ATTN: CENTRAL LIBRARY

ATTN: MS ILF-2LOCKHEED MISSILES & SPACE CO, INC

SANDIA NATIONAL LABORATORIES ATTN: VULNERABILITY & HARDENINGATTN: TECH LIB-PERIODICALS

LOGICON R & D ASSOCIATESDEPARTMENT OF DEFENSE CONTRACTORS ATTN: LIBRARY

AEROSPACE CORP LOGICON R & D ASSOCIATESATTN: LIBRARY ACQUISITION ATTN: DOCUMENT CONTROL

ALFONTE ASSOCIATES M I T LINCOLN LABATTN: WILLIAM ALFONTE ATTN: V SFERRINO

ATTN: C F WILSONANALYTIC SERVICES, INC (ANSER) ATTN: V MISELIS

ATTN: LIBRARY ATTN: R HALL

Dist-3

Page 65: Statistics of Mass ProductionAD-A264 365 Defense Nuclear Agency Alexandria, VA 22310-3398 DNA-TR-92-79 Statistics of Mass Production R. Larry Williams Wilson Y. Gateley Kaman Sciences

DNA-TR-92-79 (DL CONTINUED)

MARTIN MARIETTA DENVER AEROSPACE NICHOLS RESEARCH CORPORATIONATTN: RESEARCH LIBRARY ATTN: L GAROZZO

MASSACHUSETTS INST OF TECHNOLOGY RAYTHEON - MSDATTN: J RUINA ATTN: LIBRARY

MCDONNELL DOUGLAS HELICOPTER CO SCIENCE APPLICATIONS INTL CORPATTN: B MOORE ATTN: M ATKINSATTN: K PIERCEATTN: W SIMS TELEDYNE BROWN ENGINEERING

ATTN: P SHELTONMISSION RESEARCH CORP

ATTN: DOCUMENT CONTROL WILLIAMS INTERNATIONAL CORPATTN: TECH INFO CENTER ATTN: C ANDREK

Dist-4