11
Production, Manufacturing and Logistics Manufacturing performance measurement and target setting: A data envelopment analysis approach q Sanjay Jain a,, Konstantinos P. Triantis b,1 , Shiyong Liu c a Department of Decision Sciences, School of Business, The George Washington University, 2201 G Street NW, Washington, DC 20052, USA b Grado Department of Industrial and Systems Engineering, Virginia Polytechnic Institute and State University, System Performance Laboratory, Falls Church, VA 22043, USA c Research Institute of Economics and Management, Southwestern University of Finance and Economics, #55 Guanghua Village Avenue, Chengdu, Sichuan 610074, China article info Article history: Received 31 October 2009 Accepted 16 May 2011 Available online 23 May 2011 Keywords: Manufacturing Performance measurement Target setting Data envelopment analysis abstract Manufacturing decision makers have to deal with a large number of reports and metrics for evaluating the performance of manufacturing systems. Since the metrics provide different and at times conflicting assessments, it is hard for the manufacturing decision makers to track and improve overall manufactur- ing system performance. This research presents a data envelopment analysis (DEA) based approach for performance measurement and target setting of manufacturing systems. The approach is applied to two different manufacturing environments. The performance peer groups identified using DEA are uti- lized to set performance targets and to guide performance improvement efforts. The DEA scores are checked against past process modifications that led to identified performance changes. Limitations of the DEA based approach are presented when considering measures that are influenced by factors outside of the control of the manufacturing decision makers. The potential of a DEA based generic performance measurement approach for manufacturing systems is provided. Ó 2011 Elsevier B.V. All rights reserved. 1. Introduction and context Most manufacturing executives face three major obstacles as they strive to keep a handle on their operations: information inun- dation, information isolation, and information indecision. In a nut- shell, executives often receive too much information from isolated sources that is devoid of practical guidance for improvement. Fur- ther, in order for manufacturers to improve operational perfor- mance and in this context reduce manufacturing costs, they must have an effective method of measuring and evaluating the perfor- mance of their manufacturing processes. This issue of effective measurement is paramount in today’s manufacturing companies. There is little guidance on setting performance improvement targets in manufacturing systems. External benchmarks available through industry trade associations or through consulting organi- zations are occasionally used for setting performance targets. Targets based on these benchmarks need to be adjusted to the un- ique configuration and circumstances of the manufacturing system that is being evaluated and this is a non-trivial task. Furthermore, defining targets using a set of often conflicting performance indica- tors, such as combinations of due date performance, inventory lev- els, quality levels, throughput, cycle time and machine utilization typically cause confusion. For example, a conflict exists between the objectives of achieving low inventory levels and high machine utilizations if there are long changeovers between manufacturing different product types. Simulation modeling also provides a potential approach to set- ting performance targets that take into account the manufacturing system’s resource availabilities and stochastic demand generating large amounts of information. However, developing and maintain- ing current simulation models for a manufacturing system requires high expertise and effort. Therefore, both of these approaches, i.e., the use of external benchmarks and simulation, do not solve the information inundation or isolation problems. Consequently, organizations have typically tried to focus on one or two measures. Such focus does help to improve performance on selected measures but at times to the detriment of overall perfor- mance. What is needed is an approach that allows the manufactur- ing enterprise to focus on a small number of measures; yet take into account multiple facets of performance. Further, a mechanism is required for setting realistic targets that take into account the capabilities and changing circumstances of the manufacturing 0377-2217/$ - see front matter Ó 2011 Elsevier B.V. All rights reserved. doi:10.1016/j.ejor.2011.05.028 q This research was supported in part by an internal Virginia Tech grant from the Center for High Performance Manufacturing and in part by ‘‘211 Project Phase III’’ at the Southwestern University of Finance and Economics. Corresponding author. Tel.: +1 202 994 5591; fax: +1 202 994 2736. E-mail address: [email protected] (S. Jain). 1 The paper is also supported in part by the U.S. National Science Foundation, while Dr. Triantis was working at the Foundation. Any opinion, finding, and conclusions and recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of the National Science Foundation. European Journal of Operational Research 214 (2011) 616–626 Contents lists available at ScienceDirect European Journal of Operational Research journal homepage: www.elsevier.com/locate/ejor

Manufacturing performance measurement and target setting: A data envelopment analysis approach

Embed Size (px)

Citation preview

Page 1: Manufacturing performance measurement and target setting: A data envelopment analysis approach

European Journal of Operational Research 214 (2011) 616–626

Contents lists available at ScienceDirect

European Journal of Operational Research

journal homepage: www.elsevier .com/locate /e jor

Production, Manufacturing and Logistics

Manufacturing performance measurement and target setting: A dataenvelopment analysis approach q

Sanjay Jain a,⇑, Konstantinos P. Triantis b,1, Shiyong Liu c

a Department of Decision Sciences, School of Business, The George Washington University, 2201 G Street NW, Washington, DC 20052, USAb Grado Department of Industrial and Systems Engineering, Virginia Polytechnic Institute and State University, System Performance Laboratory, Falls Church, VA 22043, USAc Research Institute of Economics and Management, Southwestern University of Finance and Economics, #55 Guanghua Village Avenue, Chengdu, Sichuan 610074, China

a r t i c l e i n f o

Article history:Received 31 October 2009Accepted 16 May 2011Available online 23 May 2011

Keywords:ManufacturingPerformance measurementTarget settingData envelopment analysis

0377-2217/$ - see front matter � 2011 Elsevier B.V. Adoi:10.1016/j.ejor.2011.05.028

q This research was supported in part by an internaCenter for High Performance Manufacturing and in parthe Southwestern University of Finance and Economi⇑ Corresponding author. Tel.: +1 202 994 5591; fax

E-mail address: [email protected] (S. Jain).1 The paper is also supported in part by the U.S. Natio

Dr. Triantis was working at the Foundation. Any opinionrecommendations expressed in this paper are thosenecessarily reflect the views of the National Science Fo

a b s t r a c t

Manufacturing decision makers have to deal with a large number of reports and metrics for evaluatingthe performance of manufacturing systems. Since the metrics provide different and at times conflictingassessments, it is hard for the manufacturing decision makers to track and improve overall manufactur-ing system performance. This research presents a data envelopment analysis (DEA) based approach forperformance measurement and target setting of manufacturing systems. The approach is applied totwo different manufacturing environments. The performance peer groups identified using DEA are uti-lized to set performance targets and to guide performance improvement efforts. The DEA scores arechecked against past process modifications that led to identified performance changes. Limitations ofthe DEA based approach are presented when considering measures that are influenced by factors outsideof the control of the manufacturing decision makers. The potential of a DEA based generic performancemeasurement approach for manufacturing systems is provided.

� 2011 Elsevier B.V. All rights reserved.

1. Introduction and context

Most manufacturing executives face three major obstacles asthey strive to keep a handle on their operations: information inun-dation, information isolation, and information indecision. In a nut-shell, executives often receive too much information from isolatedsources that is devoid of practical guidance for improvement. Fur-ther, in order for manufacturers to improve operational perfor-mance and in this context reduce manufacturing costs, they musthave an effective method of measuring and evaluating the perfor-mance of their manufacturing processes. This issue of effectivemeasurement is paramount in today’s manufacturing companies.

There is little guidance on setting performance improvementtargets in manufacturing systems. External benchmarks availablethrough industry trade associations or through consulting organi-zations are occasionally used for setting performance targets.

ll rights reserved.

l Virginia Tech grant from thet by ‘‘211 Project Phase III’’ at

cs.: +1 202 994 2736.

nal Science Foundation, while, finding, and conclusions andof the authors and do not

undation.

Targets based on these benchmarks need to be adjusted to the un-ique configuration and circumstances of the manufacturing systemthat is being evaluated and this is a non-trivial task. Furthermore,defining targets using a set of often conflicting performance indica-tors, such as combinations of due date performance, inventory lev-els, quality levels, throughput, cycle time and machine utilizationtypically cause confusion. For example, a conflict exists betweenthe objectives of achieving low inventory levels and high machineutilizations if there are long changeovers between manufacturingdifferent product types.

Simulation modeling also provides a potential approach to set-ting performance targets that take into account the manufacturingsystem’s resource availabilities and stochastic demand generatinglarge amounts of information. However, developing and maintain-ing current simulation models for a manufacturing system requireshigh expertise and effort. Therefore, both of these approaches, i.e.,the use of external benchmarks and simulation, do not solve theinformation inundation or isolation problems.

Consequently, organizations have typically tried to focus on oneor two measures. Such focus does help to improve performance onselected measures but at times to the detriment of overall perfor-mance. What is needed is an approach that allows the manufactur-ing enterprise to focus on a small number of measures; yet takeinto account multiple facets of performance. Further, a mechanismis required for setting realistic targets that take into account thecapabilities and changing circumstances of the manufacturing

Page 2: Manufacturing performance measurement and target setting: A data envelopment analysis approach

S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626 617

system. Such a mechanism should ideally not require large effortand expertise for maintenance.

This research presents an approach based on data envelopmentanalysis (DEA) (Charnes et al. 1978) for manufacturing perfor-mance measurement and target setting. The potential of DEA formanagerial diagnosis and control was noted years ago (Epsteinand Henderson 1989). DEA can typically be used for relative per-formance measurement and evaluation, benchmarking, target set-ting, and is one of the techniques available for identifying bestpractices. This paper reports on efforts to apply the DEA based ap-proach to two manufacturing organizations. The main conclusiondrawn from our interaction with the two organizations that is con-sistent with many applications of DEA reported in the literature isthat the fundamental value adding potential of the DEA approachfor manufacturing decision-making lies in its ability on the onehand to simplify the way by which decision makers are alertedof underperforming manufacturing units and on the other handits ability to point to peers and potential performance improve-ment targets.

The approach can be used on a continuous (rolling) basis asmore data are collected whereby operational performance can becontinuously monitored. It is not inconceivable to think of showingweekly/monthly/yearly performance reports in the manufacturingareas just as one can view statistical process control (SPC) charts(Hoopes and Triantis 2001). Even though, control charts are basedon statistical theory whereas the DEA performance reports arebased on linear programming, they both point to observations thatare out of control in the case of control charts and underperformingunits in the case of DEA. Performance improvement interventionsare in both cases found by asking why an observation is in theout-of-control range or why a unit is underperforming. UnlikeSPC, DEA provides guidance on why a unit is underperformingthrough performance improvement targets and comparison withpeers. It can also provide a deeper understanding of the manufac-turing process structure that has a large impact on the observedprocess performance over time. Typically, statistical techniquesevaluate the stochastic behavior of the production process bystudying process and/or product characteristics one at a time. Onthe other hand, efficiency measurement approaches include as partof their evaluation the entire set of critical product and/or processcharacteristics simultaneously. Previous research (Hoopes andTriantis, 2001) shows that these two approaches can be used in acomplementary manner to identify unusual or extreme productioninstances, benchmark production occurrences, and evaluate thecontribution of individual process and product characteristics tothe overall performance of the production process. However, thepotential linkage of DEA efficiency scores and performance targetsthat are derived from statistical process techniques such as controlcharts, six sigma, etc. is beyond the scope of this paper.

The focus of the work reported here is on supporting perfor-mance improvement efforts over time by the management fortwo very different real manufacturing scenarios and to evaluatethe ability of DEA to effectively assist decision-makers in perfor-mance improvement. The DEA based performance measurementapproach received positive feedback from decision makers in bothof the manufacturing systems. A general approach for the imple-mentation of a DEA based performance system is suggested basedon this experience.

This paper adds to the manufacturing performance measure-ment and decision making body of knowledge by addressing DEAimplementation issues. Appropriate input and output specifica-tions are discussed that deal with manufacturing issues such asundesirable outputs, variables that could potentially be definedas both inputs and outputs, feedback mechanisms such as rework,and others. Such combinations of issues that occur in realmanufacturing systems have not been extensively addressed in

the literature. Additionally, the specific manufacturing technolo-gies studied in this research have led to the definition of the vari-ables used in the DEA models that can be used and modified byother researchers in the future. It would be appropriate to definereasonable input/output specifications associated with variousmanufacturing technologies (wafer manufacturing, assembly linemanufacturing, etc.) as long as the mapping between the realworld and the modeling world is reasonable and can be verifiedby those operating within these manufacturing environments.These specifications can be then catalogued for future researchand implementation. The modeling differences associated withthe two manufacturing scenarios are identified to highlight the va-lue of selecting appropriate variables and models for the applica-tion of DEA, something that continues to be a research andapplication challenge. Approaches used for the validation and ver-ification of the models built are described. The implications fordecision making based on DEA results for the two scenarios arepresented to demonstrate practical relevance. To the extent thatall of the above can be generalized, a framework of performancemeasurement is proposed for manufacturing facilities in general.The framework provides a starting point by highlighting the possi-bility of end users’ driving the process of DEA implementation inmanufacturing environments.

This section introduced the need for manufacturing perfor-mance measurement and target setting and DEA as an approachto meet this need. The next section reviews the relevant literature.The third section presents the two manufacturing scenarios anddescribes the process of selecting the appropriate conceptual DEAmodels. Section 4 presents the data and the results from the DEAmodels including the performance scores, the grouping of decisionmaking units into peer groups and their utilization for target set-ting. The fifth section discusses the impact on decision makingbased on the results and provides a framework for the applicationof the DEA approach to manufacturing organizations. The last sec-tion presents conclusions and future directions for research.

2. DEA based performance measurement systems formanufacturing

Since DEA was first proposed by Charnes et al. (1978), it hasbeen applied in many sectors, including manufacturing and theassociated sector of logistics and distribution. The applications inmanufacturing have been studied across a wide range of issuesfor example evaluating alternatives, for alignment with businessgoals, etc. We include relevant recent efforts here. A modifiedDEA model was developed by Cook and Green (2004) to identifythe core business performance in multi-plant firms. Ertay et al.(2006) used DEA to evaluate layout configurations in manufactur-ing systems. Liu and Liu (2008) used DEA to compare relative effi-ciencies of nine production lines in an electronics assemblyenvironment.

Recent efforts have focused on enhancing DEA formulations toaddress manufacturing realities. Triantis et al. (2003) used possi-bility theory as an approach to evaluate the performance of thenewspaper preprint insertion manufacturing process. Zeydan andÇolpan (2009) combined TOPSIS (technique for order preferenceby similarity to find an ideal solution) for measuring qualitativeperformance and DEA for measuring quantitative performance toassess 28 job shops engaged in manufacturing and maintenancefor the Turkish air force. Wang and Chin (2009) used DEA, en-hanced with double frontiers, to evaluate and select advancedmanufacturing technology. Chen (2009) visualized a productionnetwork that is comprised of multiple interdependent sub-decisionmaking units (SDMUs) and used a network-DEA approach to pro-pose measures that consider the dynamic effects of SDMUs. Our

Page 3: Manufacturing performance measurement and target setting: A data envelopment analysis approach

618 S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626

work differs from these efforts in its motivation of providing aframework that can be applied across a range of manufacturingenvironments by manufacturing practitioners. This motivationleads us to use the traditional DEA approach, though with a varietyof specifications, across two different manufacturing environments(assembly line and wafer manufacturing).

Several of the DEA based research efforts focus on manufactur-ing performance in the context of business performance and strat-egies. Talluri et al. (2003) investigated the transmutation ofmanufacturing performance into business performance in the autosupplier industry. Narasimhan et al. (2004) used a multistage DEAapproach to identify the importance of using the different manu-facturing flexibilities for achieving tangible firm level performance.Leachman et al. (2005) studied the automobile industry to evaluatea firm’s performance relative to its rivals. Düzakın and Düzakın(2007) used DEA with balance sheet level data to analyze the per-formance of 500 industrial enterprises in Turkey. Saranga (2009)used DEA to identify the inefficiencies in the Indian auto compo-nent industry. In contrast to above efforts, the focus of this paperis on short term inputs and outputs with the aim of assisting themanufacturing management for periodic updates on performancemeasurement and improvement.

Another group of research efforts evaluate the impact of policiesand factors on manufacturing performance. Sheu and Peng (2003)analyzed notebook computer producers in Taiwan using DEA toidentify the significance of factors such as training time for newemployees, mean time to repair, use of design of experiments duringengineering, and frequency of engineering changes. Ng and Chang(2003) used DEA to identify a positive relationship between com-puter personnel and manufacturing output in their sample of man-ufacturing enterprises. Ertay and Ruan (2005) presented a DEAapproach for optimally allocating operators in cellular manufactur-ing systems. Ross and Ernstberger (2006) employed DEA to evaluatethe impact of information technology on productivity in the manu-facturing sector. These efforts explore decision issues at the enter-prise level and thus differ from the work presented in this paperwith its focus within the realm of manufacturing management.

Additionally, some efforts had focused on comparing the perfor-mance of a manufacturing system to others within the same com-pany (e.g., Cricelli and Gastaldi 2002 and Djerdjouri 2005). DEA hadalso been used for comparing alternative scheduling approacheswithin a manufacturing environment (see for example, Ruiz-Torresand López 2004). Perhaps the efforts that are similar in scope of thedecision making to the work reported here are those discussed byTalluri et al. (1997) and Girod and Triantis (1999). Talluri et al.(1997) studied the performance of cellular manufacturing usingwindow analysis. Girod and Triantis (1999) proposed approachesto identify the production plans that help improve performanceconsidering the fuzzy nature associated with the measurement ofthe inputs and outputs. The work reported in this paper is similarlymotivated to utilize DEA to help manage and improve manufactur-ing performance over time.

Manufacturing usually has undesirable outputs such as scrapand pollutants and hence their consideration is important. Seifordand Zhu (2002) revised the standard DEA to consider both desir-able and undesirable outputs. They used the classification invari-ance property to show that the standard DEA model can beapplied to improve the performance by increasing the desirableoutputs and decreasing the undesirable outputs. A similar idea isfollowed in one of the applications reported in this paper.

One of the objectives of this paper is to discuss the implemen-tation issues for a DEA based approach in a manufacturing context.This is consistent with other work in the literature where theimplementation issues associated with the use of DEA in a socialservice organizational setting are discussed (Medina-Borja et al.2007). Also, the literature mostly presents case studies that appear

unique without providing general guidance for a DEA based perfor-mance system implementation in a manufacturing system. Thispaper aims to provide guidance through presentation of a general-ized framework for such implementations.

3. The conceptual models

DEA was applied to two manufacturing scenarios. One scenariowas a traditional assembly line manufacturing while the other wasan advanced wafer manufacturing operation. The conceptual DEAmodels developed for each of the two scenarios are describedbelow.

3.1. Assembly line manufacturing

The first manufacturing scenario involved assembly operationsof an electro-mechanical product on multiple lines. The operationsof this manufacturing plant were divided into various depart-ments. The research specifically focused on the department thatwas responsible for assembling mechanical and electrical compo-nents of the products. The department includes three process areas- mechanical assembly, electrical assembly, and testing. Threekinds of products are produced named as P, Q and R in this paper.Products are built up as they move on an overhead conveyorthrough the mechanical and electrical assembly areas, stoppingat each work station for a fixed cycle time. In the test area, someof the products may be taken off line for prolonged testing whileothers may go through some routine tests. The department is apart of the overall manufacturing process, with various inputs fromexternal and internal vendors and the outputs going to internalcustomers.

Examination of the data and discussions with the company per-sonnel led to setting up the conceptual DEA model as shown inFig. 1. The inputs include resources in the form of labor hours, up-time, material and supply costs. Labor hours refer to the direct la-bor hours recorded for the week in the department. Uptimeidentifies the number of hours during the week that the line wasavailable for assembly operations and excludes the time the linewas down for preventive or unplanned maintenance and sched-uled shift breaks and in this sense can be considered as a resource.Material cost captures the cost of direct material entering thedepartment and used for production in the week. Supply cost mea-sures the cost of indirect material used by the department for thesame week. The outputs were measured as the production quantityfor the week for the three products collectively.

The selection of the inputs and outputs was based on an initialanalysis of the dataset and on a fundamental understanding ofthe manufacturing process. The inputs are clearly the primary re-sources that are input into any manufacturing production processincluding the materials (material and supply cost), machines (up-time) and men (labor hours). Similarly, production quantities arethe primary outputs of a production process and are used as suchfor the model. The relationships between the input and outputmeasures were also depicted by plots of individual inputs and com-bined production quantities of all three products over time. Theplots are not included due to page length restrictions. The manufac-turing organization kept track of the labor hours, uptime, and theproduction volumes by week and hence this data was easily avail-able. The data on material and supply costs was maintained by theaccounting department and took some effort to acquire.

3.2. Wafer manufacturing

The wafer manufacturing plant used batch production with itslayout organized by processes. The batches are formed by a

Page 4: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Fig. 1. Input output concept for assembly line manufacturing.

Wafer Manufacturing

Labor HoursINPUTS OUTPUTS

Production Lot Starts

Test Lot Starts

WIP (beginning)

Lot Completions

Lot Steps

Scrap

Fig. 2. Input output concept for wafer manufacturing.

2 The formulations for these models are included in the supplement to this paper3 The data collected from the manufacturing facility did not include labor hours

and uptime inputs specific to each product and hence could not support the DEAformulations where the three products could be treated separately.

S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626 619

number of wafers grouped together into lots by product type. Lotsare moved manually in carts through different process areas as re-quired by their routing. The lots may loop through a sequence ofprocess areas several times for imprinting of various layers of cir-cuitry. At each step, wafers may be processed individually orgrouped together and processed as a batch. The wafer manufactur-ing unit utilizes various resources (inputs) during the process andgenerates completed lots per customer specifications as outputs.Based on the information and input provided by the company per-sonnel the conceptual DEA model was defined as shown in Fig. 2.

The labor hour factor has the exact same definition as for theassembly line manufacturing case. The organization changed fromthree 8-hour shifts 5-day weeks to two 12-hour shifts and 7-dayweeks during the observations period. The operators changed fromworking 5 days per week, to alternating between 3-day and 4-dayweeks. While the change was quite major for the organization, theeffect on labor hours per week was not significant. Each operatorwent from working 40 hours/week to an average of 42 hours/week.The labor hours input per week that captured the total hours acrossall the operators for the week saw a similar small change. The pro-duction lot starts refers to the number of lots released on the pro-duction floor during the week in response to customer orders. Thetest lot starts refers to the number of lots released on the produc-tion floor by engineers to test process parameters for processequipment. The test lots do not get shipped out to any customers,yet they consume resources and contribute to organization goals.WIP (beginning) input variable is the work-in-process quantity atthe beginning of the week. Together the production and test lotstarts and the WIP (beginning) capture the material resource inputto the production process. This is parallel to the use of materialcosts as one of the inputs in the assembly line manufacturing mod-el. The lot completions output variable is important since it cap-tures the number of production lots completed and shipped outto customers. The lot steps output variable is the total number ofprocess steps that all the production and engineering lots in theplant moved through during the week. It is generally referred toas lot moves in the semiconductor industry. This measure helps de-fine the amount of work done and is useful for cases where theaverage cycle time is longer than the period of observation (oneweek in this case). Scrap defines the number of lots that arescrapped due to quality problems or failed engineering tests.

Similar to the assembly manufacturing case, the inputs are theprimary resource inputs for a production process including mate-rial (production and test lot starts and the beginning WIP), and la-bor (labor hours). Machine time was not included as an input inthis case since the company personnel indicated that labor hourswas the primary constraint. Also, unlike the assembly line situationwhere any downtime on a machine brought the whole line down,an interruption on one machine in wafer manufacturing did not di-rectly affect other machines. The outputs in this case go beyondmeasuring the completed production lots since the lots typicallyspend longer than one week in the production process. The re-sources are consumed in moving the lots through successive pro-cess steps and since that is only partially captured by the lotcompletions measure, lot steps and scrap are used as indicatorsof the work accomplished.

The initial variable list for consideration included some otherfactors such as cycle time and On-Time Delivery. Cycle time wasnot included since it was already represented through a closely cor-related variable, WIP. On-Time Delivery was not included as it is ameasure that is dependent on the decisions made outside theboundary of the manufacturing system. It is based on commitmentsmade by sales staff on delivery dates that may have been madeindependent of manufacturing system considerations and status.Though the measure is important from a business point of view, itdoes not reflect the efficiency of the manufacturing process.

4. Application results

For both the cases, the performance of the manufacturing sys-tem has been evaluated on a weekly basis using the above identi-fied variables. In DEA terminology, each week has been defined as adecision making unit (DMU) allowing performance comparisonsand evaluations on a weekly basis. The first sub-section below de-scribes the data and the models considered while the second sub-section presents the results from the selected DEA model for therespective cases.

4.1. Data and models

Multiple DEA models were explored for the two manufacturingorganizations and the most appropriate one selected. Most manu-facturing processes exhibit variable returns to scale due to con-straints that become effective for different parts of the operatingrange. Even though both the modeled processes were assessed tobe of the variable returns category (Banker et al. 1984), constantreturns scale models (Charnes et al. 1978) were executed for thesake of comparing the efficiency performance scores.

4.1.1. Data and models for assembly line manufacturingThe data for the selected variables for the assembly line manufac-

turing scenario was determined from the datasheets that were madeavailable for a preceding eight month period. DEA models evaluatedfor this organization were as follows: (1) Constant returns to scale(CRS) – input oriented; (2) CRS – output oriented; (3) variable re-turns to scale (VRS) – input oriented; (4) VRS – output oriented.2

Each of the four models was executed with three sets of inputand output specifications where different combinations of inputsand outputs provided varying discriminatory ability among thedecision making units. The three sets of input and output specifica-tions were considered as follows: (a) Inputs: labor hours, uptime,combined material inputs and supply cost; Output: combined pro-duction volume for all three products; (b) Inputs: labor hours, up-time, combined material and supply cost; Outputs: separateproduction volumes for the three products; (c) Inputs: labor hours,uptime, separate material and supply cost for each of the threeproducts; Outputs: separate production volumes for the threeproducts.3,4

.

Page 5: Manufacturing performance measurement and target setting: A data envelopment analysis approach

620 S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626

Both input and output oriented models were used for compar-ison purposes. The choice of input versus output oriented modelsdepends on the environment. If there isn’t much flexibility avail-able for controlling the inputs, the decision maker would focuson output oriented models. For example, in this data set, the up-time was largely determined by the effectiveness of the mainte-nance policies not under the control of production manager, andhence the objective should focus on getting more outputs for a gi-ven level of uptime. Thus, uptime was treated as a non-controllableinput variable in the formulation.

The results presented in the next section are based on model 4with input output specification as defined in set (a) in the listabove. The particular model and set of input and output specifica-tions were selected for their ability to discriminate adequately be-tween weekly performances during the observation period.

4.1.2. Data and models for wafer manufacturingThe data for the selected input and output variables for wafer

manufacturing was collected from the datasheets made availableby the company for a recent period of 12 months. It was also notedthat wafer manufacturer changed their shift scheduling signifi-cantly during the period of analysis. First 30 weeks, the manufac-turer had 5 days of 3-shifts of 8 hours each per week whileduring the remaining weeks the facility used 7 days operation with2 shifts of 12 hours each. The approach used to configure the datasets in this case was hence based on these two modes of operationsunlike that used for assembly line manufacturing. Two configura-tions were used for this analysis: (a) One set of 51 weeks; (b)two sets: first 30 weeks and last 21 weeks after the shift change.

A number of mathematical formulations for DEA were analyzedto determine which of the models were suitable for the specificdata set. The use of multiple models also helped cross validatethe results obtained from different models. The models evaluatedare4: (1) CRS – input oriented without including undesirable out-puts; (2) CRS – output oriented without including undesirable out-puts; (3) VRS – input oriented without including undesirableoutputs; (4) VRS – output oriented without including undesirableoutputs; (5) VRS – Hyperbolic efficiency measure with scrap treatedas an undesirable output (Färe et al. 2002).5

As the list above indicates, both input and output oriented mod-els were included in the comparison. For example, in this data set,if the production and test lot starts are determined by factors out-side the control of the manufacturing manager, then the objectiveshould focus on getting more outputs for the same inputs for a gi-ven level of lot starts. In this case, production and test lot starts aretreated as non-controllable input variables in the formulation. Onthe other hand if the lot completions goal is fixed by the Salesand Marketing organization and more outputs than planned can-not be sold, the objective should focus on reducing inputs for get-ting the determined set of outputs.

The results presented in the next sub-section are based on mod-el 5 in the list above with configuration (a), that is, one set of51 weeks data, since that provided the best discrimination usingall the included outputs among the weekly data. In general onewould expect that over time the manufacturing facility is experi-encing changes in the technology (Girod and Triantis, 1999). In thiscase, however, based on feedback from the manufacturing enter-prise there were no investments in capital that has led to techno-logical changes. Therefore, we considered that the manufacturingprocesses during the whole time horizon generated the productionpossibility set and efficiency differences were due to engineering,manufacturing and managerial practices.

4 The data collected did not support advanced DEA formulations that have theability to use decision maker specified weights for inputs and outputs.

5 The hyperbolic DEA formulation is provided in the supplement to this paper.

We used a linear approximation (see the on-line supplement tothis paper) to the Hyperbolic Efficiency model and programmed itin MS Excel Solver. This linear approximation to the HyperbolicEfficiency model also allows for the identification of benchmarks(peers) and can incorporate more than one undesirable output(such as scrap) (Färe et al. 2002). Interestingly, not much additionalinformation was gained from separating out the two subsets iden-tified as configuration (b) with model 5 in the above list. This wasindicative of the fact that the shift change that was instituted bythe manufacturing firm did not affect the efficiency performanceconsiderably. The first 4 models were not as conceptually completedue to the non inclusion of scrap, the undesirable output variable.

4.2. Verification and validation of DEA results

The results of the models were verified, that is, checked fortechnical correctness and validated, that is, checked for their abil-ity to correctly reflect the performance of the real process. Theusage of verification and validation terminology here is parallelto that used in discrete event simulation literature (see for exam-ple, Law and Kelton 1999).

All the candidate models listed in Sections 4.1.1 and 4.1.2 weresubjected to verification. The technical correctness check includeda thorough review of the results by the research team particularlythrough the peer group analysis. If a DMU was indicated as beinginefficient, the difference in its inputs and outputs with its efficientpeers were checked to ensure that they indeed were inefficient.The verified models were then screened by the research team fortheir discriminatory power in terms of being able to identify inef-ficient performers and their peer groups. The screening led toselection of a recommended model for each of the two manufac-turing organizations as identified in the preceding sub-sections.

The recommended models for each organization were then sub-jected to validation. The results were validated through reviewswith the decision makers from the respective organizations. In par-ticular the plots of efficiency scores over time were checkedagainst variations in the traditional performance measures usedby the company. Historical data for periods with major drops inefficiency were considered to ensure that the company manage-ment team would identify those periods as poor performers inde-pendent of the DEA results. Some of the periods considered werealready known as having achieved poor performance and this con-tributed to an initial level of comfort with DEA results. Other peri-ods were not identified previously as poor performers, but with acomprehensive consideration of the input and output factors thedecision makers agreed with DEA results. Similarly, there wereperiods that were not considered previously as good performers,but were redeemed after the second look prompted by the DEA re-sults. It was realized that these periods achieved their perfor-mances under extenuating circumstances that reduced one ormore of the input resources. Peer group information was very use-ful in assessment of both good and poor performers.

4.3. Efficiency scores, peers and targets

The primary results from a DEA model consist of the efficiencyscores for the DMUs being compared and the peer groups for eachof the DMUs to enable comparison with peers in similar circum-stances. The efficient peers (benchmarks) can be used subse-quently to determine best practices. DEA also provides details onidentifying the areas of inefficiency. Target values are determinedfor each inefficient input or output factor for each DMU.

As mentioned in the introduction, the emphasis of the proposedapproach is on the underperforming units. Nevertheless, there areanalytical approaches in the literature that are focused on distin-guishing among the efficient DMUs (Cooper et al. 2006). Ertay

Page 6: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Fig. 3. Efficiency Scores for assembly line manufacturing (uptime as non-controllable variable, VRS output oriented model).

Table 1Output performance report for inefficient weeks (DMUs) in assembly line manufacturing (values normalized against the maximum for the respective data).

DMU Efficiency Output factor for improvement Actual value Target value Difference (absolute) Percentage (%) Peers (benchmark weeks)

1 1.0467 Product (P,Q, R) 0.8670 0.9075 0.0405 4.67 7, 10, 14, 232 1.0268 Product (P,Q, R) 0.9413 0.9665 0.0252 2.68 4, 103 1.0015 Product (P,Q, R) 0.9538 0.9552 0.0014 0.15 4, 105 1.0042 Product (P,Q, R) 0.9506 0.9546 0.0040 0.42 4, 7, 10, 156 1.0481 Product (P,Q, R) 0.9273 0.9719 0.0446 4.81 4, 109 1.0252 Product (P,Q, R) 0.9538 0.9778 0.0240 2.52 4, 1011 1.0065 Product (P,Q, R) 0.9732 0.9795 0.0064 0.65 4, 1013 1.0069 Product (P,Q, R) 0.9874 0.9942 0.0068 0.69 10, 17, 1916 1.0319 Product (P,Q, R) 0.9683 0.9992 0.0309 3.19 10, 1718 1.0056 Product (P,Q, R) 0.9831 0.9886 0.0055 0.56 12, 1920 1.0674 Product (P,Q, R) 0.9300 0.9927 0.0627 6.74 9, 17, 1921 1.0658 Product (P,Q, R) 0.9138 0.9739 0.0601 6.58 12, 2324 1.0501 Product (P,Q, R) 0.9422 0.9894 0.0472 5.01 12, 1925 1.0008 Product (P,Q, R) 0.9897 0.9905 0.0008 0.08 12, 19

S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626 621

and Ruan (2005) and Ertay et al. (2006) used cross efficiency (Doyleand Green, 1994) and minimax efficiency respectively to compareefficient DMUs. Zhu (1996) used super-efficiency to identify therobustness of efficient DMUs. Management may choose to focuson analyzing differences between efficient DMUs. The decisionmay be based on the characteristics of the manufacturing processand the performance improvement strategy of the manufacturingenterprise. Furthermore, if the approach will be adopted andimplemented on the manufacturing floor the amount of data col-lected will continuously increase and that also means that fewerunits will be found efficient by the DEA model. The results ofDEA models for the two organizations are presented below.

4.3.1. Efficiency scores, peers and targets for assembly linemanufacturing

The efficiency scores from the DEA analysis for assembly linemanufacturing scenario using model formulation 4 with input out-put specification set (a) defined in Section 4.1.1 are shown in Fig. 3.The figure indicates that 14 out of the 25 weeks were ‘‘inefficient’’6

since they are above the line representing efficiency of 1 (note thatthe point for week 25 is just a slight bit above the line and henceconstitutes the 14th inefficient week).

The DEA model used being of output orientation primarily pro-vides information on how to best improve the performance of out-put factors. Table 1 provides the list of inefficient weeks and theshortfall in the production volumes that resulted in their beinginefficient. For example, the table indicates that for week 1 to be-come efficient it would have to increase joint production of allthree products by 4.67%.

6 In output oriented models scores above 1 indicate inefficiency. Inverse of thesescores were presented as efficiency to manufacturing personnel for ease of discussion

.

Even though the model used was focused on the output orien-tation, it can identify cases that require reductions in inputs inaddition to increasing the outputs to match the efficiency of aDMU’s benchmarks/peers. The weeks listed in Table 2 below wereinefficient due to both producing lower outputs and using identi-fied inputs at a higher level than what was used by the efficientbenchmark weeks (peers). For example, Table 2 indicates week 6would have been efficient if it had used 9.46% less cost in additionto producing 4.81% more of combined volume of the three prod-ucts as indicated in Table 1. This information can be used to iden-tify practices and procedures that led to the overuse of resources.

4.3.2. Efficiency scores, peers and targets for wafer manufacturingThe efficiency scores from DEA for the wafer manufacturing sce-

nario using the model formulation 5, hyberbolic efficiency, andconfiguration (a) defined in Section 4.1.2 are shown below inFig. 4. The figure indicates that 33 out of the 51 weeks were ‘‘inef-ficient’’, that is, they did not use the input resources to generate theoutputs with the same efficiency as the rest of the weeks (pleasenote several points appearing to be efficient are actually slightlyabove the line representing the efficiency of 1). Such informationcan be used by decision makers to identify the causes for inefficientperformance and take actions to mitigate these causes in future. Itcan also be used on a rolling basis to help the decision makers fo-cus on the inefficient performances on an exception basis and focustheir energies elsewhere when high efficiency is maintained.

Since the selected model was again focused on the output orien-tation, the primary result from the analysis is regarding the out-puts that caused identified DMUs to be inefficient as shown inTable 3. For example, Table 3 indicates that for week 27 to bedeemed efficient it should have achieved 61.60% more lot comple-tions, 82.09% more lot steps with 61.60% less generation of scrap.

Page 7: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Table 2Input performance report for the inefficient weeks (DMUs) in assembly line manufacturing (values normalized against the maximum for the respective data).

DMU Efficiency Input factor for reduction Actual value Target value Difference (absolute) Percentage (%) Peers (benchmark weeks)

2 0.9739 Cost (material + supply) 0.970725 0.8989 0.0719 7.40 4, 103 0.9985 Cost (material + supply) 0.914024 0.9075 0.0065 0.71 4, 106 0.9541 Cost (material + supply) 0.988188 0.8947 0.0935 9.46 4, 109 0.9754 Cost (material + supply) 0.905201 0.8902 0.0150 1.66 4, 1011 0.9935 Cost (material + supply) 1.000 0.8889 0.1111 11.11 4, 1016 0.9691 Cost (material + supply) 0.975624 0.9289 0.0467 4.79 10, 1721 0.9383 Labor hours 0.975624 0.9289 0.0467 4.79 12, 2324 0.9523 Labor hours 0.9942 0.9866 0.0077 0.77 12, 1925 0.9992 Labor hours 0.9983 0.9928 0.0055 0.55 12, 19

Fig. 4. Efficiency scores for wafer manufacturing using hyperbolic efficiency model.

Table 3Output performance report for the inefficient weeks (DMUs) for wafer manufacturing (values normalized against the maximum for the respective data; partial table; please seesupplement for the full table).

DMU Efficiency Output factor for improvement Actual value Target value Difference (absolute) Percentage (%) Peers (benchmark weeks)

2 1.5345 Lot steps 0.4275 0.6559 0.2285 53.45 18, 34, 442 1.5345 Lot completion 0.2693 0.4133 0.1440 53.45 18, 34, 442 1.5345 Scrap 0.5780 0.2691 0.3089 53.45 18, 34, 4422 1.7569 Lot steps 0.4040 0.7098 0.3058 75.69 34, 3822 1.7569 Lot completion 0.4040 0.7098 0.3058 75.69 34, 3822 1.7569 Scrap 0.9450 0.2297 0.7153 75.69 34, 3824 1.5219 Lot steps 0.5375 0.8181 0.2805 52.19 18, 34, 4424 1.5219 Lot completion 0.3295 0.5015 0.1720 52.19 18, 34, 4424 1.5219 Scrap 0.6606 0.3158 0.3447 52.19 18, 34, 4427 1.6160 Lot steps 0.4986 0.8057 0.3071 61.60 18, 34, 3827 1.6160 Lot completion 0.3467 0.6313 0.2846 82.09 18, 34, 3827 1.6160 Scrap 0.7248 0.2783 0.4465 61.60 18, 34, 3829 1.4390 Lot steps 0.5332 0.7673 0.2341 43.90 18, 34, 38, 44, 4529 1.4390 Lot completion 0.3734 0.6394 0.2660 71.23 18, 34, 38, 44, 4529 1.4390 Scrap 0.4312 0.2419 0.1893 43.90 18, 34, 38, 44, 45

622 S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626

This case also demonstrates the capability of DEA to identify multi-ple factors to be addressed for achieving efficient performance.

Again, in some instances performance improvement may re-quire both an increase in output and reductions in inputs to matchthe efficiency of a DMU’s benchmarks/peers. For example, theweeks listed in Table 4 were inefficient due to the use of identifiedinputs at a higher level in addition to the outputs being at a lowerlevel than what was used by efficient benchmark weeks (peers).For example, Table 4 indicates that week 27 would have been effi-cient if it had used 29.48% less test lot starts and 14.33% less WIP inaddition to achieving higher outputs defined in Table 3.

Similar to the case of assembly line manufacturing, DEA wassuccessfully applied to the wafer manufacturing company to dem-onstrate its use for performance measurement. It distilled perfor-mance measurement that was based on the behavior of the plantwith respect to five input measures and three output measuresinto one integrated efficiency value. These efficiency values high-lighted 33 weeks where the manufacturing facility behaved ineffi-

ciently based on the five input and three output measures. DEAalso identified targets for specific inputs and outputs that had tobe achieved during these inefficient weeks to match the efficiencyof their peers.

5. Impact on decision making

As for any performance measurement and/or improvementtechnique, the value of DEA can be evaluated based on the decisionsupport it provides. Would the decisions change based on the DEAresults compared to the decisions made without the benefit ofDEA? The DEA results and recommendations were reviewed withthe decision makers at the two partner manufacturing organiza-tions. Overall, the feedback was very positive and the summarizedbelow. Note that this study was focused on use of DEA for trackingperformance of a manufacturing system over time and the com-ments below apply to such a purpose.

Page 8: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Table 4Input performance report for inefficient weeks (DMUs) for wafer manufacturing (values normalized against the maximum for the respective data; partial table; please seesupplement for the full table).

DMU Efficiency Input factor for improvement Actual value Target value Difference Percentage (%) Peers (benchmark weeks)

2 0.6517 Labor hours 0.8844 0.6495 0.2349 26.56 18, 34, 442 0.6517 Production lot starts 0.6854 0.5843 0.1011 14.75 18, 34, 442 0.6517 Test lot starts 0.7034 0.2815 0.4220 59.98 18, 34, 4422 0.5692 Labor hours 0.8844 0.7505 0.1338 15.13 34, 3822 0.5692 Production lot starts 0.6742 0.6317 0.0425 6.30 34, 3822 0.5692 Test lot starts 0.5931 0.3738 0.2193 36.97 34, 3824 0.6571 Labor hours 0.8844 0.8717 0.0127 1.43 18, 34, 4424 0.6571 Production lot starts 0.7865 0.6053 0.1812 23.04 18, 34, 4424 0.6571 Test lot starts 0.6000 0.5381 0.0619 10.32 18, 34, 4427 0.6188 Test lot starts 0.7655 0.5399 0.2256 29.48 18, 34, 3827 0.6188 Wip (beginning) 0.9966 0.8537 0.1428 14.33 18, 34, 38

S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626 623

5.1. Decision support provided by DEA

Decision makers in the assembly line scenario were line manag-ers inundated with a large number of graphs and charts on multi-ple performance outputs including production outputs,downtimes, and quality levels. They found the DEA results helpfulto identify the efficient and non-efficient performers as comparedto the multitudes of charts and graphs in use.

The decision maker in wafer manufacturing was the plant man-ager. He found the DEA outputs useful for similar reasons, butlooked for ways to incorporate measures such as the due date per-formance. He also expressed interest in including long term deci-sion variables such as capacity investments. These desiredmeasures are discussed among the limitations of using DEA in Sec-tion 5.2. The DEA results support decision making in several waysdiscussed subsequently.

(a) The primary strength of DEA is to identify the ‘‘true’’ efficientand non-efficient performers from among the seeminglygood and poor performers based on the numerous reportsused in manufacturing. The designation of ‘‘true’’ good per-former is admittedly based on the set of input and outputvariables used in the analysis together with the specificDEA model.

(b) Another key strength of DEA is the ability to identify peergroups for each period (DMU). Put simply, it identifies theother periods in the short term that had the closest similarcircumstances as defined by the set of inputs and outputsused in the efficiency calculation. This allows the decisionmaker and other personnel to identify the best practices thatled to efficient performances under a given set of circum-stances. These practices can then be implemented wheneversimilar circumstances occur in the future. For example, inthe wafer manufacturing scenario, DEA may be used to iden-tify ways to achieve a better performance through changesin inputs such as holding back some WIP lots or new lotstarts. This capability would require additional analysisusing the forecasted data for the future periods.

(c) The ability to identify peer groups can also be utilized to setrealistic performance targets for future periods. The analysisof past periods together with peer groups identifies the out-puts that should’ve been achieved for a given set of inputmeasures. Use of past periods as benchmarks to set perfor-mance targets provides the manufacturing team with lotmore confidence that they are achievable as compared totargets based on an arbitrarily set percentage improvementover a certain time period or those based on externalbenchmarks.

(d) DEA efficiency over time plots can be used akin to the statis-tical process control charts calling for the decision makers’attention for non-efficient performance below a certain

threshold. Such use will free up the decision makers fromanalyzing the performance of periods that may appear tobe poor performers based on traditional reports. The timesaved can be productively used for strategic decisions ratherthan fire fighting.

(e) The performance tracking over time provides a good visualdisplay to evaluate the impact of process and managementimprovement efforts with associated motivational benefits.

(f) The use of DEA promotes objectivity in decision making. Thedecisions made based on traditional manufacturing reportsmay be questioned at times by manufacturing personnelsince it is hard to identify the good and poor performanceparticularly among seemingly mediocre periods. The infor-mation on optimum weights takes away the argument ofunfair weights being used that is typically levied againstdecision maker based weighting schemes. Occasionally adecision maker may be able to crystallize the multiple tradi-tional reports to identify the ‘‘true’’ efficient performers, butshe/he may be faced with the unenviable task of explainingthe logic to the team. DEA removes this apparentsubjectivity.

5.2. DEA limitations

DEA models work best when the input and output measures areclearly related to the production process being analyzed. This re-sults in the exclusion of measures such as due date performancethat are based on policies external to manufacturing such as salespromotion efforts.

Another limitation is the inability to treat short term and longterm measures together. The performance of the system can becompared across periods using DEA as long as the system is funda-mentally the same. Long term measures such as capacity invest-ment can be used as long as the system boundaries are definedto encompass a longer time horizon term and consequently longterm measures. It is hard and confusing to incorporate measuresat multiple levels of resolution together, i.e., to mix and match bothshort and long term measures.

A case in point is the comparison of the DEA measure with oneof the measures used by the wafer manufacturing company shownin Fig. 5. The company made a significant change in the shift tim-ings about halfway through the observation period that effectivelycreated a small step jump in its capacity. It moved after week 30from a 5-day workweek to 7-day workweek. The company perfor-mance measure in the graph shows a significant jump followingthis change. The single input (Labor hour) and output (lot moves)measure did not reflect the movement of the other inputs suchas the WIP that went up considerably around that time and wereof concern to the plant manager. The DEA efficiency measureshowed that the performance got more consistent even withtaking the multiple inputs and outputs into account. There was

Page 9: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Fig. 5. Effect of a major change in the system on DEA and company measures in thewafer manufacturing scenario.

624 S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626

discrepancy between the two measures just after the change from5-day week to 7-day week operation where the traditional mea-sure showed a drop in performance but the DEA measure doesnot. The DEA measures consider multiple inputs and hence deter-mine that those weeks were efficient since lower outputs weregenerated using less of the inputs. The DEA measure did show thatthe 5-day week mode was capable of generating efficient perfor-mance though with lower frequency. The DEA model thus didnot meet the plant manager’s expectation of all of the lowercapacity (5-day week) periods being significantly inefficient. Notethat the capacity is represented in the model only by the laborhours input variable for the wafer manufacturing case. If thesystem boundaries were defined to include the enterprise theninputs and outputs for the whole organization rather than justthe manufacturing area would be included in the analysis. In thatcase, the DEA results would have probably shown a differentpicture, perhaps more in line with the plant manager’s expectation.

It should be noted that the DEA benchmarks are internally gen-erated based on the past performance of the system. It is possibleto get into complacency with achieving efficient performancebased on the internal performance. It is also not inconceivable thatproduction floor people may collude to ensure that the bar is setlow. There is some value in doing external benchmarking periodi-cally to ensure that the performance achieved is among the leadingones.

The DEA calculations do get involved and it may be hard to ex-plain the logic to a typical manufacturing supervisor. In terms ofguidance for improvements, DEA identifies the input and outputfactors to be improved. The management team then has to deter-mine the changes that may help achieve the improvements. DEAwould not provide the actual management changes.

It is acknowledged that part of the difficulty with the majorityof DEA applications is that the number of variables typically re-quires a large number of data to complete a reasonable analysis(curse of dimensionality). Usually such data requirement exceedsthe data available. This leads to two complicating issues, i.e., lackof discrimination among efficient DMUs and the inherent bias inthe efficiency scores. As the number of observations increases,approximation of the true technology set and corresponding fron-tier is improved, thus the bias which is the difference between theactual efficiency scores and the estimated ones decreases. Plus, asthe number of input/output variables (dimension of the space) in-creases, the Euclidean distance between the observations in-creases. As a result, there will be less number of nearbyobservations that can convey information about the portions ofthe efficient frontier which is of interest. In addition, an increasein the number of input and output variables requires more obser-vations (DMUs) for constructing the efficient frontier, leading to anincrease in the bias of the estimated efficiency scores. In the liter-ature, this situation is referred to as ‘‘curse of dimensionality’’.Nevertheless, in this application we were restricted by the

availability of the data provided by the two manufacturing organi-zations. Furthermore, we had experimented with alternativeformulations that required fewer input and output variables with-out compromising the spirit of the manufacturing input/outputtechnology without major improvements in the results obtained.

Finally, the set up stage of a DEA based system will temporarilyrequire the plant managers to study and provide input to multipleDEA formulations and results. This is to ensure that a final model isselected that effectively represents the mapping of the virtualworld to the real world and that the results once validated canbe used to inform decisions. However, once an appropriate modelhas been selected and implemented, the plant managers will gain asubstantial time saving through using the comprehensive DEAscores rather than poring over numerous reports. In order to saveadditional time for the plant managers, a very small subset of theoriginal models can be selected for their evaluation based on feed-back information from production personnel such as crew chiefs,engineers and manufacturing workers.

5.3. A framework for DEA application to manufacturing performancemeasurement

The work reported here showed that DEA can be applied foreffective manufacturing performance measurement and target set-ting. The transition of DEA from research projects and papers to themanufacturing shop floor requires development of a generalizedframework for the purpose. A preliminary proposal is provided be-low for such a framework.

At the conceptual level, there are two major steps required forthe implementation of DEA in a manufacturing scenario. First,the set of appropriate inputs and outputs have to be determined,and second, the right DEA model has to be identified. Both thesteps get quite involved and require assistance from the DEAresearchers. A decision framework that guides the manufacturingpersonnel through these steps may help reduce the requirementfor assistance significantly or limit the assistance to the configura-tion stage. The framework needs to provide a list of inputs and out-puts that are commonly used in the associated manufacturingparadigm. The applicability of such a framework may be improvedwith industry specific lists of inputs and outputs. The typical re-source inputs for manufacturing include labor hours, machinetime, and materials, while typical outputs include production vol-ume, scrap, and progress in long lead time items, such as the lotsteps considered for the wafer manufacturing scenario. Guidanceand validation checks can be utilized to ensure that a cohesiveset of inputs and outputs is selected. The selected inputs shouldhave a correlation/association with the selected outputs.

The selection of a DEA model is more complex. A number ofstandard models such as those used in this project can be providedas options. They may include models with different combinationsof constant and variable returns to scale, input and output orienta-tion, and desirable and undesirable outputs. DEA models can beformulated with single stage, with one set each of inputs and out-puts, or multiple stages, with multiple additional sets of intermedi-ate outputs and inputs. Manufacturing personnel should generallybe guided to use the single stage models. Multiple stage DEA mod-els would require assistance from DEA experts in most cases.

The user should be allowed to experiment and iterate throughdifferent sets of inputs and outputs with different models. Themodels that best trace back to the underlying manufacturing pro-cesses and, to the extent possible, limit the number of variablesused should be considered. This allows for higher discrimination,i.e., one can identify a higher number of inefficient units as partof the analysis. This process should help the decision makers iden-tify the model that generates results that agree on major trendsrealized and provide explainable causes when they don’t. The

Page 10: Manufacturing performance measurement and target setting: A data envelopment analysis approach

Select applicable inputs and outputs for subject manufacturing systems

List of standard inputs and outputs

Select set of applicable DEA models

Available DEA models

Validate the inputs and outputs Company Data

Execute DEA models Company Data

Verify, Validate and Analyze results from DEA

models

Select DEA model for implementation

Integrate model with operational data systems

for regular use

Select applicable inputs and outputs for subject manufacturing systems

List of standard inputs and outputs

Select set of applicable DEA models

Available DEA models

Validate the inputs and outputs Company Data

Execute DEA models Company Data

Verify, Validate and Analyze results from DEA

models

Select DEA model for implementation

Integrate model with operational data systems

for regular use

Fig. 6. A framework for DEA application for manufacturing performance measure-ment and target setting.

S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626 625

model thus identified can be selected for use as an ongoing perfor-mance measurement tool. Fig. 6 summarizes the proposed decisionframework for generalized application of DEA for performancemeasurement and target setting for manufacturing.

6. Conclusions and future research

This paper evaluated the use of DEA for performance measure-ment and target setting in two real manufacturing organizationsinvolved in discrete part production. For each of the two organiza-tions, inputs and outputs of interest to the respective decisionmakers and coherent with the DEA assumptions were selected.Several DEA models were tested to identify the ones that provideddesired discriminatory power and could be validated against themajor trends experienced during the respective observationperiods.

The results from selected models including the efficiency mea-sures over time and the peer groups were reviewed with the deci-sion makers. The decision makers found the results useful for theirability to distil down a number of measures to a single efficiencymeasure with objectively determined weights. They also found va-lue in the identification of peer groups for comparing performanceamong periods with similar circumstances and for identifying thefactors contributing to lower efficiencies. Further discussions indi-cated strong interest in the potential for target setting customizedto the circumstances of a period. The impact of DEA results on deci-sion making has been identified. The capability of setting realistictargets based on the circumstances for a particular period for themanufacturing system has significant opportunity. One option isto execute the DEA analysis including the future periods with thescheduled levels of inputs and output goals. The analysis wouldindicate if the goals are set realistically or they are overachievingor underachieving based on the system capability as indicated bythe past performance. Another possibility is to use additional anal-ysis to generate secondary goal levels given the primary goals. Forexample, for the wafer manufacturing scenario the decision

makers may want to set goals for lot completions for future periodsand let the analysis determine the levels of lot steps and scrap forefficient performance. Research is needed to determine the bestapproach for generating performance targets in such cases. One ap-proach is to iterate with an initial set of output values and use ofdifferences within peer groups as calculated by DEA to successivelymodify and verify the targets to achieve efficient production. Theinitial set of output values may be determined using patternmatching to identify periods with similar set of input values.

A preliminary framework for guiding the implementation ofDEA based performance measurement system in manufacturinghas been proposed. The framework needs to be developed furtherfor populating the information required including typical inputsand outputs for discrete manufacturing and applicable single stageDEA models. Further work will be required to incorporate the mul-ti-stage DEA models in the framework. The use of multi-stagemodeling in the context of manufacturing enterprise would beused if the researchers wished to look at a disaggregate evaluationof manufacturing performance where the various stages of themanufacturing processes could be mapped into a multi-stageDEA formulation. This could be viewed as a linked multi-stageDEA formulation or if one would consider intermediate inputsand outputs between stages then the network DEA approach maybe relevant (Färe and Grosskopf 2000). Additional research is re-quired to automate the validation of selected inputs and outputs,and for automating the selection of the DEA models. Automatingthe steps will significantly improve the possibility of use of DEAin manufacturing.

Other potential research directions can be focused on removingthe limitations of DEA identified above. Approaches for incorporat-ing the performance measures that are not directly related to theconversion process such as the due date performance need to bedeveloped. One option is to use production volumes for identifiedpart groups that form the critical customer orders among the out-put measures. This will allow relating the production volumes tothe periods when they were due and thus indirectly assess thedue date performance. The trade off among the ability to calculatethe indirect measure with additional resolution and the additionaleffort required to track and analyze the outputs should be evalu-ated. Similarly, ways to integrate short term performance mea-sures and long term performance measures need to be studied.

Appendix A. Supplementary data

Supplementary data associated with this article can be found, inthe online version, at doi:10.1016/j.ejor.2011.05.028.

References

Banker, R.D., Charnes, A., Cooper, W.W., 1984. Some models for estimating technicaland scale inefficiencies in data envelopment analysis. Management Science 30(9), 1078–1092.

Charnes, A., Cooper, W.W., Rhodes, E., 1978. Measuring the efficiency of decisionmaking units. European Journal of Operational Research 2 (6), 429–444.

Chen, C.-M., 2009. A network-DEA model with new efficiency measures toincorporate the dynamic effect in production networks. European Journal ofOperational Research 194 (3), 687–699.

Cook, W.D., Green, R.H., 2004. Multi component efficiency measurement and corebusiness identification in multi plant firms: A DEA model. European Journal ofOperational Research 157 (3), 540–551.

Cooper, W.W., Seiford, L.M., Tone, K., 2006. Data Envelopment Analysis: AComprehensive Text with Models, Applications, References and DEA-SolverSoftware. Kluwer Academic Publishers, Boston/Dordrecht/London.

Cricelli, L., Gastaldi, M., 2002. Efficiency measurement of factories via dataenvelopment analysis. Systems Analysis Modeling Simulation 42 (10), 1521–1536.

Djerdjouri, M., 2005. Assessing and benchmarking maintenance performance in amanufacturing facility: A data envelopment analysis approach. INFOR 43 (2),121–133.

Page 11: Manufacturing performance measurement and target setting: A data envelopment analysis approach

626 S. Jain et al. / European Journal of Operational Research 214 (2011) 616–626

Doyle, J., Green, R., 1994. Efficiency and Cross Efficiency in DEA: Derivations,Meanings and Uses. Journal of the Operational Research Society 45 (5), 567–578.

Düzakın, E., Düzakın, H., 2007. Measuring the performance of manufacturing firmswith super slacks based model of data envelopment analysis: An application of500 major industrial enterprises in Turkey. European Journal of OperationalResearch 182 (3), 1412–1432.

Epstein, M.K., Henderson, J.C.,1989. Data Envelopment Analysis for ManagerialControl and Diagnosis. Decision Sciences. Atlanta: Winter 1989, 20(1), 90–119.

Ertay, T., Ruan, D., 2005. Data envelopment analysis based decision model foroptimal operator allocation in CMS. European Journal of Operational Research164 (3), 800–810.

Ertay, T., Ruan, D., Tuzkaya, U.R., 2006. Integrating data envelopment analysis andanalytic hierarchy for the facility layout design in manufacturing systems.Information Sciences 176 (3), 237–262.

Färe, R., Grosskopf, S., 2000. Network DEA. Socio-Economic Planning Sciences 34,35–49.

Färe, R., Grosskopf, S., Zaim, O., 2002. Hyperbolic efficiency and return to dollar.European Journal of Operational Research 136 (3), 671–679.

Girod, O.A., Triantis, K.P., 1999. The evaluation of productive efficiency using a fuzzymathematical programming approach: The case of the newspaper preprintinsertion process. IEEE Transactions on Engineering Management 46 (4), 429–443.

Hoopes, B., Triantis, K.P., 2001. Efficiency performance, control charts and processimprovement: Complementary measurement and evaluation. IEEE Transactionson Engineering Management 48 (2), 239–253.

Law, A., Kelton, W.D., 1999. Simulation Modeling and Analysis, 3rd edition.McGraw-Hill Science/Engineering/Math.

Leachman, C., Pegels, C., Carl, Seung, K.S., 2005. Manufacturing performance:Evaluation and determinants. International Journal of Operations & ProductionManagement 25 (9), 851–874.

Medina-Borja, A., Pasupathy, K.S., Triantis, K., 2007. Large-scale data envelopmentanalysis (DEA) implementation: A strategic performance managementapproach. Journal of the Operational Research Society 58 (8), 1084–1098.

Narasimhan, R., Talluri, S., Das, A., 2004. Exploring flexibility and executioncompetencies of manufacturing firms. Journal of Operations Management 22(1), 91–106.

Ng, Y.C., Chang, M.K., 2003. Impact of computerization on firm performance: A caseof Shanghai manufacturing enterprises. Journal of the Operational ResearchSociety 54 (10), 1029–1034.

Ross, A., Ernstberger, K.W., 2006. Benchmarking the IT productivity paradox: Recentevidence from the manufacturing sector. Mathematical and ComputerModeling 44 (1-2), 30–42.

Ruiz-Torres, A.J., López, F.J., 2004. Using the FDH formulation of DEA to evaluate amulti-criteria problem in parallel machine scheduling. Computers & IndustrialEngineering 47 (2/3), 107–121.

Saranga, H., 2009. The indian auto component industry – estimation of operationalefficiency and its determinants using DEA. European Journal of OperationalResearch 196 (2), 707–718.

Seiford, L.M., Zhu, J., 2002. Modeling undesirable factors in efficiency evaluation.European Journal of Operational Research 142 (1), 16–20.

Sheu, D.D., Peng, S.-L., 2003. Assessing manufacturing management performance fornotebook computer plants in Taiwan. International Journal of ProductionEconomics 84 (2), 215–228.

Talluri, S., Huq, F., Pinney, W.E., 1997. Application of data envelopment analysis forcell performance evaluation and process improvement in cellularmanufacturing. International Journal of Production Research 35 (8), 2157–2170.

Talluri, S., Vickery, S.K., Droge, C.L., 2003. Transmuting performance onmanufacturing dimensions into business performance: An exploratoryanalysis of efficiency using DEA. International Journal of Production Research41 (10), 2107–2123.

Triantis, K., Sarangi, S., Kuchta, D., 2003. Fuzzy Pair-wise dominance and fuzzyindices: An evaluation of productive performance. European Journal ofOperational Research 144, 412–428.

Zeydan, M., Çolpan, C., 2009. A new decision support system for performancemeasurement using combined fuzzy TOPSIS/DEA approach. InternationalJournal of Production Research 47 (15), 4327–4349.

Zhu, J., 1996. Robustness of the efficient DMUs in data envelopment analysis.European Journal of Operational Research 90 (3), 451–460.