16
ELSEVIER Journal of Operations Management 14 (1996) 3-18 JOURNAL OF OPERATIONS MANAGEMENT The operating impact of parts commonality * Asoo J. Vakharia a,*, David A. Parmenter b, Susan M. Sanchez c a Department of Decision and Information Sciences, College of Business Administration, University of Florida, Gainesoille, FL 32611, USA b College of Business and Public Administration, Governors State University, Unioersity Park, IL 60466, USA c School of Business Administration, University of Missouri - St. Louis, 8001 National Bridge Road, St. Louis, MO 63121, USA Received 1 October 1993; accepted 11 February 1995 Abstract This paper investigates the impact of between- and within-product parts commonality on the workload of a manufactur- ing firm using an MRP system. More specifically, we investigate the impact of several operational factors and their interactions with part commonality. We develop and validate a large simulation of an MRP system and integrate the generation of planned order releases with workload estimation on the shop floor. The results indicate that increasing parts commonality has positive effects in terms of average shop load but does lead to greater variability in terms of loadings as well as increasing system disruption. Further, we also find that the number of work centers significantly impacts the shop floor effects of commonality. Hence, although an increase in parts commonality results in less design effort and increased standardization, the negative effects of increasing commonality often appear on the shop floor level. This points to a need for the effective management of parts commonality by assessing the tradeoff between strategic "benefits" and operational "COSts". 1. Introduction The demands of the modern consumer require that manufacturing firms provide a wider variety of prod- ucts than ever before. A serious problem that often arises due to this demand for wider variety and more new products is parts proliferation, an increase in the number of distinct part types that a firm must fabri- cate or purchase in order to manufacture these prod- ucts. Unless standardization efforts have been institu- tionalized, a designer is likely to design new parts each time a new product is planned rather than trying * Corresponding author. ~ This paper is based upon work supported in part by the National Science Foundation under Grant No. DDM-92-15432. to use as many existing parts as possible. This increase in the number of part types can lead to unnecessary design effort, as well as serious ineffi- ciencies for manufacturing and many of the manu- facturing support functions such as purchasing, in- ventory control and product costing. In a recent plant visit to three furniture manufac- turing companies in North Carolina and Virgina, one of the authors observed, first hand, the managerial problems which were being addressed when the companies were attempting to implement cellular layouts. All three companies were involved in mak- ing office, bedroom and standardized furniture (for restaurants, hotels, etc.). In order to implement cellu- lar layouts and gain setup time/cost savings, all the companies were reducing the number of unique parts 0272-6963f96f$15.00 © 1996 Elsevier Science B.V. All rights reserved SSDI 0272-6963(95)00033-X

The operating impact of parts commonality

  • Upload
    buikhue

  • View
    222

  • Download
    0

Embed Size (px)

Citation preview

Page 1: The operating impact of parts commonality

E L S E V I E R Journal of Operations Management 14 (1996) 3-18

JOURNAL OF OPERATIONS

MANAGEMENT

The operating impact of parts commonality *

Asoo J. Vakharia a,*, David A. Parmenter b, Susan M. Sanchez c a Department of Decision and Information Sciences, College of Business Administration, University of Florida, Gainesoille, FL 32611, USA

b College of Business and Public Administration, Governors State University, Unioersity Park, IL 60466, USA c School of Business Administration, University of Missouri - St. Louis, 8001 National Bridge Road, St. Louis, MO 63121, USA

Received 1 October 1993; accepted 11 February 1995

Abstract

This paper investigates the impact of between- and within-product parts commonality on the workload of a manufactur- ing firm using an MRP system. More specifically, we investigate the impact of several operational factors and their interactions with part commonality. We develop and validate a large simulation of an MRP system and integrate the generation of planned order releases with workload estimation on the shop floor. The results indicate that increasing parts commonality has positive effects in terms of average shop load but does lead to greater variability in terms of loadings as well as increasing system disruption. Further, we also find that the number of work centers significantly impacts the shop floor effects of commonality. Hence, although an increase in parts commonality results in less design effort and increased standardization, the negative effects of increasing commonality often appear on the shop floor level. This points to a need for the effective management of parts commonality by assessing the tradeoff between strategic "benefits" and operational "COSts".

1. Introduction

The demands of the modern consumer require that manufacturing firms provide a wider variety of prod- ucts than ever before. A serious problem that often arises due to this demand for wider variety and more new products is parts proliferation, an increase in the number of distinct part types that a firm must fabri- cate or purchase in order to manufacture these prod- ucts. Unless standardization efforts have been institu- tionalized, a designer is likely to design new parts each time a new product is planned rather than trying

* Corresponding author. ~ This paper is based upon work supported in part by the

National Science Foundation under Grant No. DDM-92-15432.

to use as many existing parts as possible. This increase in the number of part types can lead to unnecessary design effort, as well as serious ineffi- ciencies for manufacturing and many of the manu- facturing support functions such as purchasing, in- ventory control and product costing.

In a recent plant visit to three furniture manufac- turing companies in North Carolina and Virgina, one of the authors observed, first hand, the managerial problems which were being addressed when the companies were attempting to implement cellular layouts. All three companies were involved in mak- ing office, bedroom and standardized furniture (for restaurants, hotels, etc.). In order to implement cellu- lar layouts and gain setup t ime/cos t savings, all the companies were reducing the number of unique parts

0272-6963f96f$15.00 © 1996 Elsevier Science B.V. All rights reserved SSDI 0272-6963(95)00033-X

Page 2: The operating impact of parts commonality

4 A.J. Vakharia et al./Journal of Operations Management 14 (1996) 3-18

by instituting commonality. For example, in one

company the door panels for end tables and other bedroom furniture were being standardized, while in another company the distinct number of railings used in different chairs (used in office/standardized set-

tings) were being reduced. Although this substan- tially facilitated the production planning process (as

fewer distinct parts were being manufactured), it lead to serious load imbalances on the shop floor as

certain key machines through which the common

parts were processed were now overutilized, while other machines through which the eliminated parts

were previously processed were being underutilized. This research is motivated by similar shop floor

problems which arise when commonality is imple-

mented. Although numerous authors, e.g. (Baker, 1985; Baker et al., 1986), have described the benefits of increased parts commonality for product design,

there is little research focusing on its operational impact. This study shows that increased parts com- monality can lead to some manufacturing benefits, such as a reduction in average workload due to a decrease in setups, and that shop conditions and the

degree of commonality determine commonali ty 's im-

pact on workload variability. In addition, increased commonality can have a major impact on what this study terms system disruption.

More specifically, we investigate the effect of increased parts commonality on the operating charac-

teristics of a manufacturing firm which uses material

requirements planning (MRP). A simulation model is used to study the impact of commonality on mea-

sures such as average shop load and shop load variability. To gain greater insight into the potential of commonality, its effects are observed under a

variety of shop conditions. The remainder of this

Table 1 Overview of the relevant literature

Study focus Reference Results

Measure development Moscato (1976)

Collier (1981)

Wacker and Trevelen (1986)

Choice of common Dogramaci (1979) items

Safety stock Collier (1982)

Baker (1985)

Baker et al. (1986)

Gerchak and Henig (1986)

Gerchak et al. (1988)

Workload impact Collier (1982)

Guerrero (1985)

Measure useful for distinguishing between frequently used parts and those used only in one or two end items. Cardinal aggregate measure of commonality (DCI) for a given product set. Refinement of Collier's (1981) measure allows comparisons between data sets. Measures of within-product and between-product commonality are proposed. An optimal (for a special cost function) and heuristic clustering method can determine which end items should be candidates for redesign to make use of common parts. Aggregate safety stock requirements are an inverse function of DCI. However, as McClain et al. (1984) point out. this particular inverse function holds only when product demands are independent. The rate of increase in safety stock declines as demand correlation increases. Replacing two unique parts with a common part results in smaller safety stock requirements for the common part. Results of Baker et al. (1986) hold for three or more products following any joint demand distribution. Results of Baker et al. (1986) hold for more than two products following any joint demand distribution providing the costs of the unique compo- nents are equal; safety stocks for all unique parts increase under commonality. Increased commonality results in decreased inventory cost and smaller average workload, but increased workload variability. Increased commonality results in lower total inventory cost but more variable work-in-progress inventory levels. Demand is lumpy for compo- nents at lower levels in the product structure.

Page 3: The operating impact of parts commonality

A .I. Vakharia et al. / Journa I o f Operation s Management 14 (1996) 3 - 18 5

paper is structured as follows. In the next section we briefly review the relevant literature. This is fol- lowed by a discussion of the research methodology in Section 3. In Section 4 the results of our simula- tion experiment are discussed and finally, in Section 5, the conclusions and implications of this research are presented. Appendix A provides details on the simulation model and issues related to data analysis.

2. Relevant literature

An overview of the major contributions to the literature on parts commonality is provided in Table 1. The focus of the work varies, including measure- ment development (Collier, 1981; Moscato, 1976; Wacker and Trevelen, 1986), the choice of common items (Dogramaci, 1979), and impacts on safety stock (Baker, 1985; Baker et al., 1986; Collier, 1982; Gerchak and Henig, 1986; Gerchak et al., 1988). In the context of this research, the two most relevant papers are those by Collier (1982) and Guerrero (1985), which examine the impact of parts common- ality on workload. Using simulation, Collier (1982) finds that increased parts commonality leads to de- creased inventory cost and decreased average work- load, with the decrease in workload due entirely to a reduction in the number of setups. On the negative side, he notes an increase in planned workload vari- ability which results from the larger production lots often seen under commonality. In another simulation study, Guerrero (1985) obtains similar results, find- ing that increased commonality leads to lower total inventory costs but more highly variable work-in- process inventory levels, particularly at lower levels in the product structure. He suggests that a firm with deep product structures and high parts commonality would likely have to deal with extremely lumpy demand requirements at the lower levels. Although the Collier (1982) and Guerrero (1985) results are both intuitive, the authors make little attempt to estimate the impact of commonality under different operating scenarios. Further, the results are sensitive to assumptions regarding the uncertainty in demand, correlation of end-item demand, etc. (McClain et al., 1984).

The research cited above demonstrates that in- creased parts commonality may represent a tradeoff

between lower average shop load and more highly variable load. Although achieving lower average load would provide an obvious benefit, the accompanying higher load variability might cause serious difficul- ties, particularly for a finn operating near its capacity limit (see, for example, (Bott and Ritzman, 1983; Harl and Ritzman, 1985; Krajewski et al., 1987)). This paper more fully investigates the nature of this tradeoff. The effects of commonality are measured under different demand patterns, setup costs, and lot sizing methods for a large number of end items. In addition, unlike other studies, the overall plant work- load is disaggregated and analyzed on a per work center basis. This allows us to observe any unbalanc- ing of individual work center loads which may be caused by the implementation of commonality.

3. Research methodology

3.1. Experimental design

The experimental design formulated in this study focuses on four operating/shop factors (which re- flect the manufacturing environment being modeled) and two primary factors (which reflect the degree of between- and within-product commonality) shown in Table 2. In this section, we discuss the experimental design and the performance measures used. We refer the reader to Appendix A for details of the simula- tion model and issues related to data analysis.

Two separate experiments are conducted: one us- ing the lot-for-lot (LFL) lot sizing technique and the other using the economic order quantity (EOQ) lot sizing method. We do not include lot sizing method as a factor within an individual experiment since we are not focusing on the value of a particular lot sizing method. However, when the results using these two lot sizing strategies differ substantially, we discuss them in detail. It is important to investigate more than one lot sizing strategy as the logic under- lying the strategy used can be expected to have a rather large impact on the outcomes observed. LFL, for instance, will require a setup for every order and, hence, should provide the greatest potential for a reduction in average processing time under parts commonality. LFL also generates lot sizes which mirror any variation in end-item demand. EOQ, in

Page 4: The operating impact of parts commonality

6 A.I. Vakharia et al./Journal of Operations Management 14 (1996) 3-18

Table 2 Experimental factors used in the study

Factor # Factor Treatment levels

1 Variance of end item demand (DEMVAR) Low, High 2 Correlation of end item demand (DEMCOR) None, High 3 Setup time/cost (SETUP) Low, High 4 Number of work centers (WCCOUNT) 1, 10, 30, 50, 150 5 Within-product commonality (WPCOMM) None, High 6 Between-product commonality (BPCOMM) None, High

contrast, generally results in fewer and larger lots. This may, on the one hand, lead to higher workload variability. On the other hand, however, EOQ's fixed lot sizes will not react in a mirror image fashion to small short term demand fluctuations. Thus, EOQ may act to subdue end-item demand variability to a certain extent (Collier, 1981; Bott and Ritzman, 1983).

Factors 1 and 2 focus on end-item demands. Factor 1 is "variability of end-item demand" (DEMVAR), which is expected to be one of the main determinants of workload variability but was not examined by either Collier (1981, 1982) or Guer- rero (1985). We investigate how demand variability moderates the impact of commonality on workload variability by incorporating a trapezoidal distribution with a mean of 100 for the distribution of demand per end item per week. The standard deviation asso- ciated with low demand variability is 4.1776 (de- mand ranging from 90 to 110) and that associated with high demand variability is 41.776 (demand ranging from 0 to 200).

Factor 2 is "correlation of end-item demand" (DEMCOR), which refers to correlation between the demands for different end items within a given week, not to serial correlation across weeks. As discussed earlier, prior research has noted the importance of positively correlated end-item demand when address- ing issues related to parts commonality. Low and high correlation refer to end-item demand correla- tions of zero and 0.7071, respectively. Mathemati- cally, we generate demand for end item i in week t as follows:

D i , = I O O + ( O . 5 - U i , ) K , + ( O . 5 - U i ' , ) K 2. (1)

The parameters K~ and K 2 are determined based on the trapezoidal distribution parameters and chosen to

match the demand means and standard deviations to those specified for Factor 1. Thus, K 1 = 12.17 and K 2 = 7.83 when we investigate low demand variabil- ity, while KI = 121.68 and K 2 = 78.32 when we assume high demand variability. End-item demand correlation is determined by the uniform random number generated. To achieve zero end-item demand correlation, all U, and Ui' t are independent, while use of a common uniform random number (Uit = U t for all i) produces an end-item correlation ( p ) of 0.7071 computed as p = K~/(K~ + K2).

Factor 3 is "setup time" (SETUP) which is in- cluded for several reasons. It is reasonable to expect that a high setup time would increase average pro- cessing requirements as well as increasing the size of the potential savings in average processing available through the use of more common parts. Setup time also may impact workload variability. A high setup time (and corresponding cost) increases the eco- nomic lot size when using EOQ, one of the two lot sizing techniques used in this study. These larger lot sizes may result in greater workload variability. If so, the major question of interest is how the impact of commonality on workload variability might be altered by the changing lot sizes. For each part (including end items), " l o w " setup times (in hours) are generated from a uniform distribution with pa- rameters (0.5,20). "H ig h " setup times are twice as long as " l o w " setup times. The values of these parameters are selected such that the ratio of setup to run times (specified in Appendix A) for parts are in line with those in actual industry (Krajewski et al., 1987).

Factor 4 is "number of work centers" (WC- COUNT). This factor, which has been ignored in prior research, recognizes that the actual production is performed at the work center level. Its effect is of

Page 5: The operating impact of parts commonality

A.J. Vakharia et al . / Journal of Operations Management 14 (1996) 3-18 7

practical importance if, for example, an interest in cellular manufacturing prompts a shop to disaggre- gate machines from a few large work centers into several smaller work centers. Further, by changing the number of work centers in the shop, we are also varying the degree of machine flexibility (i.e., the number of parts processed per work center). This type of flexibility, although ignored in prior research on commonality, has been recognized as a means of dampening workload variability in MRP systems (Bott and Ritzman, 1983; Harl and Ritzman, 1985).

The number of work centers in our study is fixed at five levels: 1, 10, 30, 50 and 150. With a total of 150 parts processed in the total shop, the extreme cases of 1 and 150 work centers imply zero work center specialization (i.e., all parts are processed at a single center or a completely flexible shop) and maximum work center specialization (i.e., each work center processes one part or a completely inflexible shop), respectively. The choices of the other levels of this factor are included to assess different degrees of work center specialization (or flexibility). Given the MRP system results in (Bott and Ritzman, 1983) and (Harl and Ritzman, 1985), we contend that it is at the work center level that measures such as aver- age workload and workload variability have the most managerial significance. Consider, for instance, a simplified situation in which a plant has two work centers with infinite capacity. In one scenario, work center 1 has processing requirements of 100 hours for each of the next two weeks and work center 2 has processing requirements of 50 hours for each week. In another scenario, work center 1 has pro- cessing requirements of 120 hours and 80 hours while work center 2 has processing requirements of 30 hours and 70 hours. If workload is only analyzed on a plant wide basis, both scenarios result in weekly planned workloads of 150 hours in each of the two weeks, implying a very smooth workload. Analyzed at the work center level, however, the second sce- nario obviously results in a great deal of workload variability. Hence, assuming infinite capacity, in- creasing the number of work centers should increase workload variability. Note that in actual practice, some degree of load balancing will naturally occur since work that cannot be scheduled for completion in week t (due to capacity constraints) will in all likelihood be offloaded to a later week depending

upon the availability of capacity in future time peri- ods (assuming, of course, that overtime is not used). In other words, the actual workload across work centers gets somewhat balanced in the presence of finite capacity constraints.

Factors 5 and 6, the primary factors, define the degree of commonality, with factor 5 denoting within-product commonality (WPCOMM) and factor 6 denot ing b e t w e e n - p r o d u c t c o m m o n a l i t y (BPCOMM). For both factors the low factor level reflects no commonality. The setting at the high factor level, using the TCCI measure of Wacker and Trevelen (1986), is 0.2308. When both factors are simultaneously at their high levels the overall degree of commonality is 2(0.2308)= 0.4616. The values 0.2308 and 0.4616 represent moderate levels of com- monality in comparison to those used by Collier (1982), which range from a low of 0 to a high of 0.6207. The procedures used to institute both types of commonality are illustrated in Fig. 1. Panel A of the figure shows the product structure for end item 1001 under the no commonality condition. Panel B shows the product structure under the high within- product commonality condition: three parts appear twice in the product structure, with 3001, 4001 and 4002, replacing parts 3003, 4005 and 4006. Panel C is an example of how between-product commonality in introduced. For end item i, one level two parent and its associated components (i.e., three parts) are duplicated in the product structure for end item i + 1 (see " T o 1002" in Panel C) while another common level two parent and its associated components are shared with end item i - 1 (see "From 1010" in Panel C). This pattern continues through all ten end items. The run time, setup time and holding cost for a common part are respecified as the average of the respective parameters for the original part as well as the part eliminated. Hence, the average workload for the complete shop is not affected by the introduction of commonality.

3.2. Performance measures

The work of Collier (1981) and Guerrero (1985) suggests that increased parts commonality will result in a tradeoff between lower average workload and greater workload variability. A lower average work- load could have a tremendous beneficial impact on a

Page 6: The operating impact of parts commonality

AJ. Vakharia et al. / Journal of Operations Management 14 (1996) 3-18

Pane l A: No C o m m o n a l i t y

1001 I ! !

I 2001 2002

, I " I I I

3001 3002 3003 3004

I L ' ' . ~ ----4----- , , I ~ ' ' I I i ! ! I ! I

4001 4002 4003 4004 4005 4006 4007 4008

Pane l B: W i t h l n - P r o d u c t C o m m o n a l i t y

1001 I I !

I 2001

! i

t i ! i

3001 3002 I I ! !

! I ! I I I I I

4 0 0 1 4 0 0 2 4 0 0 3 4 0 0 4

Pane l C: B e t w e e n - P r o d u c t C o m m o n a l i t y

i !

2 0 0 2 I I

! I ! i

3 0 0 1 3 0 0 4 ! I ! !

I ! ! | t I i !

4 0 0 1 4 0 0 2 4 0 0 7 4 0 0 8

1 2001

3001 3020

4001 4002 4039 4040 \ /

7' FROM 101o

1001

1 I

2 0 0 2 1

1 t 3003 3 0 0 4

I I 1

4005 4006 4007 4008 \ /

TO 1002

Fig . 1. W i t h i n - p r o d u c t a n d b e t w e e n - p r o d u c t c o m m o n a l i t y .

manufacturing firm, particularly a firm operating near its capacity limit while greater workload vari- ability, conversely, could have a tremendous nega- tive impact (see, for example, the results in (Harl and Ritzman, 1985)). The performance measures used in this study thus consider average workload, workload variability, and overall system disruption. Descrip- tions and further motivation follow.

center basis rather than on a plant wide basis. We remark that the "processing time required" compo- nent of PROCAVG's definition refers to setup time plus run time. PROCAVG is simply the arithmetic average of processing requirements, including both setup times and run times, averaged across the num- ber of work centers and time periods.

Average processing time required per work center per period (PROCAVG)

This measure assesses the average load, where capacity requirements are calculated on a per work

Average standard deviation of work center process- ing time (PROCSD)

This measure assesses load variability. To com- pute it, the sample standard deviation is first calcu-

Page 7: The operating impact of parts commonality

A.J. Vakharia et al. / Journal o f Operations Management 14 (1996) 3 -18 9

lated for each work center based on its processing requirements per period across all time periods. PROCSD is obtained by averaging these values over the number of work centers used.

System disruption (SYSDISI and SYSD1S2) Given that the simulation used in this research

assumes infinite capacity, costs which result from changes in work center processing requirements should not be ignored. We use two measures of system disruption: SYSDIS1 and SYSDIS2. SYS- DIS 1 is computed as

SYSDIS1 = ~.~ [(Xi-/Xnc)2 + S/2], (2) i = 1

where

m ~-

T =

S i t

X i =

s~ =

number of work centers; i = 1 . . . . ,m, number of time periods in week; t = 1 . . . . . T, expected averaged processing requirements with no comonality, total load on work center i in week t,

average processing requirements for work center i

E ~ l Xi,

T week-to-week variance of processing requirement of work center i

T - I

The values for X-~ and s/2 are determined for each work center from the data obtained during each simulation run. The value P'nc is determined from the product structure information. It represents the average processing requirements per week for the average work center. In this study, as will be dis- cussed subsequently, parts are assigned random setup and working times and then randomly assigned to different work centers (in each simulation run, the assignment of parts to work centers is, of course, held constant) such that each work center completely processes an equal number of parts. Thus, with the exception of a few outliers, work centers should have approximately the same workload per week.

SYSDIS1 is based on the "bias squared plus variance" logic of the standard statistical measure

mean squared error, but is conceptually a new mea- sure. Its first component, the squared average devia- tion from the control case of no commonality, repre- sents the system disruption caused by changes in average work center load when commonality is insti- tuted. In general, increased commonality should in- crease SYSDIS 1, since a common part work center's processing requirements will rise above ~nc and an eliminated part work center's processing require- ments will fall below /Znc. The second component, the time based variance of each work center load- ings, represents the system disruption caused by week-to-week fluctuations in processing require- ments. Any factor which increases this variability will act to increase SYSDIS1. This second compo- nent is included because imbalances in work center loadings over time will require increased managerial efforts in terms of determining staffing requirements. Thus the new measure attempts to combine both the changes in the average workload due to commonality as well as fluctuations in workload variability over time.

Note that as the number of work centers in- creases, the elimination of distinct parts through the introduction of commonality may actually result in zero ioadings at certain work centers. In such a case, s 2 will be zero but including these work centers in the computation of system disruption is questionable, since by completely eliminating these work centers we could gain the advantages of reduction in the degree of shop floor control required. Further, if this is done intelligently, we could always target work centers with poor quality output or work centers which require high setup times for such an elimina- tion (i.e., replace parts processed at such work cen- ters by common parts processed at other work cen- ters). Thus, we also propose a second measure of system disruption, SYSDIS2, which is computed as follows:

i= I;X~> 0

Here, tZn+c is the expected average processing re- quirement over the non-eliminated workstations (i.e., those with nonzero loadings) under a condition of no commonality.

Page 8: The operating impact of parts commonality

10 A.I. Vakharia et al. /Journal of Operations Management 14 (1996) 3-18

Holding cost (HOLDING) Although this variable does not concern capacity

requirements, it is being included to reexamine a surprising result from Collier (1981), who finds that commonality has no significant impact on holding cost when using EOQ. We remark that in our simula- tion, holding costs are a function of the product structure level, reflecting value added, but are not explicitly related to the processing time (see Ap- pendix A).

4. Results

We analyze the data using a weighted least squares regression procedure, which results in more efficient estimates when the error variance is not constant (for details, the reader is referred to Appendix A). All the regressions account for a substantial portion of the variation in the performance measures (R 2 values are between 0.9670 and 0.9999). We feel that this validates the choice of the shop factors in our simu- lation.

The remainder of this section is organized as follows. First, we discuss in detail results for the four performance measures PROCAVG, PROCSD, SYS- DIS 1 and SYSDIS2, pointing out differences (if they exist) due to the lot sizing rule. We remark that for every one of these performance measures, at least four of the fifteen potential two-way interaction terms are statistically significant under both the LFL and the EOQ scenarios. Given the presence of interaction effects, interpretation of the main effects alone is difficult, hence some of the results presented aggre- gate cells over different factors. However, since our interests are in the effects of commonality, we at- tempt to separate the results of within- and between- product commonality. Of particular interest are com- parison of the effects of introducing each type of commonality with the control case of not instituting that particular type of commonality. Hence, for illus- trating the effects of within-product (between-prod- uct) commonality, we compute the mean value of a performance measure with a high level of within- product (between-product) commonality across all other levels of every experimental condition and compare it to the mean value of the performance measure with a low level of within-product (be-

tween-product) commonality across all other levels of every experimental condition. Finally, we briefly discuss the results relating to the HOLDING (costs) performance measure under the EOQ lot sizing rule.

4.1. Average processing time required per work center per period

As would be expected, neither DEMVAR nor DEMCOR significantly impact the average process- ing time performance measure under either lot sizing rule. Both these variables simply change demand patterns for end items, without affecting the average demand and hence the average loadings in the shop. Under both lot sizing rules, there were four factors with statistically significant main effects on PRO- CAVG: the two types of commonality, the setup time, and the number of work centers. Further infor- mation regarding their effects follows.

Under LFL, regardless of the number of work centers, the introduction of within-product or be- tween-product commonality (aggregated across the DEMVAR, DEMCOR and SETUP factors) under an LFL scenario reduces PROCAVG by between 2.54% and 2.60% from the control case of no commonality. Introducing both types of commonality thus reduces PROCAVG by between 5.11% and 5.54%. Such decreases can be quite beneficial to a firm which may have been operating close to capacity limita- tions prior to the introduction of commonality. This result supports those of Collier (1982) and Guerrero (1985) since all argue that there is a significant reduction in average load (assessed in terms of costs or time) under commonality.

In contrast to the LFL results, PROCAVG (ag- gregated across the DEMVAR, DEMCOR and SETUP factors) increases only slightly when either type of commonality is introduced under EOQ. The magnitudes of the percentage changes in PROCA:VG are slightly more variable but substantially lower than those under LFL, ranging from an increase of 0.10% to an increase of 0.22%. Introducing both types of commonality increases PROCAVG by be- tween 0.20% and 0.35%. The small differences seen under commonality can be attributed to the fact that the eliminated parts come from low levels in the product structure. They have low holding costs and large order quantities and are therefore ordered infre-

Page 9: The operating impact of parts commonality

A.J. Vakharia et aL /Journal of Operations Management 14 (1996) 3-18 I 1

quently. Thus the removal of the eliminated parts under commonality does not remove many setups. Furthermore, the number of orders placed and setups required for the common part increases: as the total demand for the common part doubles, the order quantity increases by a factor of ,,/2. This partially offsets the decrease in setups gained for the elimi- nated part. While statistically significant, these dif- ferences appear to be of little practical importance.

The substantial impact of the number of work centers on PROCAVG is directly attributable to the fact that WCCOUNT is used as a divisor in the computation of PROCAVG. Thus, the statistically significant main effect is of little practical value. However, the significant interaction between com- monality and setup time under both lot sizing scenar- ios deserves mention. As would be expected, both within-product and between-product commonality provide the potential for greater improvement when setup times are larger. In an actual application, com- monality does facilitate reduction in total setup times since the number of parts to be handled is smaller. Further, increased commonality should also encour- age increased standardization and rationalization of the production process, leading to a reduction in the time needed to perform the setups for the common parts.

4.2. Average standard deviation of work center pro- cessing time

The PROCSD performance measure is signifi- cantly impacted by all six factors under EOQ, and by all but SETUP under LFL. The different results for the factor SETUP occur because changes in SETUP lead to changes in lot sizes under the EOQ scenario and, hence, increased load imbalances. However, lot sizes under the LFL scenario are not effected by the changes in SETUP, thus there is no significant im- pact of SETUP on PROCSD.

The results for both types of commonality (our primary factors of interest) are summarized in Fig. 2, which shows the percentage changes in the cell means aggregated across the DEMVAR, DEMCOR and SETUP factors for both lot sizing scenarios. In general, the following effects can be observed.

First, note that PROCSD generally increases un- der the introduction of commonality as the number of work centers increases from 1 to 50, and then decreases. This occurs because of two counter ef- fects. One is due to the implicit work center special- ization: the larger the number of work centers, the smaller is the number of parts processed per work center (i.e., there is lower machine flexibility). This tends to increase PROCSD. Conversely, when the

10

-5

-10

-15 1

Change in PROCSD(%)

. J '=~'\.~.

10 30 50

Number of Work Centers

LFL-WP -I- LFL-BP ~ EOQ-WP -=- EOQ-BP

Fig. 2. Percentage changes in PROCSD.

150

Page 10: The operating impact of parts commonality

12 AJ. Vakharia et al. / Journal of Operations Management 14 (1996) 3-18

institution of commonality results in a zero load for certain work centers, their corresponding load varia- tion is also zero, which serves to decrease PROCSD. The former effect dominates when the work centers initially process more than one part, since the intro- duction of commonality is less likely to result in centers with zero work loadings. The latter effect dominates in the 150 work center case, where each work center is completely specialized before the introduction of commonality.

Second, the higher PROCSD values under the within-product commonality condition as compared to the between-product commonality condition can be explained as follows. In the latter case, the com- mon part replaces two parts which are used for two distinct end items, while in the former case the common part replaces two parts used for the same end item. Given that the demand for distinct end items are not perfectly correlated, the demand vari- ability is higher when within-product commonality is introduced as compared to between-product com- monality.

Finally, with the exception of the 150 work center condition, the effect of commonality is surprisingly unimportant under EOQ. The EOQ results demon- strate such tremendous variability, both with and without commonality, that the marginal effect of commonality, although larger in absolute terms than with LFL, seems fairly trivial. If a manager of a firm using EOQ were concerned with excessive process- ing variability the best advice might simply be to stop using EOQ! The overall results for commonality under the one work center condition show an in- crease from 745.00 for no commonality of either kind to 756.49 for total commonality. These results agree with those of Collier (1981) despite being less dramatic.

Clearly, increasing demand variability and end- item demand correlation will increase PROCSD un- der either lot sizing scenario, and so the statistically significance of these factors in our experiment are not surprising. The effects are more pronounced under LFL, hence only these are discussed in detail. The aggregate cell means for PROCSD as a function of low and high demand variability are shown in Table 3. This reveals that the values of PROCSD for the high variance condition are almost exactly ten times the values of the low variance condition due to

Table 3 PROCSD by end item demand variability

Number of Demand variability work centers

Low High

1 28.96 288.07 10 3.66 36.53 30 1.62 16.15 50 1.14 11.37

150 0.53 5.27

the fact that the standard deviation of demand in the former condition is exactly ten times that of the latter condition. Of more interest is the effect of the num- ber of work centers on the processing time standard deviation. The rate of decline in processing time standard deviation per work center is proportionally less rapid than the rate of increase in the number of work centers. For example, consider the value 288.07, the standard deviation for the one work center condi- tion, and the value 5.27, the average standard devia- tion per work center when the plant consists of 150 work centers. Noting that (5.27)(150)= 790.50 is greater than 288.06, this suggests that the total vari- ability with which the system has to deal increases as the number of work centers increases. This is not surprising when you consider the well-known smoothing effect of aggregation. The increased vari- ability seen with multiple work centers is the effect of an uneven workload due to disaggregation.

As with end-item demand variability, end-item demand correlation also has a significant impact on processing time standard deviation. The aggregated cell means effect for both conditions of DEMCOR (under LFL) on PROCSD are shown in Table 4. Note that while PROCSD decreases as the number of

Table 4 PROCSD by end item demand correlation

Number of Demand correlation Change in

work centers Low High PROCSD (%) a

1 85.52 231.51 + 170.71 10 14.71 25.47 +73.15 30 7.68 10.08 + 31.25 50 5.76 6.76 + 17.36

150 2.84 2.96 + 4.22

a Computed as 100(PROCSDhig h - PROCSDIow)/PROCSDIo w.

Page 11: The operating impact of parts commonality

A.I. Vakharia et al. / Journal of Operations Management 14 (1996) 3-18 13

work centers increases, the decrease is not propor- tional to increases in the number of work centers. Hence, the greater the number of work centers, the greater is the variability in demand to which the system must respond. Further, the proportional in- crease in PROCSD under both low and high end-item demand correlation decreases as the number of work centers increases. This can be explained as follows. In order for end-item demand correlation to impact PROCSD, a work center must produce parts which are not only from different end items but also from the same level in the respective product structures (since all parts have the same one week lead time). This is less likely to happen when there are many work centers because each of these many work centers is then responsible for producing fewer parts. The presence of any effect at all for the 150 work center condition, where each work center is responsi- ble for only one part, occurs because under common- ality the same part may be used in multiple end items.

These results for PROCSD agree with and add to those of Collier (1981 ), who found that commonality had no impact on processing time variability under LFL for the one work center case. For multiple work centers, however, this study shows that commonality

can increase or decrease variability in processing requirements, depending on the conditions and type of commonality, even when using LFL. This result is also a function of the fact that, in our experiments, the introduction of commonality leads to a reduction of parts but not to a reduction of work centers. Thus, under the multiple work center scenario, PROCSD will definitely be impacted by the introduction of commonality.

The results showing an increase in PROCSD due to within-product commonality are probably of lesser managerial importance than those due to between- product commonality. The effect observed is depen- dent on both sources of demand for the common part generating requirements which must be produced in the same week. In a real world situation, where lead times are not constant at one week and the common parts do not always appear at the same level in the product structure, the effect of within-product com- monality on PROCSD would probably be minimal.

4.3. System disruption

System disruption is measured using two vari- ables, SYSDIS1 and SYSDIS2. All main effects are statistically significant for these measures under both

Change in SYSDISI/SYSDIS2(%) 50O

400

300

200

100

. /

/

/ /

/

1 10 30 50

Number of Work Centers

LFL-SYS1 -{- LFL-SYS2 -~ EOQ-SYS1 -=- EOQ-SYS2 ]

Fig. 3. Percentage changes in SYSDIS1/SYSDIS2 due to commonality.

15(

Page 12: The operating impact of parts commonality

14 A.J. Vakharia et al. / Journal of Operations Management 14 (1996) 3-18

Table 5 Squared deviation as a percentage of SYSDIS1 and SYSDIS2

Number of work centers

Squared deviation percentage

LFL EOQ

SYSDIS 1 SYSDIS2 SYSDIS 1 SYSDIS2

1 10.56 10.56 0.06 0.06 10 57.97 57.97 2.63 2.63 30 77.52 73.81 4.16 3.52 50 78.23 76.10 3.55 3.20

150 82.00 70.41 3.87 2.30

lot sizing scenarios. Fig. 3 shows that commonality has an extremely large impact on system disruption. Note that the two system disruption measures are equal when no work centers are eliminated, as hap- pened for the 1 and 10 work center cases in our simulations. Given the prior discussion on the PROCSD measure, the proportional impact is much smaller under the EOQ scenario than under the LFL scenario.

These results can be better understood by consid- ering the separate impacts of the average deviation and variance terms in computing SYSDIS1 and SYSDIS2. The contribution of the squared deviation component as a percentage of overall system disrup- tion is given in Table 5. Under LFL, the squared deviation term grows dramatically (to a maximum of 82%) in importance as the number of work centers increases. However, under EOQ, the percentage in- crease in this term due to an increase in work centers is a maximum of around 4.16%. Hence, there is a significantly greater impact of the average deviation term under the LFL scenario.

A final point is that when the number of work centers increased from 50 to 150 (i.e., where 150 work centers represents a highly specialized shop), a decrease in workload variability (measured in terms of PROCSD) was observed when commonality was instituted. A similar effect is again observed for SYSDIS2 when we institute commonality for the 150 work center shop (for both LFL and EOQ) since this measure is computed without considering work centers with " ze ro" loadings.

Given these results, under LFL a manufacturing manager's primary task with respect to commonality is the supervision of a series of one time adjustments

to work center capacity, rather than having to worry about permanently managing a long term increase in processing variability. On the other hand, under EOQ managerial efforts should be focused on reducing setup costs (in order to reduce lot sizes) and develop- ing a flexible work force which can operate multiple work centers.

Each time commonality is introduced various per- sonnel and equipment might have to be reassigned to alternate work centers. The cost of these reassign- ments should be considered in making decisions about common parts. Intelligently instituted com- monality should minimize disruption and/or maxi- mize the potential benefits. For instance, selecting common parts to replace eliminated parts made at the same work center would minimize disruption. If eliminated parts are to come from other work cen- ters, eliminate parts from expensive and hard to manage centers with the ultimate goal, as suggested by the results for SYSDIS2, being the total elimina- tion of the work center.

4.4. Holding cost

The results for commonality under EOQ are as follows. When within-product commonality is intro- duced, holding costs decrease by 9.63% from $1447.75 to $1308.26; when between-product com- monality is introduced, holding costs drop by 9.76% from $1448.69 to $1307.32. These changes in hold- ing costs are both statistically significant and of practical importance. In both cases, commonality leads to the expected decrease in holding costs. Note also that the type of commonality introduced does not significantly affect the magnitude of the de-

Page 13: The operating impact of parts commonality

AJ. Vakharia et al . / Journal of Operations Management 14 (1996) 3-18 15

crease. In both cases, commonality leads to the expected decrease in holding costs. This suggests that Collier's (1981) insignificant result for holding cost under EOQ was perhaps due to the small sample size used in his experiment. Although it could be argued that these results are a function of our defini- tion of holding costs (i.e., they do not explicitly account for processing times, see Appendix A), simi- lar results are also described in (Harl and Ritzman, 1985). They found that, in addition to machine flexi- bility and workload variation, lot sizes also had a significant impact on bottom line measures such as past due demand and inventory costs. From an opera- tional perspective the decline in inventory level itself may be more important than any reduction in actual holding expense. Reduced work in process will in general mean less material handling, shorter queues and smoother flow. This can lead to reduced lead times and better customer service.

5. C o n c l u s i o n s a n d i m p l i c a t i o n s

This paper has investigated the impact of com- monality on a manufacturing firm using an MRP system. The major implications of this study are: - Commonality can lead to a significant decrease in

shop load, particularly when using a lot sizing method like LFL which requires frequent setups.

- Commonality can also lead to a significant reduc- tion in holding cost for a firm using EOQ. The decrease in work-in-process should provide much more potential benefit to a manufacturing finn than just a reduction in holding cost.

- On the negative side, commonality may also lead to an increase in load variability under certain conditions. This increase can be quite small if commonality is used judiciously to eliminate en- tire work centers.

- The results for system disruption also demonstrate two important points. First, commonality has a significant impact on the squared deviation com- ponent of system disruption, which constitutes a major portion of system disruption when using LFL. This suggests that care should be taken when instituting commonality to select both the common and eliminated parts such that the expense in-

volved in the required one-time resource realloca- tion is reasonably low. Second, the increase in size of the differences between SYSDIS1 and SYS- DIS2 seen as the number of work centers ap- proaches 150 reemphasizes the potential benefits of eliminating work centers. The strategic selec- tion of which parts to make common should allow a finn to temper the increase in system disruption caused by the institution of commonality.

Finally, note that this study has not explicitly ac- counted for other hidden benefits of commonality. For example, the institution of commonality should lead to increases in standardization and, hence, re- peatibility. This could lead to process improvements (such as automation or the creation of focused cells) which, in turn, should enhance productivity. Addi- tionally, the institution of commonality will in all likelihood dampen forecast errors and the possibility of component unavailability due to the fact that we need to forecast and plan for fewer number of com- ponents.

The experiment in this paper has been conducted primarily at the planning level, but it points out some directions for future research. Guerrero's (1985) measurement of work-in-process inventory, along with the interesting results obtained through the con- sideration of multiple work centers suggest that a more detailed lower level analysis would be fruitful. Thus, the next step for commonality research could be to investigate other tactical and detailed opera- tional issues on the shop floor.

A p p e n d i x A . D e ta i l s o f t h e s i m u l a t i o n m o d e l a n d

a n a l y s i s o f re su l t s

A.1. Simulation model parameters

A.I.1. Experimental parameters - Variability of end-item demand. Average demand

is 100 units per week. Low demand variability and high demand variability generated using a trape- zoidal distribution with standard deviation of 4.1776 and 41.776, respectively.

- Correlation of end-item demand. Refers to correla- tion of end-item demands within a week and not

Page 14: The operating impact of parts commonality

16 A.J. Vakharia et al./Journal of Operations Management 14 (1996) 3-18

across weeks. Low and high demand correlation of 0 and 0.7071, respectively.

- Setup times per part. These are generated as fol- lows: (a) low setup times are generated based on from a uniform distribution with parameters (0.5,20); (b) high setup times are generated based on a uniform distribution with parameters (1,40). This translates to an average setup time per part of (a) 10.25 hours per batch for the low setup time condition, and (b) 20.50 hours per batch for the high setup time condition.

- Number of work centers. The five levels of this parameter are 1, 10, 30, 50 and 150.

-Within-product commonality. The two levels of this parameter are 0 and 0.2308 based on the TCCI measure.

- Between-product commonality. The two levels of this parameter are 0 and 0.2308 based on the TCCI measure.

A.1.2. Fixed/randomized factors: product related - There are ten end products marketed by the firm.

We assume that each end item is made to stock. - Initially, each end item product structure reflects

zero commonality (between and within). Hence, Fig. 1 (no commonality) is a product structure for one of the end items. All the other end items have similar product structures (i.e., three levels below the end item with 2 components required to make each parent). Hence, there are a total of 150 parts (including end items) manufactured in the shop. The usage per part is set at 1 unit and the lead time for manufacturing/purchasing a part is fixed at 1 week.

- T h e r e is no safety stock provided for the end products or components at any level in the product structure.

- There is no independent demand for intermediate items at lower BOM levels.

A.1.3. Fixed/randomized factors: shop related - Each comp9nent/end product is completely pro-

cessed at a work center (i.e., we assume one operation routings). No alternative routings are considered.

- The introduction of commonality results in com- mon parts being introduced in the product struc-

tures. For each common part introduced, the set up times, run times and holding costs are computed as the average of the part retained and part elimi- nated.

- T h e run times per part (in hours per unit) on a work center are generated from a uniform distribu- tion with parameters (0.05,2). This translates to an average run time per part of 1.025 hours per unit. Thus, the ratio of average setup times to average run times is 10 or 20 depending upon the level of the experimental factor setup times.

- There are no capacity limitations imposed on the work centers (i.e, we assume infinite capacity in number of hours available per week).

- T h e simulated shop is assumed to be machine limited since a worker assigned to a work center cannot be transferred to any other work center. In other words, there is no worker flexibility.

- For the no-commonality case, the 150 parts pro- cessed in the shop are randomly assigned to each work center such that the total number of parts per work center are 150 parts (single work center), 15 parts/center (10 work centers), 5 parts/center (30 work centers), 3 parts/center (50 work centers), 1 part/center (150 work centers). When commonal- ity is introduced, a certain part replaces another part (see Figs. 1 and 2). Hence, the number of parts allocated to each work center is no longer equal. Note that although there is a reduction in the total number of parts when commonality is introduced, there is no simultaneous reduction in the number of work centers.

- During a simulation run, an item is always routed to the same work center to which it is assigned.

A.1.4. Fixed/randomized factors: MRP related - The system is assumed to be a weekly regenera-

tive system. - N o uncertainties are associated with forecast er-

rors. There are no scrap and yield losses. - G i v e n that work centers are not capacity con-

strained (i.e., we assume infinite capacity), all components are assumed to be available when required.

- N o safety stock maintained at any level in the product structure. However, the implementation of EOQ will result in buffer stocks.

- The set-up cost per hour is $50.

Page 15: The operating impact of parts commonality

A.J. Vakharia et al . / Journal of Operations Management 14 (1996) 3-18 17

- Holding costs for each part are a function of the product structure level on which a part appears. The holding cost (in $ per unit per week) for: 1. the lowest level part (i.e., level 3) is randomly

generated from a uniform distribution with pa- rameters (0.1,1) which implies an average hold- ing cost per level 3 part of $0.55;

2. the level 2 part is randomly generated from a uniform distribution with parameters (1,2) which implies an average holding cost per level 2 part of $1.50;

3. the level 1 is randomly generated from a uni- form distribution with parameters (2,4) which implies an average holding cost for a level 1 part of $3.00; and

4. the end item is randomly generated from a uniform distribution with parameters (4,8) which implies an average holding cost per end- item of $6.00.

- For each part, the EOQ is determined based on the computed setup cost (setup cost per hour times the number of setup hours) and the holding cost per part per week. Based on the levels of the setup experimental factor and the holding costs de- scribed, the average EOQ ranged from (a) 130.70 units (for end items) to 431.70 units (for level 3 items) when setup times were " l o w " , and (b) 184.84 units (for end items) to 610.52 units (for level 3 items) when setup times were "h igh" . Given an average demand per part per week of 100 units, this translates to an average LFL lot size of I weeks' demand, while the average EOQ ranges from (a) 1.31 weeks' demand to 4.32 weeks' demand when setup times were " l o w " , and (b) 1.84 weeks' demand to 6.11 weeks' demand when setup times were "h igh" .

sample size of 50 observations per treatment cell would be appropriate.

A.3. Analysis issues

As stated in the paper in Section 3.1, this paper involves two separate experiments. In an analysis of variance framework each experiment involves a bal- anced 2 × 2 X 2 X S X 2 X 2 full factorial design with 160 separate cells. However, preliminary analy- sis showed that, for all five dependent variables, the homoscedasticity assumption required for analysis of variance is violated. This violation occurred primar- ily because of the two factors involving end item demand. To correct for this violation the data are analyzed using weighted least squares, with the weights chosen so as to equalize the variances. Since the population variances for each cell are not known, each observation is weighted by the inverse of the sample variance for the cell from which it comes. The analysis is performed using the commercial statistical package SPSS.

To apply weighted least squares in an analysis of variance situation dummy variables are used to rep- resent the six factors. Each of the factors held at two levels (i.e., factors l, 2, 3, 5 and 6), are coded using X = - 1 for observations which occur at the low level and X = 1 for observations at the high level. The coding of factor 4, the work center factor having five levels, requires the use of four dummy variables. This is coded as X = 1 for an observation from a simulation with k work centers (k = 1, 10, 30 and 50) and X = - 1 for an observation from a simula- tion with 150 work centers. This coding scheme results in the least correlation between the indepen- dent variables and their interaction terms.

A.2. Simulation experiments

The length of each simulation run was set to 200 weeks. However, statistical computations of the per- formance measures were based on 170 weeks since the first ten weeks were eliminated to guard against initialization bias and the last twenty weeks were eliminated since dependent demand cannot be deter- mined for lower level items in the last few weeks. After several trial runs it was determined that a

References

Baker, K.R., 1985. "Safety stocks and component commonality", Journal of Operations Management, vol. 6, no. l, November, pp. 13-22.

Baker, K.R., M.J. Magazine and H.L.W. Nuttle, 1986. "The effect of commonality on safety stock in a simple inventory model", Management Science, vol. 32, no. 8, August, pp. 982-988.

Bott, K.N. and L.P. Ritzman, 1983. "Irregular workloads with

Page 16: The operating impact of parts commonality

18 A.J. Vakharia et al . / Journal of Operations Management 14 (1996) 3-18

MRP systems: Some causes and consequences", Journal of Operations Management, vol. 3, no. 4, August, pp. 169-182.

Chow, W.W., 1978. Cost Reduction in Product Design, Van Nostrand Reinhold, New York.

Collier, D.A., 1981. "The measurement and benefits of compo- nent part commonality", Decision Sciences, vol. 12, no. 1, January, pp. 85-96.

Collier, D.A., 1982. "Aggregate safety stock levels and compo- nent part commonality", Management Science, vol. 28, no. 6, June, pp. 753-760.

Dogramaci, A., 1979. "Design of common components consider- ing implications of inventory costs and forecasting", liE Transactions, vol. 11, no. 2, June, pp. 129-135.

Gerchak, Y. and M. Henig, 1986. "An inventory model with component commonality", OR Letters, vol. 5, no. 3, May, pp. 157-160.

Gerchak, Y., M.J. Magazine and A.B. Gamble, 1988. "Compo- nent commonality with service level requirements", Manage- ment Science, vol. 34, no. 6, June, pp. 753-760.

Guerrero, H.H., 1985. "The effects of various production strate- gies on product structures with commonality", Journal of Operations Management, vol. 5, no. 4, August, pp. 395-410.

Harl, J.E. and L.P. Ritzman, 1985. "A heuristic algorithm for capacity sensitive requirements planning", Journal of Opera- tions Management, vol. 5, no. 3, May, pp. 309-326.

Krajewski, L.J., B.E. King, L.P. Ritzman and D.S. Wong, 1987. "Kanban, MRP, and shaping the manufacturing environment", Management Science, vol. 33, no. 1, January, pp. 39-57.

Mather, H., 1988. Competitive Manufacturing, Prentice-Hall, En- glewood Cliffs, NJ.

McClain, J.O., W.L. Maxwell, J.A. Muckstadt, L.J. Thomas and E.N. Weiss, 1984. "Comment on "Aggregate safety stock levels and component part commonality"", Management Sci- ence, vol. 30, no. 6, June, pp. 772-773.

Moscato, D.R., 1976. "The application of entropy measure to the analysis of part commonality", International Journal of Pro- duction Research, voi. 14, no. 3, March, pp. 401-406.

Wacker, J.G. and M. Trevelen, 1986. "Component part standard- ization: An analysis of commonality sources and indices", Journal of Operations Management, vol. 6, no. 2, February, pp. 219-244.

Wonnacott, R.J. and T.H. Wonnacott, 1981. Regression: A Sec- ond Course in Statistics, Wiley, New York.