106
The Pennsylvania State University The Graduate School Department of Industrial Engineering DEVELOPMENT AND EVALUATION OF SOFTWARE FOR SYSTEMATIC BENCHMARKING OF PRODUCT FAMILIES A Thesis in Industrial Engineering by Rahul Sarnobat © 2010 Rahul Sarnobat Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Science December 2010

DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Page 1: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

The Pennsylvania State University

The Graduate School

Department of Industrial Engineering

DEVELOPMENT AND EVALUATION OF SOFTWARE FOR SYSTEMATIC

BENCHMARKING OF PRODUCT FAMILIES

A Thesis in

Industrial Engineering

by

Rahul Sarnobat

© 2010 Rahul Sarnobat

Submitted in Partial Fulfillment

of the Requirements

for the Degree of

Master of Science

December 2010

Page 2: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

The thesis of Rahul Sarnobat was reviewed and approved* by the following:

Timothy W. Simpson

Professor

Department of Industrial & Manufacturing Engineering

Thesis Advisor

Gul E. Kremer

Associate Professor

Department of Industrial & Manufacturing Engineering

Thesis Advisor

M. Jeya Chandra

Professor in Charge of Academic Programs & Graduate Program Coordinator

Department of Industrial & Manufacturing Engineering

*Signatures are on file in the Graduate School.

Page 3: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

ABSTRACT

Benchmarking is no longer synonymous with imitation but considered an excellent means of

innovation. Benchmarking individual products with competitors or an industry’s best standards

has been gaining momentum since its first inception in the manufacturing industry. Furthermore,

many manufacturers are designing product families to make use of the strategic advantages and

valuable insights it has to offer over competitors including better part reuse, lower redesign

efforts, and rapid response to changing market requirements. However, the advantages of

benchmarking of product families are often undermined by limited capability of current

benchmarking tools. Additionally, the process is typically done manually without the aid of

software making it not only time-consuming but also subject to human variability. To address

these problems, the Product Family Analysis Toolkit (PFAToolkit) is introduced, which

combines several popular benchmarking tools to streamline and standardize the process of

product family benchmarking. The features and capabilities offered by the PFAToolkit are

discussed, and then its functionality is demonstrated using several examples. The benefits and

usability of PFAToolkit are discussed through a research study involving families of power tools.

The study aimed at assessing the efficiency, effectiveness, and satisfaction of the PFAToolkit

over manual benchmarking of product families. The results of the study show that the

PFAToolkit significantly improves the efficiency of the product family benchmarking process

and results of the satisfaction survey substantiate the above claim by indicating that majority of

the users were satisfied with the software’s capability, features and ease of use.

iii

Page 4: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

TABLE OF CONTENT

LIST OF TABLES......................................................................................................................... vi

LIST OF FIGURES ...................................................................................................................... vii

ACKNOWLEDGEMENTS........................................................................................................... ix

CHAPTER 1 INTRODUCTION .................................................................................................... 1

CHAPTER 2 LITERATURE REVIEW AND BACKGROUND .................................................. 5

CHAPTER 3 PRODUCT FAMILY ANALYSIS TOOLKIT: SYSTEM ARCHITECTURE..... 10

3.1 Bill of Materials ............................................................................................................ 11

3.2 Design Structure Matrices............................................................................................. 12

3.3 Clustering...................................................................................................................... 13

3.4 Product Line Commonality Index................................................................................. 14

3.5 Generational Variety Index........................................................................................... 16

CHAPTER 4 PRODUCT FAMILY ANALYSIS TOOLKIT ...................................................... 18

4.1 Bill of Materials ............................................................................................................ 19

4.2 Design Structure Matrix................................................................................................ 23

4.2.1 Part Connections ....................................................................................................... 23

4.2.2 Generate Complete Design Structure Matrix............................................................ 25

4.2.3 Cluster Design Structure Matrix ............................................................................... 26

4.3 Product Line Commonality Index................................................................................. 28

4.3.1 Select Product Files................................................................................................... 29

4.3.2 PCI Analysis ............................................................................................................. 31

4.4 Generational Variety Index........................................................................................... 33

CHAPTER 5 RESEARCH STUDY RESULTS AND OBSERVATIONS.................................. 39

5.1 Summary of Experiment Protocol ................................................................................ 40

5.2 Impact of Software on Benchmarking Efficiency ........................................................ 41

5.3 Impact of Software Effectiveness on Benchmarking.................................................... 45

5.3.1 Impact of BOM Tool on Effectiveness on Benchmarking ....................................... 45

5.3.2 Impact of DSM Tool on Effectiveness on Benchmarking........................................ 51

5.3.3 Impact of PCI Tool on Effectiveness on Benchmarking .......................................... 51

iv

Page 5: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5.4 Satisfaction of the Software Based on Survey Results ................................................. 54

5.4.1 Evaluation of Ease of Use and Capability of the Software....................................... 54

5.4.2 Evaluation of Graphical User Interface and Usability of the Software .................... 55

5.4.3 Evaluation of BOM, DSM, and PCI Tools of the Software ..................................... 57

5.5 Summary of Overall Observations of the Software...................................................... 59

CHAPTER 6 CLOSING REMARKS AND FUTURE WORK ................................................... 61

Appendix A: Installation of PFAToolkit Add-in .......................................................................... 64

Appendix B: List of Part Names for Drill, Jig Saw, and Sander, List of Requirements for BOM,

List of Part Connections for DSM, and Formula to Calculate PCI .............................................. 67

Appendix C: Sample Assessment Table of Correct Details Comparison for Manual

Benchmarking ............................................................................................................................... 87

Appendix D: PFAToolkit Satisfaction Survey Form.................................................................... 88

REFERENCES ............................................................................................................................. 95

v

Page 6: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

LIST OF TABLES

Table 1. GVI Rating System [37] ................................................................................................. 17

Table 2. Details of Time Taken By Participants to Perform Benchmarking ................................ 42

Table 3. Results of Mann-Whitney U Test at 95% Confidence Level for Median Times Taken for

BOM, DSMs and PCI ........................................................................................................... 44

Table 4. Details of Percentage of Correct Material, Manufacturing, and Assembly Details for

Manual and Software Benchmarking.................................................................................... 46

Table 5. Results of Mann-Whitney U tests at 95% Confidence Level for Median Time Taken for

Materials, Manufacturing Process, Assembly ...................................................................... 49

Table 6. Comparison of Parts Added or Deleted for Manual and Software Benchmarking ........ 50

Table 7. Results of Mann-Whitney U Test for Median Number of Parts Added or Deleted for

Manual and Software Benchmarking.................................................................................... 50

Table 8. Comparison of PCI Values Calculated by All Participants ............................................ 52

Table 9. Results of Mann-Whitney U tests at 95% Confidence Level for Difference of PCI value

from "Expert" Value ............................................................................................................. 52

Table 10. Summary of Adjustments Made to PCI Tables of Each Participant............................. 53

vi

Page 7: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

LIST OF FIGURES

Figure 1. PFAToolkit Architecture ............................................................................................... 11

Figure 2. Illustration of PFAToolkit Menu................................................................................... 18

Figure 3. Illustration of the Form to Record Details of Parts for Bill of Materials ...................... 19

Figure 4. Illustration of Use of Auto Suggest Feature .................................................................. 21

Figure 5. Illustration of BOM Output in Spreadsheet................................................................... 22

Figure 6. Illustration of List of Part Connections Form................................................................ 24

Figure 7. Illustration of DSM Output ........................................................................................... 25

Figure 8. Illustration of DSM Clustering Input Parameters Form................................................ 27

Figure 9. Illustration of a Sample Clustered DSM Output ........................................................... 28

Figure 10. Illustration of Selecting Product Files for PCI Analysis ............................................. 30

Figure 11. Illustration of Form to Separate Unique and Shared Parts .......................................... 31

Figure 12. Illustration of Form for Entering PCI Factors ............................................................. 32

Figure 13. Illustration of Sample PCI Output Spreadsheet........................................................... 33

Figure 14. Illustration of GVI Phase I Form for Customer Requirements ................................... 33

Figure 15. Illustration of GVI Phase I Form for Assigning Priorities to Customer Requirements

............................................................................................................................................... 34

Figure 16. Illustration of GVI Phase I Form for Engineering Metrics ......................................... 35

Figure 17. Illustration of GVI Phase I Form for Target Values.................................................... 35

Figure 18. Illustration of GVI Phase II Form for Selecting Parts................................................. 36

Figure 19. Illustration of GVI Phase II Form for Rating Parts for Redesign Effort ..................... 36

Figure 20. Illustration of GVI Phase I Output Sheet .................................................................... 37

Figure 21. Illustration of GVI Phase II Output ............................................................................. 38

Figure 22. Comparison Between Manual and Software Benchmarking of Average Times Taken

For Benchmarking Analysis ................................................................................................. 43

Figure 23. Percentage Comparison of Identical Details for Materials between Manual and

Software Benchmarking........................................................................................................ 46

Figure 24. Percentage Comparison of Identical Details for Manufacturing Processes Between

Manual and Software Benchmarking.................................................................................... 47

vii

Page 8: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 25. Percentage Comparison of Identical Details for Assembly between Manual and

Software Benchmarking........................................................................................................ 47

Figure 26. Overall Satisfaction Levels for "Ease of Use" of Software......................................... 55

Figure 27. Overall Satisfaction Level for "Graphical User Interface" of Software...................... 56

Figure 28. Overall Satisfaction Level for "BOM" Feature of Software ....................................... 57

Figure 29. Overall Satisfaction Level for "DSM" Feature of Software........................................ 58

Figure 30. Overall Satisfaction Level for "PCI" Feature of Software .......................................... 58

viii

Page 9: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

ix

ACKNOWLEDGEMENTS

I would like to thank Dr. Timothy W. Simpson, Professor of Industrial and Manufacturing

Engineering at Penn State University for his constant support, guidance, and motivation

throughout the research. I would especially like to thank him for his belief and faith in my

capabilities which helped me not only to explore new and innovative topics to increase my vision

on the topic of my research but also perform to the best of my abilities. A special thanks to

Avanti Jain, who was a collaborator in this research.

I thank my family for giving me encouragement and support throughout my time in graduate

school. I also thank my lab members for all the help and support they provided. Their comfort

and company was one of the prime drivers in my completion of my masters. I would also like to

thank my friends Avinash, Dhananjay and Rajesh who made my time at Penn State truly

enjoyable.

This work is being supported by NSF Grant No. OCI-0636273 and DUE-0919724. Any

opinions, findings, and conclusions or recommendations in this paper are those of the authors

and do not necessarily reflect the views of the National Science Foundation.

Page 10: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 1

INTRODUCTION

Benchmarking techniques are now widely being used in many industries to compare product

designs, costs, manufacturing processes, packaging, and so on to competitor’s products, industry

standards or perceived best practices. Initially popularized by the Japanese industry members as

part of Total Quality Management (TQM), benefits of benchmarking have been valuable to many

US corporations [1]. Benchmarking has been defined in many ways [2], each definition trying to

capture the broad meaning of the term. Merriam-Webster’s dictionary defines benchmarking as

“something that serves as a standard by which others may be measured or judged”. More

formally, benchmarking is defined as a “systematic way to identify, understand, and creatively

evolve superior product, services, designs, equipment, processes, and practices to improve an

organization’s real performance” [3]. Thus, depending on the domain where maximum benefits

are realized, companies benchmark products, processes, internal operations, or other aspects that

have room for improvement.

The significant benefits gained from competitive benchmarking can best be explained by tracing

the history of Xerox Inc. from the late sixties to today [4]. Xerox in the sixties and seventies

experienced loss of market share primarily due to Japanese competitors. Xerox, the first

innovators of photocopiers and a company synonymous with photocopying, were getting beaten

at their own game. The Japanese were able to market higher quality products at a lower price

forcing Xerox to reorganize their product development strategy. They regained their lost

competitive advantage through extensive “teardown” of their competitors’ products and

benchmarking all aspects of their business as well against Xerox products and business

1

Page 11: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

operations. Other instances include AT&T, which found out that their cycle time of their new

product development takes somewhere in a range of about 133 and 250 per cent of the time taken

by their Japanese competitor [5]. Realizing the vast benefits benchmarking has to offer, more

companies are setting considerable time and effort to benchmark their products, e.g., General

Motors' Vehicle Assessment and Benchmarking Activity Center performs a “teardown” analysis

of approximately 40 of its competitors’ vehicles that require almost six weeks for completion [6].

Companies design product families and platforms not only to reduce product development time

and production costs but also to diversify their product offerings [7]. With this said,

benchmarking is a time-intensive activity that is susceptible to human errors, and to achieve

effective benchmarking appropriate tools, metrics and methodologies are needed. Moreover,

benchmarking families of products greatly multiplies the complexity and variability in the

process as compared to that of an individual product [8]. Product family design involves the

design and development of multiple products that share common parts and modules, or use

similar manufacturing/assembly processes, yet target different market segments through careful

differentiation and delineation of product features [9].

Benchmarking product families and applying the right tools can provide valuable insights into

the platform architecture. For individual products, there exist various benchmarking methods

[10,11,12]; however, for families of products, there are very few approaches that address the

benchmarking needs [8,13], and the scope of existing processes is typically limited only to a few

aspects of product family design, e.g., cost, commonality and variety. Thevenot and Simpson [8]

recently developed a product family benchmarking method (PFbenchmark) based on dissection,

extending their previous work. Their process benchmarks a set of product family design

2

Page 12: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

alternatives with respect to commonality/variety and cost estimates by means of product

dissection. A graphical evaluation method called Product Family Evaluation Graphs (PFEG) has

also been developed to compare product family design alternatives [13]. The PFEG is a two-

dimensional graph of the Variety Index (VI) versus Commonality Index (CI) that helps correlate

the marketing and engineering domains. These existing processes for benchmarking product

families are limited in number and are not very comprehensive for gathering the data needed in

today’s competitive markets. Jain [14] addresses this need by developing a systematic process to

benchmark product families. The benchmarking methodology developed integrates various tools

which provide manufacturing companies’ insights about their competitors’ products as well as

identifying opportunities to improve their own product architectures. The first step focuses on

capturing details of the products systematically and consistently which are required for further

analysis of the product family. Second step includes a metric to measure the amount of

commonality within the product family, which impacts both the economic viability of the family

and its ability to compete in the marketplace – commonality reduces design and manufacturing

costs but limits the ability to differentiate products in the family. Therefore, being able to assess

the amount of commonality in a family is an important metric for product family design, and it

provides a useful tool for benchmarking competitors’ products; however, it is not the only tool

used for product family benchmarking. The third step includes a tool to qualitatively assess

modularity and identify modules and interfaces for future product architectures which is one

concern that many companies face [15] . Lastly, the fourth step focuses on capturing changing

customer needs over a period of time and mapping them back to the product architectures to

quantify redesign efforts and flexibility of the components.

3

Page 13: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

This thesis focuses on the development of a Product Family Analysis Toolkit (PFAToolkit) that

comprises a suite of automated and semi-automated analysis tools to aid product family

benchmarking. The analysis tools integrated into the PFAToolkit include:

(1) Bill of Materials (BOM) to record components’ details

(2) Design Structure Matrix (DSM) to display different types of connections between

components in a matrix form

(3) Clustering algorithm to cluster DSMs to assess modularity and identify modules and

interfaces

(4) Product Line Commonality Index (PCI) to analyze cost effectiveness and commonality

within a product family

(5) Generational Variety Index (GVI) to measure redesign effort for future product family

architecture requirements.

Based on experience with several industry partners, these five tools enable a comprehensive

benchmarking analysis of the products in a systematic and time-efficient manner.

The remainder of this thesis is organized as follows. Chapter 2 provides additional background

on these benchmarking tools and other tools in the same space. In Chapter 3, the proposed

software is introduced along with details of the various tools included in it. In Chapter 4 the

software is implemented, and its features and capabilities are illustrated through examples.

Chapter 5 discusses a research study involving a family of power tools. Results of the survey

conducted in the study are also presented. The thesis concludes a summary of the software’s

limitations and future work in Chapter 6.

4

Page 14: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 2

LITERATURE REVIEW AND BACKGROUND

Industry or competitive benchmarking process not only includes benchmarking of products or

services against companies in the same industry but also strategic benchmarking [9]. Pozos [16]

defines strategic benchmarking as “Proactive analysis of emerging trends, options in markets,

processes, technology and distribution that could affect strategic direction and deployment”. So,

competitive benchmarking ideally should target short-term improvements as well as long-term

improvements. Extending this to benchmarking product families, short term improvements

would include benchmarking specific product families (e.g., commonality assessment) whereas

long-term improvements would include assessing platforms designs based on predicting market

trends (e.g., product redesign effort and flexibility). The proposed PFAToolkit developed

consists of tools that would address these needs and ensure consistency in data, analysis, and

results. A review of existing tools commonly used in benchmarking product families is discussed

below.

Data collection is an integral part of any benchmarking process. In manufacturing industries, one

of the basic tools to capture components’ details in a standard format is Bill of Materials (BOM)

[17]. The first vital step when a product family is analyzed is to prepare a BOM for each

product, dissecting the products to the component level and labeling [18], and listing it in the

BOM [19,20]. Typical details included in the BOM are component’s material, mass, quantity,

classification, and function [19]. As the complexity and volume of products increases, managing

this data becomes increasingly cumbersome. Based on the size of the company, data

requirements and capital available to invest in the infrastructure, companies manage data

5

Page 15: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

collection and manipulation through various Enterprise Resource Planning (ERP) software or

Product Data Management (PDM) software. Korpela and Tuominen [21] review these external

design repository schemas, PDM systems, other commercially available software and provide

insights including discussions on data consistency analysis and representation conventions.

These front-end and back-end databases not only provide a standardized way of information

representation but also provide design details that support designer activities. Although these

databases manage much more than simple data collection, the use and design of these databases

are still in the development phase. If we limit the scope of data requirement just to product

dissection and benchmarking product families, a simple Microsoft Excel spreadsheet data

collection with checks for data consistency and redundancy would provide a reasonable

foundation for benchmarking product families.

Qualitatively assessing modularity is one aspect of product family analysis that helps analyzing

the complexity of the product design. A modular product or sub-assembly has minimum number

of interactions between physical components (or modules) and further each physical component

performs exactly one function [20]. A Design Structure Matrix (DSM) provides a simple,

compact, and visual representation of a complex system that helps assess modularity and

supports innovative solutions to decomposition and integration problems [22]. DSMs are

excellent tools for products with many components for systematic representation. DSMs can

provide a higher level of abstraction of complex products in an easily interpretable visual format.

These DSMs can be clustered to identify modules and interfaces. The basic principle is to cluster

the DSM to minimize intra-modular connections and maximize inter-modular connections.

Pimmler and Eppinger [23] use a heuristic specialized macro based swapping algorithm. The

6

Page 16: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

objective of their clustering algorithm is to minimize the distance of each interaction from the

diagonal. Huang and Kusiak [24] use decomposition approach and triangulation algorithm to

identify and analyze different modules. Kusiak et al. [25] and David et al. [26] identify different

types of modules based on a genetic algorithm and minimum data description. A tool specific for

clustering DSMs using MatLab macros has been developed at MIT [27]. Partitioning of DSMs

is another way of identifying modules where elimination of feedback from a DSM is the primary

objective. In other words, the DSM is reordered to obtain a lower triangulation matrix. Steven

Eppinger and his collaborators1 have developed Excel macros for partitioning and simulation of

DSMs. Each of these clustering algorithms discussed above have their own pros and cons but the

primary limitation in selecting an algorithm is capabilities of hardware and software to manage

the size of calculations along with being publicly available. Considering these limitations, the

genetic algorithm developed by Rogers et al. [28] performs satisfactorily in clustering a DSM in

Microsoft Excel.

Another aspect of assessing product families is the amount of sharing between the products

included. Numerous indices are available to measure the degree of commonality in a product

family which are useful surrogate measures to help reduce costs [29]. Commonality indices

measure the degree of sharing in a product family with respect to number of components,

manufacturing processes, assembly process, costs of components and other areas. As many as

six commonality indices have been compared and analyzed in [29]. All commonality indices

focus on providing a measure for the trade-off between product commonality and distinctiveness.

Contrary to the indices that only measure the percentage of components that are common across

7

1 http://dsmweb.org/ 

Page 17: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

a product family, the Product Line Commonality Index (PCI) measures and penalizes the non-

unique components in the family given the product mix [30]. It helps capture the fine balance

between design, manufacturing, and marketing at the product family level.

To assess how well product families evolve over a period of time as customer requirements

change, the Generational Variety Index (GVI) provides an excellent indicator of the amount of

redesign required for a component to meet future market requirements [31]. GVI uses tools from

Quality Function Deployment (QFD), which is best summarized as “A set of planning and

communication routines, quality function deployment focuses and coordinates skills within an

organization, first to design, then to manufacture and market goods that customers want to

purchase and will continue to purchase. The foundation of the house of quality is the belief that

products should be designed to reflect customer’s desires and tastes - so marketing people,

design engineers, and manufacturing staff must work closely together from the time a product is

first conceived” [32]. GVI uses properties intrinsic to QFD and leverages them to predict

product evolution based on expected future customer needs.

Each of these tools addresses specific aspects of product family analysis and benchmarking;

however, none of them provide a comprehensive assessment of product families and explicitly

address product family benchmarking. Additionally, software to facilitate time efficient and

consistent benchmarking has also eluded the community thus far. Commercial tools like

ACLARO DFSS, ADePT Design, Complex Problem Solver, Lattix, and Loomeo2 are software

packages focusing on project management and product development related activities in

8

2 http://dsmweb.org/ 

Page 18: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

conjunction with DSMs; however none of them addresses product family benchmarking, and

they are limited to one or two specific tools. Additionally, each of these software packages is an

individual package, and they do not utilize existing interfaces like Microsoft Excel.

Microsoft Excel gives its users enhanced flexibility in terms of data storage, data manipulation,

and data reporting in a familiar interface. This makes Microsoft Excel a Rapid Application

Development (RAD) tool. In addition, the built-in programming environment requires no

additional cost, where as using individual platforms like Visual Studio would. Appreciating the

broad range of technical backgrounds and software resources of prospective practitioners of

these analyses, a common software platform is desirable to make product family benchmarking

process cost- and time-effective with fewer errors. The PFAToolkit has the capability to

accomplish all these needs by offering a broad and integrated toolset for product family

benchmarking as discussed in the next chapter.

9

Page 19: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 3

PRODUCT FAMILY ANALYSIS TOOLKIT: SYSTEM ARCHITECTURE

The PFAToolkit is a comprehensive software package that offers multiple tools for product

family benchmarking using Microsoft Excel. The advantages of Microsoft Excel include the

capability of customization through macros, and compatibility with other data manipulation

software, extensive resources for help available online through forums and help libraries,

compatibility with other Microsoft Excel users. Its users also seldom need to worry about

converting a spreadsheet to a different format due to the wide-spread use of Microsoft Excel. It

can be easily upgraded to high-end databases like Access, SQL Server, Oracle, or MySQL if the

size of records grows too large. Microsoft SQL server, Oracle, Microsoft Access, or IBM DB2

databases are used as back-ends for their applications; Microsoft Excel is able to perform many

of the services these back-end applications provide.

This chapter reviews the benchmarking method and the tools integrated in the PFAToolkit

software. This method is adopted from the research work of Jain [14]. Figure 1 shows the

architecture of the PFAToolkit depicting the benchmarking tools integrated into the software.

The PFAToolkit includes: BOM, DSM, a clustering algorithm, PCI and GVI. First, users input

the necessary data into the BOM and then that information is extracted and applied to the other

tools with some additional inputs wherever necessary as shown in Figure 1. Integration of these

product family benchmarking tools into PFAToolkit is described in detail in the remainder of this

section.

10

Page 20: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 1. PFAToolkit Architecture

3.1 Bill of Materials

A comprehensive list of information is recorded through the bill of materials (BOM), which

includes part name, material, manufacturing process, assembly/fastening scheme, quantity, cost,

weight, picture, manufacturing part number, vendor, and other parameters (like length for a

screw). The proposed software provides a standard user input form that prompts users to enter

all the necessary details of one component at a time to ensure that no information is left out.

Some of this information is used later to calculate PCI of the product family and hence

correctness of this information is necessary to maintain consistency and repeatability of results.

11

Page 21: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

3.2 Design Structure Matrices

The component-based DSM [33] is a prominently used tool for analyzing product architecture

and modeling four different types of connections between components, namely spatial/physical,

energy, information, and material/mass [23]. Physical connections include the connections

between two components by means of fasteners, welded joints, bolted joints, etc. Energy flow

connections include the connections that relate to electrical energy transfer like electric cables,

wirings, etc. Information flow connections include the connections that indicate any information

exchange, for instance, sensors, display, etc. which get input from one component and produce

output accordingly conveying some kind of information. Mass flow connections include the

connections that relate to mass flow like flow of air, water, lubricant, etc.

A component-based DSM is always a square matrix. It is symmetric for physical connections, as

all physical connections are bi-directional. Practically, mass flow, energy flow and information

flow can be either uni-directional or bi-directional but from a structural standpoint they are

assumed to be bi-directional. The software is built with an input connection form that allows

users to mark all four different types of connections between components. Once all the

connections for all the components are entered, a DSM is generated. The DSM tool in the

software has been programmed to identify the symmetry of connections between the

components, and hence users only need to enter half the connections as compared to completing

the entire DSM manually.

12

Page 22: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

3.3 Clustering

The process of clustering a DSM involves the reordering of rows and columns to group closely

related elements into modules. While it is allowed to have overlapping clusters, “the foremost

objective is to maximize interactions between elements within clusters while minimizing

interactions between clusters” [33]. One of the most widely employed clustering techniques

involves the use of genetic algorithms (GA) [34]. An Excel-based GA macro already exists for

clustering component-based DSMs [28], but its scope is limited to clustering only DSMs with

just one type of connection. This macro is used as a starting point for the development of a more

advanced macro in the software that incorporates physical, electrical, mass-flow, and

information-flow connections in the DSM. This tool is semi-automated since exact definition of

the cluster boundaries remains a matter of user judgment.

From a benchmarking standpoint, common modules within an existing product family can be

readily identified and module interface improvements can be planned based on module structure

boundaries. The proposed software provides users with an input form for GA parameters. For

users’ convenience, default values are displayed in the form, but users can change the maximum

number of generations and population size as desired (e.g., to save time based on available

computational resources).

13

Page 23: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

3.4 Product Line Commonality Index

PCI, as given by Eq. (1), is programmed into the software as it penalizes only variants and

provides a relative measure of commonality for comparing different product families.

PCI = 100 ×

P

i i

P

i

P

i iii

nNP

nfffni

12

123121

1

1

1 (1)

where:

P = total number of non-differentiating components that can potentially be standardized

across models;

N = number of products in the product family;

ni = number of products in the product family that have component i;

f1i = Size and shape factor for component i, indicates the ratio of the greatest number of

models that share component i with identical size and shape to the greatest possible

number of models that could have shared component i with identical size and shape (ni);

f2i = Materials and manufacturing processes factor for component i, indicates the ratio of

the greatest number of models that share component i with identical materials and

manufacturing processes to the greatest possible number of models that could have

shared component i with identical materials and manufacturing processes (ni); and

f3i = Assembly and fastening schemes factor for component i, indicates the ratio of the

greatest number of models that share component i with identical assembly and fastening

schemes to the greatest possible number of models that could have shared component i

with identical assembly and fastening schemes (ni).

14

Page 24: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

PCI varies continuously between 0 and 100. When PCI = 0, either none of the components are

shared across models, or if they are shared, then their size/shapes, materials/manufacturing

processes, and assembly schemes are all different. When PCI = 100, it indicates that all of the

components are shared across models and that they are of identical size and shape, made using

the same material and manufacturing process, and assembled in the same manner. PCI focuses

on commonality that should exist between products that share common or variant components

rather than on the unique components. It provides a single measure for the entire product family,

but it does not offer insight into the commonality of the individual products within the family.

PCI requires subjective information that leads to different results as a result of analysis by

different users. Various measures are incorporated in the proposed software to reduce variations

in PCI computation. For instance, material, manufacturing process, and assembly classification

schemes are used to design the drop-down menus in the BOM input form to provide the user

with ease to use standard options [35]. Thevenot and Simpson [36] provide a guideline to

understand the subjectivity that users might introduce into PCI calculations. As users are queried

to classify each component in previously prepared BOM as differentiating or not, there is a very

low probability that a desired component would be missing in the analysis. Also, users have the

flexibility to edit the values of f1, f2,, and f3 at any given time, which adds convenience in case of

any corrections.

15

Page 25: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

3.5 Generational Variety Index

GVI is calculated by a seven-step process as defined in [37]. Once the market and desired life of

a product platform is defined, a customer survey is done in order to anticipate their needs and

wants. After capturing all the customers’ requirements, they are prioritized and compiled in a

QFD matrix to map them to the engineering metrics. Engineering metrics contain the parameters

that need to be changed to achieve customer’s requirements. Once this matrix is ready, then the

expected change in the customer requirements over the platform life is analyzed and categorized

as low, medium, and high. This decision depends on the existing state of the requirement. For

instance, if customers want low cost but the product is actually offered at a low price as

compared to its competitors, then it would be either given a preference of low or medium but not

high. Prioritizing customer requirements helps to define the area of focus for improvement

and/or redesign. The next step is to prepare a second QFD matrix where the engineering metrics

are mapped to the related components so as to identify which components would need to be

redesigned. For example, if a customer requirement is a quiet power tool when operating, then it

would be mapped to noise level (in decibels) in the engineering metrics, which is the first QFD

matrix, and the noise level is then mapped to the components motor, casing, etc., which is the

part of the second QFD matrix. Once the second QFD matrix is complete, GVI ratings are

assigned to each component for corresponding connections with the engineering metrics. The

ratings are based on a scale of 0-9 as described in Table 1.

16

Page 26: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 1. GVI Rating System [37]

Rating Value Description 9 Requires major redesign of the component (>50% of initial redesign costs)6 Requires partial redesign of the component (<50%) 3 Requires numerous simple changes (<30%) 1 Requires few minor changes (<15%) 0 No changes required

After assigning GVI ratings, the summation of GVI values is done for every column that

corresponds to a component. Components with high GVI values imply relatively more redesign

whereas components with low GVI values indicate very low redesign, and hence there is an

opportunity for standardizing these components for future requirements. The GVI tool is

programmed in the software in a way that it guides users through all the seven steps thereby

streamlining the process. Implementation of GVI and other tools is described in the next chapter.

17

Page 27: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 4

PRODUCT FAMILY ANALYSIS TOOLKIT

To demonstrate integration of the proposed tools, we implement the Product Family Analysis

Toolkit (PFAToolkit) as shown in Figure 2. The figure shows the menu bar of the PFAToolkit,

which appears once the Excel PFAToolkit Add-In is installed (see Appendix A). PFAToolkit is

a Microsoft Excel 2007 Add-In like other commercially available Add-Ins, e.g., SOLVER,

DATA ANALYSIS.

Figure 2. Illustration of PFAToolkit Menu

Microsoft Excel is prominently used in industry as well as universities, and therefore large

numbers of users are familiar and comfortable with using it. All tools are programmed as

Microsoft Excel macros using Microsoft Visual Basic. Each of the four features is discussed

in detail in the following sections.

18

Page 28: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

4.1 Bill of Materials

In the PFAToolkit, component details for a BOM are recorded through a user form as shown in

Figure 3. This form ensures consistency of recorded data by minimizing errors that occur

through repetitive work. To reduce efforts of the user, pre-populated drop-down menus are

provided for each of the attributes that are being recorded through the user form. Each of the

details that are recorded has their own significance and is used in later tools for analysis.

Figure 3. Illustration of the Form to Record Details of Parts for Bill of Materials

19

Page 29: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

The different attributes recorded in the form are explained as follows:

Part Number: This records the part number starting at 1 and increasing sequentially to help

keep track of number of parts. An error check is provided to ensure only integer numbers are

sequentially entered.

Part Name: This records the part name either by entering it manually or selecting from a

pre-populated drop-down menu of standard part names. This reduces effort and improves

naming consistency across the product family. An error check exists for repeated part names.

Upload Picture: This is used to upload a picture of the part. This helps to aid in quick

visualization of the part. An error check exists to ensure that a part number is entered before

uploading the picture.

Auto Suggest: This allows utilization of existing nomenclatures used for part names,

material, manufacturing processes, and assembly schemes. The list of names that will be

used are stored in notepad files and then extracted into the drop-down menu for future use.

This feature helps maintain consistency in nomenclature used across the product family and

customized drop-down menus according to individual user needs. Figure 4 shows a snapshot

of the Auto Suggest feature.

20

Page 30: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

 

Figure 4. Illustration of Use of Auto Suggest Feature

Primary Function: This field records the primary function that a part performs. For

example, “power source” would be the primary function for a “battery”. This is very

specific, and hence each user can define customized convention to follow.

Material: This records the material type of the part. This may be selected from a pre-

populated comprehensive list [30] or entered manually.

Manufacturing Process: This records the manufacturing process used to fabricate the part.

This may be selected from a pre-populated comprehensive list [30] or entered manually.

Assembly: This records the fastening scheme used to assemble the part. This may be

selected from a pre-populated comprehensive list [30] or entered manually.

21

Page 31: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Other attributes can be recorded such as subassembly, quantity, weight, cost, and vendor/supplier

to provide additional details for complete information of the product; however, they are not

mandatory to be entered. They are not used for analysis in later tools. Sample BOM is shown in

Figure 5 after data for each part is entered.

The Excel worksheets are protected, which means that they cannot be edited without using the

user form used for recording part details for BOM shown in Figure 3. Here, once the item

number of the part to be edited is entered, the form automatically gets pre-populated with the

existing part information which the user can modify accordingly.

Figure 5. Illustration of BOM Output in Spreadsheet

22

Page 32: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

4.2 Design Structure Matrix

The Design Structure Matrix (DSM) provides a matrix-based representation of the connections

within a product. This visualization of the assembly and hierarchy of a product is an important

tool and visual aid used in benchmarking.

4.2.1 Part Connections

To create a DSM based on the BOM, the connections between parts are required. This process is

performed by using “Enter Part Connections” tab under “Part Connections”, which opens the

“List of Connections Form” as shown in Figure 6. “List of Connections Form” allows the user to

enter the different connections corresponding to each part. The manner in which one part is

connected to another part is classified in four different flows, namely, physical flow, mass flow,

energy flow, and information flow as discussed in Section 3.2.

23

Page 33: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 6. Illustration of List of Part Connections Form

The DSM for any product is symmetric, e.g., if a battery is connected to the battery charger, then

it is implied that the battery charger is also connected to the battery. Thus, a DSM for a product

is a mirror image about the diagonal. The software ensures that the DSM is symmetric and

thereby reduces the effort of the user, e.g., if the user specifies that a physical connection exists

between the battery and the battery charger, then the software replicates the inverse connection

by connecting the battery charger to the battery. Thus, repetitive work is reduced and

correctness of the information is maintained.

24

Page 34: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

4.2.2 Generate Complete Design Structure Matrix

Once all of the connections have been defined, then the next step is to generate the DSM using

the ‘Generate Complete Design Structure Matrix’ tab which generates an unclustered DSM as

shown in Figure 7. This includes all four types of connections: physical (black filled cells), mass

(red filled cells), energy (green filled cells) and information (blue filled cells). The color-coding

helps users visualize different connections easily. Other options allow users to view the DSM by

isolating each connection by selecting the desired connection under the DSM tab.

Figure 7. Illustration of DSM Output

25

Page 35: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

4.2.3 Cluster Design Structure Matrix

In the PFAToolkit, the Genetic Algorithm (GA) proposed in [28] is applied to cluster the DSM,

which groups components into modules based on connections, i.e., maximize connections within

a module and minimize connections between modules. The “Cluster DSM” tab accesses the

“DSM Clustering Input Parameter Form” as shown in Figure 8. Two of the six parameters

governing the clustering algorithm displayed in the form can be changed by the user. The

default values displayed are recommended for time-efficient results of clustering the DSM. For

maximum processes [28], the number of parts information is extracted from BOM. The

algorithm considers only physical connections as a focus because they mainly govern the

packaging and placement of parts in an assembly of a physical system. Yu et al. [38] have

introduced a clustering algorithm based on the Minimum Length Description (MDL) principle

[39] and a simple GA to visualize product architecture; it recognizes bus modules, allows tuning

of GA parameters depending on the type of products, and is capable of producing results that

match human experts’ clustering. GA used in the PFAToolkit can be enhanced for future

versions to incorporate these features.

26

Page 36: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

 

Figure 8. Illustration of DSM Clustering Input Parameters Form

Once initiated, the clustering algorithm runs for a time that is proportional to the total number of

parts and generates a clustered DSM as shown in Figure 9. Defining modules and their

boundaries is made easier once the DSM is clustered. The exact definition of each module and

cluster boundaries is left for users to determine since their definition is specific and subjective in

nature.

27

Page 37: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 9. Illustration of a Sample Clustered DSM Output

4.3 Product Line Commonality Index

The next tool enables commonality analysis of the product family. Currently, the software uses

the Product Line Commonality Index (PCI) [30] given its flexibility and focus on non-

differentiating components; however, other commonality metrics such as Percent Commonality

Index (%C), Degree of Commonality Index (DCI), Commonality Index (CI), and so on [29]

could easily be integrated into the software if desired.

Within PCI, commonality is divided into three factors: (1) size and shape (f1), (2) materials and

manufacturing (f2), and (3) assembly/fastening scheme (f3). The values for f1, f2, and f3 are

28

Page 38: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

calculated using the details of each part recorded in the BOM. The user has an option to make

changes manually. The fi values vary between [1/n, 1] where n is the number of products that

have the component that is being analyzed. Once this information is entered for every part, then

PCI is computed, and an output summary is generated. The PFAToolkit guides the user through

each step for calculating PCI in an accurate and systematic manner as follows.

4.3.1 Select Product Files

The first step for calculating PCI is to select the data files that correspond to each product of the

family to be analyzed. This is done by selecting “Select Product files and start NEW analysis”

under “Product Line Commonality Analysis” tab. This opens the Windows explorer window as

shown in Figure 10.

29

Page 39: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 10. Illustration of Selecting Product Files for PCI Analysis

The product files should contain a complete ‘Bill of Materials’ worksheet in a standard format as

generated by the PFAToolkit. Once all the product files are selected, then the next form opens as

shown in Figure 11. This allows users to separate parts that are unique (perform unique

functions) from those parts that are shared within the product family. This is repeated for all of

the products that are selected for analysis. This step is not automated since definition of unique

parts is user-dependent. Parts can be unique to provide different degrees of functionality, to

provide brand differentiation, accommodate supply chain limitations, and so on. Thus, the

definition of unique parts is left up to the users’ interpretation and judgment.

30

Page 40: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 11. Illustration of Form to Separate Unique and Shared Parts

4.3.2 PCI Analysis

The next step is to calculate PCI for the product family by selecting “PCI Analysis” tab under the

“Product Line Commonality Index” tab, which opens the “PCI Factors form” as shown in Figure

12. The form displays the values for f1, f2, and f3 calculated based on the information recorded in

BOM. The user has an option to edit values of any of the factors for each part. The values for

the f1, f2, and f3 factors must be between 0 and 1. A check for that exists in the software. The

user goes through each part to complete the PCI analysis. Calculation of commonality using PCI

includes only those parts that are shared between products. If the user has selected parts that

have a unique occurrence in the calculation, then the user is prompted for all such parts. The

choice to include such parts is left to the user since a unique occurrence does not necessarily

imply that it is performing a unique function.

31

Page 41: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 12. Illustration of Form for Entering PCI Factors

Once the information is entered, then the software calculates the PCI value and generates the

output shown in Figure 13. This spreadsheet summarizes the data that has been entered as well

as the values that are used to compute PCI.

32

Page 42: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 13. Illustration of Sample PCI Output Spreadsheet

4.4 Generational Variety Index

The last step of product family analysis includes using Generational Variety Index (GVI) to

measure the extent to which a component (or module) may be redesigned to meet the

requirements for different market segments. The calculation of GVI is divided into two phases.

The first phase is initiated by using the “GVI Analysis” tab under “Generational Variety Index”

tab, which opens the “GVI Phase I” form as shown in Figure 14.

.

Figure 14. Illustration of GVI Phase I Form for Customer Requirements

33

Page 43: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

This form allows users to enter all the customer requirements one at a time. Once all the

customer requirements are entered, then users are prompted to divide the customer requirements

into low, medium, and high priority. The form that lets users assign priorities through the second

“GVI Phase I” form as shown in Figure 15.

Figure 15. Illustration of GVI Phase I Form for Assigning Priorities to Customer Requirements

Next users are prompted to enter the engineering metric(s) that impact each of the customer

requirements. This is done by using the third “GVI Phase I” form as shown in Figure 16.

34

Page 44: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 16. Illustration of GVI Phase I Form for Engineering Metrics

Next users are prompted to enter target values for each of the customer requirements. The values

for current and future markets are also required to be entered to complete Phase I of GVI. The

form to enter the target values is shown in Figure 17.

 

Figure 17. Illustration of GVI Phase I Form for Target Values

35

Page 45: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Phase II is initiated once the last step of entering target values in Phase I is completed. Users are

first prompted to select all of the parts that are affected by changes in the listed customer

requirements. This is done using the form shown in Figure 18.

Figure 18. Illustration of GVI Phase II Form for Selecting Parts

Next users are prompted to enter their redesign effort ratings for each part for the corresponding

future change in engineering metric (see Figure 19). Users can choose to perform this task by

entering desired values in the “GVI Phase II” worksheet.

Figure 19. Illustration of GVI Phase II Form for Rating Parts for Redesign Effort

36

Page 46: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Lastly, GVI is computed by summing the user-specified weights in each component column.

Note that GVI can be calculated for each individual part in the product family once all of the

parts are placed in the BOM. GVI Phase I and Phase II spreadsheet outputs are shown in Figure

20 and Figure 21, respectively. In GVI Phase II, a pie-chart displays the relative distribution of

GVI values of the components to be redesigned where higher the value of GVI greater is the area

under the pie chart.

Figure 20. Illustration of GVI Phase I Output Sheet

37

Page 47: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure 21. Illustration of GVI Phase II Output

38

Page 48: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 5

RESEARCH STUDY RESULTS AND OBSERVATIONS

To evaluate the benefits of the software implementation, a product family dissection and

benchmarking study was conducted. The study involved eight participants benchmarking a

product family of three power tools including a drill, a jigsaw, and a sander. The participant

population consisted of eight graduate and undergraduate engineering students enrolled at Penn

State University. Four volunteers were asked to perform the benchmarking of the product family

using the software while the other four did manual benchmarking.

The objectives of the case study are to assess the impact of the software on:

1. Efficiency – measured in terms of the time to complete the three tasks:

a. Recording BOM

b. Developing the DSM

c. Calculating the PCI

2. Effectiveness – measured in terms of accuracy in the three tasks:

a. BOM – number of parts catalogued and labeled correctly (manufacturing process,

materials, assembly)

b. DSM – number of correct connections noted

c. PCI – difference of calculated PCI from the “expert” value

3. Satisfaction – measured by analyzing survey responses of participants who performed the

benchmarking analysis using the software

39

Page 49: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Other features of the software like clustering the DSM and GVI were not tested in the study.

Clustering the DSMs manually is not only a tedious task requiring long periods of time but also

difficult to compare the two DSMs quantitatively. Additionally, the techniques to cluster DSMs

require deeper understanding of the mechanical design of products, and the results obtained

would have been biased towards the skill level and expertise of the participants. Similarly for

GVI, the understanding required is not only limited to the products but also the market

requirements and product evolution. Therefore, these tools were not included in the experimental

study.

5.1 Summary of Experiment Protocol

As stated earlier the study involved eight participants benchmarking a family of three power

tools: a drill, a jigsaw, and a sander. The participant population comprised of seven graduate

industrial engineering students and one undergraduate engineering design student enrolled at

Penn State University. Each of them had a basic understanding of manufacturing processes,

materials, and principles of mechanical devices.

All participants were first introduced to the concepts of product families, benchmarking analysis,

and the benefits of commonality while participants using the software were also given a demo of

the software including its features and capabilities. Four participants performed the

benchmarking analysis manually using either pencil/pen and paper or a simple word/spreadsheet

to do the calculations and record data. The remaining four participants performed the same

benchmarking analysis using the software. The three products, namely, drill, jigsaw, and sander,

were disassembled prior to the commencement of the activity. The parts were labeled and laid

40

Page 50: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

out in an organized manner for the participants. Any questions or concerns of the participants

were addressed after which the participants were asked to sign the informed consent form to

enable use of their results in this study.

The steps performed by each participant (irrespective of the category assigned) individually are:

Record the BOM including part details such as part name, manufacturing process, materials,

and assembly.

Develop the DSM based on connections between parts in their list.

Calculate PCI for the product family.

Record the time taken for each step in their benchmarking process

Participants who performed the study using the software were asked to fill out a survey that

evaluated the satisfaction levels for the software used after the conclusion of benchmarking

analysis. Instruction sheets including the list of part names, list of requirements for BOM, and

formula to calculate PCI (see Appendix B for sample instructions) for each product with

individual part images for ease of identification were provided. The results of the activity and the

survey are discussed in the following sections.

5.2 Impact of Software on Benchmarking Efficiency

This section assesses the time required for manual benchmarking and benchmarking with the

software. Details of time taken by individual participants performing benchmarking analysis

manually and using the software for recording data for BOM and PCI calculation are listed in

Table 2.

41

Page 51: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 2. Details of Time Taken By Participants to Perform Benchmarking

BOM DSM PCI Participant

No. Manual (min)

Software (min)

Manual (min)

Software (min)

Manual (min)

Software (min)

1 65 65 22 7 31 4

2 80 58 28 7 22 4

3 70 60 30 4 25 5

4 65 57 40 5 25 8

It can be seen from the table that times the taken using the software for benchmarking are

consistently lower than manual benchmarking. This might be because participants using the

software used the “Save As” feature of Microsoft Excel, which allowed them to save on time for

parts that were common between the products. To give a clear comparison between the two

modes of analysis, average times taken are compared as shown in Figure 22. It can be seen that

average times to record DSM and PCI calculations using the software are five times less than

those of manual benchmarking. However, the time taken to record BOMs do not differ as

dramatically for the two modes of analysis, but the data is captured in a more systematic and

consistent format as discussed in Section 5.3. This allows users to analyze the data in multiple

ways using its other analytical capabilities.

42

Page 52: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

30

60

6 5

26

72

0

10

20

30

40

50

60

70

80

BOM DSM PCI

Average Tim

e  (m

in)

Manul Software

Figure 22. Comparison Between Manual and Software Benchmarking of Average Times

Taken For Benchmarking Analysis

43

The numbers of observations for each mode on analysis are four. For a 95% confidence level, the

sample size required for t-test is only two [40] while for Mann-Whitney U test is four, but there

are subtle issues on using t-tests for such small sample sizes. For example, a t-test with only

three patients per group (A and B) could be highly statistically significant, but the three patients

in group A might have been male and the three patients in group B female so that gender may

have explained the observed difference. The levels of understanding or knowledge of the

participants for manufacturing processes, materials, and principles of mechanical devices can be

assumed to be similar, since as discussed earlier, participants were students from mechanical or

industrial engineering; however, their knowledge levels were not measured. So, either of the tests

for central tendencies can be considered for statistical conclusions. Generally, in cases where

Page 53: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

sample size is less than 18, Mann-Whitney U test is preferred [40]. Hence, considering all the

limitations mentioned above, Mann-Whitney U test is considered for making statistical

conclusions.

A Mann-Whitney U test at 95% confidence level was performed to compare median times taken

for BOM, DSM, and PCI between the two modes of analysis. The results of the tests are

summarized in Table 3

Table 3. Results of Mann-Whitney U Test at 95% Confidence Level for Median Times

Taken for BOM, DSMs and PCI

Parameter BOM DSM PCI

Ho Timemanual = Timesoftware

Timemanual = Timesoftware

Timemanual = Timesoftware

Ha Timemanual ≠ Timesoftware

Timemanual ≠ Timesoftware

Timemanual ≠ Timesoftware

P-value (adjusted for ties)

0.055 0.029 0.028

Remarks Statistically Not

Significant at 95% Confidence Level

Statistically Significant at 95% Confidence Level

Statistically Significant at 95% Confidence Level

Conclusion Accept Ho Accept Ha Accept Ha

It can be concluded from Table 3 that the difference between the population median times taken

for software and manual benchmarking of DSM and PCI at 95% confidence level is statistically

significant. Hence, there is sufficient evidence to conclude that the DSM and PCI tools

substantially improve the efficiency of product family benchmarking analysis. While for BOM,

the difference between the population median times taken for software and manual

44

Page 54: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

benchmarking of DSM and PCI at 95% confidence level is statistically not significant. Thus,

there is insufficient evidence to conclude that the BOM tool significantly improves efficiency of

product family benchmarking analysis.

5.3 Impact of Software Effectiveness on Benchmarking

5.3.1 Impact of BOM Tool on Effectiveness on Benchmarking

Out of the four participants performing the benchmarking analysis manually, two of them opted

to record the BOM using pencil and paper while the remaining participants used their own

Microsoft Excel spreadsheets. A comparison study (see Appendix C for sample calculations) was

completed to evaluate consistency of part details recorded including manufacturing process,

materials, and assembly of each product across all participants. This comparison provides an

assessment of the software’s effectiveness in terms of improving data accuracy and consistency

across users. The four BOM spreadsheets of participants performing benchmarking using the

software were compared for materials, manufacturing process, and assembly. Similar assessment

was completed for the four BOMs from manual benchmarking. The percentage of correct

materials, manufacturing process, and assembly information across users are shown in Table 4.

To give a clear comparison between the two modes of analysis, the results are shown in Figure

23, Figure 24, and Figure 25, respectively.

45

Page 55: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 4. Details of Percentage of Correct Material, Manufacturing, and Assembly Details

for Manual and Software Benchmarking

Manual Software Particpant Number

% of Correct Materials

% of Correct Manufacturing Process

% of Correct Assembly

% of Correct Materials

% of Correct Manufacturing Process

% of Correct Assembly

1 75% 42% 44% 83% 55% 92%

2 62% 14% 11% 73% 38% 80%

3 75% 26% 36% 88% 43% 79%

4 89% 36% 30% 65% 31% 88%

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1 2 3 4

Particpant Number

Per

cen

tag

e o

f C

orr

ect

Mat

eria

ls

Manual Software

Figure 23. Percentage Comparison of Identical Details for Materials between Manual and

Software Benchmarking

46

Page 56: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

0%

10%

20%

30%

40%

50%

60%

1 2 3 4Particpant Number

Per

cen

tag

e o

f C

orr

ect

Man

ufa

ctu

rin

g

Pro

cess

es

Manual Software

Figure 24. Percentage Comparison of Identical Details for Manufacturing Processes

Between Manual and Software Benchmarking

0%

10%

20%

30%

40%

50%

60%

70%

80%

90%

100%

1 2 3 4Participant Number

Per

cen

tag

e o

f C

orr

ect

Ass

emb

ly

Manual Software

Figure 25. Percentage Comparison of Identical Details for Assembly between Manual and

Software Benchmarking

47

Page 57: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

It can be seen from Figure 23, Figure 24, and Figure 25 that the consistency achieved for

recording data using the software is slightly higher overall than those of manual benchmarking.

As the users performing benchmarking using the software had the option of choosing materials

and manufacturing processes from a list, there was significantly higher consistency across them.

This might seem an obvious conclusion since the manual users did not have the benefit of

choosing from a list of options. However, it was observed in class activities that students are

rarely provided with a list of options from which to choose from. Thus, in case of manual

benchmarking, the type of materials or manufacturing processes considered are restricted to the

users’ knowledge and expertise. The software curbs this disparity between users’ knowledge

levels and minimizes the discrepancies of data recorded across them.

A Mann-Whitney U test at 95% confidence level was performed to compare percentage of

correct details recorded for materials, manufacturing process, and assembly between the modes

of analysis. Results of the Mann-Whitney U tests are summarized in Table 5.

48

Page 58: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 5. Results of Mann-Whitney U tests at 95% Confidence Level for Median Time

Taken for Materials, Manufacturing Process, Assembly

Parameter Material Manufacturing Process Assembly

Ho Correctmanual = Correctsoftware

Correctmanual = Correctsoftware

Correctmanual = Correctsoftware

Ha Correctmanual ≠ Correctsoftware

Correctmanual ≠ Correctsoftware

Correctmanual ≠ Correctsoftware

P-value 0.66 0.19 0.03

Remarks Statistically Not

Significant at 95% Confidence Level

Statistically Not Significant at 95% Confidence Level

Statistically Significant at 95% Confidence Level

Conclusion Accept Ha Accept Ha Accept Ha

It can be seen from Table 5 that the difference between the population median percentage of

correct assembly details for software and manual benchmarking at 95% confidence level is

statistically significant. While for correct manufacturing process and material details, the

difference between software and manual benchmarking at 95% confidence level is not

statistically significant. Thus, it can be seen that the BOM tool overall does not dramatically

impact the effectiveness of the software on product family benchmarking analysis in this study.

The number of parts added and deleted for the two modes of analysis is shown in Table 6. It can

be seen that the total number of parts added or deleted for software benchmarking were less than

half of those for manual benchmarking. Parts were added if the participant did not list it in the

BOM or deleted if they were incorrectly listed in the BOM.

49

Page 59: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 6. Comparison of Parts Added or Deleted for Manual and Software Benchmarking

Participant No.

Manual Benchmarking Software

Benchmarking

1 1 1

2 2 0

3 2 0

4 4 2

A Mann-Whitney U test was performed to evaluate statistically the difference of population

median number of parts added or deleted for the two modes of analysis. The results are show in

Table 7.

Table 7. Results of Mann-Whitney U Test for Median Number of Parts Added or Deleted

for Manual and Software Benchmarking

Parameter Description

Ho Parts added/deletedmanual = Parts added/deletedsoftware

Ha Parts added/deletedmanual ≠ Parts added/deletedsoftware

(adjusted for ties)

0.13

Remarks Statistically Not Significant at 95% Confidence Level

Conclusion Accept Ho

It can be concluded from the table that that the difference between the population median

numbers of parts added or deleted for software and manual benchmarking at 95% confidence

level there is not statistically significant. Thus, there is insufficient evidence to conclude that the

BOM tool dramatically impacts the effectiveness of the software in this study.

50

Page 60: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5.3.2 Impact of DSM Tool on Effectiveness on Benchmarking

To evaluate the effectiveness of the DSM tool, the DSM developed by all of the participants

were assessed for the number of correct connections listed. All participants performing the

benchmarking manually opted to use their own Microsoft Excel spreadsheet while the

benchmarking software was used by the others. All of the participants recorded the part

connections correctly; thus, the DSMs were correctly developed. The only difference between

the participants who developed DSMs manually was that each of them used different methods to

represent the DSMs. One participant chose to develop two DSMs separately, one for physical

connections and the other for energy connections, while other participants used numbers (1-

physical, 2-energy) and color coding (blue-physical, green-energy). The software overcomes

these inconsistencies by creating identically formatted DSMs. Thus, the DSM tool in a way

impacts the effectiveness of the software by acquiring data in a consistent format making it

easier for comparison.

5.3.3 Impact of PCI Tool on Effectiveness on Benchmarking

To evaluate the effectiveness of the PCI calculation tool, the PCI values calculated by all of the

participants were compared to the “expert” PCI value for the family of products. All participants

performing the benchmarking manually opted to use their own Microsoft Excel spreadsheet

while the benchmarking software was used by the others. In Table 8, PCI values for all

participants performing benchmarking manually and using the software are compared against the

calculated “expert” value of 62.3% respectively.

51

Page 61: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Table 8. Comparison of PCI Values Calculated by All Participants

Manual Software Participant

No. PCI

value

Difference from

"expert" value

PCI value

Difference from

"expert" value

1 64.70% 2.40% 64.80% 2.50%

2 74.00% 11.70% 59.80% -2.50%

3 61.50% -0.80% 60.80% -1.50%

4 68.80% 6.50% 66.10% 3.80%

A Mann-Whitney U test at 95% confidence level was performed to compare population

difference of PCI values from the expert value between the modes of analysis. The result of the

test is summarized in Table 9.

Table 9. Results of Mann-Whitney U tests at 95% Confidence Level for Difference of PCI

value from "Expert" Value

Parameter Description

Ho PCI Difference from Expert Valuemanual = PCI Difference

from Expert Valuesoftware

Ha PCI Difference from Expert Valuemanual ≠ PCI Difference

from Expert Valuesoftware

P(T<=t) two-tail 0.22

Remarks Statistically Not Significant at 95% Confidence Level

Conclusion Accept Ho

52

Page 62: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

It can be seen from Table 9 that the difference between the difference of PCI values from

“expert” value for software and manual benchmarking at 95% confidence level there is not

statistically significant. Thus, there is insufficient evidence to conclude that the PCI calculation

tool dramatically impacts the effectiveness of the software in this study. However, the results

might be a little misleading since the PCI values might be similar even if the values for the f

factors considered in the analysis are different. For example, correct values for f1, f2, f3 were 0.33,

1, and 0.66, respectively. Consider that one participant incorrectly enters the f values as 1, 0.66,

and 0.33. The product of the f values in both cases would be the same even though the f values

were wrongly assigned in the later case. To evaluate the PCI values in more detail, the PCI

calculation spreadsheets were assessed and compared to the expert value. The total numbers of

parts included in PCI analysis were 14. Adjustments, including number of parts considered in the

analysis and adjustments to f values, were made to the each of the PCI spreadsheets to attain

values equal to the expert value. The results are shown in Table 10. It can be observed that in

case of manual benchmarking, the numbers of adjustments are higher than those in software

benchmarking.

Table 10. Summary of Adjustments Made to PCI Tables of Each Participant

Manual Benchmarking Software Benchmarking

Participant No.

Total number of f values adjusted

Total number of parts added or

deleted from the analysis

Total number of f values adjusted

Total number of parts added or

deleted from the analysis

1 2 - 1 -

2 4 2 1 -

3 2 1 2 -

4 5 2 3 2

53

Page 63: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5.4 Satisfaction of the Software Based on Survey Results

A survey (see Appendix D for Survey form) was administered to assess the overall satisfaction

with the software, the level of satisfaction for individual features of the software, suggestions to

improve it, and so on. The survey was completed by participants performing the analysis using

the software only.

The survey has three sections to comprehensively evaluate each aspect of the software:

1. Evaluation of Ease of Use and Capability of the software

2. Evaluation of GUI and Usability of the software

3. Evaluation of BOM, DSM and PCI tools

Each of the three sections had approximately ten questions each. In total, each of the four

participants had to provide responses for 29 questions. The percentages of the type of responses

for each section are discussed in detail.

5.4.1 Evaluation of Ease of Use and Capability of the Software

This section of the survey assesses the ease with which the user learns to use the software and its

capabilities. Results are shown in Figure 26. It can be observed that 83% of the responses

indicated that users were satisfied with the clarity of language, ability to navigate the features,

and ability to find required features. Majority users also were satisfied with its overall

capabilities like operating speed, recording data, saving, and editing information, but 13% of the

total responses indicated that users were undecided about “ease of use and capability” of the

software. Out of these 13% undecided responses, 80% of users were not sure if external

54

Page 64: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

applications like Media Player or Internet Explorer ran smoothly when they were using the

software. This question was included in the survey as it is common for users to have multiple

applications open while performing the analysis. So, assessing the impact of these applications

on the performance of software is critical. The primary reason for undecided responses might be

because users did not have any other applications running while they were working on the

software.

Disagree, 

5%

Can't Say, 

13%

Agree, 58%

Strongly 

Agree, 25%

Figure 26. Overall Satisfaction Levels for "Ease of Use" of Software

5.4.2 Evaluation of Graphical User Interface and Usability of the Software

55

This section of the survey assesses the Graphical User Interface of the software and also helps to

support the hypothesis that using Microsoft Excel reduced the learning curve for the software. As

discussed in chapter 2, Microsoft Excel provides a familiar environment to users and sufficient

capabilities at no extra cost. Figure 27 reveals that 72% of the responses positively indicate that

the software has a satisfactory GUI. All of the users “Strongly Agreed” that using Microsoft

Page 65: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Excel made it easier for them to understand and learn the software as compared to a new

independent platform. Furthermore, all of the users indicated that the layout of the software

features were clear and intuitive allowing the users to find what they needed. The majority of the

users also indicated that the software could be easily integrated into classroom product dissection

activities; however, 75% of the users were unsure about the commercial viability of the software.

Strongly 

Agree, 36%

Agree, 36%

Can't Say, 

21%

Disagree, 

7%

Figure 27. Overall Satisfaction Level for "Graphical User Interface" of Software

The existing version of the software has a relatively “plain and simple” design. Creative use of

fonts and colors was not the primary focus. By integrating visually appealing backgrounds and

colors along with enhanced interaction with the user (e.g., flexibility to decide font formats,

colors, and so on), the software can be made into a more attractive package. Again, adding these

features will not cost extra in terms of software resources since the platform used to develop the

software (VBA) is available at no extra cost.

56

Page 66: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5.4.3 Evaluation of BOM, DSM, and PCI Tools of the Software

This section of the survey assesses the BOM, DSM, and PCI features of the software. Figure 28

reveals that 75% of responses are positive indicating that the software has a satisfactory BOM

feature; however, 3 out of 4 users disagreed or were not sure that the BOM tool prevented

redundancy of work. This feature overall was rated as the weakest feature and needs to be looked

into to reduce human effort and streamline the process of recording data.

Strongly 

Agree, 25%

Agree, 50%

Can't Say, 

19%

Disagree, 

6%

Figure 28. Overall Satisfaction Level for "BOM" Feature of Software

The DSM tool was rated one of the best features of the software along with PCI tool. Results

shown in Figure 29 reveal that 87% responded favorably to the DSM tool. While for PCI, results

shown in Figure 30 reveal that 100% of the users agreed that the tools significantly reduced time

and avoided redundancy of work.

57

Page 67: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Can't 

Say, 13%

Agree, 

56%

Strongly 

Agree, 

31%

Figure 29. Overall Satisfaction Level for "DSM" Feature of Software

Agree, 

62%

Strongly 

Agree, 

38%

Figure 30. Overall Satisfaction Level for "PCI" Feature of Software

58

Page 68: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5.5 Summary of Overall Observations of the Software

The DSM and PCI features provided in the software provide distinct advantages over manual

benchmarking since the automation offered in the software significantly reduces human effort

and, as a result, reduces human error. These tools also provide a consistent methodology that

guides users through the various benchmarking steps. For example, consider the DSM tool. The

user is prompted to record the connections for each part. Since, a component DSM is symmetric,

the tool automates the symmetrical connections thereby not only reducing human effort but also

avoiding human errors. Further it presents the DSM in a visually stimulating way by color

coding the different connections, sizing the DSM according to the space constraints on the

spreadsheet, formatting the texts, and so on. This helps users focus their time and effort on

analyzing the DSM for product insights (which is its core purpose) rather than developing the

DSM and making it look appealing.

Meanwhile, for PCI, the tool extracts relevant data available from the BOM for calculations

(e.g., materials, manufacturing, and assembly processes to calculate f2 and f3) and prompts the

user for any inputs required to make judgment calls (e.g., f1 or size and shape factor). This not

only helps to reduce the time for PCI calculations significantly (as seen in Figure 22) but also

gives users the ability to test the product family with different combinations of products and

analyze each scenario easily. These tools aid in reducing the resources and logistics involved in

product family benchmarking by reducing time and effort for the users. As a result, it ensures

consistency of results over time.

59

Page 69: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

In this study, the DSM was developed for a product with only 19 parts. As the number of parts

increases, the complexity increases, and the task gets repetitive thereby not only increasing the

time required but also the possibility of errors. Furthermore, the benchmarking analysis for the

product family was performed by each individual. However, in classroom activities or especially

in industries, due to the tedious nature of benchmarking analysis, it is performed in teams rather

than individuals. The software aids in overcoming the communication gap between team

members by streamlining the benchmarking analysis process (e.g., the same input form is used

by all users resulting in same level of abstraction of detail recorded and with increase in

familiarity with the software, time required reduces), standardizing the nomenclature used (e.g.,

drop-down lists give users with list of materials, part names, and so on), and guiding the users

through the process (same steps are followed each time the analysis is performed). The impact of

these tools on data collection and benchmarking analysis is discussed in the next chapter.

60

Page 70: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

CHAPTER 6

CLOSING REMARKS AND FUTURE WORK

In this thesis, a new Microsoft Excel-based software is presented to support product family

analysis and benchmarking. The software provides an integrated package of various

benchmarking tools currently used in industry and by researchers and faculty (for educational

purposes). It can be used by companies for benchmarking their competitors’ product families

and/or for analyzing their own products. The product family benchmarking capabilities of the

software streamlines and standardizes the workflow involved in utilizing several common

product family tools. Apart from the case study discussed in the thesis, the software has been

used as part of product dissection activities in undergraduate and graduate courses as well as

extensively in the testing activities during development. Advantages of the PFAToolkit observed

during these activities are multifold: saves considerable time and reduces human effort by

automating generation of DSM, clustering of DMS, and calculations of PCI. The software

guides the user in a methodical and organized way. This not only helped achieve more accurate

and consistent results as compared to manual benchmarking analysis but also reduced variability

in recording required data. This can be due to the numerous error checks and standard list

options including part nomenclature, materials, and manufacturing processes. Users exploited the

“Save As” option of Excel to save time on recording data for parts common between products. It

was also observed that as the number of parts increased, the disparity in data consistencies,

results and time taken for analysis increased between the modes of benchmarking analysis

(manual and software).

61

Page 71: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Making decisions on product family designs involves assessing various trade-offs. As the

software evolves into a more comprehensive toolkit, it will empower designers to make better

decisions by providing a systematic assessment from diverse perspectives. This can be achieved

by extending the research and improving the depth of assessment in each of the toolkits features.

For example, tools like interface matrices [41] and change propagation analysis [42] can provide

more systematic assessment of modularity of the product families. The Coupling Index [31] is

another tool that can address redesign efforts of products along with GVI. Integrating other

commonality metrics such as Percent Commonality Index (%C), Degree of Commonality Index

(DCI), Commonality Index (CI), and so on [29] would provide useful means to redesign product

families using commonality indices [43]. By introducing these features in the toolkit, the user

will have various options to explore based on individual needs. Development of these

capabilities also opens new doors for future work on benchmarking tools, as well as

opportunities for leveraging shared data in other applications. A growing area of research

involves the development of design repositories that catalog component geometry, interface, and

performance characteristics. Data entry standardization is a driving requirement in these

benchmarking capabilities and leveraging the data archived in design repositories, if combined

with standardization of metrics, offers potential for increased design automation power. In

addition, the combination with Computer Aided Design (CAD) data offers a more complete

component geometric characterization as well as potential for graphical design representations.

The usability of the tool can be improved by designing more efficient solutions for extracting

data from a company’s MRP (Material Resource Planning) systems or linking Microsoft Excel

with a database to gather data necessary for BOM and PCI analysis. Another limitation is

clustering the DSMs using GAs that only take physical connections into consideration. Future

62

Page 72: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

work would involve considering other connections in the clustering algorithm to provide more

complete solutions. Also, scaling the clustering approach to larger DSMs is important – the

current GA implementation can only handle 200 components at a time. Other future work

includes comprehensive testing and evaluation of the proposed software including GVI tool in an

industry to determine its impact on competitive commercial benchmarking.

63

Page 73: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Appendix A

Installation of PFAToolkit Add-in

Following are the steps to install the PFAToolkit Microsoft Excel Add-in:

Step 1: Save ‘PFAToolkitv1.0.xla’ file in a convenient location.

Figure A. PFAToolkitv1.0.xla file icon

Step 2: Open Microsoft Excel 2007 workbook. Go to ‘Office button’ (left top corner) and select

‘Excel options’

Figure B. Excel Options in Microsoft Excel 2007

64 Step 3: In Excel Options window click on ‘Add-Ins’ (left panel). At the bottom a

Page 74: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

tab called ‘Manage’ appears, press Go.

Figure C. Excel Options window

Step 4: Browse the xla file from the location where it was saved and the software will be added

under Add-Ins tab in top main menu once OK is pressed.

65

Page 75: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Figure A-1. Add-In Installation

66

Page 76: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Appendix B

List of Part Names for Drill, Jig Saw, and Sander, List of Requirements for BOM, List of

Part Connections for DSM, and Formula to Calculate PCI

List of Part Names and Pictures for Drill

Item No. Part Name Picture

1 Battery Charger

2 Battery

3 Left Clamshell

4 Right Clamshell

67

Page 77: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5 Transmission Housing

6 Contacts

7 Black Wire ack Wire

8 Red Wire

9 Switch

10 LED

68

Page 78: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

11 Switch Cover Switch Cover

12 Direction Switch

13 Motor

14 Transmission Cap

15 Thin Planet Gear

16 Ridged UFO Gear

69

Page 79: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

17 Thin Ring Gear

18 Speed Adjustor Housing

19 Small Planet Gear

20 Smooth UFO Gear

21 Washer

22 Thick Planet Gear

70

Page 80: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

23 Thick Ring Gear Thick Ring Gear

24 Ball Bearing

25 Chuck

26 Horizontal Balance

27 Vertical Balance

28 Speed Switch

71

Page 81: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

29 Direction Switch Extender

30 LED Cover

31 Long Nut

32 Screw 1

33 Bolt 1

34 Screw 5

72

Page 82: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

35 Screw 6 Screw 6

73

Page 83: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

List of Part Names and Pictures for Jig Saw

Item No. Part Name

Picture

1 Battery Charger

2 Battery

3 Left Clamshell

4 Right

Clamshell

5 Contacts

6 Red Wire

74

Page 84: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

7 Black Wire

8 Switch

9 PCB

10 LED

11 Motor

12 Connector

75

Page 85: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

13 Gear

76

14 Arm

15 Bearing

16 Blade Holder

17 Positioning

Switch

18 Blade Positioner

Page 86: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

19 Pin

20 Safety

21 Switch Cover

22 Shield

23 Sled

24 Bracket

77

Page 87: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

25 Square Nut

78

26 Screw 3

27 Screw 4

28 Screw 5

29 Screw 1

Page 88: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

List of Part Names and Pictures for Sander

Item No. Part Name

Picture

1 Battery Charger

2 Battery

3 Left

Clamshell

4 Right

Clamshell

5 Switch Cover

6 Gear

79

Page 89: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

7 Circular Bearing

8 Weight

80

9 Shaft

10 Support

11 Sander

12 Motor Gear

Page 90: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

13 Motor

14 Switch

15 Contacts

16 Red Wire

17 Black Wire

18 Screw 1

81

Page 91: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

19 Screw 2 Screw 2

82

Page 92: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Part Connections for Sander

Item No.

Part Name Physical Connections Energy Connections

1 Battery Charger

Battery Battery

2 Battery

3 Left Clamshell

Right Clamshell, Switch Cover, Circular Bearing, Support, Motor, Contacts, Screw

4 Right Clamshell

Switch Cover, Circular Bearing, Support, Motor, Contacts, Screw

5 Switch Cover Switch

6 Gear Shaft, Motor Gear

7 Circular Bearing

Shaft

8 Weight Shaft

9 Shaft Support

10 Support Sander, Screw 2

11 Sander Screw 2

12 Motor Gear Motor

13 Motor Red Wire, Black Wire Red Wire, Black Wire

14 Switch Red Wire Red Wire

15 Contacts Red Wire, Black Wire

16 Red Wire

17 Black Wire

18 Screw 1

19 Screw 2

*NOTE: ALL CONNECTIONS LISTED ARE SYMMETRICAL IN NATTURE, e.g. for Part

No. 12 “Motor Gear” has “Motor” listed connection. This also implies “Motor” is connected

to “Motor Gear”.

83

Page 93: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

List of Requirements

There are 3 requirements of the case study.

1. Prepare Bill of Material for 3 products. The products are disassembled and laid out in an

organized manner. Drill, Sander and Jigsaw are the three products from the Durabilt product

family. You will also find list of part names along with pictures listed for each of the

products. Please use the same part names. All you need to record in addition to part name is

part number, manufacturing process, materials and assembly type. Example is shown below:

Part No. Part Name Manuf. Proc Material Assembly

1 Screw Mild Steel Machining Fastening

2 Battery Various Various Snap

After you complete recording BOM for all the 3 products, record the time taken to

complete it.

2. Next would be preparing the DSM. Prepare the DSM only for the Sander. The connections

between various parts are given in a sheet. The connections include only the lower triangular

matrix. So, e.g., if battery is connected to battery contacts then this implies that battery

contacts are connected to battery. But in the connections sheet, battery will be listed as

connected to battery contact. You have to make sure it is connected the other way round too

in the DSM. Only energy and physical connections are included.

After you complete developing the DSM, record the time taken for it to complete.

84

Page 94: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

3. Lastly, you will need to calculate PCI for the product family of the 3 products. You will find

the formula for PCI along with other documents.

Once you have completed calculating DSM, record the time for it.

The mode of recording details is up to you. So, you can use pen/pencil or you can use MS Excel

or MS Word. You can also use a combination of it!

85

Page 95: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

PCI Formula – Instructions

PCI formula is given by Eq. (1). It provides a relative measure of commonality for comparing

different product families.

PCI = 100 ×

P

i i

P

i

P

i iii

nNP

nfffni

12

123121

1

1

1 (1)

where:

P = total number of non differentiating components that can potentially be standardized

across models;

N = number of products in the product family;

ni = number of products in the product family that have component i;

f1i = Size and shape factor for component i, indicates the ratio of the greatest number of

models that share component i with identical size and shape to the greatest possible

number of models that could have shared component i with identical size and shape (ni);

f2i = Materials and manufacturing processes factor for component i, indicates the ratio of

the greatest number of models that share component i with identical materials and

manufacturing processes to the greatest possible number of models that could have

shared component i with identical materials and manufacturing processes (ni); and

f3i = Assembly and fastening schemes factor for component i, indicates the ratio of the

greatest number of models that share component i with identical assembly

86

Page 96: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Appendix C

Sample Assessment Table of Correct Details Comparison for Manual Benchmarking

1 – Indicates identical data records across users

e.g., In above figure, part 1, same material was recorded by Participant 1,2, and 4. Participant 3

recorded a different material, so Participant 3 was designated as 0.

0.5 – Indicates identical data records between 2 users

e.g., In above figure, part 12, same material was recorded by Participant 1 & 2 while same

material was recorded by Participant 3 & 4.

0 – Indicates non-identical data records across users

87

Page 97: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

Appendix D

PFAToolkit Satisfaction Survey Form

 

I. YOUR  VIEWS  ON  “EASE  OF  USE”  OF  THE  SOFTWARE.                       

PARTICIPANT NO.: ____ 

1. Language is easy to understand 

o o o o o Strongly

Agree Agree Can’t

Say Disagree Strongly

Disagree

2. External applications (e.g. Media Player, Internet Explorer, MS Word and so on) run 

smoothly while using the software  

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

3. Data recording or input method is simple  

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

4. Easy to correct/edit mistakes 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

88

Page 98: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

5. Easy to navigate to find necessary features of the software 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

  

 

6. Switching between worksheets when working on an application (e.g. Bill of Material Input 

Form, Product Line Commonality Index Form is easy  

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

7. Easy to exit any application you are working on 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

8. Easy to resume any application after exiting it 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

9. Software operates at an acceptable speed 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

10. Users can save at regular intervals and re‐enter the information at any point 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

89

Page 99: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

II. YOUR  VIEWS  ON  “GRAPHICAL  USER  INTERFACE  and  USABILITY”  OF  THE 

SOFTWARE. 

 

1. Background and text (fonts) are pleasing and easy to read 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

2. Colors used in the software applications are used in an effective way (e.g. DSM) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

3. Layout is clear and intuitive; learners can always find what they need 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

4. Layout is logical & consistent on all pages 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

5. All aspects of the software can be easily integrated into classroom  product dissection 

activities 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

6. The software is commercially viable 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

90

Page 100: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

7. Using Microsoft Excel as the background platform made it easier to understand and learn 

the software as compared to a new independent platform 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

III. WRITE DOWN THE APPROXIMATE TIME TAKEN FOR EACH TASK OR ANALYSIS 

 

1. Time taken to record necessary part details through BILL OF MATERIALS DATA INPUT 

FORM 

________ 

 

2. Time taken to generate DSM  

________ 

 

3. Time taken perform PCI ANALYSIS 

________ 

 

 

91

Page 101: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

IV. PLEASE  RATE  THE  FOLLOWING  FOR  EACH  INDIVIDUAL  FEATURE  OF  THE 

SOFTWARE. 

 

a. Bill of Materials Input Form 

 

1. Easy to enter data 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

2. Avoids redundancy of work 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

3. Error checks integrated in the form improves data consistency (accuracy of data) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

4. Saves time (as compared to manually recording it using an electronic spreadsheet) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

Suggestions/Remarks:

92

Page 102: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

b. Connections Input Form and Generating DSM tool 

 

1. Easy to enter data 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

2. Avoids redundancy of work 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

3. Error checks provided improve data consistency (accuracy of data) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

4. Saves time (as compared to manually recording it using an electronic spreadsheet) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

Suggestions/Remarks:

93

Page 103: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

c. Product Line Commonality Index (PCI) Input Form(s) 

 

1. Easy to enter data 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

 

 

2. Avoids redundancy of work 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

3. Error checks provided improve data consistency (accuracy of data) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

4. Saves time (as compared to manually recording it using an electronic spreadsheet) 

o o o o o Strongly Agree

Agree Can’t Say

Disagree Strongly Disagree

Suggestions/Remarks:

94

Page 104: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

REFERENCES

[1] Sarkis, J., 2001, "Benchmarking for Agility," Benchmarking: An International Journal, 8(2), 88-107.

[2] Hieb, R., 1995, Benchmarks and benchmarking: a definitional analysis. Daneva, M. Insitute of Information Systems, University of Saarlandes, Saarbrücken, Germany.

[3] Harrington, H. J., 1996, The Complete Benchmarking Implementation Guide: Total Benchmarking Management, New York, McGraw-Hill.

[4] Bogan, C. E. and English, M. J., 1994, Benchmarking for best practices : winning through innovative adaptation, New York, McGraw-Hill.

[5] Sansone, F. P. and Singer, H. M., 1993, "AT&T’s 3-phase plan rings in results," Appliance Manufacture, 41(2), pp. 71-74.

[6] Halman, J. I. M., Hofer, A. P. and Vuuren, W. V., 2005, "Platform-Driven Development of Product Families", Product Platform and Product Family Design: Methods and Applications, Simpson, T. W., Siddique, S. and Jiao, J., Eds., New York, Springer, pp. 27-47.

[7] Hoffman, C., 2006, "The Teardown Artists," Wired, 14(2), pp. 136-140. [8] Thevenot, H. J. and Simpson, T. W., 2009, "A Product Dissection-Based Methodology to

Benchmark Product Family Design Alternatives," Journal of Mechanical Design, 131(4), 041002-1-041002-9.

[9] Farrell, R. S. and Simpson, T. W., 2008, "A method to improve platform leveraging in a market segmentation grid for an existing product line," Journal of Mechanical Design, 130(3), 031403-1 - 031403-11.

[10] Camp, R. C., 1989, Benchmarking: The Search for Industry Best Practices that Lead to Superior Performance, Milwaukee, WI, ASQC Quality Press.

[11] Bhutta, K. S. and Huq, F., 1999, "Benchmarking – Best Practices: An Integrated Approach," Benchmarking: An International Journal, 6(3), pp. 254-68.

[12] Spendolini, M. J., 1992, The Benchmarking Book, New York, NY, American Management Association Communications (AMACOM).

[13] Ye, X. L., Thevenot, H. J., Alizon, F., Gershenson, J. K., Khadke, K., Simpson, T. W. and Shooter, S. B., 2009, "Using product family evaluation graphs in product family design," International Journal of Production Research, 47(13), pp. 3559-3585.

[14] Jain, A., 2010, Process for systematic product family dissection and benchmarking, Industrial and Manufacturing Engineering. State College, Pennsylvania State University. Master of Science.

[15] Ulrich, K., 1995, "The Role of Product Architecture in the Manufacturing Firm," Research Policy, 24(3), 419-440.

[16] Pozos, A., 1995, Benchmarking: an overview, Benchmarking of Agility Workshop. Fort Worth, TX, Automation and Robotics Research Institute.

[17] Melnyk, S. A. and Christensen, R. T., 2000, Back to Basics: Your guide to manufacturing excellence, Boca Raton, FL, CRC Press.

[18] Otto, K. N. and Wood, K. L., 1998, "Product evolution: A reverse engineering and redesign methodology," Research in Engineering Design-Theory Applications and Concurrent Engineering, 10(4), pp. 226-243.

95

Page 105: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

[19] Whitney, D. E., 2004, Mechanical Assemblies: Their Design, Manufacture, and Role in Product Development, New York, Oxford University Press.

[20] Ulrich, K. and Tung, K., 1991, "Fundamentals of Product Modularity", Proceedings of the 1991 Winter Annual Meeting DE, Atlanta, GA, 73-79.

[21] Korpela, J. and Tuominen, M., 1996, "Benchmarking logistics performance with an application of the analytic hierarchy process," IEEE Transactions on Engineering Management, 43(3), pp. 323-333.

[22] Browning, T. R., 2001, "Applying the design structure matrix to system decomposition and integration problems: A review and new directions," IEEE Transactions on Engineering Management, 48(3), pp. 292-306.

[23] Pimmler, T. U. and Eppinger, S. D., 1994, "Integration Analysis of Product Decompositions," ASME Design Theory and Methodology - DTM ‘94, DE-Vol. 68, pp. 343-351.

[24] Huang, C. C. and Kusiak, A., 1998, "Modularity in design of products and systems," IEEE Transactions on Systems Man and Cybernetics Part a-Systems and Humans, 28(1), 66-77.

[25] Kusiak, A. and Larson, N., 1995, "Decomposition and Representation Methods in Mechanical Design," Journal of Mechanical Design, 117, 17-24.

[26] David H., Kusiak, A. and Tseng, T. L., 1998, "Delayed product differentiation: A design and manufacturing perspective," Computer-Aided Design, 30(2), 105-113.

[27] Thabeau, R. E., 2001, Knowledge Management of System Interfaces and Interactions for Product Development Process, System Design and Management. Cambridge, MA, Massachusetts Institute of Technology. Thesis.

[28] Rogers, J. L., Korte, J. J. and Bilardo, V. J., 2006, Development of a Genetic Algorithm to Automate Clustering of a Dependency Structure Matrix, NASA Technical Memorandum, TM-2006-21429

[29] Thevenot, H. J. and Simpson, T. W., 2006, "Commonality indices for product family design: a detailed comparison," Journal of Engineering Design, 17(2), pp. 99-119.

[30] Kota, S., Sethuraman, K. and Miller, R., 2000, "A metric for evaluating design commonality in product families," Journal of Mechanical Design, 122(4), pp. 403-410.

[31] Martin, M. V. and Ishii, K., 2002, "Design for variety: developing standardized and modularized product platform architectures," Research in Engineering Design-Theory Applications and Concurrent Engineering, 13(4), pp. 213-235.

[32] Hauser, J. R. and Clausing, D., 1988, "The House of Quality," Harvard Business Review, pp. 63-73.

[33] Browning, T. R., 2001, "Applying the design structure matrix to system decomposition and integration problems: A review and new directions," IEEE Transactions on Engineering Management, 48(3), 292-306.

[34] Fernandez, C. I. G., 1998, Integration Analysis of Product Architecture to Support Effective Team Co-Location, M.S. Thesis Mechanical Engineering, Cambridge, MA, Massachusetts Institute of Technology

[35] Swift, K. G. and Booker, J. D., 1997, Process selection from design to manufacture, London, Arnold.

[36] Thevenot, H. J. and Simpson, T. W., 2007, "Guidelines to minimize variation when estimating product line commonality through product family dissection," Design Studies, 28(2), 175-194.

96

Page 106: DEVELOPMENT AND EVALUATION OF SOFTWARE FOR …

97

[37] Martin, M. V. and Ishii, K., 2002, "Design for variety: developing standardized and modularized product platform architectures," Research in Engineering Design-Theory Applications and Concurrent Engineering, 13(4), 213-235.

[38] Yu, T. L., Yassine, A. A. and Goldberg, D. E., 2007, "An information theoretic method for developing modular architectures using genetic algorithms," Research in Engineering Design, 18(2), 91-109.

[39] Rissanen, J., 1978, "Modeling by Shortest Data Description," Automatica, 14(5), 465-471.

[40] Fay, Michael P., Proschan, Michael A., Wilcoxon-Mann-Whitney or t-test? On assumptions for hypothesis tests and multiple interpretations of decision rules, Statistics Surveys, 4, (2010), 1-39 (electronic). DOI: 10.1214/09-SS051

[41] Dobberfuhl, A. and Lange, M. W., 2009, Interfaces per Module Is there an ideal number?, ASME International Design Engineering Technical Conferences & Computers and Information in Engineering Conference. San Deigo, California, USA.

[42] Eckert, C., Clarkson, P. J. and Zanker, W., 2004, "Change and customisation in complex engineering domains," Research in Engineering Design, 15(1), 1-21.

[43] Halman, J. I. M., Hofer, A. P. and Vuuren, W. V., 2006, "Commonality Indices for Assessing Product Families", Product Platform and Product Family Design: Methods and Applications, Simpson, T. W., Siddique, Z. and Jiao, J., New York, Springer, pp. 107-129