16
Product data modeling using GTPPM A case study Ghang Lee a, , Rafael Sacks b , Charles Eastman c a Department of Architectural Engineering, Yonsei University, 134 Sinchon-Dong, Seodaemun-Gu, Seoul 120-749, Korea (Republic of), Korea b Civil and Environment Engineering, Technion Israel Institute of Technology c College of Architecture/College of Computing, Georgia Institute of Technology Accepted 13 May 2006 Abstract The Georgia Tech Process to Product Modeling (GTPPM) is a formal process-centric product modeling method. It enables capture of domain- specific information and work processes through process modeling. The method was initially developed in response to the need to integrate multiple use- cases with differing data definitions from different companies. It automates and applies formal methods to aspects of process and product modeling that have traditionally been negotiated by committees. It has been deployed in several research projects, in addition, as an information flow analysis method rather than as a product modeling method. This paper reports a case of deploying GTPPM as a product modeling method, using a purpose-built software tool for its implementation. The case study in the domain of precast concrete construction demonstrates that it is possible to semi-automatically derive a product data model from the collected information through normalization, information integration, and conflict resolution processes. © 2006 Elsevier B.V. All rights reserved. Keywords: GTPPM; Product modeling; Process modeling; Precast concrete 1. Introduction Product (data) modeling initially started in the 1980s as an effort to develop a standard data format for exchanging product information between different software applications. Several international product modeling efforts, such as ISO-STEP and IAI-IFC, have been initiated, and much work has been invested to develop standard product models. (See the Glossary at the end of this paper for acronyms and abbreviations used in this paper.) The most comprehensive effort in the AEC arena is the IAI's develop- ment of the Industry Foundation Classes (IFC) model, which is intended to cover the whole gamut of building and construction. The current IFC model version 2 × 2 [12] has received broad acceptance and has been used in a growing number of projects. The current product modeling practices can be categorized into two types. The current ISO-STEP product modeling process, from elicitation of domain knowledge, through reconciliation of the meanings of vernacular terms, proposal and refinement of appropriate data structures, and democraticapproval of product model candidates, has largely been performed by task groups (e.g., TC 184 SC/4 workgroups) supported by product modeling professionals. The resultant model, called an Application Reference Model, is later integrated into other standard ISO- STEP models, called Integrated Resources and Application Protocols. As a result, current product modeling practice is very time-consuming and laborious. It commonly takes about 5 to 10 years to generate and validate one product model including the committee review process to approve the developed model as a standard. The IAI is taking a slightly different approach. Since its inauguration in August 1994, the IAI focused on developing a framework for the IFC based on ISO-STEP models until the first commercial-level model (IFC 1.5.1.) was released 4 years later in July 1998. Since then, the IFC extension models have been developed as addenda to the main framework model. In general, it takes shorter time to develop IFC extension models than to develop ISO-STEP models because IFC extension models are often defined as extended subtype structures of the IFC kernel layer, and that by only a small number of product modelers. Nevertheless, the development of extension models still takes almost 3 years on average (Table 1). The impact of long durations for product model development is that product model development often lags behind development of the very applications that generate the data whose transfer the models are intended to Automation in Construction 16 (2007) 392 407 www.elsevier.com/locate/autcon Corresponding author. Tel.: +82 2 2123 5785; fax: +82 2 365 4668. E-mail address: [email protected] (G. Lee). 0926-5805/$ - see front matter © 2006 Elsevier B.V. All rights reserved. doi:10.1016/j.autcon.2006.05.004

Product data modeling using GTPPM A case studybig.yonsei.ac.kr/pdf/publications_patents/1. journal(int'l)/20... · Product data modeling using GTPPM — A case study ... effort to

Embed Size (px)

Citation preview

16 (2007) 392–407www.elsevier.com/locate/autcon

Automation in Construction

Product data modeling using GTPPM — A case study

Ghang Lee a,⁎, Rafael Sacks b, Charles Eastman c

a Department of Architectural Engineering, Yonsei University, 134 Sinchon-Dong, Seodaemun-Gu, Seoul 120-749, Korea (Republic of ), Koreab Civil and Environment Engineering, Technion Israel Institute of Technology

c College of Architecture/College of Computing, Georgia Institute of Technology

Accepted 13 May 2006

Abstract

The Georgia Tech Process to Product Modeling (GTPPM) is a formal process-centric product modeling method. It enables capture of domain-specific information andwork processes through processmodeling. Themethodwas initially developed in response to the need to integratemultiple use-cases with differing data definitions from different companies. It automates and applies formal methods to aspects of process and product modeling thathave traditionally been negotiated by committees. It has been deployed in several research projects, in addition, as an information flow analysis methodrather than as a product modeling method. This paper reports a case of deploying GTPPM as a product modelingmethod, using a purpose-built softwaretool for its implementation. The case study in the domain of precast concrete construction demonstrates that it is possible to semi-automatically derive aproduct data model from the collected information through normalization, information integration, and conflict resolution processes.© 2006 Elsevier B.V. All rights reserved.

Keywords: GTPPM; Product modeling; Process modeling; Precast concrete

1. Introduction

Product (data) modeling initially started in the 1980s as aneffort to develop a standard data format for exchanging productinformation between different software applications. Severalinternational product modeling efforts, such as ISO-STEP andIAI-IFC, have been initiated, and much work has been invested todevelop standard product models. (See the Glossary at the end ofthis paper for acronyms and abbreviations used in this paper.) Themost comprehensive effort in the AEC arena is the IAI's develop-ment of the Industry Foundation Classes (IFC) model, which isintended to cover the whole gamut of building and construction.The current IFC model – version 2×2 [12] – has received broadacceptance and has been used in a growing number of projects.

The current product modeling practices can be categorized intotwo types. The current ISO-STEP product modeling process, fromelicitation of domain knowledge, through reconciliation of themeanings of vernacular terms, proposal and refinement ofappropriate data structures, and ‘democratic’ approval of productmodel candidates, has largely been performed by task groups (e.g.,

⁎ Corresponding author. Tel.: +82 2 2123 5785; fax: +82 2 365 4668.E-mail address: [email protected] (G. Lee).

0926-5805/$ - see front matter © 2006 Elsevier B.V. All rights reserved.doi:10.1016/j.autcon.2006.05.004

TC 184 SC/4 workgroups) supported by product modelingprofessionals. The resultant model, called an ApplicationReference Model, is later integrated into other standard ISO-STEP models, called Integrated Resources and ApplicationProtocols. As a result, current product modeling practice is verytime-consuming and laborious. It commonly takes about 5 to10 years to generate and validate one product model including thecommittee review process to approve the developed model as astandard. The IAI is taking a slightly different approach. Since itsinauguration in August 1994, the IAI focused on developing aframework for the IFC based on ISO-STEP models until the firstcommercial-level model (IFC 1.5.1.) was released 4 years later inJuly 1998. Since then, the IFC extension models have beendeveloped as addenda to the main framework model. In general, ittakes shorter time to develop IFC extension models than todevelop ISO-STEP models because IFC extension models areoften defined as extended subtype structures of the IFC kernellayer, and that by only a small number of product modelers.Nevertheless, the development of extension models still takesalmost 3 years on average (Table 1). The impact of long durationsfor productmodel development is that productmodel developmentoften lags behind development of the very applications thatgenerate the data whose transfer the models are intended to

Table 1Development durations for IFC extension models (source: Ref. [29])

IFC extension model Developmentperiod

Developmentduration(years:months)

HVAC performance validation[BS-7]

January 1998–May 2003

5:4

HVAC modeling and simulation[BS-8]

June 2001–May2003

3:0

Network IFC: IFC for cable networks inbuildings [BS-9]

January 2002–May 2003

1:4

Code compliance support [CS-4] May 2001–May2003

2:1

Electrical installations in buildings[EL-1]

April 2002–May2003

1:2

Costs, accounts and financial elements[FM-8]

March 2001–May2003

2:3

Material selection, specification andprocurement [PM-3]

2001–May 2003 2:0

Steel frame constructions[ST-1]

October 1998–May 2003

4:7

Reinforced concrete structures andfoundation structures [ST-2]

January 1997–May 2003

6:4

Precast concrete construction [ST-3] 2001–May 2003 2:0Structural analysis model and steel

constructions [ST-4]August 2000–May 2003

2:8

IFC drafting extension [XM-4] April 2001–April2003

2:0

Average 2:9

393G. Lee et al. / Automation in Construction 16 (2007) 392–407

facilitate. This issue will grow as product models become morecommonly used. While development durations for any productmodel or product model extension are naturally dependent onbudgeting and scope issues, it appears that extension models withcomplex geometry information (e.g., the IFC-STseries in Table 1)require longer durations than other extension models. The processcan be made more efficient, and its results more reliable, than it isat present.

A new process-centric product modeling method called theGeorgia Tech Process to Product Modeling (GTPPM) methodwas developed with these issues in mind. The strategy was todevelop an effective method that started by allowing eachcompany to define their current or planned processes in detail.This research effort included the development of both formalanalysis methods for converting the raw information flow datainto semantic structures, and their conversion into productmodel constructs. It also involved conceptual and user interfacedevelopment to facilitate domain expert entry of relevant,correct and complete information used in the activities beingmodeled. The logic and rules governing the method's functionhave been documented previously [7,17,21].

GTPPM has been deployed and used across a spectrum ofindustrial and academic applications. It was been deployediteratively in the Precast Concrete Software Consortium1 (PCSC)

1 The PCSC is a consortium of major precast concrete producers in Canadaand the US formed in 2001. Its goals are to fully automate and integrateengineering, production, and construction operations, to gain productivity, andultimately to increase the market share. As the means to achieve the goals, thePCSC chose to develop an intelligent 3D parametric CAD system and a PrecastConcrete Product Model (PCPM) to enable data exchange between diversesystems used through the lifecycle of precast concrete pieces.

project between 2001 and 2004 [27] and in research projects atPurdue [3], Carnegie Mellon [9,28], Georgia Tech [27] and theTechnion [25]. Sacks et al. [27] collected fourteen informationflow models developed by North American precast concreteproducers using GTPPM and analyzed and reported variousaspects of the information flows of the North American precastconcrete fabrication process. The study reviews the informationflow of different project delivery types (e.g., design build,subcontract, piece supply) in terms of the level of detail (thenumber of activities to the number of information flows) and theDesign Structure Matrix [23], which was automatically extractedfromGTPPM. In none of these, however, was GTPPMused to thefull extent of its capabilities as a product modeling tool, as it wasin the case study presented here.

Song et al. [28] used GTPPM to represent the pipe fabricationand material tracking processes as part of the FIATECH SmartChips project. Ergen et al. [9] attempted to formally representprecast supply chains as component-based information flowusingthe concept of “material flow” embedded in GTPPM. Navon andSacks [25] captured and represented the control processes of alarge company with sophisticated information systems usingGTPPM to study the issues in Automated Project PerformanceControl (APPC). However, in these projects, GTPPM wasdeployed as a high-level information flow and process analysismethod, rather than a product modelingmethod (as it was initiallyconceived).

In this paper, we describe a case study of deploying GTPPM asa product modeling method and discuss its advantages as such.This case study was conducted with three precast concretefabricators in the US from 2003 to 2004 as part of an effort todevelop a preliminary product data model for supporting dataexchange between three different processes. Section 2 introducesthe general structure and characteristics of GTPPM. Section 3describes how GTPPM was implemented as a softwareapplication. Sections 4.1 and 4.2 present two aspects of theRequirements CollectionMethod (RCM). Section 4.3 details howa GTPPM model can be represented in an SQL format forcomparisonwith the database structures of legacy systems that aretargets for integration. Section 4.4 provides empirical data for thedegree ofmodeling effort required for the process, and Section 4.5summarizes the Logical Product Modeling (LPM) step.

2. A standard product modeling method and GTPPM

The current international standard for productmodeling process[13] is composed of three steps, which are respectively representedby the three model types: the Application Activity Model (AAM),the Application Reference Model (ARM), and the ApplicationInterpreted Model (AIM). The AAM defines the use-cases ofinformation in terms of processes and information flows. TheARM is a data model that is defined in a context of certaininformation use-cases, without reference to related but externalengineering domains or common information concepts used inmultiple domains. The ARM is restructured as the AIM productmodel by integrating the common information concepts, calledIntegratedResources, and the relations to external partmodels. TheAIM is the final data model integrated with other standard Parts.

394 G. Lee et al. / Automation in Construction 16 (2007) 392–407

The AAM modeling phase is the first step of productmodeling. It is equivalent to the requirements collection andmodeling phase (RCM) (or more generally, process modeling)of a general data modeling process [8]. There are severalprocess modeling methods such as the Flowchart, UML, andDFDs. IDEF0 is the most commonly used in product modelingand is endorsed for use by ISO-10303, the STEP standard [13].In addition to the basic process model semantics (e.g., Activity,Flow, Decision, etc.), IDEF0 allows modelers to describe InputInformation, Control, Output Information, and Mechanism(ICOM), but only as short phrases. In standard practice, theARM modeling phase develops a full list of information itemsrequired for the processes defined in the AAM; this is called thelogical product modeling phase (LPM). EXPRESS-G andEXPRESS are the logical product modeling languages [14].Current practice focuses on the development of an AAM thatreflects a general, common process followed industry-wide. Asa result, AAMs are high level and do not reflect detailworkflows. In an ideal situation, however, multiple AAMs thatrepresent various targeted information use-cases and workflowswould be developed in order to define a standard product modelthat can support various information use-cases defined withinmultiple companies, at a more detailed level. However, sinceIDEF0 and other process modeling methods do not provide amechanism to elicit individual input and output informationitems from a certain information-use context, AAM modelingbecomes merely a task to understand a general industry practiceand to define the overall scope of a product model. We are notaware of any standard product model that is developed based onmore than one AAM.

Furthermore, the process of collecting domain terms anddefinitions from domain experts and translating them into aproduct model has been informal. The validity and thecompleteness of the collected information items could not beformally evaluated until the resultant product model is imple-mented as a translator. Ironically, detailed and various informationexchange scenarios between different software applications andrequired subset information need to be specified prior to thedevelopment of a translator or to the creation of views, whichshould have been done in the process of developing a standardproduct model. A project to compensate for this problem in IFC,which was initiated in a slightly different context, is theInformation Delivery Manual (IDM) project — an effort todefine different “data views (conformance classes)” and infor-mation delivery protocols for different processes [30,31]. GTPPMis an effort to provide a formal method to define and reuseinformation requirements throughout the Requirements Collec-tion and Modeling (RCM) phase, the Logical Product Modeling(LPM) phase, and subsequent product-model deployment phases.GTPPM assumes the following processes:

a. Requirements collection and modeling phase (RCM):Domain experts specify various information use-cases(flow) and specific information items used in each use-casewith help from product modeling experts through this phase.

b. Logical product modeling phase (LPM): GTPPM definesrules to resolve conflicts between collected information items

and restructure them in an integrated product model. Apreliminary product model can be derived automatically fromthe collected information items using these rules. Productmodeling experts elaborate and finalize the Applicationrequirements model (ARM). This model is then integratedwith Integrated Resources to generate the AIM.

The sub-sections below briefly describe the generalarchitecture and the characteristics of GTPPM. The detaileddescriptions of the rules, examples, and tutorials can be found inother resources [17,18]. Elaborated and modified rules are to bepublished in Refs. [19,20,22].

2.1. Requirements collection and modeling phase

RCM is a graphical requirements-collection-and-modelingmethod for capture of information in the context of its use. It issimilar to other requirements-collection-and-modeling methodsfor data modeling in that it is also based on a general processmodeling concept, but differs from others in that GTPPMprovides users with the logic and mechanisms to define specificinformation items required by each activity. An RCM modelconsists of two parts: process modeling and specification ofproduct information.

2.1.1. Process modeling componentsEach process model represents a sequence of activities and

information flows between them. GTPPM defines the activity asa logical step of processing information. It receives inputinformation and yields output information:

Output :¼ ActivityðInputÞ

In this respect, an activity is interchangeable with a system ora function. For example, a flowchart may represent a businessprocess or an algorithm. A sequence of activities (i.e., receivingand producing information) naturally forms an information flow.

The GTPPM tool includes thirteen process modelingcomponents (Fig. 1). The difference between the GTPPM-RCM module and other process modeling methods is not in theprocess modeling components themselves, but in the relationsbetween the process modeling components and specificinformation items. For example, a static information source(e.g., building codes, industry standards) cannot be definedwithin a process and can carry only pre-defined informationitems whereas a dynamic information repository (e.g., adatabase management system) can store and return anyinformation item generated or used in a process. Anotherexample is that high-level activities do not carry any definitionof specific information items, but work only as a groupingmechanism for detailed activities. In order to avoid any conflictin information definitions between high-level activities andtheir constituent detailed activities, all information items arestored and carried in detailed activities alone.

The collected RCM process models can be considered thetargeted information use-cases for the Universe of Discourse(UoD). Different users (or companies, or applications) may use

Fig. 1. An example of a GTPPM–RCM model.

Table 2Mapping between company-specific terms in VIIs and information constructs(ICs)

Company I (VIIs) Company II (VIIs) Information Constructs (ICs)

Site name Construction site name SITE{name}Site address Construction site location SITE{address}Estimated weight Load PIECE+LOADS{weight, unit}Piece mark Mark number PIECE{piece mark}Serial number Control number PIECE{control number}

395G. Lee et al. / Automation in Construction 16 (2007) 392–407

information in different ways. GTPPM (RCM) encouragesdomain experts to generate a process model based on theircurrent workflow, or their envisioned future workflow. It doesnot require (and indeed discourages) compromise in processmodeling by requiring alignment with any ‘typical’ or ‘industry-standard’ workflow or process.

2.1.2. Specification of product informationIn GTPPM, product information can be specified in two

ways — in unrestricted local terms or in a machine-readableformat. The terms defined using unrestricted local terms arecalled vernacular information items (VIIs) and the informationitems specified in a formal manner are called informationconstructs (ICs). The terms used to describe the sameinformation often differ from organization to organization,from company to company, and local terminology may vary orconflict. For this reason, the method allows domain experts tospecify information used by each activity in local terms, usingVIIs first, and then map them to ICs to support automation ofthe analysis process. If domain experts are able to work directlyin terms of information constructs, the VII specification processcan be omitted.

ICs are defined in a machine-processable format followingsimple syntactic rules [19]. Modelers can specify the informa-tion used by each activity in a consistent and analyzable wayusing ICs. In order to avoid any semantic ambiguities,information constructs are specified using the terms predefinedin an information menu. (If necessary, more terms can be addedlater.) An information menu is a collection of tokens that areused in a universe of discourse (UoD), with a classificationstructure similar to the parts of speech in a language. Theclassification structure restricts the ways in which tokens can bestrung together in constructing information items. An exampleof an IC is “project+non_residential_building⁎parking_deck{project_code}”. The plus symbol (+) denotes the association(“associated with”) or decomposition (“part of”) relations, theasterisk symbol (⁎) denotes the specialization (“type of”)

relation, and attributes are enclosed in curly brackets. Theattributes always belong to the last entity in the concatenation.(See Ref. [19] for more details on the notation and the rules.)

Examples of VIIs and ICs are provided in Table 2. Mappingbetween company-specific terms in VIIs and informationconstructs (ICs). Any two given companies (denoted CompanyI and Company II in Table 2) may use different terms withintheir organization. If domain experts are unfamiliar with thestructure of ICs, they can start defining information itemsrequired for their processes using VIIs. Later, either domainexperts or product modelers can map them to ICs.

2.2. Logical product modeling phase

The ICs collected through the RCM phase are analyzed,integrated, and converted into a product model through theLogical Product Modeling (LPM) phase. LPM is an algorithmicprocess to derive a product model from collected informationconstructs. It's composed of two main steps:

• Extraction and integration of information constructs (ICs)from multiple RCM models.

• Normalization of collected information constructs into aformal product data model.

Since GTPPM encourages the domain experts to define allpossible different information use-cases and information

Fig. 2. GTPPM system configuration.

396 G. Lee et al. / Automation in Construction 16 (2007) 392–407

constructs required by them, some of the information constructscollected in the LPM phase may have conflicting definitions.The LPM process resolves such conflicts between informationconstructs and integrates the collected information constructsinto a single well-formed product data model. The normaliza-tion and conflict resolution rules are defined as Design Patterns[1,11]. The current LPM process is composed of twelve designpatterns, which are similar to the normal forms in relationaldatabases [2,4,5]. They are similar in that both seek to eliminateanomalies and redundancies in a data structure and toincrementally define a well-formed data model. The majordifference is that the LPM integration and normalization processis a schema-level normalization, whereas the relation databasenormalization is an instance-level normalization. A simpleexample of a design pattern is that, if Entity A is a subtype ofEntity B and has Attribute N in one information construct andanother information construct defines that Entity B also hasAttribute N as its attribute, Attribute N should be removed fromEntity A since it can inherit Attribute N from Entity B when thetwo information constructs are integrated. Details on the LPMnormalization design patterns can be found in Ref. [20]. The laststep, harmonizing the LPM with the Integrated Resources andthe related external constructs in other data models isundertaken by the domain experts, using conventional methods.

3. GTPPM implementation

Product modeling is a collaborative effort between domainexperts and product modeling experts (mediators). GTPPMstructures and formalizes the collaboration between the domainexperts and product modeling experts and automates theproduct model development procedures. To realize thesecapabilities, a GTPPM software tool has been implemented asan MS Visio® add-on using Visio graphic engine andApplication Programming Interface (API) (Fig. 2). TheGTPPM shapes listed in Fig. 1 were modeled and defined asa Visio Stencil and then their behaviors and functions wereimplemented using the Visio API. Examples of behaviorinclude automated shape and identifier creation and update.Examples of functions include syntax checkers, the informationflow consistency checker, and information collectors. An Excelfile was used as a data repository for the information menu andthe vernacular data dictionary and linked to the GTPPM RCMmodule through Microsoft Component Object Model (COM)interface. The GTPPM LPM module, which was alsoimplemented using the Visio API, includes various exportfunctions and information integration and normalization func-

tions. It can export the activity names, the context, andinformation associated with them as an Excel file and alsocan integrate and normalize information collected from theRCM model to a single integrated EXPRESS model.

The GTPPM method is unique in that it enables processmodeling with explicit detailed information flows; this is alsothe key feature of the software tool that distinguishes it fromother graphical process modeling interfaces. Not only do usersplace symbols representing activities, information flows andcontrols, but the system prompts for detailed information itemlists (drawn from the information menu for the domain) for eachactivity and flow. It also checks for consistency in theinformation flows, as will be explained in Section 4.1.

The tool is designed to support alternative modelingapproaches, as illustrated in Fig. 3(a) to (c). Information itemscan be defined by domain experts or by product modelingexperts. Information required by each task (or activity) can bedefined as ICs, as shown in Fig. 3(a); if domain experts prefer tospecify information items using their local terms, then they mayuse VIIs, and the procedure is as shown in Fig. 3(b). In the lattercase, the VIIs are eventually mapped to ICs. Finally, GTPPManalyzes and derives a product model from the collected ICs, inthe LPM procedure shown in Fig. 3(c). ICs can be collected fromdifferent process models and can be integrated into a singlemodel; conflicts between ICs are resolved in the LPM phase.

4. A test case

GTPPMwas deployed in a test-case product-modeling project.In order to capture different types of information use-cases, themanagement processes (i.e., estimating, bidding, production, andshipping) of two precast producers, Company A and Company B,and a precast concrete “designing/drafting” process, of CompanyC, weremodeled using theGTPPM–RCMmodule. CompaniesAand B were chosen because they had advanced databasemanagement systems for managing estimation, production, andshipping information, which could be compared with a productmodel generated through the GTPPM method. Company C waschosen because it had well developed guidelines for designingprecast concrete pieces. Based on the guidelines, the designing/drafting processes for double tees (Fig. 4) and exterior columnswere modeled.

The RCM models of the three companies were developed bythe process modeling experts (the authors) based on theinterviews with the management-level personnel of eachcompany. Later, the models generated were reviewed by thedomain experts at the three companies. Based on the review

397G. Lee et al. / Automation in Construction 16 (2007) 392–407

comments, the models were then elaborated. The informationitems were first defined as vernacular information items (VIIs),based on the interviews and various types of forms provided bythe three companies, and then mapped to information constructs(ICs) later. The following sections describe the modeling processin detail with examples, test-case results, and lessons learned.

Fig. 3. Two possible GTPPM (RCM) modeling procedures (using ICs alone and usinModeling (RCM) process using Information Constructs (ICs). b) the Requirements Coc) the Logical Product Modeling (LPM) process.

4.1. Mapping between Vernacular Information Items (VIIs) andInformation Constructs (ICs)

This section describes how VIIs and ICs were created andmapped in detail. Once the process models were developed, thegroups of information items that are transferred from one

g VIIs) and the GTPPM (LPM) procedure. a) the Requirements Collection andllection and Modeling (RCM) process using Vernacular Information Items (VII).

Fig. 4. A stack of double tees.

398 G. Lee et al. / Automation in Construction 16 (2007) 392–407

activity to another were modeled as information sets (Fig. 5),based on standard company reports required by the end ofcertain activities (e.g., job summary sheet, turnover meetingcheck list, piece tag, and packet slip). The information set isGTPPM's mechanism to group and name collections ofinformation items. The sets were first defined with vernacular

Fig. 5. Examples of information sets

information items (VIIs). Examples of specified informationsets and their items are as follows (a ‘takeoff list’ is a report ofthe quantity of products and subcomponents):

PROJECT INFORMATION SHEET {; project name; loca-

tion; report date; purchaser; address; city_sta-

te_zip; project size; job#; contract value; taxes;

status; type; sold as; detailed project require-

ments; Sales Rep; estimator;}

PACKING SLIP {; address;city_state_zip; job#;

truck number; trailer number; truck driver; pay-

ment method; po#; piece mark; piece qty; piece

description; comments; contents packaged by; con-

tents checked by; contents received by; delivered

date;}

PIECE TAG {; bar code; piece weight; piece mark;}

TAKEOFFLIST{;projectname;location;job#;product

type id; product element id; product name; product

qty; product size; product u/m; estimator; estimate

no; area code; distance between the project site and

theplant;piecemark;piecedepth;piecewidth;piece

unit length; piece weight; load name; total loads;

total # of pieces; piece qty;}

(company names are mosaicked).

399G. Lee et al. / Automation in Construction 16 (2007) 392–407

The VIIs were then mapped to ICs using the Information ItemMapper (Fig. 6), in consultation with the domain experts. Fig. 6shows the user interface of the Information ItemMapper. The left-hand window panes list tokens by their relations (e.g., thespecialization relation, the association relation, and attributes).Tokens in specialization relations inherit attributes from theirparents. Users can form information constructs (IC) by navigatingthrough the lists, and thenmapping the resulting ICs to VIIs in theright-hand window (labeled ‘Mapped Information’).

VIIs and ICs were generally mapped one to one. However,several VIIs and ICs were mapped one to many or many to one.(This implies that some VIIs and ICs are in fact in the many-to-many relation). VIIs that were synonyms were mapped to singleICs. Some of the VIIs actually included several pieces ofinformation, and had to bemapped to several ICs. An example ofthe latter is galvanized embed order status. In order to keep trackof the order status of a product or a part in terms of datamanagement, we need to specifically know which item has beenordered, what is the purchase order identifier, and so on.However, when such information is maintained in a paperformat, it is recorded informally and freely as one long note. Toclarify the precise meaning, the data recorded in the VIIgalvanized embed order status, was mapped to four distinct ICsas follows:

PIECE+MATERIAL*HARDWARE{;type;};

PIECE+MATERIAL*HARDWARE{;id;};

Fig. 6. Mapping ambiguous term

PIECE + MATERIAL * HARDWARE+PURCHASE_ORDER{;

status;};

PIECE+MATERIAL*HARDWARE+PURCHASE_ORDER{;id;}

Some VIIs may have different meanings to different people,depending on their familiarity with the domain. The VII rebarschedule is a good example. A rebar schedule is not a time-based activity schedule for making or placing rebars; instead itdenotes a listing of rebars for production, including diameters,lengths and shape designations, often, but not always, includingan abstract 2D representation of the bent rebar shapes. In themapping process, ambiguous VIIs such as rebar schedule weremapped to ICs based on the definitions, data types, examples,references, and synonyms of the VIIs (the right side of Fig. 6).

VIIs specified in information sets were automaticallyconverted to ICs according to the mapped relations betweenVIIs and ICs. The input and output information items of activitieswere specified using information sets as the targets of informationproduction. The consistency of information flows were thenchecked using the automated routines available in the tool.

4.2. Ensuring logical consistency in information processmodels

As mentioned in the Introduction, GTPPM was first used byfourteen PCSC member companies for analyzing their sales,design, engineering, and production processes [27]. Analysis of

s based on the descriptions.

Fig. 7. The activity information window.

400 G. Lee et al. / Automation in Construction 16 (2007) 392–407

these models revealed logical inconsistencies in many of theinformation flows. For example, information items modeled asgenerated in a source activity were never used in any subsequentactivity; items reported to be received in a consecutive activity,were neither imported into nor generated by the source activity.These issues motivated development and implementation of arigorous method to validate the consistency of the informationflows defined within process models. The dynamic consistencychecking method validates the consistency of information flowas the flows are entered by the domain experts, based on theavailability of information: i.e., if input information required foran activity is not provided by the upstream activities, the infor-mation flow is inconsistent. The dynamic consistency checkingmethod defines the relationships between different informationtypes (i.e., input, output, generated information, and remaininginformation) as rules. (see Ref. [21] for more details on therules). In the test case project, the dynamic consistency checkingmethod was used to check the validity of collected information.

Fig. 7 shows the GTPPM Activity Information Window.Activity “Schedule Pieces to Fabrication Areas” receivesinput information and returns output information. The goal ofdynamic consistency checking is to eliminate two types ofinconsistency indicators— the unavailable- or the not-provided-information items. The unavailable information items are the

input information items that are not provided by upstreamactivities:

UnavailableInformationu [ fxjxa inputðAÞg− [ fyjya outputðupðAÞ

�g

where

x, y information itemsup(A) upstream activities of an activity A;output(A)output information of an activity A;input(A) input information of an activity A

New information items can be generated as a result of anactivity, but cannot be generated when they are only transferredfrom one activity to another. Such illogicality can be correctedby taking one of the following remedies or other approachesdepending on the cause.

– Add new output information items to one of the upstreamactivities.

– Remove the unavailable information items from the inputitem list of the current activity.

Fig. 8. The consistency checker.

401G. Lee et al. / Automation in Construction 16 (2007) 392–407

– Add new activity that can provide the unavailable informa-tion items.

– Add or redirect information flows.

The not-provided-information items are the informationitems that are required by downstream activities, but have notbeen provided any upstream activities. The not-provided-information is formally defined as follows:

Not−providedinformationu [ fxjxa inputðdnðAÞ�g

− [ fyjya outputðupðdnðAÞ��

g

where

dn(A) downstream activities of an activity A;

Similar to the unavailable information problem, the not-provided information items can be eliminated either by editing theinformation lists of relevant activities; adding or removingactivities; or by adding, deleting, or redirecting information flows.

The same logic has been implemented in the ConsistencyChecker (Fig. 8), which can detect and mark any inconsistent orempty activity in an RCM model. This batch checking functioncan be effectively used especially when a product modelingexpert needs to check the validity of RCM models collectedfrom non-product-modeling experts (i.e., domain experts).

By deploying the dynamic consistency checking method,missing or illogical information items were detected and

corrected in the three collected RCM models. Note however,that the dynamic consistency checking method can be renderedineffective if a modeler repeatedly copies one informationdefinition list from activity to activity without rigorouslycompiling the information items one by one, because there willnot be any logical inconsistency between the resulting identicalinformation inputs and outputs.

4.3. Practicality of an automatically generated model

The Company A and Company B information constructswere normalized into two separate preliminary product modelsin EXPRESS. In order to compare the results with the datastructures of Company A's current database managementsystem, the information constructs collected from CompanyA's model were also normalized into an SQL schema. For thisprocess, an SQL code generation module was developed andused to show referential relationships between TABLEs. Fig. 9shows the SQL table structure generated from the Company Amodel with referential relations.

Company A's ERP system was running multiple databasemanagement systems, which were composed of several com-mercial and custom-built databasemanagement systems. CompanyA was using an MS Access®-based estimation system, twoOracle®-based production scheduling, shipping, inventory, andpurchase management systems, a legacy accounting/costingsystem, an engineering/drawing management system, and ahuman resource/payroll system. However, only limited sets ofinformation could be exchanged between the different database

Fig. 9. An SQL table structure of the Company A model with referential relations.

402 G. Lee et al. / Automation in Construction 16 (2007) 392–407

management systems. The company was in the process ofdeveloping a central database that would integrate the disperseddatabases and that could acquire geometric information and bills ofmaterials (BOMs) directly from an advanced 3D CAD system.

Direct and quantitative comparison between the automaticallygenerated data model and the data schemas of Company A's ERPsystem was limited because of the fundamental differencesbetween them and also because of the differences between theterms used in the two data schemas. For example, theautomatically generated model was designed as one large schemawhile Company A's existing database system was a distributed setof schemas. The automatically generated data model was based onan object-oriented modeling approach (i.e., EXPRESS) whereasCompany A's systems were relational databases using SQL. Inorder to flatten the inheritance structure of the object-orientedmodel, the automatically generated EXPRESS model wastranslated into a SQLmodel based on one [32] of several mappingmethods from EXPRESS (object-oriented database) models toRelational databases [10,16,24,26,32,33]. The automaticallygenerated SQL model included thirty-seven TABLEs. Still, thefundamental terms and structural problems remained.

Thus the differences between the property set structures andthe terms prohibited the authors from comparing the twodifferent schemas directly and quantitatively. Each TABLE andits attributes were reviewed by the authors and the IT manager atCompany A. In terms of the existence of TABLES andattributes associated with them, the following sixteen TABLESwere categorized as over-defined, nineteen TABLES as closelydefined, and only two were categorized as under-defined:

1. Over-defined: TABLEs that include more information thanCompany A's current data models: ASSEMBLY, BIDDING,BOM (bill of materials), BUILDING_CODE, CON-STRAINTS, DIMENSIONS, ENGINEERING, EQUIP-

MENT, ERECTION, GEOMETRY (2D, 3D), MOLD,QC_CHECK, SHIPPING, SURFACE_TREATMENT,TRUCK_LOADS.

2. Closely Adequate: TABLEs that define information at thesame level as Company A's current data models: DESIGNREQUIREMENTS, DOCUMENTATION, DRAWING,ERECTION_DRAWING, ESTIMATION, HARDWARE,HARDWARE_LIST, LABOR, MATERIAL, PIECE, PIE-CE_DRAWING, PIECE_LIST, PRODUCTION_AND_-HANDLING, POUR, PRESTRESSING, PROJECT,REINFORCEMENT, SCHEDULE, SITE.

3. Under-defined: TABLEs that lack necessary information:BATCH (mix recipe), CONCRETE (mix recipe).

The over-defined TABLEs include additional attributes andentities thatwere notmanaged byCompanyA at that time, but thatthey wished to manage in the near time. Examples include 3Dgeometry and engineering information, equipment information,quality control and constraint check information and additionalshipping and BOM information. The two TABLES related to theconcrete mix process were under-defined because the concretemix process was defined with little detail in the requirementscollection model. This shows the sensitivity of GTPPM to itsrequirements collection model; it can only create a product modelbased on the use-cases that are specified in the requirementscollection process

4.4. Modeling effort required

One of the goals of the GTPPM method is to reduce thedegree of human effort required for product modeling. Empiricalevidence of the impact the method has on the requirementscollection and the logical product modeling phases was recordedthrough development of the processmodels for Companies A, B,

Fig. 10. A part of a double tee modeling process.

Table 3The statistics of model components

Process model components Company A Company B Company C

ActivitiesInternal detail 98 96 55External detail 13 29 7Internal high level 9 7 4External high level 13 29 7Total (nA) 133 161 73

FlowsInformation flow (nF) 210 179 160Feedback flow 14 9 3Material flow 70 64 0

OtherDynamic repository 10 14 21Static information source 6 10 2Continue 84 42 18

Information itemsInformation sets 24 6 0VIIs (non distinctive) 192 186 0ICs (distinctive) 135 231 85

Degree of information dependence(nF/nA)

1.58 1.11 2.19

403G. Lee et al. / Automation in Construction 16 (2007) 392–407

and C, in the previous sections and compilation of a sampleunified product model described in Section 4.5 below.

In contrast to the models of Companies A and B, whichfocused on management and administrative procedures, theprocess model prepared at Company C captured precast concreteengineering procedures. The procedures for designing anddrafting prestressed double tee elements (Fig. 4) and exteriorcolumns were selected and modeled. Engineering processes aremore difficult to model than administrative processes because ofthe high degree of domain expertise they include and alsobecause of their complexity; even domain experts with morethan 10 years of experience find it difficult to describeengineering and design processes in a systematic way. In thiscase, the information items of each activity were defined directlywithout using information sets. As before, they were firstdefined as vernacular information items (VIIs) and then mappedto information constructs (ICs).

The major difference between a business management processand a designing/drafting process in terms of information flow isthat information flow in the designing/drafting process isaccumulative: i.e., a model of a precast concrete structure behavesas a data repository. As soon as a designer adds a shape or texts toa precast concrete model or to a drawing, they represent certaininformation. Also, such design information affects not only theactivities immediately following it, but also many other activitiesthat appear later in the process. Therefore, the information transfercan be modeled using dynamic (information) repositories. In thiscase study, precast concrete pieces were modeled using dynamic

repositories, which is a concept similar to a database, allowingdynamic inputs and outputs, as shown in Fig. 10. In Fig. 10, “DTmodel” and “Drawings from clients” are the examples of dynamicrepositories. They receive and redistribute collections ofinformation to other activities.

Table 4Modeling durations (hours)

Company A B C

Task

Process modeling (hours) 24 24 6Capturing detailed information flows using VIIs (hours) 12.5 3 2Mapping VIIs to ICs (hours) 7.5 2 2Total (hours) 44 29 10

404 G. Lee et al. / Automation in Construction 16 (2007) 392–407

Another difference between the previous Company A andCompany B models and the Company C model is that theCompany Cmodel included specific types of products. Since theCompany Amodel focused on themanagement process, types ofpieces were defined by generic information such as productname or piece-mark, whereas, in the Company C designing/drafting model, types of pieces were defined specifically asspandrel, pc_column, floor_piece, etc. In order to design a piece,designers need to knowwhich type of piece (at an object level) isconnected to which other type of piece. For the same reason,although the Company C model dealt with only two producttypes (double tees and spandrels), the definitions of adjacentpieces and connections were also captured.

The GTPPM process models collected from Companies A, Band C contained 135, 231, and 85 distinctive informationconstructs respectively (Table 3). In terms of the number ofprocess components, there was no significant difference betweenthe Company A and Company B models; the models included133 and 161 activities and 210 and 179 information flowsrespectively. The Company C model was smaller than the twoprevious models in terms of both the number of process compo-nents and the number of distinctive information items, because itonly dealt with a small portion of the design and engineeringprocess. It included 73 activities and 160 flows. The degree ofinformation dependence in each model, which is the ratio of thenumber of information flows (nF) to the number of activities (nA)[27], was 1.58, 1.11, and 2.19 respectively.

Table 4 shows the modeling hours recorded for the threemodels. The RCM modeling process took 44, 29, and 10 hrespectively, whereas development of Company A's ERPsystem had occupied commercial database developers forseveral months. The difference between 44 hours and severalmonths is significant given that the resultant model was veryclose to the data model of the working ERP system. Althoughthis study did not focus on productivity, this result could bestrengthened in subsequent studies by conducting controlledexperiments focusing on the time reduction issue.

4.5. Logical product modeling and refinement

Information constructs collected from the above three modelswere integrated as one model through the LPM process. Thisphase is automated, and does not require user input. Theintegrated model included 129 entities; for reference andcomparison, CIS/2 LPM 6 [6] has 731 entities and PCC-IFCVersion 0.9 [15] has 413 entities. The syntax of the automaticallygenerated integrated product model was validated using two

syntax checking tools (those embedded in the commercial toolEXPRESS Data Management (EDM®) Supervisor Version 4.5and in the shareware Expresso Version 3.1.4). The integratedmodel was then implemented as a physical database managementsystem using the EDM Server.

Fig. 11 illustrates an EXPRESS-G model of the integratedpiece definitions from the Company A, Company B, andCompany C models (‘dt’ in the model represents the double teeentity). The EXPRESS-G model was automatically generated byimporting the integrated model into STEP Tools® ‘as-is’ withoutany refinement or modification. As shown in Fig. 11, a productmodel automatically derived from various process models usingGTPPM is not complete and requires further refinement. This isnot due to any logical problem in GTPPM, but due to itssensitivity to specified processes. For example, in Fig. 11, amongthree subtypes (‘spandrel’, ‘pc_column’, and ‘floor_piece’) of‘piece’, only ‘spandrel’ has 3-dimensional geometry information.It is not an error and does not mean that ‘pc_column’ or‘floor_piece’ will not have 3-dimensional geometry. It simplymeans that, in the information constructs collected from the threetest cases described above, only ‘spandrel’ had a casewhere it wasassociated with 3-dimensional geometry information. This mayseem unreasonable; however, it can be traced to the fact that thetest cases were taken from companies using 2-dimensional CADsystems [27]. In such cases, if a modeler identifies missinginformation that has not been captured through the GTPPMprocess, such information should be added to the final modelduring the refinement process.

Furthermore, the current version of the GTPPM tool wasimplemented assuming that entity data types would be elaboratedin the product-model refinement phase, and so sets STRING asthe default data type. TheWHERE clauses and the cardinalities ofentities are also assumed to be defined in the refinement phase.For example, the direct association relations between ‘dt’ and twoconnection types (dap and chord in Fig. 11) may be refined by theWHERE clauses in the manual modification process, and othermodifications can also be made. Since these issues are usuallydealt with in the last phase of product modeling, they may not becritical. However, a mechanism to capture domain rules andtranslate them into theWHERE clauses is a challenging topic andis worthy of the future attention of researchers in informationmodeling.

5. Conclusions

As the variety of AEC software applications increase and theirnature becomes more diverse, and the adoption of advancedinformation technology in real projects grows, the need forstandard product modeling to enable data exchange betweendifferent applications increases. However, traditional standardproduct modeling practice involves a long and iterative reviewprocess. GTPPM is an effort to systematically collect rich infor-mation input from domain experts at the start of the process, in aform that can be structured and used to automate product modelgeneration downstream, shortening the duration required forcomplete product model development. It is also an effort tologically structure and formalize the process of generating a

Fig. 11. Automatically generated PIECE definitions.

405G. Lee et al. / Automation in Construction 16 (2007) 392–407

IDEF: Integration Definition of Function Modeling

– ICOM: Input, Control, Output, and Mechanism

IAI IFC

– IAI: International Alliance for Interoperability– IFC: Industry Foundation ClassISO STEP– ISO: International Organization for Standardization– STEP: STandard for the Exchange of Product model data

ISO STEP model types

– AAM: Application Activity Model– AIM: Application Interpreted Model– ARM: Application Reference Model

LPM: Logical Product ModelingRCM: Requirements Collection andModeling

SQL: Structured Query Language (a query and transaction specificationlanguage for the relational data model)

UML: Unified Modeling Language(s)

UoD: Universe of Discourse

406 G. Lee et al. / Automation in Construction 16 (2007) 392–407

product model, transforming it from an intuitive art to anengineering procedure. An additional benefit is that the methodallows participating companies to track how their corporateprocedures have been embedded into the final product model.

GTPPM has been implemented and improved through severaltest cases with the North American Precast Concrete SoftwareConsortium (PCSC). It has also been used in a number of otherresearch projects. Through these and other projects, GTPPM hashad to address not only embedded logical and formal capabilities,but also user-oriented and organizational realities, which led tomultiple refinements and enhancements. The case studies haveshown that its application is feasible. Early indications are that itfulfils the goal of reducing product modeling duration, althoughno firm conclusions can be drawn regarding the quality of itsoutput vis-à-vis that of a more traditional product modelingapproach. It is howevermore rigorous andmore easily automated.

In the development of the GTPPM method and the softwaretool that supports its implementation, we found that both logicaland user-oriented capabilities were needed and interdependent.One could not be completely defined before the other. Workingwith a sizable number of organizations allowed iterative cycles ofdevelopment, deployment and refinement. Iterative refinement is anecessary component of such research and its validation; it is ofvalue both as a method to be followed by others and as a means todefine the logic behind the decisions made. We believe such anempirical process is necessary for knowledge capture research ingeneral.

In addition to automating the information and capture of newproduct models, the GTPPM method can also be used for theupdate and extension of existing product models. Furthermore,its potential applications may not be limited to productmodeling. For example, the systematic, integrated collectionof process and information flow data, and its semanticstructuring into syntactic units, as realized in GTPPM, canalso be applied in systematically capturing information require-ments such as the Information Delivery Manual (IDM) ordevelopment of other knowledge-rich systems, such as enter-prise resource planning systems, process reengineering, and insupport of knowledge elicitation for knowledge-based systemdevelopment.

Appendix A. Glossary

APPC: Automated Project Performance Control

CIS/2: CIMSteel (Computer-Integrated Manufacturing for ConstructionSteelwork) Integration Standards Release 2

DFD: Data Flow Diagram(s)

GTPPM: Georgia Tech Process to Product Modeling (Method); A productmodeling method that was developed to expedite a productmodeling process by providing a logical integration mechanismbetween the requirements collection and modeling phase and thelogical modeling phase

– IC: Information Construct, a concatenation of tokens– VII: Vernacular Information Items

References

[1] C. Alexander, S. Ishikawa, M. Silverstein, M. Jacobson, I. Fiksdahl-King,S. Angel, A Pattern Language: Towns, Buildings, Construction, OxfordUniversity Press, New York, 1977.

[2] P.A. Bernstein, J.R. Swenson, D. Tsichritzis, A unified approach tofunctional dependencies and relations, in: W.F. King (Ed.), Proceedings ofthe 1975 ACM SIGMOD International Conference on Management ofData, ACM, San Jose, California, 1975, pp. 237–245.

[3] D. Castro-Lacouture, B2B e-Work Intranet Solution Design for RebarSupply Interactions, Ph.D. thesis, School of Civil Engineering, Purdue,2003.

[4] E.F. Codd, Extending the data base relational model to capture more meaning,ACM Transactions on Database Systems (TODS) 4 (4) (1979) 397–434.

[5] E.F. Codd, Further normalization of the data base relational model, in: R.Rustin (Ed.), Data Base System, vol. 6, Prentice-Hall, Englewood Cliffs,N. J., 1972, pp. 33–64.

[6] A. Crowley, CIMSteel Integration Standards Release 2 (CIS/2), http://www.cis2.org/ (2003, Last Accessed: 2005).

[7] C.M. Eastman, G. Lee, R. Sacks, A new formal and analytical approach tomodeling engineering project information processes, in: K. Agger, P.Christiansson, R. Howard (Eds.), CIB W78, vol. 2, Aarhus School ofArchitecture, Aahus, Denmark, 2002, pp. 125–132.

[8] R. Elmasri, S. Navathe, Fundamentals of Database Systems, Fourth ed.Addison Wesley Longman, Inc., Reading, MA, 2004 Edition 4.

[9] E. Ergen, B. Akinci, R. Sacks, Formalization and automation of effectivetracking and locating of precast components in a storage yard, EIA-9: E-activities and intelligent support in design and the built environment, 9thEuropIA International Conference, Istanbul, Turkey, 2003, pp. 31–36.

[10] J. Fong, Translating object-oriented database transactions into relationaltransactions, Information and Software Technology 44 (2002) 41–51.

[11] E. Gamma, R. Helm, R. Johnson, J. Vlissides, Design Patterns: Elementsof Reusable Object-Oriented Software, Addison Wesley, 1994.

[12] IAI, Industry Foundation Classes IFC2x Edition 2, http://www.iai-international.org/Model/R2x2_add1/index.html (2003, Last Accessed: 2006).

[13] ISO TC 184/SC 4, ISO 10303-1:1994 Industrial automation systems andintegration- Product data representation and exchange- Part 1: Overview andfundamental principles, InternationalOrganization for Standardization, 1994.

407G. Lee et al. / Automation in Construction 16 (2007) 392–407

[14] ISO TC 184/SC 4, ISO 10303-11:1994 Industrial automation systems andintegration- Product data representation and exchange- Part 11: Descrip-tion methods: The EXPRESS language reference manual, InternationalOrganization for Standardization, 1994.

[15] K. Karstila, A. Laitakari, M. Nyholm, P. Jalonen, V. Artoma, T. Hemio, K.Seren, Ifc2x PCC v09 Schema in EXPRESS, PCC-IFC Project Team, IAIForum Finland, 2002.

[16] K.H. Law, T. Barsalou, G. Wiederhold, Management of complex structuralengineering objects in a relational framework, Engineering withComputers, vol. 6, Springer-Verlag, New York, 1990, pp. 81–92.

[17] G. Lee, GTPPM Official Website, http://dcom.arch.gatech.edu/glee/gtppm(2002, Last Accessed: 2005).

[18] G. Lee, A New and Formal Process to Product Modeling (PPM) Methodand its Application to the Precast Concrete Industry, Ph.D. dissertation,College of Architecture, Georgia Institute of Technology, 2004.

[19] G. Lee, C.M. Eastman, R. Sacks, Grammatical rules for specifying productinformation to support automated product data modeling, AdvancedEngineering Informatics 20 (2006) 155–170.

[20] G. Lee, C.M. Eastman, R. Sacks, Twelve Design Patterns for Integratingand Normalizing Product Model Schemas, Computer-Aided Civil andInfrastructure Engineering (CACAIE) (in press).

[21] G. Lee, R. Sacks, C.M. Eastman, Dynamic information consistency checkingin the requirements analysis phase of datamodeling (Keynote), in: Z. Turk, R.Scherer (Eds.), eWork and eBusiness in Architecture, Engineering andConstruction— European Conference for Process and Product Modeling(ECPPM), A.A. Balkema, Slovenia, 2002, pp. 285–291.

[22] G. Lee, R. Sacks, C.M. Eastman, Eliciting Information for ProductModeling using Process Modeling, Data and Knowledge Engineering (inreview).

[23] MIT, UIUC, The Design Structure Matrix Website, http://www.dsmweb.org/ (2003, Last Accessed: 2006).

[24] S. Monk, J.A. Marianib, B. Elgalalb, H. Campbell, Migration fromrelational to object-oriented databases, Information and Software Tech-nology 38 (1996) 467–475.

[25] R. Navon, R. Sacks, Status and Research Agenda of Automated ProjectPerformance Control (APPC), ASCE Journal of Construction Engineeringand Management (in review).

[26] J.W. Rahayua, E. Changa, T.S. Dillona, D. Taniar, A methodology fortransforming inheritance relationships in an object-oriented conceptualmodel to relational tables, Information and Software Technology 42 (2000)571–592.

[27] R. Sacks, C.M. Eastman, G. Lee, Process model perspectives onmanagement and engineering procedures in the North American precast/prestressed concrete industry, the ASCE, Journal of ConstructionEngineering and Management 130 (2) (2004) 206–215.

[28] J. Song, C. Haas, C. Calda, E. Ergen, B. Akinci, C.R. Wood, J. Wadephul,FIATCH Smart Chips Projects: Field Trials of RFID Technology forTracking Fabricated Pipe - Phase II, FIATECH, 2004.

[29] VTT, IAI IFC Model Development, http://ce.vtt.fi/iaiIFCprojects/ (2004,Last Accessed: 2005).

[30] J. Wix, Information Delivery Manual (IDM) Using IFC to Build SMART—The IDM Project Official Website http://www.iai.no/idm/learningpackage/idm_index.htm (2005, Last Accessed: 2005).

[31] J. Wix, Information Delivery Manual (IDM): Enabling InformationExchange in AEC/FM Business Processes, Jeffrey Wix Consulting Ltd,UK, 2005, p. 33.

[32] S.-J. You, D. Yang, C.M. Eastman, Relational DB implementation of STEPbased product model, CIB World Building Congress 2004, Toronto,Ontario, Canada, 2004.

[33] X. Zhang, J. Fong, Translating update operations from relational to object-oriented databases, Information and Software Technology 42 (2000)197–210.