39
-1- Work Design and Integration Glitches in Globally Distributed Technology Projects Anant Mishra School of Business, George Mason University, Fairfax, VA 22030 Email: [email protected], Phone: (703) 993-1771 Kingshuk K. Sinha Carlson School of Management, University of Minnesota, Minneapolis MN 55455 Email: [email protected], Phone: (612) 624-7058 Forthcoming in Production and Operations Management Journal Abstract In a technology project, project integration represents the pooling together of complete, interdependent task modules to form a physical product or software delivering a desired functionality. This study develops and tests a conceptual framework that examines the interrelationships between the elements of work design, project integration challenges, and project performance. We identify two distinct elements of work design in technology projects: (i) the type of project organization based on whether a technology project spans a firm boundary (Domestic-Outsourcing) or a country boundary (Offshore-Insourcing) or both boundaries (Offshore-Outsourcing) or no boundaries (Domestic-Insourcing), and (ii) the joint coordination practices among key stakeholders in a technology projectnamely, Onsite Ratio and Joint- Task Ownership. Next, we measure the effectiveness of project integration using integration glitches that capture the incompatibility among interdependent task modules during project integration. Based on analysis of data from 830 technology projects, the results highlight the differential effects of distributed project organizations on integration glitches. Specifically, we find that project organizations that span both firm and country boundaries (Offshore-Outsourcing) experience significantly higher levels of integration glitches compared to domestic project organizations (Domestic-Outsourcing and Domestic- Insourcing). The results further indicate that the relationship between project organization type and integration glitches is moderated by the extent of joint coordination practices in a project. That is, managers can actively lower integration glitches by increasing the levels of onsite ratio and by promoting higher levels of joint-task ownership, particularly in project organization types that span both firm and country boundaries (Offshore-Outsourcing). Finally, the results demonstrate the practical significance of studying integration glitches by highlighting its significant negative effect on project performance. Keywords: Global Sourcing, Work Design, Integration Glitches, Project Management, Offshoring, Outsourcing

Work Design and Integration Glitches in Globally Distributed Technology Projects

Embed Size (px)

Citation preview

-1-

Work Design and Integration Glitches in Globally Distributed Technology Projects

Anant Mishra

School of Business, George Mason University, Fairfax, VA 22030

Email: [email protected], Phone: (703) 993-1771

Kingshuk K. Sinha

Carlson School of Management, University of Minnesota, Minneapolis MN 55455

Email: [email protected], Phone: (612) 624-7058

Forthcoming in Production and Operations Management Journal

Abstract

In a technology project, project integration represents the pooling together of complete, interdependent

task modules to form a physical product or software delivering a desired functionality. This study

develops and tests a conceptual framework that examines the interrelationships between the elements of

work design, project integration challenges, and project performance. We identify two distinct elements

of work design in technology projects: (i) the type of project organization based on whether a technology

project spans a firm boundary (Domestic-Outsourcing) or a country boundary (Offshore-Insourcing) or

both boundaries (Offshore-Outsourcing) or no boundaries (Domestic-Insourcing), and (ii) the joint

coordination practices among key stakeholders in a technology project—namely, Onsite Ratio and Joint-

Task Ownership. Next, we measure the effectiveness of project integration using integration glitches that

capture the incompatibility among interdependent task modules during project integration. Based on

analysis of data from 830 technology projects, the results highlight the differential effects of distributed

project organizations on integration glitches. Specifically, we find that project organizations that span

both firm and country boundaries (Offshore-Outsourcing) experience significantly higher levels of

integration glitches compared to domestic project organizations (Domestic-Outsourcing and Domestic-

Insourcing). The results further indicate that the relationship between project organization type and

integration glitches is moderated by the extent of joint coordination practices in a project. That is,

managers can actively lower integration glitches by increasing the levels of onsite ratio and by promoting

higher levels of joint-task ownership, particularly in project organization types that span both firm and

country boundaries (Offshore-Outsourcing). Finally, the results demonstrate the practical significance of

studying integration glitches by highlighting its significant negative effect on project performance.

Keywords: Global Sourcing, Work Design, Integration Glitches, Project Management, Offshoring,

Outsourcing

-2-

1. Introduction

Increasing globalization and growing competitive pressures are driving firms to unbundle their value

chain activities and execute them in organizational structures that span firm and/or country boundaries

(Hayes et al. 2005, Sinha and Van de Van 2005). This trend is evident in the execution of technology

projects (i.e., physical product development and information technology projects) (Anderson and Parker

2013). From the perspective of work design—defined as “a system of arrangements and procedures for

organizing work” (Sinha and Van de Ven 2005, p.390)—a technology project can be viewed as a work

system that is partitioned into many interdependent and interconnected task modules. Each task module

corresponds to an individual component of a physical product or a software that must fit together to form

a coherent “whole” delivering a desired functionality (Sosa et al. 2004, Baldwin and Clark 2000). We

refer to this process of pooling together completed, interdependent task modules in a technology project

as project integration (Prencipe et al. 2003). Project integration not only represents an important phase in

the execution of a technology project, it has also been referred to as an essential function of project

execution (Heath and Staudenmayer 2000).

Motivated by the need to manage project integration, prior studies have frequently emphasized the

importance of aligning the organizational structure of a technology project with its coordination

requirements (e.g., Srikanth and Puranam 2011, Herbsleb et al. 2000). Generally speaking, in a project

where task modules share multiple, ill-defined interdependencies with one another, higher levels of

coordination are necessary for project integration; such a project should therefore be organized within the

firm boundary (i.e., make). In contrast, in a project where task modules share a limited number of well-

defined interdependencies with one another, lower levels of coordination are required for project

integration and such a project can be executed in an organizational structure that spans the firm boundary

(i.e., buy).

Notwithstanding, the above theoretical prescriptions, in practice, technology projects continue to face

significant challenges during project integration. For example, our discussions with managers in a Fortune

500 US-based firm that worked with several offshore vendor firms in the execution of a software

development project, highlighted numerous instances of project integration challenges. In one instance, a

manager reported that the offshore vendor firm personnel were constantly involved in rewriting the code

for larger chunks of the project due to errors discovered during project integration. Subsequent

-3-

investigation revealed significant mismatches in the interfaces between two key project components, the

compiler and the emulator, that resulted in severe data-transfer issues. In another instance, a manager

found that the vendor personnel had limited understanding of the functioning of a specific hardware

component and how it would interface with a software component they were developing. During project

integration, considerable redesign of the interfaces of the software component took place before the two

components could fit and function together as a system. In addition to these examples, Table 1 below

reports a number of examples of project integration challenges from practice. The disconnect between the

extant literature and the practice of project integration can be attributed to three important gaps in the

literature.

Insert Table 1 about here

First, prior studies often assume that interdependencies among task modules are identified when

project sourcing decisions are taken (Cummings et al. 2009). In practice, as project execution proceeds,

existing interdependencies among task modules may often evolve and new interdependencies may emerge

due to changes in project requirements (Sosa et al. 2004). Further, the bounded rationality of managers

may prevent all interdependencies from being fully understood in the initial stages of a project (Ethiraj

and Levinthal 2004, Simon 1962). As such an understanding evolves, the organizational structure that

was chosen at the beginning of a project may not be effective in facilitating project integration. Toward

this end, Larsen et al. (2013) argue that offshoring of complex, knowledge-intensive activities often

involves unanticipated design/specification costs and coordination costs that arise from insufficient

understanding of interdependencies among the activities. They conceptualize and measure unanticipated

costs in terms of cost-estimation errors or budget overrun. Such a measure, however, does not shed light

on the immediate first-order effects of sourcing decisions or capture the underlying mechanisms during

project execution (e.g., project integration challenges) that result in estimation errors.

Second, even if a project can be partitioned into multiple task modules with limited

interdependencies, project integration still necessitates the exchange of information relating to the

functional and technical attributes of the integrated system (Thompson 1967). In other words, effective

partitioning of a project is not synonymous with effective integration of task modules. Despite this

distinction, the study of project integration has received significantly lesser attention in prior studies,

compared to the study of partitioning of a project (Williams 2005, Heath and Staudenmayer 2000).

-4-

Third, firm boundary decision frameworks such as make vs. buy or insourcing vs. outsourcing do not

appropriately reflect the organizational structures that have emerged from project work spanning country

boundaries (Metters 2008). While project organizational structures differ from one another in terms of

whether they span firm and/or country boundaries, there is a limited understanding of the role of these

different boundaries in affecting project integration, and more importantly, a limited understanding of the

work design elements that can address project integration challenges (Lee et al. 2007).

Our study attempts to address the above gaps in the extant literature. We develop and test a

conceptual framework that examines the interrelationship between the elements of work design, project

integration challenges, and project performance. Specifically, we identify two distinct elements of work

design in practice: (i) the type of project organization based on whether a technology project spans firm

and/or country boundaries (Metters 2008, Tanriverdi et al. 2007)—namely, Domestic-Insourcing (intra-

country, intra-firm), Domestic-Outsourcing (intra-country, inter-firm), Offshore-Insourcing (inter-

country, intra-firm), and Offshore-Outsourcing (inter-country, inter-firm), and (ii) the joint coordination

practices that attempt to bridge together the key stakeholders in a technology project (Tenhiala and

Salvador 2014)—as reflected in Onsite Ratio (i.e., the proportion of the project tasks that are carried out

by the project team at the project client’s location), and Joint-Task Ownership (i.e., the extent to which

project team members work together and exhibit collective ownership of the project tasks).

Next, we capture project integration challenges in a technology project using Integration Glitches

which measures the extent of incompatibilities between interdependent task modules at their interfaces

during project integration. Such incompatibilities may arise due to: adverse effect of one task module on

another; sub-functions of individual task modules not producing the desired combined function; or

escalation of imprecision in task modules to unacceptable levels (Tenhiala and Salvador 2014, Sosa et al.

2004). Finally, building upon arguments from information processing theory (IPT) (Thompson 1967,

Galbraith 1973), we examine how integration glitches are driven by work design elements, and how such

glitches affect project performance.

The empirical analysis is conducted using data from 830 technology projects. The results highlight

the differential effects of distributed project organizations on project integration, with project

organizations spanning both firm and country boundaries (Offshore-Outsourcing) experiencing

significantly higher levels of integration glitches compared to domestic project organizations (Domestic-

-5-

Outsourcing and Domestic-Insourcing). The results further indicate that the relationship between project

organization type and integration glitches is moderated by the extent of joint coordination practices in a

project. Thus, while the type of project organization may typically remain fixed during the course of a

project, mangers can strive to actively lower the level of integration glitches in Offshore-Outsourcing

project organizations by increasing the levels of onsite ratio and by promoting higher levels of joint-task

ownership in the project. Finally, accounting for the endogenous nature of integration glitches in our

analysis, the results demonstrate the practical significance of studying integration glitches by highlighting

its significant negative effect on project performance.

The remainder of the paper is organized as follows. In §2, we present the theoretical underpinnings of

this study. In §3, we draw on and synthesize theoretical arguments to develop the study hypotheses. In §4,

we present the research design, discuss our data collection effort, and construct measurement. In §5, we

present the model specifications and the estimation results. Finally, in §6, we discuss the contributions of

the study to theory and practice, identify limitations and suggest directions for future research.

2. Theoretical Background

2.1. The Dynamics of Project Integration: A Coordination Perspective

As discussed earlier, project integration involves the pooling together of completed, interdependent

task modules in a technology project to form an overarching deliverable producing a desired functionality

(Prencipe et al. 2003). This coordination process requires the exchange of architectural knowledge

between a project client and the project team. Architectural knowledge refers to the understanding of the

ways in which task modules would fit and function together to form a coherent whole (Henderson and

Clark 1990). Such knowledge is embedded in the problem solving domains of a project client and the

project team since the functional specifications of task modules provided by the project client need to be

mapped onto the technical specifications provided by the project team (Baldwin and Clark 2000).

Specifically, the project client has a detailed understanding of the business and functional requirements of

a project and its role within the organization’s processes. Such expertise is derived from the project

client’s unique organizational routines, internal practices, and trade-secrets (Ethiraj et al. 2005). The

project team, in contrast, has the technical expertise (e.g., hardware development, programming) and the

project management expertise (e.g., estimating effort, sequencing task modules) for executing the project

(Stock and Tatikonda 2008). Project integration, therefore, requires “vertical” coordination—i.e.,

-6-

coordination of expertise between a project client and the project team—to identify interdependencies

among task modules and reduce incompatibilities.

In addition, project integration also necessitates “lateral” coordination—i.e., coordination between the

project team members involved in integrating interdependent project tasks. That is, the project team

members also need to exchange technical and functional knowledge relating to the project tasks to carry

out project integration. Often, the project client may be called upon to address any gaps in the functional

knowledge between project team members, and bridge such gaps through vertical coordination with the

project team members. Thus, a coordination loop is formed wherein project team members working on

interdependent task modules not only exchange information with each other directly, but also indirectly

through the project client. Given the substantive scale and scope of coordination required for project

integration, the potential for integration glitches remains a distinct possibility. Surprisingly though, the

notion of integration glitches in technology projects remains largely under researched in prior literature

(Anderson et al. 2007, Williams 2005).

Two notable exceptions to this characterization exist: Hoopes and Postrel (1999) anecdotal study of

product development “glitches” and Espinosa’s (2002) anecdotal study of large-scale software

development. Hoopes and Postrel (1999) define a product development “glitch” as an “unsatisfactory

result on a multi-agent project that is directly caused or allowed by a lack of interfunctional or

interspecialty knowledge about problem constraints. (p.843)” Such unsatisfactory results include a broad

set of outcomes such as “design flaws, quality problems (bugs), development cost overruns, late delivery,

or incomplete release (p. 848). While these unsatisfactory results may or may not be associated with

project integration only, they provide a starting point for conceptualizing and measuring glitches during

project execution. In that regard, Espinosa (2002, p. 38), in the context of software development, comes

closer to the notion of integration glitches by referring to such glitches as technical coordination

challenges that arise “when technical dependencies among software parts are not effectively managed

(e.g., redundant code, incompatible interfaces…).” A limited number of studies in the software

engineering domain have also examined interface defects across task modules (e.g., Boehm and Turner

2003, Fenton and Ohlsson 2000). Much of our understanding here is based on case studies and interviews

with practitioners, with the result that empirical operationalization of interface defects has not received

attention beyond measures of “defect counts.” Such count measures do not reflect the severity of the

-7-

integration challenges that occur in technology projects.

Our study extends Hoopes and Postrel (1999) notion of product development “glitches” and Espinosa

(2002) notion of technical coordination challenges in software development to develop a detailed, multi-

item measure of integration glitches that takes into account the extent of functional reliabilities, functional

inconsistencies, the degree of rework, and the overall difficulty associated with integration of task

modules in a technology project. Further, contrary to the software engineering literature which has

primarily used count measures, a focus on measuring the intensity of integration glitches using a multi-

item measure also reduces the potential for measurement error that may be higher when respondents are

asked to recollect specific values reflecting the number of integration glitches. A secondary benefit of

conceptualizing and operationalizing integration glitches in such a manner is that it extends the scope of

applicability of integration glitches to a variety of technology projects.

2.2. A Nuanced Conceptualization of Project Organizational Structure

Project sourcing decisions have often been linked with firm boundary decisions that are traditionally

characterized as make vs. buy or insourcing vs. outsourcing (Hayes et al. 2005). However, with increasing

geographical distribution of technology project work within and across country boundaries, the

organizational structures for such projects have become increasingly complex (Anderson and Parker

2013, Sinha and Van de Ven 2005). For example, if a client firm chooses the make/insourcing option for

executing a technology project, it can do so by using a project team located in-house or within one of its

divisions in a different country. On the other hand, if the client firm chooses the buy/outsourcing option,

it can contract with either a domestic or an international vendor firm for executing the project. As

organizational structures for technology projects differ in the extent to which they span firm and/or

country boundaries, the coordination challenges can vary considerably. We know little about how the

boundaries differ from each other with respect to influencing coordination in technology projects

(Narayanan et al. 2011, Agerfalk et al. 2009, Levina and Vaast 2008). Further, empirical research

examining the effects of spanning both firm and country boundaries jointly on coordination – as we do in

this study – is sparse, its growth stunted due to the difficulty of collecting data on such project

organizational settings (Lee et al. 2007).

In this study, we use a two-by-two classification scheme to represent the different types of project

organization commonly used in practice (Metters 2008, Tanriverdi et al. 2007, Eppinger and Chitkara

-8-

2006). As Figure 1 below indicates, one axis of the classification scheme represents the distribution of

technology projects within and across country boundaries, while the other axis represents the distribution

of technology projects within and across firm boundaries. Four distinct types of project organization

emerge from this classification scheme, as defined below.

Insert Figure 1 about here

Domestic-Insourcing: In this type of project organization, a project client and the project team are a

part of the same firm and are located in the same country. The project client is frequently referred to

as an “internal client” and the project team as an “internal team.” Note that a Domestic-Insourcing

project organization does not necessarily represent collocation of a project client and the project team

as each side may be located across different divisions or cities (Bardhan et al. 2013). An illustrative

example would be John Deere’s manufacturing unit based in Moline, Illinois that is engaged in a

number of embedded systems development projects with its division in Fargo, North Dakota.

Domestic-Outsourcing: In this type of project organization, a project client and the project team

each belong to a different firm (i.e., a client firm and the vendor firm, respectively), but are located in

the same country. An illustrative example is the U.S. firm Lucent Technologies, a client firm,

contracting with Borland Inc., a vendor firm also based in the U.S., to develop testing equipment.

Offshore-Insourcing: In this type of project organization, a project client and the project team are a

part of the same firm but are located in distinct organizational units across different countries. An

illustrative example is the Microsoft corporate R&D group based in Redmond, Washington

collaborating with Microsoft’s India Development Center on new software development.

Offshore-Outsourcing: In this type of project organization, a project client and the project team each

belong to a different firm (i.e., a client firm and the vendor firm, respectively) that are located in the

different countries. An illustrative example is Aviva, a U.K.-based client firm that is a leading

provider of insurance products, contracting with Tata Consulting Services, a vendor firm based in

India, for the development of partner management system application.

3. Hypotheses Development

3.1. Project Organization Type and Integration Glitches

The type of project organization plays a central role in affecting the extent to which coordination

between a project client and the project team is carried out effectively. Proponents of the information

processing theory (IPT) (Tushman and Nadler 1978, Galbraith 1973) point out that coordination

challenges in an environment are often dependent upon the information processing capabilities of the

environment. In a technology project, such capabilities are determined by its project organization type.

The distribution of project tasks across firm and/or country boundaries, by moving from Domestic-

Insourcing project organization, where the project client and the project team are organizationally and

geographically proximate, toward a distributed project organization (Domestic Outsourcing, Offshore-

-9-

Insourcing, or Offshore-Outsourcing), undermines the information processing capabilities in the project

environment and increases the challenges associated with the coordination of architectural knowledge

(Anderson and Parker 2013, Metters 2008). However, each type of distributed project organization poses

its own challenges for coordination of architectural knowledge.

Specifically, in a Domestic-Outsourcing project organization that spans a firm boundary, a project

client and the project team are likely to differ from each other in terms of their organizational context as

well as the business processes and operating procedures for performing tasks. Such differences can create

significant barriers in the processing of relevant information for project integration. Further, the

coordination process in such project organizations is also hampered due to the fact that members of each

side have a lower understanding of how to collaborate with each other—i.e., how to communicate one’s

ideas to others’ ideas, and how to coordinate one’s actions with other members (Majchrzak et al. 2005)1.

This can not only result in unproductive interactions between a project client and the project team, but

also prevent each side from accurately estimating the impact of certain dependencies, thereby increasing

the potential for integration glitches. Finally, a project client is likely to be vulnerable to loss of valuable

“private” organizational knowledge in Domestic-Outsourcing project organizations and, therefore, place

stricter controls on the project team’s access to such knowledge (Levina and Vaast 2008). This may filter

out certain aspects of the functional and technical requirements necessary for integration of task modules.

In contrast, in an Offshore-Insourcing project organization that spans a country boundary, differences

in national culture between a project client and the project team may make communication difficult and

prone to misinterpretation (Hahn and Bunyaratavej 2010). Such differences may manifest through

differences in language, work habits, and assumptions between a project client and the project team

(Metters 2008), all of which can create misunderstandings during the exchange of architectural

knowledge. Each side will need to expend significant effort to learn about the cultural inclinations and the

localized context of the other side that can influence problem interpretation (Hinds and Cramton 2008).

Unlike Domestic-Outsourcing project organizations, however, since the project client and the project

team in Offshore-Insourcing project organizations belong to the same firm, coordination challenges

arising from differences in organizational context and business processes may be less salient in such

1 Such an understanding can be low even if the client firm and the vendor firm have prior working experience with

each other. This is because the individuals as well as the divisions (or internal departments) associated with inter-

firm transactions as well as the individual employees may differ across projects.

-10-

project organizations. Notwithstanding these inherent advantages of Offshore-Insourcing project

organization, greater control and ownership of a project client over the project team as well as culturally

distinct perceptions of authority relations and work norms between the two sides can make it difficult for

each side to develop a common ground for day-to-day coordination during project integration (Metiu

2006). Finally, coordination of architectural knowledge is more likely to be asynchronous with reduced

levels of spontaneous communication and significant coordination delays when a project client and the

project team are distributed across country boundaries (Cummings et al. 2009).

The above arguments, taken together, imply that coordination of architectural knowledge required for

project integration is likely to be incomplete and less effective in Domestic-Outsourcing and Offshore-

Insourcing project organizations, compared to Domestic-Insourcing project organizations. Given the

distinct nature of the coordination challenges arising from spanning firm boundary and country boundary,

we anticipate that Offshore-Outsourcing project organizations which span both these boundaries are likely

experience greater levels of coordination challenges during project integration, and correspondingly

higher levels of integration glitches, compared to either Domestic-Outsourcing or Offshore-Insourcing

project organizations. In this regard, Cummings et al. (2009) indicates that, as the number of boundaries

spanned by members in a globally distributed team increase, so does the cognitive burden placed on

members to account for differences in both cultural and organizational contexts. Similarly, Kirkman and

Mathieu (2005) argue that information asymmetries among members of a group increase as the number of

boundaries spanned by group members’ increase. We, therefore, posit the following hypotheses.

Hypothesis 1a (H1a): Integration glitches are greater in a technology project that spans a

firm boundary (Domestic-Outsourcing) or a country boundary (Offshore-Insourcing)

compared to a project that does not span either boundary (Domestic-Insourcing).

Hypothesis 1b (H1b): Integration glitches are greater in a technology project that spans both

firm and country boundaries (Offshore-Outsourcing) compared to a project that spans only a

firm boundary (Domestic-Outsourcing) or a country boundary (Offshore-Insourcing).

3.2. The Role of Onsite Ratio and Joint-Task Ownership

The classification scheme for the type of project organization in Figure 1 does not necessarily imply

that the project tasks are entirely carried out at the project team’s location. Prior studies have recognized

the notion that, for a given type of project organization, a project client may strive to maintain “fluidity”

in the organizational structure by varying the extent to which project tasks are distributed across their

-11-

location and the project team’s location (e.g., Ramasubbu et al. 2008, Rottman and Lacity 2008, O’Leary

and Cummings 2007). For instance, Rottman and Lacity (2008) note that a part of the project may be

executed at the project client location by physically collocating some members of the project team with

the project client. Ramasubbu et al. (2008) use the term “task dispersion” in the context of Offshore-

Outsourcing to represent the extent to which technology project tasks are distributed between a project

client and the project team location. Similarly, O’Leary and Cummings (2007) refer to the notion of

multi-site distribution of tasks as spatial distribution of tasks, while Kirkman and Mathieu (2005) use the

term “proportion of co-located team members” as an important dimension of team virtuality. Building

upon this literature, we capture the differences for a given type of project organization by using the

measure, Onsite Ratio. This measure represents the proportion of the project tasks (out of a total of 100%)

that are carried out by the project team at the project client location.2

While the distribution of project tasks across firm and country boundaries complicates the process of

coordination of architectural knowledge between a project client and the project team, we anticipate such

coordination challenges to decrease with increasing levels of onsite ratio (Ramasubbu et al. 2008). First,

greater levels of physical embeddedness of a project team at the project client location not only allows

each side to develop an understanding of the contextual challenges associated with boundary spanning, it

also enables the vertical coordination of architectural knowledge (i.e., know-why). Such contextual

understanding and vertical coordination, in turn, helps each side to identify the steps necessary to resolve

such challenges (i.e., know-how) before they escalate into integration glitches (Hinds and Cramton 2008,

Tucker et al. 2007). While, the project client develops a better understanding the inner workings of the

task modules, the project team’s awareness of local context associated with the project client also

increases, thereby enabling a common ground for the coordination of architectural knowledge (Srikanth

and Puranam 2011). Toward this end, prior research on organizational learning has highlighted the

benefits of situated learning in distributed work settings for resolving and troubleshooting task-related

problems (Tyre and von Hippel 1997, Majchrzak et al. 2005).

Second, as more task modules are executed at the project client location, requirement changes by the

project client or emerging interdependencies across task modules are more spontaneously communicated

2 The characterization of onsite ratio does not necessarily apply to distributed project organizations only. Even

Domestic-Insourcing project organizations can vary widely in terms of onsite ratio. That is, a project client and the

project team may be located across different divisions and/or different cities (Bardhan et al. 2013, Boh et al. 2007).

-12-

to the project team. This allows project team members who are located at the project client location to

augment their architectural knowledge of the project and also relay such knowledge to other team

members that may be located across firm and/or country boundaries, thereby reducing the potential for

integration glitches. Third, higher levels of onsite ratio also enables the project client and the project team

to locate where relevant functional and technical expertise resides among project team members, and who

to contact when a gap in architectural knowledge arises (Espinosa et al. 2007). Such an understanding is

particularly relevant in distributed project organizations as it enables the timely resolution of gaps in

architectural knowledge between a project client and the project team, and reduces the potential for

integration glitches. Based on the above arguments, we posit the following hypotheses.

Hypothesis 2a (H2a): Onsite ratio is negatively associated with integration glitches in a

technology project.

Hypothesis 2b (H2b): Onsite ratio moderates the positive association between the type

of project organization and integration glitches in a technology project such that this

positive association becomes weaker as onsite ratio increases.

Joint-task ownership refers to the extent to which members of the project team (i) assume collective

ownership of the project tasks, and (ii) are involved jointly in the execution of the project tasks. As like

onsite ratio, joint-task ownership forms an important component in the study of work design. From the

standpoint of work design literature, the focus on joint-task ownership is consistent with the notion of

lateral coordination in manufacturing—i.e., should internal departments/suppliers design each component

of a product independently from one another, or should they be jointly responsible for a number of

components (Adler 1995). In the context of our study, where a single project team is involved in the

execution of a technology project, the project team can be considered as a collection of suppliers (i.e.,

team members) that are responsible for carrying out the tasks within a project.

Extending the notion of lateral work design to a technology project setting, Clark and Fujimoto

(1991) and Womack et al. (1990), in the study of the world automobile industry, found that successful

projects often tended to have engineers exercise their capabilities over a broad range of activities and

assume responsibility for the successful execution of multiple activities. Similarly, Hackman and

colleagues (Hackman and Oldham 1975, Hackman 1987, Hackman and Wageman 1995), have found out

that collective responsibility of team members for multiple tasks allows the team to understand each task

and its contribution to the overall project, the relative sequencing of the tasks in the project (i.e., the

-13-

predecessor relationships in project management parlance), and how information flows across the

boundaries of individual tasks. Thus, at its very core, the notion of joint-task ownership has direct

implications for the effective management of information across task-module interfaces.

In addition to effectively managing the information flow across interdependent task modules, joint-

task ownership increases the likelihood that team members in distributed project organizations have a

more unified understanding of the technical standards, tools and the best practices that would be applied

during the integration of task modules. Consider, for example, the case of integration glitches in the

Boeing 787 Dreamliner project where differences in the version of the design software (i.e., a project

tool) between team members distributed across firm and country boundaries resulted in significant

integration glitches. Similarly, in a study of software project management, Likoebe et al. (2009, p. 359)

notes that collective ownership of the coding tasks in a project often “ensures that they [developers] are

exposed to all aspects of the code (Pikkarainen et al., 2007), which gives developers a better

understanding of the software architecture and the dependencies that exist among various units of code.”

Such an enhanced understanding of their coding tasks as well as tasks performed by others ensures that

developers can make effective adjustments to software code and address errors in the code. Further, new

errors are less likely to be inadvertently introduced into the software code (Fitzerald et al. 2006, Likoebe

et al. 2009). Based on the above arguments, we posit that joint-task ownership is not only likely to reduce

integration glitches by bridging coordination gaps between project team members across all types of

project organization, but is likely to be more relevant in bridging such gaps when technology projects

span a firm boundary, or a country boundary, or both

Hypothesis 3a (H3a): Joint-task ownership is negatively associated with integration

glitches in a technology project.

Hypothesis 3b (H3b): Joint-task ownership moderates the positive association between

project organization type and integration glitches in a technology project such that this

positive association becomes weaker as the joint-task ownership increases.

3.3. Impact of Integration Glitches on Project Performance

Resolving integration glitches in a project often requires a project client and the project team to revisit

their understanding of the task modules and make appropriate changes to address the incompatibilities at

the interfaces. Changes at the interfaces of task modules are often difficult to implement since they

-14-

require extensive levels of vertical and lateral coordination in a project to understand system-level

performance requirements (Mitchell and Nault 2007). Regardless of the difficulty of implementing these

changes, the extent of rework required on task modules and their interfaces increase as integration

glitches increases. Project rework is often recursive in nature, i.e., their discovery and resolution

frequently surfaces more sources of rework (Cooper 1993). The recursive nature of rework is likely to be

salient when integration glitches occur. Due to the complex and often network-like structure of

interdependencies across task modules (Langlois and Robertson 1992), modifications at the design

interfaces of a single task module may initiate additional design changes that ripple across the interfaces

of several task modules and require more rework (Hoegl et al. 2004, also see Itanium chip development

example in Table 1). This can negatively impact project performance as it not only leads to wasteful

expenditure of project resources in resolving rework issues, but also detracts the project from its original

path resulting in sub-optimal levels of technical performance of the deliverable and quality issues

(Mitchell and Nault 2007). Therefore, we posit the following hypothesis.

Hypothesis 4 (H4): Integration glitches are negatively associated with project

performance in a technology project.

The hypotheses stated above are integrated into a conceptual framework, Figure 2, depicting the inter-

relationships between work design elements, integration glitches and project performance.

Insert Figure 2 about here

4. Research Design

4.1. Data Collection and Sample Description

To empirically analyze the above conceptual framework, we collected primary data by designing and

implementing a web-based questionnaire. Prior to the design of the questionnaire, we conducted grounded

fieldwork by interviewing managers from a Fortune 500 technology firm (located in mid-western United

States) that were involved in Offshore-Insourcing and Offshore-Outsourcing project organizations with

project teams located in India. The qualitative insights developed through these interviews along with a

comprehensive review of the literature helped us to better understand the dynamics of distributed project

organizations and develop a structured questionnaire for collecting data.

The data was collected during a six-month period between February and July 2007. An initial version

of the questionnaire was developed and put through a multi-stage refinement process involving, first, a

-15-

peer review stage. In this stage, the questionnaire was evaluated by a panel of five experts (three

academics and two industry professionals) knowledgeable about the topic of global sourcing. The panel

assessed the content validity and clarity of the items and provided specific comments on revising items

that were prone to misinterpretation. Following revisions, a pilot test was initiated by sending the

questionnaire to members of two professional project management associations: PMHUB and a local

chapter of Project Management Institute (PMI).3 The pilot test generated 50 responses, which were

thoroughly reviewed to gauge specific aspects of the data collection process such as item non-response,

dropouts, and the time taken to answer the questionnaire. Modifications were subsequently made to the

content and the organization of the questionnaire to improve accuracy and number of responses.

Finally, following the pilot test, an e-mail invitation clearly explaining the purpose of the study and

requesting participation was sent to members of the new product development (NPD) and information

systems (IS) groups within PMI. To minimize recall bias, we asked respondents to provide their responses

on a recently completed technology project that they were most familiar with, and not necessarily a

successful or a failed project. As an incentive for participation, all respondents who provided their e-mail

address were offered a comprehensive summary of the survey results. In addition, reminder e-mails were

sent to each group after the initial invite to encourage participation. For the PMI-NPD group, we obtained

a total of 155 usable responses representing a response rate of approximately 13%. For the PMI-IS group,

we obtained a total of 675 usable responses representing a response rate of approximately 6%. A

MANOVA test comparing the means for the key continuous variables in our analysis—integration

glitches, onsite ratio, joint-task ownership, project performance—across the two groups did not indicate

significant differences (F = 1.37, p = 0.24). Further, the Kolmogorov-Smirnov test for the equality of

distribution functions for each variable was statistically insignificant (median p-value = 0.63) indicating

that the samples were drawn from the same overall distribution. Therefore, we combined the responses

from the two groups to create a total sample of 830 technology projects for conducting the analysis.

A number of formal checks were carried out to determine the appropriateness of the respondent

profile for answering the survey. An examination of respondent experience levels indicated that

3 Both PMHUB (www.pmhub.net) and PMI (www.pmi.org) are professional associations that provide a platform

for industry professionals to share ideas, attend seminars/workshops, and increase professional exposure. Prior

studies that have collected project level data from similar sources include: Narasimhan et al. 2013 (Data Source:

PMI and International Association of Outsourcing Professionals), Bendoly et al. 2010 and Lee and Xia 2010 (Data

Source: PMI), Pavlou and El Sawy 2006 and Tatikonda and Rosenthal 2000 (Data Source – PDMA).

-16-

respondents had high levels of experience in the domain of project management. Specifically, respondents

had an average of 21.2 years of professional experience, out of which 11.5 years were spent in a project

management role. Further, the respondents occupied key positions of responsibility within the project—

approximately 72% were project managers, while the remaining 28% were senior level managers (e.g.,

project sponsor, program manager) or held specialist roles (e.g., technical lead, business analyst). These

numbers suggest that respondents were knowledgeable about their project, thereby increasing our

confidence in the quality and the accuracy of the data.

To check for non-response bias, we use the extrapolation method proposed by Armstrong and

Overton (1977). Using this method, the sample was first classified into groups of early and late

respondents. Early respondents represented those that provided their responses after the first contact while

late respondents represented those that provided their responses following reminders. A series of

statistical t-tests were performed for the key continuous variables in our analysis across the respondent

groups, the underlying assumption here being that late respondents were more likely to be similar to non-

respondents. Results from these tests indicated no pattern of differences across the groups, minimizing

concerns of non-response bias.4

4.2. Measures

We used multi-item scales based on our review of existing studies to measure the key variables in the

study. The measurement items are listed in Table A1 in the Appendix. To reduce the potential for

common method variance (CMV) that can occur due to the use of a single respondent for both the

independent and the dependent variables in the study (Podsakoff et al. 2003), we used different scale

formats for the key variables (i.e., a 7-point scale for project performance, a 5-point scale for integration

glitches and joint-task ownership, a % scale for onsite ratio and a nominal categorical scale for project

organization type). We discuss the key variables in greater detail below. An in-depth discussion of

additional corrective steps taken to minimize CMV follows in §5.3.

Project Performance: Consistent with prior studies (e.g., Stock and Tatikonda 2008, Hoegl et al.

4 In addition, to check for recall bias, we asked respondents to indicate when their specific project had been

completed. A majority (i.e., 72%) indicated that the projects were completed within the past 6 months. Analysis

using t-tests to identify potential variation in key variables across two categories (category 1 – where projects were

completed < 6 months and category 2 where projects were completed > 6 months) did not indicate any significant

differences, thereby, suggesting that recall duration was not associated with any systematic trends in variable means.

-17-

2004), project performance was assessed as an average of a seven-point Likert scale (Cronbach’s α =

0.90) which evaluates a project with respect to the following performance dimensions: adherence to

schedule, budget, quality, technical performance, and overall satisfaction. The use of multiple dimensions

to represent project performance reflects the notion that technology projects have multiplicity of

objectives that need to be taken into account to obtain a holistic assessment of project outcomes. A closer

look at the project performance measure did not indicate any skewness (skewness value = 0.0006).

Further, the average reported project performance on a seven point Likert scale was 4.54 and

approximately 47% of the projects had a value of 4 or below on this scale. These results alleviated

concerns that the sample was representative of successful projects only.

Integration Glitches: Integration glitches was assessed as an average of four items (Cronbach’s α =

0.89) based on a five-point Likert scale. The items for this scale captured: (i) the functional reliability of

the task modules; (ii) the functional inconsistencies between the task modules; (iii) the overall difficulty

encountered in integrating the task modules; and (iv) the degree of rework required to integrate the task

modules. These items were first derived from a review of relevant literatures on product development

(e.g., Hoopes and Postrel 1999, Sosa et al. 2004, Gokpinar et al. 2010), software development (e.g.,

Espinosa 2002), and software engineering (e.g., Boehm and Turner 2003, Fenton and Ohlsson 2000), and

were further refined based on feedback from subject matter experts during the peer review process.

Project Organization Type: The measure for project organization type is based on the classification

scheme depicted in Figure 1. Respondents were asked to select one among the four project organization

types: Domestic-Insourcing, Domestic-Outsourcing, Offshore-Insourcing, and Offshore-Outsourcing, that

fits their project. To ensure clarity, a brief definition was provided for each project organization type in

the survey (see Table A2 in the Appendix). Given the categorical nature of this variable, we represented

Domestic-Insourcing as the base category and included three indicator variables (i.e., Domestic-

Outsourcing, Offshore-Insourcing, and Offshore-Outsourcing) in the analysis. Across the total sample of

830 technology projects, the distribution of the project organization types was as follows: 54.7% (454

projects) were Domestic-Insourcing, 20.2% (168 projects) were Domestic-Outsourcing, 8.6% (71

projects) were Offshore-Insourcing, and the remaining 16.5% (137 projects) were Offshore-Outsourcing.5

5 We realize that some technology projects may not necessarily identify with one specific type of project

organization, but may be associated with multiple types of project organization during the course of a project. We

-18-

Onsite Ratio: Onsite ratio represents the extent of spatial distribution of project tasks across a project

client and the project team location in a technology project. Consistent with prior literature (Ramasubbu

et al. 2008, Kirkman and Mathieu 2005), onsite ratio is measured on a percent (%) scale with values

ranging from 0 to 100, where a given value represents the percent (%) of the project tasks that were

carried out by the project team at the project client location. Higher values on this scale, thus, represent

higher levels of onsite ratio. As Table 2 below indicates, the mean levels of onsite ratio are highest for

Domestic-Insourcing project organizations (70.69%) followed by Domestic-Outsourcing (59.32%),

Offshore-Insourcing (44.61%), and Offshore-Outsourcing (42.45%).

Insert Table 2 about here

Joint-Task Ownership: Joint-task ownership refers to the extent to which members of the project team

(i) assume collectively ownership of the project tasks, and (ii) are involved jointly in the execution of the

project tasks. The above component dimensions of joint-task ownership are consistent with much of the

prior literature on job design (Hackman and Oldham 1975) and the software project management

literature (Likoebe et al. 2009). We measured joint-task ownership using a two item scale (Cronbach’s α

= 0.61) that captures each of the above dimensions, and averaged responses across the items.

To examine the psychometric properties of the multi-item scales representing the independent and

dependent variables, we performed Confirmatory Factor Analysis (CFA). In assessing the overall fit of

the measurement model, we estimated the normed chi-square statistic (χ2/df), RMSEA, and fit indices

such as Normed Fit Index (NFI) and Relative Fit Index (RFI). The results of CFA indicated a high level

of fit with the data (χ2/df = 2.16, RMSEA = 0.04, NFI = 0.98 and RFI = 0.98) (Kline 1998, Schumacker

and Lomax 2004). Next, we assessed the reliability and validity of the constructs in the measurement

model. The composite reliabilities for all constructs were greater or equal to the recommended cutoff

value of 0.60 (Fornell and Larcker 1981)—0.62 for joint-task ownership, 0.88 for integration glitches, and

0.89 for project performance. Convergent validity was assessed in two ways: first by examining the path

coefficients from the latent constructs to their corresponding manifest indicators (Anderson and Gerbing,

1988), and (ii) by comparing the measurement model discussed earlier (where correlations between

constructs were estimated) to one where the correlations between constructs were constrained to zero

received a very small fraction (<1%) of our responses relating to such a scenario. We dropped those responses from

our study sample and focused on the significant majority that identify one specific type of project organization.

-19-

(Gatignon et al. 2002). All the path coefficients were significant (p< 0.01) with values ranging between

values 0.40 to 0.91 and more than 10 times their standard errors (Anderson and Gerbing 1988). Further, a

significant improvement in the fit was observed for the unconstrained measurement model to the

constrained model. Taken together, the above findings provide strong evidence of convergent validity

among the constructs. We also followed a similar measurement model approach to check for discriminant

validity between the constructs by comparing the unconstrained measurement model to one where the

correlations between constructs was constrained to unity. A test of difference in model fit across the two

models was statistically significant (p< 0.01), highlighting the discriminant validity for the constructs.

Control Variables: We controlled for a number of external factors that create heterogeneity among

the technology projects in our sample and are likely to influence project integration and the overall

performance of a project. These include:

Team Size: As project team size increases, so does the information processing and coordination

requirements of a project (Mitchell and Nault 2007). We, therefore, included the natural log of

project team size [ln(Team Size)] in the analysis.

Project Budget: Project budget represents the total budgetary allocation for a project and is

measured as an ordinal variable (1= <$50,000, 2 = $50,000-$249, 999, 3 = $250,000-$499,999, 4

= $500,000-$1 Million, and 5 = >$1 Million).

Requirements Uncertainty: Changes in project requirements during the course of the project may

introduce changes in the functional specification of task modules. We measure requirements

uncertainty as the average of a four-item scale (Cronbach’s α = 0.85) (shown in the Appendix)

that captures the nature and extent of dynamism associated with the project requirements

(Nidumolu 1995).

Project Type: The deliverables from the projects in our sample fell into three categories:

Hardware, Software, and IT Infrastructure (e.g., projects involving enterprise IT infrastructure

development projects). We controlled for such variations across projects by creating two dummy

variables (Hardware and Software), and entered them into our analysis.

Industry Type: We controlled for heterogeneity in industry type by including dummy variables for

industries with high representation in the sample: Information Technology, Banking, Insurance,

Healthcare and Manufacturing.

Past Experience: We controlled for the past experience (Past Experience) of the project team in

handling similar projects (Ethiraj et al. 2005). This variable represents the average of a four-item

scale (Cronbach’s α = 0.75) (shown in the Appendix) which assesses the past experience of a

project team in working on similar projects (in terms of project organization type, scope/size, and

project client requirements).

-20-

Respondent Heterogeneity: We also controlled for heterogeneity among respondents using

dummy variables that represent the respondent’s role in the project (Project Manager, Senior

Manager, or others), the respondent’s affiliation with the project (Project Client, Project Team,

or otherwise), and the number of years of project management experience of the respondent

[ln(Project Management Experience)].

Project Client Location: A majority of the projects in our sample had project clients that were

based in North America; hence, we controlled for project clients location by using a dummy

variable: North America.

5. Analysis and Results

5.1. Model Specification

Table A2 in the Appendix presents the descriptive statistics and the pairwise correlations for all the

variables in the analysis. To test the hypothesized relationships, the empirical analysis was carried out in

two steps. In the first step, we use an ordinary least squares (OLS) procedure with robust standard errors

to examine the role of work design elements on integration glitches. In the second step, we examine the

relationship between integration glitches and project performance. Based on the conceptual framework,

integration glitches represent an endogenous explanatory variable in our analysis. This presents a

methodological challenge in the estimation of the second step model.6 While the OLS procedure can lead

to biased estimates in the presence of an endogenous explanatory variable, the two-stage least squares

(2SLS) procedure to correct for endogeneity of a continuous explanatory variable does not include a

correction for unobserved heterogeneity associated with different levels of the variable (Mooi and Ghosh

2010). Therefore, we use a control function approach (Cameron and Trivedi 2005, Heckman and Robb

1985, Garen 1984) to correct for endogeneity of integration glitches.

Using this procedure, the model in the second step is augmented by including residuals, ê, from the

first step estimation along with the interaction term, ê × Integration Glitches. While the residuals correct

for endogeneity of integration glitches, the interaction term corrects for unobserved heterogeneity

associated with different levels of the integration glitches. To minimize multi-collinearity concerns in the

second step model that may arise from the inclusion of the residual variables (Arabsheibani and Marin

2001), we center Integration Glitches to construct interaction terms with the residuals. The residual

variables, ê and ê × Integration Glitches, are combined into a single variable, SCORE, using principal

6 Further, as a robustness check discussed on p.26, we carried out additional analysis treating project organization

type as an endogenous variable and obtained consistent results.

-21-

component analysis (Kutner et al. 2005). The model in the second step is then estimated by using a

generalized least squares (GLS) procedure with robust errors. Detailed information on the key equations

and the assumptions underlying the control function approach are provided in the Appendix.

Note that for an unbiased estimation and empirical identification of the second step model, it is

necessary to use valid instrumental variables in the first step model (Cameron and Trivedi 2005). We

included the following set of instrumental variables in the first step analysis:

Architectural Uncertainty – measured as the average of a three-item scale (Cronbach’s α = 0.76)

(shown in the Appendix) that captures the degree of difficulty in dividing a project into task

modules, the clarity in defining the interdependencies between the modules, and the ease in

defining the interdependencies across task modules (Baldwin and Clark 2000).

Task Module Complexity – measured as a two-item formative scale (shown in the Appendix) that

captures the extent of task modules in a project (Xia and Lee 2005, Wood 1986).

Development – measured as a binary variable with one category representing development

projects and the other category representing maintenance/testing projects.

Project Duration – measured as a continuous variable that captures project duration in months.

The choice of these instrumental variables was driven by both conceptual and methodological

reasons. From a conceptual standpoint, both Architectural Uncertainty and Task Module Complexity

represent the characteristics of task module interdependencies in a project. Prior studies have noted that

an incomplete understanding of the task-module interdependencies may result in inconsistencies across

task modules during integration and result in significant rework (Sosa et al. 2004, Anderson et al. 2007).

Similarly, from a coordination standpoint, an increase in the number of task modules increase the

potential number of design interfaces to be integrated, thereby increasing the possibility for integration

glitches to occur (Herbsleb et al. 2000). We also included the variable, Development, as an instrumental

variable for the reason that development projects are more likely to involve greater extent of new and

unknown interdependencies across task modules compared to maintenance/testing projects where task

module interdependencies are more likely to be established and better understood. Finally, Loch et al.

(2006, p. 52) note that projects of longer duration are “commonly plagued by fundamentally unforeseeable

events and/or unknown interactions among different parts of the project.” Hence, we include Project

Duration as an instrument in the first step model.

-22-

From a methodological standpoint, we carried out a number of tests to check the validity of the

chosen instruments, and to empirically confirm the need to correct for the endogeneity of integration

glitches in the analysis. First, we conducted the Hansen J statistic test for over-identification (Kennedy

2008). This test is based on the observation that the error term in the second step model should be

uncorrelated with the set of exogenous variables in the second step model if the instruments are truly

exogenous. A failure to reject the null hypothesis in this test provides evidence in favor of the exogeneity

assumption of the instruments. To that end, the Hansen J statistic for over-identification is statistically

insignificant (χ2 = 3.46, p=0.33). Second, the F-statistic, testing the null hypothesis that the slopes of all

instruments are zero is 23.72; this exceeds the recommended cutoff value of 10 in prior studies (e.g.,

Kennedy 2008, Cameron and Trivedi 2005), highlighting the relevance and sufficient predictive power of

the instruments. Third, the Kleibergen-Paap statistic rejects the null hypothesis (χ2 = 17.23, p<0.01) that

the second step model is underidentified. Taken together, the above tests provide methodological support

for the inclusion of these instruments in the first step model. Finally, using these instruments, the

endogeneity of integration glitches was confirmed empirically using the Durbin-Wu-Hausman’s test

(Maddala 2001) which rejected the exogeneity assumption (χ2 = 6.47, p<0.01).

5.2. Model Estimation Results

The estimation results are shown in Table 3. To minimize multicollinearity concerns in the first step

model, we used standardized values for the continuous variables in the interaction terms. The highest VIF

value for each model was well below the suggested cutoff values of 10 and the average VIF was less than

2 suggesting that multi-collinearity was not a major concern in the analysis (Kennedy 2008). We also

found no other concerns with respect to the regression assumptions of normality and heteroskedasticity.

As Table 3 indicates, the analysis in the first step is estimated hierarchically starting with the baseline

model in column (1) which includes the control variables and the instrumental variables, followed by

column (2) which includes the main effects associated with the work design variables, and column (3)

which includes the hypothesized interaction effects.

Insert Table 3 about here

We examine Hypothesis 1a which posits that higher levels of integration glitches are associated with

project organizations that span either a firm boundary (Domestic-Outsourcing) or a country boundary

(Offshore-Insourcing), compared to those that do not span either boundary (Domestic-Insourcing). The

-23-

estimation results in column (2) show positive but statistically insignificant coefficients of Domestic-

Outsourcing and Offshore-Insourcing on integration glitches. Hypothesis 1a is, therefore, not supported in

our analysis.

Hypothesis 1b posits that project organizations that span both firm and country boundaries (Offshore-

Outsourcing) are associated with higher levels of integration glitches compared to project organizations

that span either a firm boundary (Domestic-Outsourcing) or a country boundary (Offshore-Insourcing). At

the outset, the sign and the statistically significant coefficient of Offshore-Outsourcing (βOffshore-Outsourcing =

0.23, p < .05) provides strong evidence of significantly high levels of integration glitches in an Offshore-

Outsourcing project organization compared to a Domestic-Insourcing project organization. Further, a

comparison of the magnitudes of the coefficient estimates between Offshore-Outsourcing and Domestic-

Outsourcing also indicates a significant positive difference (βOffshore-Outsourcing – βDomestic-Outsourcing = 0.30, p <

.05). However, no significant differences are observed in the comparison of the magnitudes of coefficient

estimates between Offshore-Outsourcing and Offshore-Insourcing (βOffshore-Outsourcing – βOffshore-Insourcing =

0.04, p > .10). Hypothesis 1b is, therefore, partially supported in that spanning both firm and country

boundaries is associated with a significant increase in integration glitches compared to spanning only the

firm boundary, but not compared to spanning only the country boundary. Collectively, the results of

testing Hypothesis 1a and Hypothesis 1b highlight the differential effects of distributed project

organizations on integration glitches and emphasize the point that integration glitches are more likely to

manifest in technology projects when they span both firm and country boundaries (Offshore-Outsourcing)

compared to domestic project organizations (Domestic-Outsourcing and Domestic-Insourcing).

Hypotheses 2a and 2b examine the direct effect of onsite ratio on integration glitches, and the

moderating effects of onsite ratio on the relationship between project organization type and integration

glitches, respectively. The estimation results in column (3) indicate a significant negative coefficient for

the main effect of onsite ratio (βOnsite Ratio = -0.12, p < .01). In essence, as the proportion of project tasks

carried out by a project team at the project client location increases, integration glitches decrease.

Hypothesis 2a is therefore supported. Further, in column (4), the coefficients of the moderation effect of

onsite ratio on the relationship between project organization type and integration glitches is negative and

statistically significant in the case of Offshore-Outsourcing project organization (βOnsite Ratio × Offshore-

Outsourcing = -0.22, p < .05). No other moderation effect is significant. Hypothesis 2b is, therefore, partially

-24-

supported. Figure 3a below plots the results associated with the above moderation effect.

Hypothesis 3a and 3b examine the direct effect of joint-task ownership on integration glitches, and

the moderating effects of joint-task ownership on the relationship between project organization type and

integration glitches, respectively. The estimation results in column (2) indicate a significant negative

coefficient for the main effect of joint-task ownership (βJoint-Task Ownership = -0.19, p < .01), indicating that,

as the extent of joint-task ownership in a project increases, integration glitches decrease. Further, in

column (3), the coefficients of the moderation effect of joint-task ownership on the relationship between

project organization type and integration glitches is negative and statistically significant in the case of

Offshore-Outsourcing project organization (βJoint-Task Ownership × Offshore-Outsourcing = -0.23, p < .01). However, as

in the case of onsite ratio, no other moderation effects are significant. Hypothesis 3b is, therefore,

partially supported. Figure 3b below plots the results associated with the above moderation effect.

Insert Figures 3a and 3b about here

Finally, the model in the second step is then estimated hierarchically using a GLS estimation

procedure. This procedure involves first estimating a baseline model with control variables and work

design variables in column (5), followed by the addition of integration glitches variable and the correction

for endogeneity in column (6). The estimation results in column (6) show a statistically significant

negative coefficient for integration glitches (γIntegration Glitches = -0.40, p< .01) indicating that increasing

levels of integration glitches are associated with a decrease in project performance.

5.3. Robustness Checks

Impact of Common Method Variance (CMV): We undertook a number of steps to minimize CMV in

our study. During the survey design, we focused on reducing CMV by: (i) maintaining anonymity of

respondents and assuring them of confidentiality, which increased their willingness to provide accurate

responses; (ii) ironing out vague concepts and ambiguities in measurement items through rigorous pilot

testing; and (iii) using measurement scales with different formats for the key variables in our analysis

(Podsakoff et al. 2003). From a statistical standpoint, given the factual nature of the project organization

type and onsite ratio variables as well as the non-linear nature of some the hypothesized relationships in

our study, CMV is less likely to bias our results (Siemsen et al. 2010). That is, the respondents are less

likely to (i) make attributions between a nominal categorical variable (project organization type) or a %

-25-

measure (onsite ratio) and a continuous variable (integration glitches), and (ii) predict variations in the

magnitude and the direction of the moderating effects. Nonetheless, to detect CMV, we carried out the

marker variable test. The basis for this test is that if a manifest variable exists that is theoretically

uncorrelated with the other manifest variables in the data set, then the smallest positive value of the

correlation between this variable and the other variables is a reasonable proxy for CMV (Lindell and

Whitney 2001, Malhotra et al. 2006). Given that post-hoc identification of such a variable can “capitalize

on chance,” Lindell and Whitney (2001, p. 116) further suggest using the second smallest positive

correlation in the data set as a conservative estimate for CMV. Using this correlation (ρ = 0.01, p>0.1),

we computed the corrected correlations between all items in our dataset and their t-statistic. A comparison

of all inter-item correlations before and after correcting for CMV indicated little or no change in their

significance levels, indicating that CMV was not a threat to the validity of the results.

Endogeneity of Project Organization Type: The choice of project organization type represents an

endogenous variable that may potentially impact the results in the model predicting integration glitches.

We further note that the type of project organization represents a treatment effect whose specific effect is

being observed in the model. The traditional Heckman two-step endogeneity correction approach for

categorical variables is not suitable here since it does not allow us to observe the specific effects of

project organization type on integration glitches. Economists have proposed an alternative approach—the

endogenous treatment effect model (Cameron and Trivedi 2005)—which specifically corrects for the

endogeneity of the treatment while allowing us to observe its effect on the dependent variable of interest.

Such an approach however requires the treatment effect variable to be a binary variable (instead of a

multi-nominal categorical variable as in our study). Given that our main analysis results highlighted the

significant effects of offshore project organizations (i.e., Offshore-Insourcing and Offshore-Outsourcing)

on integration glitches, we constructed a binary categorical variable, Intercountry, which is coded as 1 for

offshore projects organizations and 0 for domestic project organizations. We identified two instruments—

Project Type (Hardware, Software, Others) and Average Work Experience (average work experience of

the project team members in years) that may affect Intercountry directly and may be related to integration

glitches through this variable. The type of project may represent the level of physical asset specificity or

information asset specificity and may affect the choice of whether a project is carried out across country

boundaries or not. Similarly, the use of team members with greater project experience may represent the

-26-

level of skill requirements of challenge in a project and may affect whether a project is carried out in

offshore project organizations or not. Table A3 in the Appendix presents the results from the analysis of

the endogenous treatment effects model. Consistent with the main analysis results, the results from this

model highlight the significant positive effect of Intercountry on integration glitches, suggesting that

distributed project organizations that span country boundaries are associated with higher levels of

integration glitches compared to domestic project organizations.

Controlling for the Effects of “Nearshoring” projects: Of the 178 (out of 208) Offshore-Insourcing

and Offshore-Outsourcing projects for which we had data on country location, 31 projects (17%) involved

nearshoring transactions (e.g., US-Canada, Eastern Europe-Western Europe, and South-East Asia-Middle

East) while the remaining 147 projects (83%) involved transactions across geographically and culturally

distant locations (e.g., North America-Asia, Europe-Asia). A concern may arise that our analysis does not

differentiate nearshoring projects from those involving geographically and culturally distant countries.

Additional analysis conducted by: (i) excluding the 31 nearshoring projects from our total sample, and (ii)

including an indicatory variable (1 = Nearshored, 0 otherwise) were consistent with those estimated with

the full sample, thereby highlighting the robustness of the results.

6. Discussion

The need for understanding project integration challenges in technology projects and, more

specifically, in the context of project organizational structures has been emphasized in prior literature as

firms continue to unbundle their value chain operations (e.g., Anderson and Parker 2013, Hayes et al.

2005). In an effort to address these needs, the goal we set forth was to develop and test a conceptual

framework that examines how work design elements in a technology project impact project integration

and, in turn, project performance. The results of our empirical analysis suggest that the type of project

organization plays a significant role in determining the extent of integration glitches in a project. We find

support for the differential effects of distributed project organizations on project integration, with project

organizations spanning both firm and country boundaries (Offshore-Outsourcing) experiencing

significantly higher levels of integration glitches compared to domestic project organizations (Domestic-

Outsourcing and Domestic-Insourcing). Additionally, toward minimizing integration glitches in

technology projects, particularly in those that span firm and country boundaries (Offshore-Outsourcing),

-27-

our results highlight the benefits of “adjusting” the organizational structure of a project by increasing its

onsite ratio, and by increasing the levels of joint-task ownership in the project. This is an actionable

insight that is consequential, both from a theoretical and a practical standpoint. Finally, delving into the

performance consequences, our results unequivocally demonstrate the strong negative impact of

integration glitches on project performance, thereby highlighting the importance of addressing such

glitches in technology projects.

6.1. Contributions to Theory

In light of the above findings, our study makes a three-fold contribution to the operations strategy

literature. First, our study develops and empirically analyzes a conceptual framework that sheds light on

the understudied topic of project integration in this literature. While project integration has been

recognized as a key phase in the execution of a technology project (Anderson et al. 2007), and the

organizational structure decisions have received significant attention in the operations strategy literature

(e.g., Iyer et al. 2013, Narayanan et al. 2011), we know little about how organizational structure decisions

influence project integration and, in turn, project performance. Our study bridges this gap by developing

an overarching conceptual framework that synthesizes these two streams of literature and examines the

interrelationship between the elements of work design, integration glitches, and project performance.

Further, the significant performance consequences of integration glitches in our analysis underscore the

importance of managing architectural knowledge in technology projects. Often, the time-sensitive, high

pressure environment which characterizes technology project execution can significantly constrain the

resources available for developing and coordinating architectural knowledge. While the strategic

implications of architectural knowledge for firm-level competitive advantage are widely acknowledged

(Henderson and Clark 1990), our study highlights the micro-level effects of such knowledge in improving

project integration and, in turn, improving project performance.

Second, our study responds to the calls from researchers for using a more nuanced conceptual

exposition of project organizational structure, especially with technology projects increasingly spanning

firm and/or country boundaries (Agerfalk et al. 2009, Metters 2008, O’Leary and Cummings 2007). Past

studies on project organizational structures have examined the impact of firm and country boundaries on

the coordination process separately (Narayanan et al. 2011, Levina and Vaast 2008). However, little is

-28-

known about how the different types of boundaries “stack up” with respect to each other in terms of the

coordination challenges in distributed projects (Lee et al. 2007). Further, empirical studies examining

coordination challenges in project organizational structures that span country boundaries have often

tended to view Offshore-Insourcing and Offshore-Outsourcing project organizations similarly, both

theoretically and empirically. By using a classification scheme to delineate boundary spanning in

technology project organizations across firm and country boundaries – as depicted in Figure 1 – our study

provides a more refined understanding of the differential effects of project organizational structures on

coordination than that can be gleaned from the extant literature.

The third contribution of this study is in the conceptualization and measurement of integration

glitches in a technology project. Much of our understanding of project integration is derived from detailed

case studies and interviews of product development practitioners documented in prior studies, with the

result that empirical operationalization of integration glitches has not received attention beyond measures

of “fault” or “bug” counts. Such count measures are not reflective of the severity or the intensity of the

integration challenges that occur in technology projects. Further, as noted previously, a focus on count

measures also heightens the potential for measurement error that may be higher when respondents are

asked to recollect specific values reflecting the number of integration glitches. In this study, we build

upon Hoopes and Postrel (1999) conceptualization of product development “glitches” and Espinosa

(2002) notion of technical coordination challenges to develop a fine-grained, multi-item measure of

integration glitches that takes into account the extent of functional reliabilities, functional inconsistencies,

the degree of rework, and the overall difficulty associated with integration of task modules in a

technology project. The grounding of the measurement items in the extant literature ensures content

validity of the proposed scale. The other scale properties (i.e., convergent and discriminant validities,

reliability) are established using a large multi-industry, multi-country sample of technology projects,

thereby demonstrating the robustness of the multi-item measure of integration glitches.

6.2. Contributions to Practice

From a practical standpoint, the findings of this study provide valuable insights for managers.

Specifically, the finding that integration glitches are particularly salient in Offshore-Outsourcing project

organizations (compared to Domestic-Outsourcing and Domestic-Insourcing project organizations),

-29-

emphasizes the need for managers to frame contingency plans for addressing integration glitches in such

project organizations. One specific recommendation here would be to staff Offshore-Outsourcing project

organizations with technical personnel who not only have prior work experience in the project functional

domain, but also in multicultural work environments, and make them responsible for the coordination of

architectural knowledge between the project client and the project team. Such a recommendation is

consistent with the notion of “supply chain integrators” highlighted in Parker and Anderson (2002) study,

where such personnel were involved in mediating, negotiating, coordinating, and translating architectural

knowledge between the focal firm and its suppliers for ensuring successful product integration.7

Additionally, our conceptualization of the work design elements in a technology project emphasizes

to managers that decisions relating to such elements are not necessarily static in the project, but can be

adjusted appropriately during project execution to address project integration challenges. In particular,

while decisions related to the organizational structure of a technology project are often framed at the top

management level and are less prone to adaptation and change in the short-run (Novak and Eppinger

2001, Wheelwright and Clark 1992), managers can actively facilitate the exchange of architectural

knowledge between a project client and the project team within a chosen organizational structure by

enhancing the levels of onsite ratio and joint-task ownership in a project (Srikanth and Puranam 2011).

6.3. Limitations and Future Research

Our study has the following limitations. First, the use of a single respondent for collecting project

data is a limitation. Collecting data from multiple respondents is often feasible when data collection is

restricted to projects within a limited set of firms. In this study, the empirical analysis required data from

projects across the different types of project organizations, all of which are seldom found within one or

few firms. Nonetheless, the procedural steps undertaken in designing the survey, the non-linear nature of

the hypotheses, and the empirical results from the marker variable technique minimize concerns of single-

respondent bias in our study. Second, the cross-sectional nature of the data collected limits our ability to

infer causality among the key variables of interest in our study. It is unlikely, however, that the causal

7 Similarly, Staudenmayer et al., (2005) found that firms that managed interdependencies better typically employed

a “relationship manager” who possessed a broad spectrum of skills—consisting of both technical skills (e.g.,

engineering design, programming languages, and systems engineering) and business skills (e.g., project

management, costing, and business case evaluation)—to facilitate communication between the functional experts on

the project client side and the technical experts on the project team side.

-30-

ordering of the key variables would be different from what is posited since decisions related to work

design elements temporally precede the actual execution phase when project integration takes place

(Dibbern et al. 2008). Further, project integration, by definition, temporally precedes the completion of a

project after which its performance is evaluated (Ethiraj and Levinthal 2004). Third, given the use of

perceptual measures for integration glitches and project performance, it is conceivable that objective

measures for such variables could further enhance the validity of our study findings.

While addressing the above limitations would be logical extensions to this study, the study findings

also present avenues for future research. In particular, while integration glitches create rework, they often

require a project client and the project team to revisit the functional and technical specifications of the

project and evaluate their feasibility. In some instances, integration glitches may actually prove to be a

“window of opportunity” for a project client and the project team to take decisions that can benefit both

sides. Hence, a natural direction for future research would be to identify project execution factors or the

specific conditions that would moderate the negative relationship between integration glitches and project

performance. We encourage scholars to continue pursue this exciting line of inquiry.

Acknowledgments

The authors thank M. Johnny Rungtusanatham, the Senior Editor and the three reviewers for their

constructive comments and helpful suggestions that have significantly improved the paper. The authors

are also grateful to Andrea Prencipe, Phanish Puranam, Michael Jacobides, Manuel Sosa, Samer Faraj,

Tyson Browning, Cheryl Druehl, and Jim Lenz for their insightful comments and constructive

suggestions on earlier versions of the paper. Special thanks goes to the Project Management Institute’s

Information Systems group, the New Product Development group, Kimberly Johnson, Mary Walker, and

Paul Krebs for their assistance in data collection.

-31-

References

Adler, P. S. 1995. Interdepartmental interdependence and coordination: The case of the design/manufacturing interface. Organization Science 6(2) 147-167.

Ågerfalk P. J., B. Fitzgerald, S. A. Slaughter. 2009. Flexible and distributed information systems development: State of the art and research challenges. Information Systems Research 20(3) 317–328.

Anderson, J. C., D. W. Gerbing. 1988. Structural equation modeling in practice: A review and recommended two-step approach. Psychological Bulletin 193 411-423.

Anderson E., A. Davis-Blake, N. Joglekar, S. Erzurumlu, G. G. Parker. 2007. The impact of outsourcing on product integration. In Handbook of New Product Development, Loch C, Kavadias S (eds).Butterworth-Heinemann: Oxford, U.K., 259-290.

Anderson, E., G.G. Parker. 2013. Integration in global knowledge networks. Production & Operations Management

22(6) 1446-1463. Arabsheibani, G. R., A. Marin. 2001. Self-selectivity bias with a continuous variable: Potential pitfall in a common

procedure. Applied Economics 33 1903-1910. Ariane 5 Accident Report. http://sunnyday.mit.edu/accidents/Ariane5accidentreport.html Armstrong, J. S., T. S. Overton. 1977. Estimating nonresponse bias in mail surveys. Journal of Marketing Research

14(3) 396-402. Baldwin C. Y., K. B. Clark. 2000. Design Rules, Vol. 1: The Power of Modularity. MIT Press, Cambridge, MA. Bardhan, I., V. V. Krishnan, S. Lin. 2013. Team dispersion, information technology, and project performance.

Production & Operations Management 22(6) 1478-1493. Boehm, B., R. Turner. 2003. Balancing Agility and Discipline: A Guide for the Perplexed. Addison–Wesley, Boston

MA Boh, W. F., Y. Ren, S. Kiesler, R. Bussjaeger. 2007. Expertise and collaboration in the geographically dispersed

organization. Organization Science 18(4) 595-612. Cameron, C. A., P. K. Trivedi. 2005. Microeconometrics: Methods and Applications. Cambridge University Press

2005. Clark, K. B., T. Fujimoto. 1991. Product Development Performance: Strategy, Organization and Management in the

World Auto Industry. Harvard Business School Press, Boston, MA. Cooper, K. G. 1993. The rework cycle: Why projects are mismanaged. PM network. February: 5-7. Cummings, J. N., J. A. Espinosa, C. K. Pickering. 2009. Crossing spatial and temporal boundaries in globally

distributed projects: A relational model of coordination delay. Information Systems Research 20(3) 420-439. De Marco, T., T. Lister. 2003. Waltzing with Bears: Managing Risk on Software Projects. Dorset House Publishing,

New York, NY. Dibbern, J., J. Winkler, A. Heinzl. 2008. Explaining variations in client extra costs between software projects

offshored to India. MIS Quarterly 32(2) 333-366. Eppinger, S.D., A. R. Chitkara. 2006. The new practice of global product development. MIT Sloan Management

Review 47(4) 22–30. Espinosa, J. A. 2002. Shared Mental Models and Coordination in Large-Scale Distributed Software Development.

Carnegie Melon University Doctoral Dissertation. Espinosa, J. A., S. A. Slaughter, R. E. Kraut, J. D. Herbsleb. 2007. Familiarity, complexity, and team performance in

geographically distributed software development. Organization Science 18(4) 613-630. Ethiraj, S. K., D. Levinthal. 2004. Bounded rationality and the search for organizational architecture: An

evolutionary perspective on the design of organizations and their evolvability. Administrative Science Quarterly 49(3) 404-437.

Ethiraj, S. K., P. Kale, M. S. Krishnan, J. V. Singh. 2005. Where do capabilities come from and how do they matter? A study in the software services industry. Strategic Management Journal 26(1) 25-45.

Fenton, N. E., N. Ohlsson. 2000. Quantitative analysis of faults and failures in a complex software system. IEEE Transactions on Software Engineering 26(8) 797-814.

Fitzgerald, B., G. Hartnett, K. Conboy. 2006. Customising agile methods to software practices at Intel Shannon. European Journal of Information Systems 15(2) 200–213.

Fornell, C., D. F. Larcker. 1981. Evaluating structural equation models with unobservable variables and measurement error. Journal of Marketing Research 18 39–50.

Galbraith, J. R. 1973. Designing Complex Organizations. Addison-Wesley Longman Publishing Co., Garen, J. 1984. The returns to schooling: A selectivity bias approach with a continuous choice variable.

Econometrica 52(5) 1199-1218. Gatignon, H., M.L. Tushman, W. Smith. 2002. A structural approach to assessing innovation: Construct

development of innovation locus, type, and characteristics. Management Science 48(9) 1103-1122.

-32-

Gokpinar, B., W. J. Hopp, S. M. R Iravani. 2010. The impact of misalignment of organizational structure and

product architecture on quality in complex product development. Management Science 56(3) 468- 484. Hackman, J. R., R. Wagemen 1995. Total Quality Management: Empirical, Conceptual and Practical Issues."

Administrative Science Quarterly 40: 309-342. Hackman, J.R., G.R. Oldham. 1975. Development of the job diagnostic survey. Journal of Applied Psychology 60

159-170. Hackman, J.R. The design of work teams. In Lorsch J Handbook of organizational behavior Englewood Cliffs, NJ:

Prentice-Hall; 1987. Hahn, E. D., K. Bunyaratavej. 2010. Services cultural alignment in offshoring: The impact of cultural dimensions on

offshoring location choices. Journal of Operations Management 28(3) 186-193. Hamilton, D. P. 2001. Intel gambles it can move beyond the PC with new microprocessor. The Wall Street Journal

(May 29) 1. Handley, S.M., W.C. Benton Jr. 2013. The influence of task- and location-specific complexity on the control and

coordination costs in global outsourcing relationships. Journal of Operations Management 31(3) 109-128.

Hayes, R, G. Pisano, D. Upton, S. Wheelwright. 2005. Pursuing the Competitive Edge: Operations, Strategy,

Technology. John Wiley and Sons, New Jersey. Heath, C., N. Staudenmayer. 2000. Coordination neglect: How lay theories of organizing complicate coordination in

organizations. Research in Organizational Behavior 22 153-192. Heckman, J.J., R. Robb. 1985. Alternative methods for estimating the impact of interventions. In James J. Heckman

and Burton Singer (Eds.), Longitudinal Analysis of Labor Market Data (Cambridge: Cambridge University Press)

Henderson R. M., K. B. Clark. 1990. Architectural innovation: the reconfiguration of existing product technologies and the failure of established firms. Administrative Science Quarterly 35(1) 9-30.

Herbsleb, J. D., A. Mockus, T.A. Finholt, R.E. Grinter 2000. Distance, dependencies, and delay in a global collaboration. Proceedings of the 2000 ACM conference on CSCW. Philadelphia, PA

Hinds, P. J., C. D. Cramton. 2008 Situated knowing who: Why site visits matter in global work. Presented at the 2007 Academy of Management Meeting, Philadelphia, PA.

Hoegl, M., K. Weinkauf, H. G. Gemuenden. 2004. Interteam coordination, project commitment, and teamwork in multiteam R&D projects: A longitudinal study. Organization Science 15(1) 38-55.

Hoopes D. G., S. Postrel. 1999. Shared knowledge, “glitches,” and product development performance. Strategic

Management Journal 20(9) 837-865. Iyer, A., H.L. Lee, A. Roth. 2013. Introduction to special issue on POM research on emerging markets. Production

and Operations Management 22(3) 233-235. Kennedy, P. 2008. A Guide to Modern Econometrics. Blackwell Publishing, Oxford. Kirkman, B. L., J. E. Mathieu. 2005. The dimensions and antecedents of team virtuality. Journal of Management

31(5) 700-718. Kline RB. 1998. Principles and practice of structural equation modeling. New York: Guilford. Kutner, M. H., C. J. Nachtsheim, J. Neter, W. Li. 2005. Applied Linear Statistical Models. The McGraw-Hill/Irwin

series operations and decision sciences. McGraw-Hill Irwin, Boston MA Langlois, R. N., P. L. Robertson. 1992. Networks and innovation in a modular system: Lessons from the

microcomputer and stereo component industries. Research Policy 21(4) 297-313. Larsen, M.M., S. Manning, T. Pedersen. 2013. Uncovering the hidden costs of offshoring: The interplay of

complexity, organizational design, and experience. Strategic Management Journal 34 533-552. Lee, G., W. DeLone, J. A. Espinosa. 2007. Ambidexterity and global IS project success: A theoretical model. 2007

40th Annual Hawaii International Conference on System Sciences HICSS07, 44-44. Retrieved from http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=4076458

Levina, N., E. Vaast. 2008. Innovating or doing as told? Status differences and overlapping boundaries in offshore collaboration. MIS Quarterly 33(2) 307-332.

Lindell, M. K., D. J. Whitney. 2001. Accounting for common method variance in cross-sectional research designs Journal of Applied Psychology 86(1) 114-121.

Loch, C.H., DeMeyer, A., Pich, M.T. 2006. Managing the Unknown. A New Approach to Managing High

Uncertainty and Risk in Projects. John Wiley and Sons, Hoboken, NJ. Maddala, G. S. 2001. Introduction to Econometrics. New York: Wiley. Majchrzak, A., A. Malhotra, R. John. 2005. Perceived individual collaboration know-how development through

information technology-enabled contextualization: Evidence from distributed teams. Information Systems Research 16(1) 9-27.

Malhotra, N. K., S. S. Kim, A. Patil. 2006. Common method variance in IS research: A comparison of alternative

-33-

approaches and a reanalysis of past research. Management Science 52(12) 1865–1883. Maruping, L. M., X. Zhang, V. Venkatesh. 2009. Role of collective ownership and coding standards in coordinating

expertise in software project teams. European Journal of Information Systems 18(4) 355-371. Metiu, A. 2006. Owning the code: Status closure in distributed groups. Organization Science 17(4) 418-435. Metters R. 2008. A typology of offshoring and outsourcing in electronically transmitted services. Journal of

Operations Management 26(2) 198–211. Mitchell, V., B. Nault. 2007. Cooperative planning, uncertainty and managerial control in concurrent design.

Management Science 53(3) 375-389. Mooi, E. A., M. Ghosh. 2010. Contract specificity and its performance implications. Journal of Marketing 74 105-120. Narayanan, S., S. Balasubramanian , J. M. Swaminathan. 2011. Managing outsourced software projects: An analysis

of project performance and customer satisfaction. Production & Operations Management 20(4) 508–521. Nidumolu, S. 1995. The effect of coordination and uncertainty on software project performance: Residual

performance risk as an intervening variable. Information Systems Research 6(3) 191-219. Novak, S., S. Eppinger. 2001. Sourcing by design: Product complexity and the supply chain. Management Science

47(1) 189–204. O'Leary, M.B., J.N. Cummings. 2007. The spatial, temporal, and configurational characteristics of geographic

dispersion in teams. MIS Quarterly 31(3) 433-452. Parker, G.G., E.G. Anderson. 2002. From buyer to integrator. The transformation of the supply-chain manager in the

vertically disintegrating firm. Production and Operations Management 11(1) 75-91. PCWorld 2011. http://www.pcworld.com/article/239641/lawson_software_customer_embroiled_in_erp_project_lawsuit.html Pikkarainen, M., X. Wang, K. Conboy. 2007. Agile practices in use from an innovation assimilation perspective: A

multiple case study. In Proceedings of the 28th International Conference on Information Systems, pp 1–17, Association for Information Systems Press, Montreal, Canada.

Podsakoff, P. M., S. B. MacKenzie, J. Y. Lee, N. P. Podsakoff. 2003. Common method biases in behavioral research: A critical review of the literature and recommended remedies. Journal of Applied Psychology 88(5) 879-903.

Prencipe A., A. Davies, M. Hobday.2003. The Business of System Integration. Oxford University Press,UK. Ramasubbu, N., S. Mithas, M. S. Krishnan, C. F. Kemerer. 2008. Work dispersion, process based learning, and

offshore software development performance. MIS Quarterly 32(2) 437-458. Rottman, J. W., M. C. Lacity. 2008. A US client learning from outsourcing IT work offshore. Information Systems

Frontier 10(2) 259-275. Schumacker, R. E., R.G. Lomax. A Beginner's Guide to Structural Equation Modeling. Lawrence Erlbaum

Associates, Inc., Mahwah, NJ, 1996. Siemsen, E., A. M. Roth, P. Oliviera. 2010. Common method bias in regression models with linear, quadratic, and

interaction effects. Organizational Research Methods 13(3) 456-476. Simon H. A. 1962. The architecture of complexity. Proceedings of American Philosophical Society 106(6) 467-482. Sinha K. K., A. H. Van de Ven. 2005. Designing work within and between organizations. Organization Science

16(4) 389-408. Sosa, M. E., S. D. Eppinger, C. M. Rowles. 2004. The misalignment of product architecture and organizational

structure in complex product development. Management Science 50(12) 1674-1689. Srikanth, K., P. Puranam. 2011. Integrating distributed work: Comparing task design, communication, and tacit

design mechanisms. Strategic Management Journal 32 849-875. Stuckenbruck, L. C. 1988. Integration: The essential function of project management. In Project Management

Handbook, (eds) D. I. Cleland, W. R. King, Van Nostrand Reinhold, New York Project Management Handbook, p. 56-81.

Stock, G.N., M. V. Tatikonda. 2008. The joint influence of technology uncertainty and interorganizational

interaction on external technology integration success. Journal of Operations Management 26 65-80. Tanriverdi, H., P. Konana, L. Ge. 2007. The choice of sourcing mechanisms for business processes. Information

Systems Research 18(3) 280-299. Tenhiala, A., F. Salvador. 2014. Looking inside glitch mitigation capability: The effect of intraorganizational

communication channels. Decision Sciences 45(3) 437-466. Thompson, J. D. 1967. Organizations in Action: Social Science Bases of Administrative Theory. McGraw-Hill, New

York. Tucker, A. I. Nembhard, A. Edmondson. 2007. Implementing new practices: An empirical study of organizational

learning in hospital intensive care units. Management Science 53(6) 894-907. Tushman, M. L., D. A. Nadler. 1978. Information processing as an integrating concept in organizational design.

Academy of Management Review 3 613-624. Tyre, M. J., E. von Hippel. 1997. The situated nature of adaptive learning in organizations. Organization Science

-34-

8(1) 71–83. Wheelwright, S. C. K. B. Clark.1992. Revolutionizing product development, quantum leaps in speed, efficiency, and

quality. The Free Press. New York. Williams, T. 2005. Assessing and moving on from the dominant project management discourse in the light of

project overruns. IEEE Transactions in Engineering Management 52(4) 497-508. Wood, R.E. 1986. Task Complexity: Definition of the construct. Organizational Behavior and Human Decision

Processes 37(1) 60-82. Womack, J.P., D.T. Jones, D. Roos. 1990. The Machine that Changed the World. Maxwell Macmillan International,

New York, NY.

Table 1: Examples of Integration Glitches in Technology Projects

Itanium Chip Development: Partnering with Hewlett-Packard, Intel developed the Itanium micro-

processor that would dramatically enhance processing power beyond the existing x86 micro-

processor. During development, the project team comprising of more than 500 team members was

divided into smaller teams, each working on a specific task module. Initially, the teams proceeded

in their work without fully understanding how the task modules interfaced with each other. The

subsequent integration of task modules indicated that the assembly was bigger than the die in which

it was supposed to be placed. Follow-up rework resulted in changes in a specific subset of task

modules which rippled across work of several hundred team members causing not only technical

performance issues but also delaying the project by more than 18 months (Source: Hamilton 2001)

ERP Implementation at CareSource Management Group: Healthcare plan administrator

CareSource Management Group sued Lawson Software, claiming that an ERP system from the

company hadn't been able to get beyond the testing phase and wasn't the fully integrated deliverable

that Lawson was contracted for. The system consisted of two task modules: a human resource (HR)

module and a financial application (FA) module. During project integration, severe data-transfer

issues occurred between the HR and the FA modules, resulting in significant rework and halting the

project’s progress (Source: PCWorld 2011).

Arianespace Software Development: The maiden flight of the Ariane 5, launched by the French

commercial space transportation company, Arianespace, crashed 40 seconds after takeoff, resulting

in a loss of nearly $500 million for the company. Initial investigation revealed that the crash was a

result of a software error that occurred at the interface of two task modules: the Inertial Reference

Software (IRS) and the Flight Software (FS), which integrate together to perform computations and

adjust the trajectory of the launcher. Detailed investigation revealed that the IRS and the FS were

tested independently, but not tested in an integrated manner during the development, which resulted

in the occurrence of an integration glitch during launcher takeoff. (Source: DeMarco and Lister 2003,

Ariane 5 Accident Report)

Table 2: Distribution of Onsite Ratio across Project Organization Types

Project Organization Type Number of Projects Mean Std. Dev.

Domestic-Insourcing 454 70.69 36.01

Domestic-Outsourcing 168 59.32 34.29 Offshore-Insourcing 71 44.61 29.31 Offshore-Outsourcing 137 42.45 24.42

-35-

Table 3: Model Estimation Results

Independent Variables

First Step: OLS Regression Model

Second Step: GLS Regression Model

Dependent Variable: Integration Glitches

Dependent Variable: Project Performance

(1) (2) (3) (4) (5)

(Constant) 3.04 (0.35)*** 2.88 (0.35)*** 2.85 (0.38)*** 4.30 (0.36)*** 4.55 (0.36)***

ln(Team Size) 0.11 (0.05)** 0.09 (0.05) 0.09 (0.06) -0.09 (0.07) -0.07 (0.06)

Project Budget -0.06 (0.04) -0.03 (0.04) -0.04 (0.04) 0.06 (0.04)* 0.07 (0.04)*

Requirements Uncertainty 0.15 (0.04)*** 0.14 (0.04)*** 0.14 (0.04)** -0.03 (0.05) 0.01 (0.04)

Hardware 0.26 (0.13)* 0.11 (0.14) 0.09 (0.15)** -0.33 (0.16)** -0.30 (0.15)**

Software 0.20 (0.12) 0.10 (0.13) 0.11 (0.14)*** -0.33 (0.15)** -0.33 (0.14)**

Past Experience -0.26 (0.04)*** -0.22 (0.05)*** -0.22 (0.05)*** 0.20 (0.06)** 0.13 (0.06)**

Project Manager -0.12 (0.06) -0.08 (0.12) -0.07 (0.13)** 0.33 (0.14)*** 0.33 (0.13)***

Senior Manager -0.07 (0.11) 0.03 (0.16) 0.03 (0.16)** 0.03 (0.19)** 0.09 (0.16)**

Project Client 0.23 (0.14) 0.18 (0.14) 0.17 (0.14) -0.20 (0.18) -0.16 (0.16)

Project Team -0.29 (0.13)** -0.23 (0.13) -0.23 (0.13)*** -0.01 (0.16) -0.12 (0.15)

ln(Proj. Mgmt. Exp.) -0.04 (0.06) -0.06 (0.06)*** -0.07 (0.06) -0.13 (0.08)*** -0.20 (0.07)***

Development 0.07 (0.09) 0.07 (0.09) 0.08 (0.09)

ln(Duration) 0.19 (0.06)*** 0.17 (0.07)*** 0.18 (0.07)***

Task Module Complexity -0.01 (0.05) 0.00 (0.05) 0.00 (0.05)

Architectural Uncertainty -0.12 (0.04)*** -0.13 (0.04)*** -0.13 (0.04)***

Domestic-Outsourcing

-0.07 (0.10) -0.05 (0.10) -0.12 (0.12) -0.13 (0.11)

Offshore-Insourcing

0.19 (0.17) 0.33 (0.20) -0.44 (0.18) -0.21 (0.22)

Offshore-Outsourcing 0.23 (0.13)** 0.08 (0.14) -0.56 (0.16)*** -0.49 (0.14)***

Onsite Ratio -0.12 (0.04)*** -0.13 (0.04)*** 0.36 (0.05)*** 0.29 (0.05)***

Joint Task Ownership -0.19 (0.04)*** -0.17 (0.05)*** 0.06 (0.04) 0.03 (0.04)

Domestic-Outsourcing×Onsite Ratio 0.04 (0.09)

Offshore-Insourcing×Onsite Ratio 0.20 (0.17)

Offshore-Outsourcing×Onsite Ratio -0.22 (0.13)**

Domestic-Outsourcing×Joint Task Ownership 0.09 (0.11)

Offshore-Insourcing×Joint Task Ownership 0.06 (0.17)

Offshore-Outsourcing×Joint Task Ownership -0.23 (0.10)***

Integration Glitches -0.40 (0.07)***

SCORE -0.09 (0.07)

R2 0.18 0.25 0.27 0.26 0.44

Δ R2 - 0.07*** 0.02*** - 0.18***

F 9.44*** 9.95*** 8.30*** 8.23*** 20.67***

N 725 671 671 687 671

*p< 0.1, ** p < 0.05, ***p < 0.01; robust standard errors are included in the parentheses; we use two-tailed tests for

control variable results where no directionality is hypothesized and one-tailed test for hypotheses results.

Note: To conserve space and increase the readability of the above table, we dropped dummy variables relating to

industry (Information Technology, Banking, Insurance, Healthcare and Manufacturing) and project client location

(North America) from the table. These variables are however included as controls in the analysis

-36-

Table 4: Summary of Results

Hypothesized Relationships Results

H1A: Boundary Spanning across Firm Boundaries or Country Boundaries Integration Glitches Not Supported

H1B: Boundary Spanning across Firm Boundaries and Country Boundaries Integration Glitches Partially Supported

H2A: Onsite Ratio Integration Glitches Supported

H2B: Moderating Effects of Onsite Ratio on Boundary Spanning Integration Glitches Partially Supported

H3A: Joint-Task Ownership Integration Glitches Supported

H3B: Moderating Effects of Joint-Task Ownership on Boundary Spanning Integration Glitches Partially Supported

H4: Integration Glitches Project Performance Supported

Figure 1: A Classification Scheme for the Types of Project Organization

Figure 2: The Conceptual Framework linking the various elements of Work Design, Integration Glitches, and Project Performance

Inte

r-

Co

un

try

Offshore-

Insourcing

Offshore-

Outsourcing

Intr

a-

Cou

ntr

y

Domestic-

Insourcing

Domestic-

Outsourcing

Intra-Firm Inter-Firm

-37-

Figure 3a: Moderating Effects of Onsite Ratio Figure 3b: Moderating Effects of Joint-Task Ownership

Appendix Table A1: Measurement Items in the Survey Questionnaire

Project Performance [Please rate the success of this project relative to its goals: 1= Significantly Worse; 2 = Worse; 3 =Somewhat Worse; 4 = About Same; 5 = Somewhat Better; 6 = Better; 7 =Significantly Better]

• Adherence to schedule

• Adherence to budget

• Adherence to quality

• Technical performance

• Overall satisfaction

Project Organization Type [Select one of the choices which best reflects the organization of the project]

• Domestic-Insourcing: Firm assigns project tasks to an in-house project team

• Domestic-Outsourcing: Firm contracts project tasks to a vendor firm in the same country

• Offshore-Insourcing: Firm assigns project tasks to its division in a different country

• Offshore-Outsourcing: Firm contracts project tasks to a vendor firm in a different country

Onsite Ratio [Show % on a 0-100 scale]

What % of the project tasks were carried out by the project team at the project client site? _______

To what extent do you agree or disagree with the following statements about the project (1 = Strongly Disagree; 2 = Somewhat Disagree; 3 = Neutral; 4 = Somewhat Agree; 5 = Strongly Agree)

Integration Glitches • Task modules did not function reliably when they were first integrated

• Functionalities of the task modules were misaligned when they were first integrated

• Major difficulties were encountered during the integration of the task modules

• Significant rework had to be done to the task modules to improve integration

Joint-Task Ownership Team members were assigned project tasks in pairs

Team members pursued the practice of collective ownership of the project

Requirements Uncertainty • Client firm requirements fluctuated significantly at the start of the project

• Client firm requirements fluctuated significantly midway into the project

• Client firm requirements changed continuously throughout the project

• Client firm requirements remained stable throughout the project†

-38-

Past Experience • Team members had worked on similar projects in the past

• The project manager had past experience of managing projects of similar scope/size

• Team members had dealt with user firm requirements of similar type in past projects

• The project manager had past experience of working in a similar project organization

Task Module Complexity • The task modules in the project were highly interdependent

• The project consisted of a large number of task modules

Architectural Uncertainty† • The project could be easily divided into task modules

• Interdependencies across task modules were clearly defined

• It was easy to define the interdependence among task modules in the project

Table A2: Descriptive Statistics and Correlation Matrix

Variables 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22

1 ln(Team Size) 1.0

2 Project Budget .56 1.0

3 Requirements Uncertainty .10 .17 1.0

4 Hardware .03 .11 .01 1.0

5 Software -.07 -.13 .00 -.76 1.0

6 Past Experience -.07 -.14 -.12 -.05 .07 1.0

7 ln(Proj. Mgmt. Exp.) .09 .13 .01 -.08 .09 .06 1.0

8 Project Manager -.08 -.06 -.02 -.04 .04 .03 -.02 1.0

9 Senior .01 .02 -.04 .06 -.01 -.04 .14 -.67 1.0

10 Client .08 .01 -.17 .04 .02 -.04 .00 -.09 .12 1.0

11 Vendor -.16 -.10 .17 -.02 .03 .02 -.07 .12 -.11 -.75 1.0

12 Task Module Complexity .29 .29 .17 .05 -.06 .04 .04 -.05 .00 -.01 -.03 1.0

13 ln (Duration) .44 .59 .21 .13 -.13 -.21 .07 -.04 .02 .02 -.04 .22 1.0

14 Development -.04 -.01 -.14 .08 -.30 -.02 -.06 .02 -.02 .02 -.04 -.05 -.03 1.0

15 Architectural Uncertainty -.13 -.12 -.01 -.02 -.02 .10 .01 .05 -.03 -.02 .00 .04 -.14 .02 1.0

16 Domestic-Outsourcing -.04 .08 .01 -.11 .02 -.03 -.02 .01 -.04 .12 -.16 .00 .01 .03 .08 1.0

17 Offshore-Insourcing .02 -.03 -.06 .05 .00 .02 .02 .06 .00 .17 -.14 -.03 .01 .02 .01 -.15 1.0

18 Offshore-Outsourcing .19 .03 -.02 .03 .03 -.06 .01 -.04 -.01 .29 -.28 .01 .01 -.06 -.02 -.23 -.14 1.0

19 Onsite Ratio -.01 .01 .00 -.03 .00 .00 -.01 -.05 .05 -.17 .10 .05 -.04 .03 -.02 -.03 -.14 -.23 1.0

20 Joint-Task Ownership .01 .03 -.02 -.04 .00 .17 .01 .07 -.03 -.03 .07 .06 -.01 -.04 .02 -.05 -.09 -.04 .05 1.0

21 Integration Glitches .19 .13 .15 .05 -.01 -.26 -.02 -.09 .05 .21 -.21 .04 .22 .00 -.16 -.03 .08 .19 -.17 -.24 1.0

22 Project Performance -.09 .01 -.06 -.02 -.05 .22 -.03 .15 -.08 -.20 .16 .05 -.09 .04 .01 .00 -.10 -.21 .13 .35 -.51 1.0

Mean 2.80 3.51 2.99 0.26 0.60 3.75 2.28 0.72 0.13 0.30 0.57 3.76 2.37 1.29 2.42 0.20 0.09 0.17 0.00 2.74 2.71 4.35

Std. Dev 0.93 1.39 1.07 0.44 0.49 0.87 0.63 0.45 0.34 0.46 0.50 0.84 0.78 0.46 0.85 0.40 0.28 0.37 1.13 0.97 1.09 1.27

Minimum 0.69 1.00 1.00 0.00 0.00 1.00 0.00 0.00 0.00 0.00 0.00 1.00 -0.69 1.00 1.00 0.00 0.00 0.00 -1.96

1.00 1.00 1.00

Maximum 6.40 5.00 5.00 1.00 1.00 5.00 3.69 1.00 1.00 1.00 1.00 5.00 4.79 2.00 5.00 1.00 1.00 1.00 1.23 5.00 5.00 7.00

N 830 745 830 830 830 830 830 830 830 830 830 830 830 802 830 830 830 830 830 707 830 830

|ρ| ≥ 0.06 significant at 0.10 level, |ρ| ≥ 0.07 significant at 0.05 level, |ρ| ≥ 0.09 significant at 0.01 level

Note: As in the case of Table 3, to conserve space and increase the readability of the correlation matrix, we dropped dummy variables relating to industry (Information Technology, Banking, Insurance, Healthcare and Manufacturing) and project client location (North America) from the matrix.

† represents reverse coded items

-39-

Table A3: Results from the Endogenous Treatment Effects Model

Dependent Variable: Integration Glitches

(Constant) 2.69 (0.37)***

Joint Task Ownership -0.19 (0.04)*** Onsite Ratio -0.12 (0.03)*** Intercountry 0.79 (0.31)***

Dependent Variable: Intercountry

(Constant) -0.31 (0.23) Hardware 0.61 (0.20)*** Software 0.50 (0.19)*** Average Work Experience -0.22 (0.04)***

-Log-Likelihood 1268.55 Wald Chi-Square 197.29*** Sample Size 667

*p< 0.10, ** p < 0.05, ***p < 0.01; Note: control variables were included in the analysis, but not shown above

The Control Function Approach Let Y1 denote the response variable, Y2 the endogenous explanatory variable, and Z1 is a vector of exogenous

variables and U1 the disturbance term in predicting Y1. Y1 is therefore represented as follows:

Y1 = Z1*δ1 + α1*Y2 + U1 ------- (1)

Further, let the endogenous variable, Y2 be predicted from Z2, a vector of exogenous predictors that include

instrumental variables and V2 denote the disturbance term in predicting Y2. We represent Y2 as follows

Y2 = Z2*π2 + V2 ------- (2)

In the above scenario, endogeneity of Y2 arises only when U1 is correlated with V2. Representing U1 as a

function of V2, we have

U1 = ρ1V2 + e1 ------- (3)

Substituting (3) into (1) gives the following equation

Y1 = Z1*δ1 + α1*Y2 + ρ1V2 + e1 ------- (4)

Where V2 is now a predictor and e1 is uncorrelated with Y2 and V2. Since V2 is not directly observed, we

replace it with its unbiased estimator, V̂2, the OLS residuals from the first stage regression of Y2 on Z2 in (4)

Y1 = Z1*δ1 + α1*Y2 + ρ1V̂2 + error ------- (5)

The new error term in (5) is now dependent upon the sampling error in π2. As prior studies (Wooldridge

2002, Heckman and Robb 1985) note, standard results on two-step estimation indicate that OLS estimation

of (5) will be consistent. This estimation is referred to as control-function estimation since the inclusion of

the residuals controls for the endogeneity of Y2 in (1). Garen (1984) provides a more practical extension of

this approach wherein the effects of the endogenous explanatory variable, Y2, on the Y1 is heterogeneous.

This implies that the disturbance term, error, is likely to vary with levels of Y2. To parse out this

heterogeneous effect in (5), the residual term, V̂2, is multiplied with the Y2 and included in (5). The revised

model specification shown in (6) is then estimated using a generalized least square (GLS) approach with

robust standard errors to reduce heteroskedasticity.

Y1 = Z1*δ1 + α1*Y2 + ρ1�̂�2 + ρ2Y2.�̂�2 + error ------- (6)