44
District Leadership for Computer Data Systems 1 Running head: DISTRICT LEADERSHIP FOR COMPUTER DATA SYSTEMS District Leadership for Computer Data Systems: Technical, Social, and Organizational Challenges in Implementation Vincent Cho, Ph.D. Department of Educational Administration and Higher Education Boston College Jeffrey C. Wayman, Ph.D. Wayman Services, LLC The authors would like to thank The Spencer Foundation for funding the project from which this article comes. We also wish to extend special thanks and admiration to all of the educators in our three study districts, not only for their assistance in our project, but for their commitment to education. Paper prepared for the 2013 UCEA Convention, Indianapolis, IN. Please address all inquiries to [email protected].

District leadership for computer data systems: Technical, social, and organizational challenges in implementation

  • Upload
    bc

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

District Leadership for Computer Data Systems 1

Running head: DISTRICT LEADERSHIP FOR COMPUTER DATA SYSTEMS

District Leadership for Computer Data Systems:

Technical, Social, and Organizational Challenges in Implementation

Vincent Cho, Ph.D.

Department of Educational Administration and Higher Education

Boston College

Jeffrey C. Wayman, Ph.D.

Wayman Services, LLC

The authors would like to thank The Spencer Foundation for funding the project from which this

article comes. We also wish to extend special thanks and admiration to all of the educators in

our three study districts, not only for their assistance in our project, but for their commitment to

education.

Paper prepared for the 2013 UCEA Convention, Indianapolis, IN. Please address all inquiries to

[email protected].

District Leadership for Computer Data Systems 2

Introduction

Leading a school district is hard work. District leaders face a multitude of concerns to

juggle and decisions to make. For example, Corcoran, Fuhrman, and Belcher (2001) provide one

glimpse into these challenges. They describe how district leaders struggle not only with

addressing immediate concerns (e.g., what work to do and how to coordinate it), but also matters

around long term success (e.g., handling leadership turnover; being responsive to feedback).

Although the logistical issues (e.g., space, budget, labor contracts) involved in district change

may seem obvious to practitioners (Palandra, 2010), the work of district leadership is not simply

technical. Rather, many challenges are “people issues.” For example, district leaders may need to

create legitimacy in the minds of others around change, develop a new common language around

work practices, and rethink processes for supporting educators’ professional growth (Stein &

D’Amico, 2002; Wayman, Jimerson, & Cho, 2012).

Test-based accountability policies, however, have increased the pressures on educators to

be “data-driven” in their decision making (Booher-Jennings, 2005; Valli & Buese, 2007; Wills &

Sandholtz, 2009). Into this fold, computer data systems have promised to assuage some of the

burdens associated with meeting these demands. Such systems serve as conduits for how schools

gather, analyze, and distribute a wealth of data about students (Tucker, 2010; Wayman, Cho, &

Richards, 2010). Accordingly, data systems have become a standard component to supporting

data use (Burch & Hayes, 2009; Hamilton et al., 2009; Marsh, 2012). Enthusiasm for investing in

such systems has grown not only in school districts, but at the state and federal levels (Ash,

2013).

District Leadership for Computer Data Systems 3

Further complicating matters, ensuring the use of a data system might not be as easy as

purchasing one. Although access to these systems has increased, simply affording access has not

necessarily resulted in system use (Means, Padilla, DeBarger, & Bakia, 2009). Data systems can

be rejected, used for unexpected purposes, or used for only a narrow subset of possible

applications (Cho & Wayman, in press). Indeed, the “people issues” associated with technology

implementation may catch some central office leaders off guard. For example, Brooks (2011)

argues that district leaders may confuse technological advancement with educational progress,

thus clouding their sense for whether teaching and learning are really improving. Research

outside of education suggests that managing technology implementation involves not only

technical work, but also consideration of the social and organizational factors affecting use

(Leonardi, 2009b; Orlikowski & Barley, 2001).

Educational scholars, however, have yet to chart out the ways in which central offices

might navigate such disparate demands upon their work. Gaps in knowledge exist not only

around the key issues that district leaders might contend with during implementation, but also

around the strategies they might employ toward supporting system use. Accordingly, the purpose

of this comparative case study is to explore the work of central office members in implementing

computer data systems. We were guided by three research questions. First, what do central office

members hope to gain from data system implementation? Second, what social or organizational

considerations influence central office implementation work? Third, what strategies do central

offices employ toward data system implementation?

District Leadership for Computer Data Systems 4

Conceptual Framework

We drew upon a sensemaking framework in order to better understand the work of

central office members in implementing computer data systems. This framework contributed to

our data collection and analysis in two ways. First, sensemaking helped to ground our views on

problem solving in organizations. Second, sensemaking sensitized us to specific issues that could

be at play in technology implementation. Although this view has much traction in the

information systems (IS) and management information systems (MIS) fields, educational

scholars have been slower to conceptualize about technology in this way.

Sensemaking and Problem Solving

Sensemaking affords entry into researching a variety of issues in educational

organizations. At the heart of this theoretical lens is the notion that interpretive processes what

people do—that what people see and think about the world influences their decisions about how

to act in that world. In this view, people and organizations can be likened to information

processors, constantly assessing and acting upon a host of signals about themselves, each other,

and their environments (Lichtenstein, 2006; McDaniel, 2004). Although such signals are

everywhere, however, individual people and organizations have bounded understandings about

their worlds (Levinthal & Warglien, 1999; Rivkin & Siggelkow, 2002). Each sees and is able to

process only a small, subjective slice of what is out there. Thus, differences in access to

information, training, and practical experience can influence how well people solve problems

(Brown & Duguid, 1991; Carlile, 2002; Weick, 1993).

Accordingly, sensemaking draws attention to a variety of social and organizational issues

influencing education. For example, Weick (1976) highlights how the notion of loose coupling

could shed light on how innovative ideas spread, how educational organizations adapt to new

District Leadership for Computer Data Systems 5

circumstances, and how professional autonomy influences teacher work. More recently, Dorner

(2011) describes how factors such as time, place, and family circumstances can influence how

immigrant parents make decisions about enrolling their children in bilingual education. Others

have described how policy outcomes can be influenced by interpretive processes in central

offices (Datnow, 2006; Spillane, Reiser, & Reimer, 2002), interactions among building-level

educators (Coburn, 2001; Palmer & Snodgrass Rangel, 2011; Young, 2006), by relationships

between central office and school personnel (Honig & Venkateswaran, 2012) or between and

outside organizations (Honig, 2006).

Implications for Technology Implementation

A sensemaking framework can help to illuminate specific issues about central office

technology implementation. These include: (a) how central offices approach the work of

planning and implementation and (b) how central offices conceptualize about technology itself.

Approaches to planning and implementation. As a way to understand central office

work, sensemaking highlights that district leaders’ interpretive frames have bearing on what they

do in practice. As Weick (1993) observes, how people make sense of phenomena influences

what they come to define as problems to handle. In this way, leaders may make all the “right

decisions” for the problems they see, but still go wrong because they have misidentified the real

problems at stake. Accordingly, a sensemaking framework calls attention not only to what

actions leaders take, but also what worldviews and information led to those actions.

There are many ways in which district leaders might orient to the work of planning and

implementation. The conventional view about organizations is that they are rationally organized.

Much familiar examples of this view are Scott and Davis’s rational systems perspective (2007)

and Bolman and Deal’s (2008) structural frame. Influenced heavily by the work of Frederick

District Leadership for Computer Data Systems 6

Taylor and Max Weber, rational systems perspectives emphasize bureaucratic structures,

aligning procedures to goals, and articulating the division of labor. In other words, they assume

that people in organizations act rationally—that when they know the “right” thing to do, they

will do it. Thus, the machine becomes a useful metaphor for understanding organizations

(Morgan, 1986). Organizations are seen as becoming more efficient via the forethought and

planning of leaders.

If rational systems perspectives rely on carefully drawn plans to determine how work is

to be done, then this can be seen as both a source of strength and of limitation. Rational systems

perspectives have proven effective for understanding work in environments with predictable

challenges (e.g., production lines), but less so for contexts prioritizing workers’ professional

judgment (Davenport & Prusak, 1998; Pfeffer & Sutton, 2000). Organizations do not perform

well simply because of leaders’ and workers’ capacities to rationalize procedures. Bolman and

Deal’s (2008) three other frames (i.e. human resources, political, symbolic) provide examples

from the popular literature of alternative views for understanding organizational success. Open

systems perspectives (Scott & Davis, 2007) and the literature on sensemaking (e.g., Weick,

1976) provide another sets of alternative views. On the whole, alternative ways to think about

organizations emphasize bottom-up “people issues” such as organizational culture, political

dynamics, or values and attitudes about work and the workplace (March & Olsen, 1984; Morgan,

1986; Pfeffer, 1997). In other words, although non-rational perspectives do not necessarily

negate planning, they do recognize that plans in their objective, ideal forms do not always

capture all real world demands.

The upshot for district leaders is that they have options when orienting toward the work

of planning and implementation. Although some might lean toward rational, structural

District Leadership for Computer Data Systems 7

approaches to planning, others might see implementation as a co-constructed endeavor (Datnow,

2006). For example, planners who are concerned about the limits of conventional planning

approaches might formalize approaches to collect feedback about their work (Eisenhardt, 1990;

Thomas, Sussman, & Henderson, 2001). On one hand, this would allow leaders to address

problems in real time– even those problems they have never before encountered or could not

have predicted (Levinthal & Warglien, 1999; McDaniel, 2004; Weick & Roberts, 1993). On the

other hand, this would improve the nature of solutions applied toward those problems.

Sometimes the “right” thing to do is not obvious. Problems, however, become more intelligible

(and remedies more robust) when people talk to each other about what they see (Brown &

Duguid, 1991; Carlile, 2002; Edmondson, 2003).

Conceptualizing about technology. How district leaders conceptualize about technology

also factors into their implementation of computer data systems. Although educational scholars

have recognized that interpretive processes shape how educational data are used (Coburn, Honig,

& Stein, 2009; Hamilton et al., 2009; Young, 2006), few have thought about computer data

systems in this way (for an exception, see Cho & Wayman, in press). In light of the foundational

role data systems can play in educators’ access and analysis of data (Tucker, 2010; Wayman et

al., 2010), this blind spot is especially glaring.

As with planning and implementation, district leaders have choices when it comes to how

to orient toward technology. Different orientations come along with different assumptions about

how to approach the work. For example, technologically deterministic perspectives assume that

technologies are tools. Their simple presence is assumed to drive change in and of itself (Brooks,

2011). In this way, veils are cast over how (or even if) a technology is actually used in practice

(Orlikowski & Barley, 2001; Orlikowski & Iacono, 2001). Thus central offices who are

District Leadership for Computer Data Systems 8

technologically deterministic in their approaches to data systems, might merely focus their work

around technical or logistical matters, such as adopting the “right tool” for the job. Because

technologically deterministic views assume that tools are predestined to accomplish their

designers’ intents, they also assume that implementation work stops once people have access to

the tool (Leonardi, 2009a). In this way, technological determinism can be seen as another

manifestation of rational systems thinking. If organizations are seen as big machines, then

machines are assumed to seamlessly plug into or replace activities in those organizations (Barley,

1990).

On the flipside, technologies can also be understood as socially constructed; organizing

can be seen as matter of sensemaking. In other words, what a technology is or is good for may, at

least in part, be the result of what people have come to believe about the technology (Cho, Ro, &

Littenberg-Tobias, 2013; Leonardi, 2012). In this way, local histories, narratives, and work

interests help to shape the life technology takes on in the workplace (Leonardi, 2009b). The same

technology could be understood and used in vastly different ways (Cho & Wayman, in press;

Pinch & Weibe, 1984). Thus, a central office aiming to resist technological determinism might

approach implementation as not simply a technical, but also a social endeavor. For example,

professional development might consequently address not only the technical side to using a data

system, but also impressions about the importance of a system to work or work relationships.

Similarly, a central office might be careful to think about about a data system not only in its

rationalized, ideal state, but rather as a technology whose fit into local context are yet to be

understood. Such a central office might gather feedback about how technologies are actually

being used or otherwise adapt its implementation strategies over time.

District Leadership for Computer Data Systems 9

Methods

This comparative case study (Merriam, 2009; Yin, 2009) draws upon data from a larger

study of three school districts. This larger study aims to uncover what patterns might be

generalized about districts and their efforts around data use. Accordingly, the districts in this

study varied in size, demographics, and academic achievement. Below, we describe our methods

data collection and analysis for the present paper.

Data collection was aimed at capturing how central offices saw and approached data

system implementation. Because successful implementation involved how campus

administrators and teachers used and perceived this work, they were also included in data

collection. The bulk of data collection took place over an 11-month period from March 2010 to

January 2011. This provided a sense for implementation as it transpired over time. Data sources

included interviews (e.g. individual interviews and focus groups) and observations. In total, there

were 82 participants in either individual interviews or focus groups. Generally, central office

members participated in interviews individually and campus-level educators participated in focus

groups. Semi-structured protocols helped to provide conversations with a focused, but flexible

agenda (Merriam, 2009; Miles & Huberman, 1994). Lines of inquiry included: districts’ efforts

around data use, districts’ efforts around data systems, and the contributions of particular data

systems to work.

Observational data helped to provide a first-hand sense for how districts were introducing

and dealing with computer data systems. The venues included trainings for computer data

systems, leadership events (e.g., principals’ meetings), and meetings of central office planners.

At each field experience, comprehensive sets of jottings were collected that were later be

expanded and knit into more detailed field notes (Emerson, Fretz, & Shaw, 1995). Fifteen

District Leadership for Computer Data Systems 10

observational sessions were conducted, each lasting between one hour to several hours. Data

analysis followed the recommendations of Miles and Huberman (1994). Initial analysis began

with a start list of codes, which stabilized over time.

Findings

We drew upon interview and observational data from three school districts to explore the

work of central office in implementing computer data systems. First, we describe what central

office members hoped to gain from data system implementation. Second, we describe the social

and organizational factors influencing their work. Third, we describe some of the specific

strategies they took toward implementation.

Hopes from Data Systems

One way to understand central offices implementation work is to examine the hopes and

aims behind such efforts. Doing so provides not only a clearer picture of what technologies are at

stake during implementation, but also what significance such technologies have to the people

leading their implementation. Our findings highlight that district leaders oriented toward data

systems in two ways. On one hand, we found that central office members drew upon the history

of past implementation efforts served as a resource for understanding present systems. On the

other hand, we also found that this history was often linked to the larger ecology of technologies

in the district. Other technologies helped to frame what features central office members came to

see as valuable about present systems. This pattern held true even though one district eventually

chose to abandon efforts to pursue data system implementation.

Together, these findings demonstrate that computer data systems are not simply “tools”

or “instruments” devoid of context. Instead, context helped to shape what characteristics became

important to central offices.

District Leadership for Computer Data Systems 11

Musial technology context. The centerpiece to Musial’s efforts around data use was the

Front End system. The history of computerization in Musial played a role district leaders’ clear

enthusiasm for the system. Front End did something in Musial that was unprecedented: it

provided teachers with direct access to a student data. In previous years, site licensing issues had

negated the potential for direct, across-the-board access to data. Thus, teachers had had to rely on

other staff to generate reports from the various district systems. In fact, the actual "data systems"

before Front End were email and Excel spreadsheet. Staff emailed reports to teachers

periodically with the spreadsheets attached. Each sheet contained a plethora of data points for all

the hundreds of students in a given school; much of the analysis and sorting was left up to

teachers.

Not surprisingly, Musial leaders came to feel that these practices were inefficient. Thus,

the goal behind Front End became to ensure that teachers had access to the “right data” in timely,

user-friendly, and interpretable ways. Representing an in-house collaborative effort between the

technology department and the accountability department, Front End was designed to provide a

single interface into data from various computer systems at once. For example, central office

members were especially positive about how Front End offered access to attendance, tardies,

discipline, TAKS scores, and district benchmark scores. Users were able to view these data

points at the classroom level, as well as the individual student level. In turn, these data could be

exported as PDFs or Excel spreadsheets, depending on the actual system “behind” Front End. In

this way, central office members understood Front End’s particular features as advancing the

district toward achieving district-wide goals. As one central office member explained, “[Front

End] gives folks an opportunity to really be aware of who their students are, relative to the ways

that the accountability system looks at students and student performance.”

District Leadership for Computer Data Systems 12

Gibson technology context. The Gibson district was in the midst of significant change.

On one hand, it was transitioning from an older assessment system to a newer one called

Flightpath. On the other hand, it was also attempting to implement an entirely new kind of

system: Dashboard Central. Dashboard Central was intended to serve as a data warehouse, thus

integrating a variety of data from disparate systems.

History (or the lack of history) also played a role in how Gibson central office members

understood Flightpath and Dashboard Central. For Gibson leaders, Flightpath had a clear sense

of history and trajectory surrounding it. In general, the Flightpath interface was not new to the

district. Flightpath had previously been used for the purposes of teacher appraisal, and only the

functions relating to student assessment data were new. District leaders felt that Flightpath gave

them what they wanted, which was data about student expectations (SEs). Central office

members reported especially liking that Flightpath reported SE data for individual students and

classrooms, as well as recommendations about how to group students for additional support or

tutoring according to SEs

In contrast, the Dashboard Central system was new to the district and in its first year of

implementation. To some extent, this lack of history gave it some novelty. Specifically, they saw

Dashboard Central as a way to integrate the district’s various disparate systems (i.e. Flightpath,

other assessment systems, student information systems, and human resources systems). Although

implementation troubles meant that central office members had not had much opportunity to test

out the system in practice, they were excited about the idea of Dashboard Central. One central

office member described Dashboard Central was more than just “up or down on a test.” Others

suggested that the system would improve the “safety net” and educators’ insights into the “whole

child.” Another central office member summed:

District Leadership for Computer Data Systems 13

Without Dashboard Central, we were still looking at little bits in isolation and having to

generate all this information from all these different venues, which took a lot of time just

to assess what a kid needs, what’s been done for them, what’s working, what’s not

working. Now you can sit a group of people down and use Dashboard Central to pull all

of the information and see it right there in one place.

Although perceptions about Dashboard Central did not negate the importance of SEs, it

was the system’s promised ability to tap into the broader range of technologies in the district that

district leaders found exciting. For example, Dashboard Central promised the ability to integrate

human resources (i.e. teacher) data with other systems. One central office leader explained this

contribution in terms of the potential to correlate data teacher professional development with

student achievement data. Another central office leader liked that Dashboard Central was web-

based, and thus allowed principals to access data via mobile device. He envisioned principals

conducting classroom observations, but using mobile devices to get a sense for the teacher’s

work records (i.e. attendance) or details about the particular students that happened to be in the

classroom at that moment.

Finally, it should be noted that although district leaders were attracted to Dashboard

Central’s novel uses, some had trouble envisioning what the system might mean for teachers'

everyday work. Even for the central office members who had helped select the system, there was

ambiguity about how people ought to be using the system in practice. For example, one

lamented, “We had a vague idea that we needed something that would answer some questions,

but we didn’t know what the questions were.” When another was asked about the purpose of the

system, the response was, “I haven’t been able to get an answer [about Dashboard Central’s

purpose].” Murkiness around the system’s purpose was not lost on campus-level educators. For

District Leadership for Computer Data Systems 14

example, several campus administrators reported being unimpressed that Dashboard Central only

offered student attendance and state test information. They felt that they were already getting

these data via Flightpath.

Boyer technology context. At the opening of data collection, Boyer district leaders were

strongly interested in obtaining an integrated data system. As will be described in later sections,

these plans were eventually abandoned and other supports for data use were put in place. The

seeds to this decision, however, can still be traced to district leaders’ sensemaking about data

systems. In particular, district leaders were content with what they were getting from their data

systems, and dissatisfied only with their lack of integration.

For example, central office members saw the Flightpath and PM Plus systems as

contributing well to data use. Indeed, they saw the systems as playing to balancing each other’s

strengths and weaknesses. To Boyer district leaders, Flightpath handled standardized test data.

As one Boyer district leader, explained Flightpath was “very good for [state test] data and for the

benchmarks we give.” In other words, assessments in Flightpath were delivered intermittently.

In contrast, PM Plus was an assessment system offering short, biweekly "learning

probes" to monitor student performance. Accordingly, central office members described this

system as offering a “really good, comprehensive look at the child.” Another described how the

system was good for “exactly identifying” students’ weaknesses in specific areas (e.g., phonics,

reading fluency, math computation, math applications). For example, one central office member

explained:

You can see patterns in yearly trends. You can look back at a timeframe and intervention,

then see in a trend line if it really worked. There are lots of pieces of information and

quality data for support teams to plan around.

District Leadership for Computer Data Systems 15

An added bonus to PM Plus was its reporting of data in relationship to national norms. One

central office member reported that teachers found this data “reassuring and interesting,” because

these data showed many Boyer students to be exceptional.

If central office members had complaints about their data systems, it was around their

integration. For example, one central office member called Boyer’s systems a “hodge podge,

where there is data, but there is nothing that really consolidates it or that correlates that

information.” Another central office member stated, “What’s missing is a complete picture of the

kid in one place... They must pull pull pull.” In this, several central office members admitted that

it could be hard to pull data because it was hard to remember how to log in or use so many

systems at once. In this way, district leaders saw obtaining an integrated data system as a way to

make life a little easier. Unlike the other districts, Boyer did not associate any new system with

gains such as improved access or analyses. The promises of any new system were imagined only

to be reduced burdens at work.

Social and Organizational Considerations

Setting aside technological considerations, another way to understand central office

implementation work is to examine the social and organizational forces around system

implementation. Each district had a unique set of “people issues” (e.g., histories, bureaucratic

structures, norms) that helped set parameters around how central offices went about their jobs.

From a sensemaking perspective, these real world considerations helped to orient district leaders

around how to go about their work. Of the various districts, Musial was most mindful about such

issues, consciously working to address them. In contrast, Gibson leaders tended to take such

issues as a given part of central office life, without working to address them. Meanwhile, the

District Leadership for Computer Data Systems 16

Boyer district seemed to lack the social momentum that might have carried a data system

initiative forward.

Social and organizational terrain in Musial. The Musial district was very aware that

social and organizational forces could influence data use in their district. Their attention to these

forces seemed to lay some of the groundwork for implementing Front End. In particular, district

leaders worked hard to create a district culture oriented toward state achievement data. At the

same time, district leaders took steps to support data use by bridging department boundaries.

Although both sets of activities were primarily bureaucratic in nature, this did not preclude their

potential to influence how a system such as Front End might be perceived.

The drive to promote achievement data. Musial central office members saw improving

the use of achievement data as a foundational dimension to their everyday work. In Musial, state

test results were the main measure of achievement. Throughout our interviews and observations,

Musial central office members were adamant that state testing was the major vehicle by which

the district would accomplish its formal mission of serving “all students.” Speaking about the

importance of state testing, however, one central office member lamented, “There are just a lot of

folks that continue to struggle with the idea that all students can learn at high levels.”

Accordingly, schools failing to “move” their scores were described as “needing a culture

change.”

District leaders tended to see these values and beliefs as something that could be affected

positively via bureaucratic consequences and authority. As one central office member explained,

“You can’t cause an attitude to come into being, but you can address behaviors.” In such

instances, the school improvement office assigned to work with those schools. This work was

described thusly, “until a campus understands that [state test achievement is] what’s expected…

District Leadership for Computer Data Systems 17

and if they don’t achieve it, then there are going to be all kinds of people in there trying to help

them, and to fix them, and to make them do things.” Those activities included direct involvement

at schools around things like hiring and firing, as well as classroom observation and supervision.

In short, life in the Musial central office was characterized by a drive to promote the role of

achievement data in schools. Thus, when it later became time to implement Front End, one

central office member described the climate as “you will use this.”

Bridging department boundaries. The Musial central office took efforts to consciously

evaluate the ways in which bureaucratic structures were aligned toward supporting data use. As

one district leader put it, “Bureaucracy done properly can help you think, offer control points,

and do post mortems.”

This attention to formal organizational structures around data use had been sparked by

out-of-district tours that had taken place a few years previously. The purpose of these tours had

been to gather knowledge from outside of the district about effective ways to support student

achievement. Accordingly, a team was sent to districts from throughout the state that had student

demographics similar to Musial, but outperforming Musial on the state accountability test.

Among other things, these visits brought home to Musial the importance of supporting data use.

In the words of one central office leader, “You know, I think that was one of the first events

where all of a sudden we had a driving force… to say ok, well, this data is important.”

Additionally, the out-of-district tours had impressed upon district leaders the pitfalls of

fragmentation among central office departments. One central office member recalled how one of

the model districts in particular had worked to reduce such divisions. He elaborated, “We heard

other people saying, ‘Okay, you can’t have a riff between what the campuses are actually asking

for and what curriculum is saying they need.’ They have to be in agreement with one another.”

District Leadership for Computer Data Systems 18

Accordingly, Musial redesigned its bureaucratic structures toward better supporting data use by

emulating some of the things it had seen during the tours.

This culminated in a new job position within its school improvement office: associate

director for data use. Previously, technology work and expertise had been housed within the

technology department. This new position, however, would serve the whole district and be

geared toward student achievement. Further, the district formally advertised for someone with

expertise in three areas: instruction, data use, and computer data systems. One central office

leader described the new job role as “straddling the assessment, accountability, and information

processing aspects of district work.”

By all accounts, Musial got what it was looking for and more. Susan1, the new associate

director for data use, was commonly described by campus administrators and other central office

members as a valuable addition to the district. At the level of formal bureaucracy, Susan was

described as being adept at navigating department boundaries and running “ideas up the chain”

of command. In the words of one central office leader:

Most of the data processing and assessment or accountability folks in this district are not

educators. There’s a kind of schism between what principals might think they want, but

not be able to do, and what a person over in [another department] might interpret that

report or electronic platform to look like. So we tried to get a job that would be able to

interface between those two different, unrelated divisions. That was really the goal.

In her own words, Susan described the importance of boundary spanning:

You need to be an educator. You need to have worked in the classroom, and you need to

understand clearly the connection between instruction, assessment, and how data drives

1 A pseudonym.

District Leadership for Computer Data Systems 19

instruction. You need to be able to work well with teachers and principals. You need to

have communication skills. You need to be able to talk to the technical people and be

able to say this is what our teachers need… Above all, you have to be an advocate for

teachers, because all the research tells us that teacher’s impact student achievement the

most. And you’ve got to take all the technical jargon off the table, and you’ve got to take

all the political jargon off the table, and you’ve got to take off the ‘who gets credit for

this initiative off the table.’ I don’t care who gets credit, I just want teachers and

principals using data.

It was also evident, however, that Susan’s role in the Musial district was not simply

formal. Rather, she also represented a change in the social sphere around computer data systems.

For example, central office members commonly noted her interpersonal skills when it came to

talking about Front End and data use. Although she was not officially under their authority, the

technology department saw Susan as “lending a face” to their work. Throughout the Musial

district, she was referred to as the “go-to person” when it came to data use. Indeed, one campus

administrator described Susan as “somebody that can talk the talk on a teacher level and an

administrative level—what does it mean, how to look at it.”

Social and organizational terrain in Gibson. The Gibson district experienced

significant difficulties when it came to implementing Dashboard Central. One of the seeds for

these challenges could be traced to organizational structure of the Gibson central office. Another

of the seeds could be traced to technologically deterministic assumptions about data system

implementation.

Department “kingdoms” and boundaries. A commonly used phrase in the Gibson

central office was the term "keys to the kingdom." It was used to describe who had authority and

District Leadership for Computer Data Systems 20

oversight over different projects. Central office work around Flightpath and Dashboard Central

were treated as separate realms, each overseen by separate central office departments.

Specifically, the curriculum and instruction department oversaw Flightpath, while the technology

department oversaw Dashboard Central. Although both departments were under the supervision

of the deputy superintendent, the implementation work was kept separate in practice.

Geographically, the technology department offices were even located in the auxiliary services

building, while the curriculum and instruction department was in the main administrative

building.

From a rational systems perspective, such formal divisions of labor seemed quite

reasonable. After all, the kinds of work that Flightpath was geared toward were already

established within the curriculum and instruction department duties. In contrast, Dashboard

Central would draw from disparate systems from throughout the district, not all of them related

to instruction (e.g., human resources). Technological savvy was the technology department’s

jurisdiction.

Unforeseen, however, were the “people problems” that came along with this bureaucratic

arrangement. This was especially evident when it came to work with Dashboard Central. For

example, the technology department was unprepared to oversee the instructional sides to

Dashboard Central use. Traditionally, such authority flowed through assistant superintendents,

not auxiliary services such as technology. One technology department member wished that

instructional experts were more involved in “decisions like what reports are important and what

do you want teachers do get out of this... I’m not a teacher. I think Dashboard Central is

important. It’s cool, but it might not be anything that anybody really needs.” Indeed, this view

was echoed elsewhere as well. In the words of another central office member, “Don’t you find it

District Leadership for Computer Data Systems 21

odd that [the curriculum and instruction people] don’t know a lot about Dashboard Central?”

Once work around Dashboard Central got under way, the relative isolation of the technology

department and its work took its toll.

Feedback about “deployment.” Throughout our interviews and observations in Gibson,

the word for implementation was “deployment.” Indeed, the Gibson central office tended to see

implementation work as stopping once people had access to the tool. This technologically

deterministic view on implementation was not without drawbacks.

For example, it tended to mask the need for feedback about whether the system was being

used successfully. Lack of such feedback was most evident among higher ranking district

leaders. For example, although the reality was that campus educators did not have access to

Dashboard Central, one assistant superintendent considered Dashboard Central “deployment” to

be over and done. Further probed, this district leader admitted that not having personally

observed principals or teachers actually using the system. This was not considered worrisome

because “most of the time when I’m working with the principals they are presenting [their goals

and accomplishments] to me. They are not really [observed] at the point where they are digging,

but they tell me that it is very helpful.” Despite the fact that Dashboard Central deployment had

begun back in the summer, it was not until the end of the fall semester that district leaders

commonly realized that Dashboard Central might need to be “redeployed.” At that time, one

central office member reported:

I think we’re having to take a re-start and say, “Okay you’ve had a tool; you’ve had a

chance to play with it if you wanted to. Maybe we’ll do some more learning this spring.”

And then we’ll set, maybe collaboratively with the principals, some expectations for

usage in the future.

District Leadership for Computer Data Systems 22

To be fair to the Gibson central office, our sense was that district leaders were generally

in touch with other happenings in the district, but that Dashboard Central had fallen into a blind

spot. For example, one contrasted the district’s approach to Dashboard Central to its approach to

a recent math and technology education initiative. This initiative had included weekly campus

visits and conversations with educators. “We would walk the classrooms, observe, talk to kids if

the lesson lent itself to that. We’d talk to teachers then debrief at the end of each day about what

we saw.” Similarly, members of the curriculum and instruction department reported regular

contacts with campus-level educators, including around Flightpath. Campus educators were

using Flightpath without difficulty.

Social and organizational terrain in Boyer. In the end, the Boyer district decided not to

invest in an integrated data system. From a sensemaking perspective, two details help to frame

the lack of momentum around carrying a data system initiative forward. The first involved the

relative high levels of achievement among students in the Boyer district. The district and its

schools regularly held the highest state accountability ranking (“exemplary”). Throughout the

Boyer central office, people took pride in this achievement. One central office member reported

that the very bottom quarter of students in Boyer still placed in the 50th

percentile on nationally

norm-referenced tests. As one central office member summed, Boyer students were “literally off

the charts smart.”

The second detail involved recent changes in school accountability policy in the state.

Although Boyer’s overall passing rates were outstanding, the new policies would require a

stronger focus on demographic subgroups. Specifically, a handful of “economically

disadvantaged” students would need to score “commended” (the highest level) on the state test in

order to preserve the district’s exemplary rating. As one district leader worded it, “Now we have

District Leadership for Computer Data Systems 23

to take a closer look at our lower socioeconomic students. We have to have 25 percent of them

reach the commended level in order for a campus or the district to be considered exemplary.”

According to another central office member, Boyer was already quite close to that goal. He

reported that the actual number of students needing to jump to the commended level was

approximately 20.

In short, these details help to frame relative lack of pressure experienced by Boyer district

leaders toward adopting a data system. Although central office members had initially felt that an

integrated data system might help their students, they were not prepared for the financial costs of

such a system. During a cabinet-level meeting, a district leader explained that he had inquired a

vendor about pricing but the system “unbelievably expensive.” In fact, a second district leader

exclaimed, “Damn!” upon hearing the price tag. A third district leader followed up by explaining

that having an integrated data system was an “absolutely great idea” but so “horribly expensive”

that the “emotional feeling and reality of the moment was, ‘we don’t have the money to do

that.’” In the end, although central office members could imagine some benefits to an integrated

data system, they had difficulty imagining whether the returns on that investment were worth the

dollars.

Strategies toward Implementation

In the previous sections, we have described the technical, social, and organizational

considerations that informed central office approaches to data system implementation. Below,

we describe the activities of that work. The implementation strategies employed by central

offices in these regards are revealing about sensemaking at two levels. At one level, they help to

show how central office members may draw upon a variety of signals in their environment

toward arriving at implementation decisions. For example, although the Boyer district did not

District Leadership for Computer Data Systems 24

fund a computerized system, it did decide to fund instructional coaches as an alternative strategy

for data integration.

At a more sophisticated level, central offices activities also reveal the ways in which the

influence of sensemaking may have cumulative effects on data system implementation. For

example, Musial’s early mindfulness about norms and bureaucratic structures seemed to pay

dividends for its later work with Front End. On the flipside, Gibson’s relative inattention to such

matters seemed to create a snowball of troubles with Dashboard Central.

Musial implementation strategies. Musial’s early attention to norms and bureaucratic

structures seemed to set their later work up on good footing. Just as the district was comfortable

adapting bureaucratic structures early on, the approach to Front End was also designed to adjust

based upon real world feedback. Similarly, just as norms around data were a recognized target

for district work, some trainings around Front End seemed attentive to social expectations around

the system.

A long-term, adaptive plan. Of the study districts, Musial did the most work to situate its

data system within a long-term plan for data use. From a sensemaking perspective, the Musial

central office’s approach to planning was also most sensitive to the importance of adaptation and

feedback.

Continuing in her boundary spanning role, Susan, Musial’s associate director for data

use, collaborated with the director of the technology department to develop and gain approval for

a long-term plan for data use. Although the technology department had been working piecemeal

on some of the basic technologies in Front End, central office members reported that it was

Susan who finally brought focus and momentum to this work. The district plan had three phases,

to be adapted as implementation progressed. Phases 1 and 2 were observable during the time of

District Leadership for Computer Data Systems 25

data collection. Phase 3, which involved purchasing an additional technology for managing

curricular content, did not take place during data collection.

Phase 1 of the plan was to expand access to data via Front End in spring 2010. One

central office member explained, “We don’t want them to focus on ‘where do I get the data,’ nor

on ‘how do I get the data.’ We want them focusing on how do I use the data. But the first step is

getting it.” During Front End’s development, Susan took several steps to improve upon the

technology department’s work. This included conducting user groups among teachers and

campus administrators. Indeed, one central office member characterized Front End as having

been “generated from the teachers’ desires.” During user groups, Susan emphasized that users’

participation helped ensure that what they “defined as most critical” to their work would become

reports or functions in Front End.

The work conducted during Phase 1 is notable in several ways. For example, it helps to

suggest that bureaucratic changes, such as creating a new job role, can spill over into changes in

overall process. Although the technology department had attempted to collect similar types of

feedback in the past, they described how well Susan could “translate” among the various

worldviews, sharpening teacher feedback or explaining technical limitations to teachers. In this

sense, creating the ability to hire someone like Susan helped to address “people problems”

around technical work. In turn, the district was better able to adapt to real world demands. In the

end, the Musial central office was unanimously satisfied with Front End’s design. In the words

of central office member, “we hit it on the money.”

Musial’s inclination toward adaptation was also evident as Front End implementation

progressed. For example, the user group continued to meet even after Front End had been

introduced to teachers. In this way, the group provided feedback not only about Front End’s

District Leadership for Computer Data Systems 26

initial design, but also about how that design might be adapted over time. Similarly, Susan

regularly ran and distributed reports about Front End usage rates. These reports were distributed

to other central office leaders. Although work around Phase 2 was just beginning at the close of

data collection, the trend seemed to continue. Phase 2 involved providing administrators with

Front End access, as well as providing a “beefier” assortment of data via the incorporation of

printer-scanner technologies. In this, printer-scanners were piloted at select campuses. Before

expanding to the entire district, feedback about these technologies was to be collected online.

Sensemaking around Front End. Musial’s strategy for introducing teachers to Front End

had two sides. The two sides help to demonstrate how even in a single district, ideas can be

mixed when it comes to how to conceptualize about technology. The primary strategy for

introducing teachers to Front End was an “online training module” (i.e. Powerpoint text). In line

with rational systems perspectives, this strategy would seem to assume that if people are told the

"right way" to use a tool, then the tool will be used according to designers' intents.

Our interviews with teachers suggested that this strategy may have inadequately

addressed the need for shared sensemaking around Front End. Teachers reported that they did

much of the learning about Front End informally and not via the modules. In particular, several

teachers reported that word-of-mouth (e.g., calling “someone in from the hallway” for help) was

how they’d learned to use Front End. This was even evidenced in one focus group, when one

teacher described Front End’s sorting functions, and the others began to express surprise or to

ask questions about how to use them. Overall, teachers reported liking the idea of Front End, but

we found that they used only a small share of the functionalities available.2

2 See also Cho & Wayman (in press).

District Leadership for Computer Data Systems 27

Musial’s secondary strategy for introducing teachers to Front End was more attentive to

the social side to data systems. Specifically, one-time trainings of campus “go-to people” were

led by Susan and representatives of the technology department. Although these trainings were

offered to only a small subset of teachers in the district, they provide interesting examples for

how a central office might shape sensemaking around the technology.

These trainings were attentive to developing a sense of importance and legitimacy around

Front End. We observed that they followed a similar pattern. Susan would call the group to order

by launching into an overview of Front End and its purpose. A commonly used slide for this

purpose showed an image of Musial’s strategic planning brochure. Subsequently, she would

highlight district goals relevant to data use (e.g., “closing achievement gaps”). Next, she would

mention that Musial principals were surveyed about campus data needs, and that the “number

one” issue was assessment reporting. Accordingly, she would explain that Front End was

intended to address that need by providing teachers with richer, direct access to data. In these

presentations, she would describe the role of the user group in Front End’s design.

These trainings also helped to demonstrate some of the potential role of central office

members in shaping how teachers make sense of the data system. For example, Susan invested

significant energies toward creating a positive atmosphere around. At one training, she seemed to

win over an audience by naming “laughing and learning” as her first key goal. In fact, Susan

worked hard to stoke excitement around Front End, even when others were doing the presenting.

For example, when the technology department described a favorite feature, Susan would

applaud, directly asking the audience to join in. At other times, she would ask them to raise their

“thumbs up” in approval or interject with a connection to her own classroom experiences. At one

training, she requested that the presenters move on to a certain feature that “really gets those

District Leadership for Computer Data Systems 28

oohs and ahhs” and the “Price is Right effect.” If a feature was well-received, she would often

give credit to the technology department or user group for their hard work in developing the

feature. At the close of trainings, Susan would show a slide of enthusiastic quotes and praise

about Front End developed from the feedback of other users. After waiting quietly while the

audience read (or sometimes chuckled), Susan would recite her favorite one. Finally, Susan

would invite feedback from the audience, telling them that their suggestions would be taken to

the user group.

Although these trainings served a small subset of teachers, Susan’s efforts seemed to

create palpable results. Reactions to certain features included wows, murmurs, and nods. One

teacher was so enthusiastic that she interrupted with, “Shut up! You just saved me hours of

cumulative folder digging!” Similar comments were commonly overheard among the audience

throughout trainings. When Susan picked up on these comments, she would repeat them for the

group to also hear. Although some of reactions might be attributed to how glad some teachers

might have felt to finally have access to data, it was also clear that Susan was interested in

amplifying those sentiments. Commenting on Susan’s success at shaping sensemaking, one

technology department member described how different it was from technical work:

It’s coordinating, packaging, and putting out a unified message. The [technology

department] is not the one to be doing that. We’re the data jocks. We should be listening

to our users’ needs, but the coordination and roll out of Front End has been served by our

school improvement office.

Gibson implementation decisions. The troubles experienced in Gibson around

Dashboard Central help to demonstrate the ways in which technical problems can be intertwined

District Leadership for Computer Data Systems 29

with social or organizational ones. Lacking preparation for such matters, troubles seemed to

snowball.

Dashboard Central’s (stalled) timeline. As reported above, the district took a

technologically deterministic stance toward Dashboard Central’s "deployment." Gibson did not

take formal stock of its success until after the summer and fall semester had passed. During this

time, technical, social, and organizational challenges seem to play off one another.

District leaders had initially envisioned a tight timeline for deployment. These plans,

however, were inflexible to local demands. Part of this was due to funding. Some of the impetus

behind Dashboard Central had come from the unexpected availability of federal stimulus dollars

in spring 2010. To seize the opportunity, district leaders worked quickly to select and purchase

the system. To make the most of the investment, educators would need to be using the system in

the subsequent school year (fall 2010). Similarly, district leaders felt that the best time to train

educators would be before the school year got into full swing. As one explained:

I don’t want to second guess what we did... but we had the sense (and I still think this is

true to an extent) that if you don’t start something at the beginning of the school year in

school districts, it’s hard for it to get traction.

This tight timeline, however, failed to take organizational capacity into account. If initial

deployment was to take place in fall 2010, much of the preparatory work would need to be done

in a single summer. As it happened, however, the technology department director had just left the

district. Gibson was without a director that summer—one would not be hired until the fall. Thus,

Gibson was shorthanded during a critical time. For much of the summer, only one person,

Edith3, was left doing both the project manager and technical work around Dashboard Central.

3 A pseudonym.

District Leadership for Computer Data Systems 30

Although the technology department reported to Gibson’s deputy superintendent, this was not

the same as having someone “in-house” who could advocate for department needs. In Edith’s

own words:

My boss was trying to keep so many balls bouncing, it was a difficult time. If somebody

had been in place, I could have probably gone to them and said, “I need you to deal with

the vendor. They are not being responsive. We’re not meeting this deadline, et cetera.”

Compounding matters, some of the technical work associated with Dashboard Central

was more difficult than expected. Initially, the Gibson central office had envisioned all technical

work to be done within the summer of 2010. Even by the fall, however, the technology

department was still working to integrate student data into Dashboard Central. Non-student data

(e.g., human resources, teacher-made assessments) had not even been touched. In some

instances, technical challenges were caused by district norms and procedures. For example, it

was difficult to ensure that Dashboard Central had valid data in it. One central office member

explained that part of the problem stemmed from district inconsistencies around recording data.

For example, campuses had historically determined the coding of student discipline data.

Without uniform number codes across all schools for unique infractions (e.g., all schools

recoding “assault” via “8644”), discipline data would be unintelligible when integrated.

At the same time, technical challenges took their toll on the people and relationships in

central office. For example, problems in Dashboard Central's automatic update function meant

that Edith was part of her day updating the system manually (once a day, when possible).

Further, because of the data validation problems, access to certain query functions was still

limited. This provided a point of contention. On one side, the technology department was

concerned that opening access too soon might cause more harm than good—for example, if

District Leadership for Computer Data Systems 31

people read too much into invalid reports. Nonetheless, some central office members wanted

access sooner. One was especially vocal about to experiment with the system. She felt that the

technology department was being overly protective, “I don’t need it perfect… We can’t destroy

anything.”

Working with people at Dashboard Central. The Gibson district's dealings with the

Dashboard Central vendor shed additional light on how technological problems can be

intertwined with social and organizational ones. For example, without a technology department

director, the basic work of “calling the vendor and staying on their case” had fallen by the

wayside. This vacancy left the district without someone devoted to spanning the boundaries

between central office and vendor, and in some cases, better advocating for Gibson's interests.

For example, district leaders reported that their contact person at Dashboard Central had taken on

new districts to support. They felt that this put Gibson on the “back burner” when it came to

service. Gibson had trouble getting phone calls to be returned and getting their contact person to

show up at meetings.

This was at the same time that Gibson beginning to realize that some of the promises

around Dashboard Central might have been exaggerated. For example, a team of district leaders

had selected Dashboard Central because of its relative cost and promised implementation

timeline. The impression was that Dashboard Central had enough experience and prefabricated

work to minimize technical responsibilities on Gibson's part. But such was not the case. Some of

the system’s most interesting functions during product demonstrations turned out not to be

standard, but rather the results of custom building. The amount of technical work required of the

district had also been underestimated. As one central office member lamented:

District Leadership for Computer Data Systems 32

The other [vendors] were like okay, we’re going to sit down with you and talk about what

your data looks like. And then we’re going to build the cubes. It would have taken a lot

longer—probably. But looking back, maybe not. Hindsight is always 20/20.

Citing the relative success that another local district later had with implementing Dashboard

Central, another district leader lamented that Gibson might have served as the “guinea pig” for

the vendor’s later work in the state.

Even when there were no technical problems with Dashboard Central, the Gibson district

had trouble getting answers about how to use the system. For example, Dashboard Central came

pre-loaded with over 250 unique reporting functions, but it was left to central office to decide

which of these functions to give to users. This task proved nearly impossible, because the vendor

had no guide explaining what all the functions even were – let alone how they might be used.

One central office member wished that someone from Dashboard Central would provide a “valid

reason” explaining how and why certain reports looked the way they did. She reported that when

the vendor was asked for explanations of particular reports, the response was, “Well, that’s what

it is.” In effect, despite Dashboard Central’s output was still unintelligible.

Training. Trainings for Flightpath and for Dashboard Central were managed by the

respective departments overseeing them. Regardless of the system at hand, campus-level

educators described how district technology trainings were missed opportunities for

sensemaking. Teachers, especially, wanted to build skills more deeply and over time. Echoing

the central office’s technologically deterministic view of technology, what teachers got were

one-time training events. One teacher characterized them as the “here it is, now go” approach to

training. Another teacher described technology trainings as a "cognitive dump." Others lamented

District Leadership for Computer Data Systems 33

that trainers typically told them how to log onto a system and that a system would be beneficial,

but that the trainers would then also "walk away" without offering additional help.

Trainings for Dashboard Central were a particular sore spot among campus

administrators. To some extent, some of the missteps around training for the system can be

traced to the social and organizational terrain of Gibson. For example, because the technology

department was an auxiliary to campuses and other offices, it was unprepared to mandate who

should attend training or how concepts from training should be applied in practice. Confusion

ensued amongst the campus administrators. At one school, administrators turned the table on the

interviewer, asking whether they were supposed to be using the system, as well as whether they

were now supposed to be trainers in a “train-the-trainer” model. Similarly, one principal was

frustrated that he had been mandated to another training on the same day as Dashboard Central

training. He was unclear about where to turn for missing knowledge, or if he would pay any

consequences for having missed the training. Contrary to the central office belief that late

summer is a good time for training, he felt that he was already too busy preparing for the school

year.

Again, relationships with Dashboard Central proved disappointing. Representatives from

the vendor were supposed to have led parts of the training, but they canceled at the last minute.

Nonetheless, Gibson pressed on. As one central office member explained, by then “the train had

left the station and there was no stopping it.” What training participants did see if the system was

thus limited to basics about logging in and customizing the look of their home pages.

Information about how to use the data or next steps for campuses was not provided. Without

expectations or support around how to use the system, one campus administrator confided that

she was forgetting how to use Dashboard Central and wished she had never gone.

District Leadership for Computer Data Systems 34

Boyer’s instructional coaches. Although the Boyer district did not decide to invest in a

data system, it did funding a different strategy for supporting data use: instructional coaches. As

one Boyer district leader put it, instructional coaches served as the “human resource solution” to

the problem of access to data. As a bonus, they also offered direct, collegial interaction and

support to classroom teachers. He explained:

If say we eliminated those [instructional coaches] and used that money to pay for just a

single data system, they may be able to get data, but we have no idea if they’re using it—

or even what they are using it for. So we’ve kind of erred on the side of “let’s go with a

human resource solution to solving the problem.” In a perfect world we would have both.

We would have the coaches there to help them figure it out, but we would also have [an

integrated computer data system].

In practice, the roles and responsibilities of coaches were varied. Potential responsibilities

included: training for using or interpreting data; help in designing local assessments; assistance

with entering data or other “paperwork;” and helping teachers to plan instruction. For example,

central office members described working with coaches to deliver a one-time data use training.

This training involved developing a sense for the “whole student” by focusing on various

printouts of individual-level data. In the absence of an integrated data system, preparing for this

training involved logging in, isolating students relevant to the teachers, querying the system for

particular reports, and then repeating the process for additional systems. With the printouts in

hand, central office leaders and coaches then had conversations with teachers about individual

students. One central office member explained:

District Leadership for Computer Data Systems 35

We make it a point to say it isn’t one piece of data that will tell you a child needs help or

needs [intervention]. The whole point was to dig deeper and look at the whole picture of

how they’re doing in the classroom. Not to just look at PM Plus or [state test scores].

Overall, how coaches went about their duties was influenced by how individuals made

sense of their context and local demands. Part of this may have been complicated by the fact that

the program was only in its second year. This placed the Boyer central office, as well as the

coaches, on a learning curve. One central office leader described how this played out when it

came to training for data use:

We had trained [the coaches] on what we had done the year before. They had also been a

part of it on their own campuses previously. The expectation was that they would follow

through with that. That collided with the concept of them developing relationships and

having time to do it. It really never materialized. That was our fault in thinking we had

them and that they could do it. I think next year they could.

Although coaches were positive about their work overall, one coach admitted that it could be a

challenge balancing central office and school expectations. Specifically, she confided that she

sometimes had trouble asking teachers to comply with the policies or requirements she didn’t

personally support. Ironically, even when teachers were positive about their coaches, they were

lukewarm about coaches’ attempts to “stand back” and give teachers greater independence. In

effect, teachers felt like extra work was being given to them that the help to which they’d

become accustomed was being purposefully withheld.

Setting such bumps aside, campus-level educators in Boyer were positive about their

coaches. Principals were especially enthusiastic on a number of levels. One principal trumpeted

the caliber of the coaches:

District Leadership for Computer Data Systems 36

They are very well respected. They are knowledgeable. They are talented. At first, there

were some [doubts about data use trainings], but now the mindset is that we don’t know

what we would do without them.

Similarly, principals were enthusiastic about coaches’ specialized expertise. This made

supporting data use easier. Different coaches specialized in math and reading instructional

strategies. As a result, one principal described how this freed her up to focus on teacher or school

level data. Thus, principals saw coaches a way to boost their overall leadership capacity. One

principal explained:

I can’t be the expert in every single thing, there is no way. I would be kidding myself and

I would be kidding the staff. I mean, I know enough about each one to—to be dangerous

so to speak, to be in the loop. But I can’t be everywhere. I can’t know everything. To be

effective, it’s a collaboration.

Discussion and Conclusion

In the preceding passages, we have described three very different approaches to computer

data systems. Our study contributes practical knowledge by helping to map out for other central

offices the kinds of challenges that they might experience in implementation work. These

challenges around technology were not simply technical, but also tied to how central offices

oriented toward adaptation, feedback, and bureaucratic structures. Without work on this kind of

map, some districts might not know what they're getting themselves into.

Our study also contributes to theoretical knowledge by demonstrating the ways in which

central offices make sense of technical, social, and organizational issues around computer data

systems. Further, it adds to research in this area by highlighting how orientations in the early

planning phases of initiatives can have larger repercussions or time. In this way, we have

District Leadership for Computer Data Systems 37

attempted to show how work with technologies are situated in time and place. This study also

provides ground for larger questions related to practice and theory. For example:

What are the other practical challenges in data system implementation? Much of the

light about potential challenges came from the Gibson district. The Musial district developed its

own system in-house, while Boyer did not implement a new computerized system. Accordingly,

it is worth questioning how central offices are to weigh the consequences of working with the

vendor versus working in-house. Although Gibson experienced troubles with Dashboard Central,

it did not have those troubles with Flightpath. Similarly, Musial's success with in-house

technology development might not be generalizable. Vendors do have significant expertise and

resources that district do not. At the same time, this study touched upon the importance of

finances to data system decisions, and future studies might attend to how funding influences

decisions about these technologies.

What is the "value" of a computer data system? We have suggested that the

exceptional degrees of student achievement in the Boyer district made it easier to measure the

cost of a data system in terms of dollars only. This district seemed to be satisfied enough with the

technologies it had and so funded instructional coaches instead. For other districts, not having a

data system might be measured in different terms—gains in student achievement or state

accountability performance, for example. Depending on the unique functionalities of the system

at hand, one district’s extravagance might be another district's necessity. This put caveats around

overly general statements about the importance of data systems, underlining the importance of

local conversations about their value.

Further, it raises questions about the potential of one technology to substitute seamlessly

for another technology, or the interchangeability of people and technologies. Whereas a common

District Leadership for Computer Data Systems 38

story is that people are being replaced by computers, the Boyer district found reasons to replace

computers with people.

How can (or should) central offices shape expectations about data systems? Susan’s

training sessions provided an interesting example for how training approaches can attend to

sensemaking about data systems. Musial’s online training modules and Gibson’s trainings

provide examples for what life is like without such investment. Our sense is that central offices

can do more to develop relationships among users when it comes to data systems and their use in

practice. Given that how central offices view the importance of data and the use of data systems,

it may be important for district leaders and communities to reflect about what they really want

from these technologies. Although Musial’s emphasis on accountability achievement seemed to

lay some groundwork for the system's acceptance, this approach might not suit everyone.

District Leadership for Computer Data Systems 39

References

Ash, K. (2013, March 14). Fragmented data systems a barrier to better schools, experts say.

Education Week, 32(25), 26.

Barley, S. R. (1990). The alignment of technology and structure through roles and networks.

Administrative Science Quarterly, 35, 61–103.

Bolman, L. G., & Deal, T. E. (2008). Reframing organizations: Artistry, choice and leadership

(4th ed.). San Francisco, CA: Jossey-Bass.

Booher-Jennings, J. (2005). Below the bubble: “Educational triage” and the Texas accountability

system. American Educational Research Journal, 42(2), 231–268.

Brooks, C. (2011). Locating leadership: The blind spot in Alberta’s technology policy discourse.

Education Policy Analysis Archives, 19(26). Retrieved from

http://www.eric.ed.gov/ERICWebPortal/detail?accno=EJ956002

Brown, J. S., & Duguid, P. (1991). Organizational learning and communities-of-practice:

Toward a unified view of working, learning, and innovation. Organization Science, 2(1),

40–57.

Burch, P., & Hayes, T. (2009). The role of private firms in data-based decision making. In T. J.

Kowalski & T. J. Lasley II (Eds.), Handbook of Data-Based Decision Making in

Education (pp. 54–71). New York, NY: Routledge.

Carlile, P. R. (2002). A pragmatic view of knowledge and boundaries: Boundary objects in new

product development. Organization Science, 13(4), 442–455.

Cho, V., Ro, J., & Littenberg-Tobias, J. (2013). What Twitter will and will not do: Theorizing

about teachers’ online professional communities. Learning Landcapes, 6(2), 45–61.

District Leadership for Computer Data Systems 40

Coburn, C. E. (2001). Collective sensemaking about reading: How teachers mediate reading

policy in their professional communities. Educational Evaluation and Policy Analysis,

23(2), 145–170. doi:10.3102/01623737023002145

Coburn, C. E., Honig, M. I., & Stein, M. K. (2009). What’s the evidence on districts’ use of

evidence? In L. J. Bransford, D. J. Stipek, N. J. Vye, L. M. Gomez, & D. Lam (Eds.), The

role of research in educational improvement (pp. 67–88). Cambridge: Harvard Education

Press.

Corcoran, T., Fuhrman, S. H., & Belcher, C. L. (2001). The district role in instructional

improvement. Phi Delta Kappan, 83(1), 78–84.

Datnow, A. (2006). Connections in the policy chain: The “co-construction” of implementation in

comprehensive school reform. In M. I. Honig (Ed.), New directions in education policy

implementation: Confronting complexity (pp. 105–123). Albany, NY: State University of

New York Press.

Davenport, T. H., & Prusak, L. (1998). Working knowledge: How organizations manage what

they know. Cambridge, MA: Harvard Business School Press.

Dorner, L. M. (2011). The life course and sense-making: Immigrant families’ journeys toward

understanding educational policies and choosing bilingual programs. American

Educational Research Journal. doi:10.3102/0002831211415089

Edmondson, A. C. (2003). Speaking up in the operating room: How team leaders promote

learning in interdisciplinary action teams. Journal of Management Studies, 40(6), 1419–

1452. doi:10.1111/1467-6486.00386

Eisenhardt, K. M. (1990). Speed and strategic choice: How managers accelerate decision

making. California Management Review, 32(3), 39.

District Leadership for Computer Data Systems 41

Hamilton, L., Halverson, R., Jackson, S. S., Mandinach, E., Supovitz, J. A., & Wayman, J. C.

(2009). Using student achievement data to support instructional decision making

(Practice guide No. NCEE 2009-4067). National Center for Education Evaluation and

Regional Assistance, Institute of Education Sciences, U.S. Department of Education.

Retrieved from http://www.eric.ed.gov/ERICWebPortal/detail?accno=ED506645

Honig, M. I. (2006). Street-level bureaucracy revisited: Frontline district central-office

administrators as boundary spanners in educational policy implementation. Educational

Evaluation and Policy Analysis, 28(4), 357 –383. doi:10.3102/01623737028004357

Honig, M. I., & Venkateswaran, N. (2012). School–central office relationships in evidence use:

Understanding evidence use as a systems problem. American Journal of Education,

118(2), 199–222.

Leonardi, P. M. (2009a). Crossing the implementation line: The mutual constitution of

technology and organizing across development and use activities. Communication

Theory, 19, 278–310.

Leonardi, P. M. (2009b). Why do people reject new technologies and stymie organizational

changes of which they are in favor? Exploring misalignments between social interactions

and materiality. Human Communication Research, 35, 407–441.

Leonardi, P. M. (2012). Materiality, sociomateriality, and socio-technical systems: What do

these terms mean? How are they related? Do we need them? In P. M. Leonardi, B. A.

Nardi, & J. Kallinikos (Eds.), Materiality and Organizing: Social Interaction in a

Technological World (pp. 25–48). Oxford: Oxford University Press. Retrieved from

http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2129878

District Leadership for Computer Data Systems 42

Levinthal, D. A., & Warglien, M. (1999). Landscape design: Designing for local action in

complex worlds. Organization Science, 10(3), 342–357.

Lichtenstein, B. B. (2006). Complexity leadership theory: An interactive perspective on leading

in complex adapative systems. E:CO, 8(4), 2–12.

March, J. G., & Olsen, J. P. (1984). The new institutionalism: Organizational factors in political

life. The American Political Science Review, 78(3), 734–749.

Marsh, J. A. (2012). Interventions promoting educators’ use of data: Research insights and gaps.

Teachers College Record, 114(November), 1–48.

McDaniel, R. R. (2004). Chaos and complexity in a bioterrorism future. Advances in Health

Care Management, 4, 119–139.

Means, B., Padilla, C., DeBarger, A., & Bakia, M. (2009). Implementing data-informed decision

making in schools: Teacher access, supports and use. Washington, DC: U.S. Department

of Education, Office of Planning, Evaluation and Policy Development.

Morgan, G. (1986). Images of organization. Newbury Park, CA: SAGE.

Orlikowski, W. J., & Barley, S. R. (2001). Technology and institutions: What can research on

information technology and research on organizations learn from each other? MIS

Quarterly, 25(2), 145–165.

Orlikowski, W. J., & Iacono, C. S. (2001). Research commentary: Desperately seeking the “IT”

in IT research - A call to theorizing the IT artifact. Information Systems Research, 12(2),

121–135.

Palandra, M. (2010). The role of instructional supervision in district-wide reform. International

Journal of Leadership in Education, 13(2), 221–234.

District Leadership for Computer Data Systems 43

Palmer, D., & Snodgrass Rangel, V. (2011). High stakes accountability and policy

implementation: teacher decision making in bilingual classrooms in Texas. Educational

Policy, 25(4), 614 –647. doi:10.1177/0895904810374848

Pfeffer, J. (1997). New directions for organization theory: Problems and prospects (First

Printing.). New York, NY: Oxford University Press.

Pfeffer, J., & Sutton, R. I. (2000). The knowing-doing gap. Boston, MA: Harvard Business

School Press.

Pinch, T. J., & Weibe, E. B. (1984). The social construction of facts and artefacts: Or how the

sociology of science and the sociology of technology might benefit each other. Social

Studies of Science, 14, 399–441.

Rivkin, J. W., & Siggelkow, N. (2002). Organizational sticking points on NK landscapes.

Complexity, 7(5), 31–43.

Scott, W. R., & Davis, G. F. (2007). Organizations and organizing: Rational, natural, and open

systems perspectives. Upper Saddle River, NJ: Pearson Education.

Spillane, J. P., Reiser, B. J., & Reimer, T. (2002). Policy implementation and cognition:

Reframing and refocusing implementation research. Review of Educational Research,

72(3), 387–431. doi:10.3102/00346543072003387

Stein, M. K., & D’Amico, L. (2002). Inquiry at the crossroads of policy and learning: A study of

a district-wide literacy initiative. Teachers College Record, 104(7), 1313–1344.

Thomas, J. B., Sussman, S. W., & Henderson, J. C. (2001). Understanding “strategic learning”:

Linking organizational learning, knowledge management, and sensemaking.

Organization Science, 12(3), 331–345.

District Leadership for Computer Data Systems 44

Tucker, B. (2010). Putting data into practice: Lessons from New York City. Washington, DC:

Education Sector.

Valli, L., & Buese, D. (2007). The changing roles of teachers in an era of high-stakes

accountability. American Educational Research Journal, 44(3), 519–558.

Wayman, J. C., Cho, V., & Richards, M. P. (2010). Student data systems and their use for

educational improvement. In P. L. Peterson, E. Baker, & B. McGraw (Eds.),

International Encyclopedia of Education (Vol. 8, pp. 14–20). Oxford: Elsevier.

Wayman, J. C., Jimerson, J. B., & Cho, V. (2012). Organizational considerations in establishing

the Data-Informed District. School effectiveness and school improvement, 23(2), 159–

178.

Weick, K. E. (1976). Educational organizations as loosely coupled systems. Administrative

Science Quarterly, 21(1), 1–19.

Weick, K. E. (1993). The collapse of sense making in organizations: The Mann Gulch disaster.

Administrative Science Quarterly, 12(38), 628–652.

Weick, K. E., & Roberts, K. H. (1993). Collective mind in organizations: Heedful interrelating

on flight decks. Administrative Science Quarterly, 38(3), 357–381.

Wills, J. S., & Sandholtz, J. H. (2009). Constrained professionalism: Dilemmas of teaching in the

face of test-based accountability. Teachers College Record, 111(4), 1065–1114.

Young, V. M. (2006). Teachers’ use of data: Loose coupling, agenda setting, and team norms.

American Journal of Education, 112(4), 521–548.