15
Evaluating MIS Effectiveness Evaluating Information System Effectiveness -- Part I: Comparing Evaluation Approaches By: Scott Hamilton Norman L. Chervany A bstract While the use and costs of Management Information Systems(MIS) have become highly visible, tittle attention has been paid to assessing and com- municatingsystem effectiveness. Evaluation of system effectiveness is difficult due to its multidimensionality. its quantitativeand quafitative aspects, and the multiple. and often conflicting, evaluatorviewpoints. This article provides an overview of whatsystem effec- tiveness means and how it should be measured. It is the first of two articles to appear in consecutive issues of the MIS. Quarterly. Starting with a definition of system effectiveness, this article examinesevaluation of system effectiveness in terms of a conceptual hierarchy of system objectives. Thehierarchy is used to discuss problems in, and recommendations for, evaluating system effectiveness, and to compare MIS evaluation approaches. The second article characterizes and compares the evaluator viewpoints on system effectiveness for decision makers in several functional groups involved in MIS implementation -- user, MIS, internal audit, and management. Thesecond article recommends several MIS evaluation approaches for incorporating multiple dimensions and multiple evaluator viewpoints into evaluations of information system effectiveness. Keywords: Management information systems, MtS, evaluation ACM Categories: 1.3, 2.10. 240. 3.50 Introduction Evaluation of Management Information Systems (MIS) is an integral part of the management control process. It was highlighted in the SHARE study [8] recommendations for improving MIS management and making the value of an informa- tion system evident to the enterprise. MIS effectiveness, defined by the accomplishment of objectives, is of concern, not only to the manage- ment function, but also to user, developer, and internal audit personnel involved in MIS implemen- tation. Yet, few organizations have an organized process for evaluating MIS effectiveness. Thepurposeof this article, the first of two parts to be published consecutively, is to provide an overview of approaches for evaluating MIS effec- tiveness. The definition of system effectiveness is first considered and assessments of system effectiveness are discussed in terms of a conceptual hierarchy of system objectives. The conceptual hierarchy is used to discuss problems in, and recommendations for, assessing system effectiveness. Applications of the conceptual hierarchy are illustrated for evaluating a manufac- turing MIS and for comparing MIS evaluation approaches. Definition of System Effectiveness Two general views can be taken concerning what system effectiveness meansand how it should be measured: the goal-centered view and the systems-resource view. ’ 1. In the goal-centered view, the way to assess system effectiveness is first to determine the task objectives of the system, or of the organizational units utilizing the system, and then to develop criterion measures to assess how well the objectives are being achieved. Effectiveness is determined by comparing per- formance to objectives. An example of the goal-centered view of systems effectiveness ’For more detailed explanations concerning the goal-centered view and the system-resource view. see[4. 22, 29] MIS Quarterly~September 1981 55

Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

  • Upload
    vuhuong

  • View
    229

  • Download
    1

Embed Size (px)

Citation preview

Page 1: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

Evaluating InformationSystem Effectiveness --Part I: ComparingEvaluation Approaches

By: Scott HamiltonNorman L. Chervany

A bstractWhile the use and costs of Management InformationSystems (MIS) have become highly visible, tittleattention has been paid to assessing and com-municating system effectiveness. Evaluation of systemeffectiveness is difficult due to its multidimensionality.its quantitative and quafitative aspects, and the multiple.and often conflicting, evaluator viewpoints.

This article provides an overview of what system effec-tiveness means and how it should be measured. It is thefirst of two articles to appear in consecutive issues ofthe MIS. Quarterly. Starting with a definition of systemeffectiveness, this article examines evaluation ofsystem effectiveness in terms of a conceptualhierarchy of system objectives. The hierarchy is usedto discuss problems in, and recommendations for,evaluating system effectiveness, and to compare MISevaluation approaches. The second articlecharacterizes and compares the evaluator viewpointson system effectiveness for decision makers in severalfunctional groups involved in MIS implementation --user, MIS, internal audit, and management. The secondarticle recommends several MIS evaluation approachesfor incorporating multiple dimensions and multipleevaluator viewpoints into evaluations of informationsystem effectiveness.

Keywords: Management information systems, MtS,evaluation

ACM Categories: 1.3, 2.10. 240. 3.50

IntroductionEvaluation of Management Information Systems(MIS) is an integral part of the management

control process. It was highlighted in the SHAREstudy [8] recommendations for improving MIS

management and making the value of an informa-tion system evident to the enterprise. MISeffectiveness, defined by the accomplishment ofobjectives, is of concern, not only to the manage-ment function, but also to user, developer, andinternal audit personnel involved in MIS implemen-

tation. Yet, few organizations have an organizedprocess for evaluating MIS effectiveness.

The purpose of this article, the first of two parts to

be published consecutively, is to provide anoverview of approaches for evaluating MIS effec-tiveness. The definition of system effectiveness is

first considered and assessments of systemeffectiveness are discussed in terms of aconceptual hierarchy of system objectives. Theconceptual hierarchy is used to discuss problemsin, and recommendations for, assessing systemeffectiveness. Applications of the conceptualhierarchy are illustrated for evaluating a manufac-turing MIS and for comparing MIS evaluationapproaches.

Definition ofSystem EffectivenessTwo general views can be taken concerning whatsystem effectiveness means and how it should bemeasured: the goal-centered view and thesystems-resource view. ’

1. In the goal-centered view, the way to assesssystem effectiveness is first to determine thetask objectives of the system, or of the

organizational units utilizing the system, andthen to develop criterion measures to assesshow well the objectives are being achieved.Effectiveness is determined by comparing per-formance to objectives. An example of thegoal-centered view of systems effectiveness

’For more detailed explanations concerning the goal-centeredview and the system-resource view. see [4. 22, 29]

MIS Quarterly~September 1981 55

Page 2: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

would be to compare actual costs and benefitsto budgeted costs and benefits.

2. I.n the system-resource view, systemeffectiveness is determined by attainment of anormative state, e.g., standards for "good"practices. Effectiveness is conceptualized interms of resource viability rather than in termsof specific task objectives. For example,system effectiveness in terms of humanresources might be indicated by the nature ofcommunication and conflict between MIS anduser personnel, user participation in systemdevelopment, or user job satisfaction. In termsof technological resources, system effec-tiveness might be indicated by the quality ofthe system or service levels. The systemresource model recognizes that systems fulfillother functions and have other consequencesbesides accomplishment of official objectives,and that these need to be considered inassessing system effectiveness.

In assessing system effectiveness, the evaluationapproach would depend in part on which of thesetwo views are considered. In practice, the twoviews should converge. In order to explain thesuccess, or lack of success, in meeting objec-tives, the systems resources need to beinvestigated.

The distinction between the two views is similar tothe distinction drawn between "summative" and"formative" evaluation approaches in theevaluation research literature [28]. Summativeevaluation determines whether the system hasaccomplished objectives. Formative evaluationassesses the quality of the system and relatedsupport. The distinction between summative andformative evaluation approaches is analogous tothe evaluation of ends versus means, oroutcomes versus process. Formative evaluationapproaches provide information throughout theimplementation process to help improve themeans, or process, to accomplish objectives andaid interpretation of summative evaluation results.Summative evaluation approaches provide infor-mation on the system outcomes, or ends, tosupport decisions to continue, adopt, orterminate the system. Both formative andsummative evaluation approaches are typicallyused in providing evaluative information onsystem effectiveness.

EvaluatingSystem EffectivenessEvaluation of system effectiveness will bediscussed in terms of objectives typically con-sidered in implementing an MIS. A conceptualhierarchy of objectives is proposed and used tocompare MIS evaluation approaches. To illustrateuse of the conceptual hierarchy, the objectivesand performance measures that might be used toevaluate the effectiveness of an MRP system arecharacterized.

A conceptual hierarchyof system objectivesSystem objectives broadly define the goals of theMIS and embody the hierarchy of objectives forthe organization, running the gamut from a singlestrategic statement which is quite conceptual, todetailed operational goals for the individual MISdevelopment project [20]. Typically, therequirements definition or design specification forthe information system is an operational descrip-tion of the system objectives and constitutes areference point for MIS development and opera-tions personnel. A conceptual hierarchy of objec-tives is depicted in Figure 1. Similar conceptualhierarchies have been proposed by Greenberg,et al. [12], Kriebel, et al. [16], and Ginzberg [11 ].

One of the primary objectives of the MIS functionis to develop and operate/maintain informationsystems that will enhance the organization’sability to accomplish its objectives. Accomplish-ment of this objective can be evaluated from twoperspectives for a specific information system:

1. The efficiency with which the MISdevelopment and operations pro-cesses utilize assigned resources(staff, machines, materials, money) provide the information system to theuser.

2. The effectiveness of the users, or theusers’ organizational unit, using theinformation system in accomplishingtheir organizational mission.

The efficiency-oriented perspective is reflected inthe left-hand side of Figure 1 for the MIS develop-ment and operations processes. The MIS

56 MIS Quarterly~September 1981

Page 3: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

Level 0

Level 1

MISDe velopment(Resources)

provided by

ResourceConsumption

Provides

InformationSystem

__ MIS Operations(Resources)

InformationProvided

affects

SupportProvided

affects

Level 2

provided by

ProductionCapability

affects

and UserPerformance

affects

Level 3

provided by

/

ResourceInvestment

Efficiency-OrientedPerspective

Influences

Influences

affects

OrganizationalPerformance

- - -I - :ffectstI External I

I Environment* II

Effectiveness .OrientedPerspective

Figure 1. A Conceptual Hierarchy of System Objectives

*System objectives concerning effects on the external environment are considered 38 organizational performance objectives (Level 3).

MIS Quarterly/September 1981 57

Page 4: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

development process, by the selection andapplication of organizational resources, yields theinformation system which is then supported bythe MIS operations process. Objectives for theMIS development and operations processesmight be stated at four levels:

Level O: The requirements definition forthe information system.

Level 1: The resource consumptionnecessary to provide the infor-mation system.

Level 2: The production capability orcapacity of the resources.

Level 3: The level of investment inresources.

The effectiveness-oriented perspective isreflected in the right side of Figure 1. The infor-mation provided by, and the support provided for,the information system influence user decisionmaking processes and user organizational perfor-mance, which in turn affect organizational perfor-mance and possibly the external environment.System effectiveness might be assessed at threelevels of objectives:

Level 1 : The information provided by theinformation system, and thesupport provided by the MISfunction to the users of thesystem.

Level 2: The use of the informationsystem and the effect on userorganizational processes andperformance.

Level 3: The effect of the informationsystem on organizationalperformance.

System effectiveness is ideally assessed in termsof the information system’s contributions toaccomplishment of organizational objectives, L e.,its effect on organizational performance (Level3). 2 For example, organizational objectives and

2Organizational performance objectives may include considera-tions of effects on the external environment. For example,manufacturers typically include product safety (e.g., consumerinjuries from automobile defects) within organizational perfor-mance objectives, and even employ information systems totrack current owners and safety records.

performance measures might be expressed insales revenues, customer satisfaction, and profitcontributions. These effects do not follow directlyand immediately, but rather result from use of theinformation system and changes in organizationalprocesses. System objectives in terms of the useprocess and user organization performance(Level 2) reflect system effects on theseorganizational processes, including changes indecision makers, changes in the decision makingprocess, and changes in user organizational per-formance. However, system objectives aretypically stated in terms of the requirementsdefinition which specifies the information to beprovided by the information system (Level 1).These objectives, such as improved informationtimeliness, content, or form, affect organizationalperformance only through the use process(Level 2).

The accomplishment of MIS objectives can beassessed by performance measures. Figures 2and 3 illustrate a variety of MIS objectives andcorresponding performance measures for theefficiency-oriented and effectiveness-orientedperspectives, respectively, that reflect thosementioned in the literature. For example, Ein-Dorand Segev’s [9] summarization of system objec-tives, and Norton and Rau’s [23], and Smith’s [31 ]illustrations of performance measures, can bemapped into figures 2 and 3.

An example for a manufacturing MIS

A manufacturing MIS is used to illustrate anapplication of the conceptual hierarchy of systemobjectives and corresponding performancemeasures. Few firms have effective measures fordetermining how well the manufacturing functionperforms. Since the Manufacturing ResourcePlanning (MRP) system provides substantial sup-port to manufacturing managers in labor andmaterials management, examples will be drawnfor an MRP system.3

An MRP system consists of several subsystems,including bill of materials, routings, costing,

3For further information on MRP, Anderson, et al. [2]summarize current MRP practices. For further information onevaluation of a manufacturing MIS, Richardson and Gordon[27} present measures of manufacturing performance.

58 MIS Quarterly~September 1981

Page 5: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

~MiS Development Process MIS Operations Processes

Performance PerformanceLevel Object|re Measure Objective Measure

0 Information systems Technical quality Technical quality

1 Resource consumption

2 Production capability

Compliance to Systemsdevelopment standards forprogram design, databasedesign, testing, etc.

Controls quality

Documentation quality

Compliance to applicationscontrol standards

Compliance to documentationstandards

Development budget

Scheduled completion

User participation

Budget variance

Schedule compliance

Amount and type ofinvolvement

Available man-hours Chargeable man-hoursProductivity ratePercent overtime

Controls quality

Documentation quality

Compliance to designspecification

Compliance test foradequacy and completenessof controls

Compliance to standards

Operations budget

Scheduled run times

Estimated computerresource unitsrequired

Budget variance

Actual run timesPercent reruns

Actual resource unitsutilized

Available computercapacity (throughput)

Percent uptimeResponse timeBacklogPercent utilizationActual throughput

Job satisfactionJob performance

Job description

3 Resource investment MIS personnel training Training expenditures Capital investment Capital expenditures(hardware)

Figure 2. Four Levels of Efficiency Oriented Objectives andPerformance Measures for the MIS Development andOperations Processes

Page 6: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

Level Objective Performance MeasuresImprove time of presentation1 Information and

Support Provided

2 Use process andUser performance

3 OrganizationalPerformance

Improve information content -quality

Improve information content -quantity

Improve presentation form

Improve user support

Ir~proved decision m’aker(s)

Improved decision making process

Improved user organizationalperformance via:

Reduced information processingcosts

Improved asset utilization

Financial Objectives

Customer objectives

Organizational developmentobjectives

Data currency: reporting interval plusprocessing delayDelivery schedule (offline)Response time (online)Turnaround time (requests)

Data: accuracy, scope, level ofaggregation, time horizon, reliability,security, completeness

Model quality: Technical validity,organizational validity

Data: Access to new/more data

Model: Computational powerLibrary facilities

System interface:Flexibility, simplicity, ease of use,responsiveness, convenience,availability, etc.

Format:Graphical, tabular, colors,documentation, etc.

Amount of user trainingAmount of quality of user guidesQuality of MIS - user relationshipAmount of user control over MISservices

Understanding of problemExtent of common informationDegree of cooperation and consensusChange in attitudes: toward job, "toward MIS, toward MI$ approach,toward confidence in decision

Explicitness of goals/objectivesConsideration of constraints, alternativesComprehensiveness of analysisQuantification of action consequencesMore informed use of MISLength of time to make decisions

Automate manual calculation/analysisAutomate data handling/collection/correctionCost displacement (people, equipment)Improved procedures

Reduced inventory levels/turnaroundReduced number of backordersSales revenueProfit contributionReturn on investment

Customer satisfactionRegulatory agency compliance

Morale

Figure 3. Three Levels of Effectiveness OrientedObjectives and Performance Measures forthe MIS Use Process

60 MIS Quarterly~September 1981

Page 7: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

inventory status, purchasing, master productionschedule, material planning, capacity planning,and shop floor control. Implementation of thevarious subsystems is evolutionary andrepresents a development cycle for manufactur-ing MIS [3]. The MRP system primarily servesusers in the manufacturing area, but also supportsdecision makers in marketing, finance, and otherfunctional groups.

Examples of objectives of an MRP system andcorresponding performance measures arepresented in Figure 4. The effectiveness-oriented objectives for the system reflect theprimary users’ concerns, e.g., improved userawareness through education, improved dataaccuracy, improved inventory control andproduction scheduling, improved coordinationbetween functional groups, and improvedcustomer satisfaction. A fundamental requirementfor evolutionary implementation of an MRPsystem, and for accomplishing Level 2 and Level3 objectives, is an ongoing, continually updatedprogram of user education, to increase userunderstanding and awareness. The efficiency-oriented measures listed in Figure 4 reflect con-cern.s of the MIS function, e.g., improved MISschedule completion, MIS capacity, andawareness of MIS personnel.

Problems in evaluatingsystem effectiveness

Evaluating system effectiveness in meaningfulterms has been one of the most difficult aspectsof the MIS implementation process. The problemsencountered in evaluating system effectivenesscan be briefly summarized.

Objectives and measures ofaccomplishments are ofteninadequately defined initially.

Objectives and measures ofaccomplishments are often not definedadequately at the outset of an MISimplementation effort [1 5]. The initialspecification of objectives concerningthe requirements definition for (LevelO) and information provided by (Level1) the information system is oftenincomplete [7]. Measures for objec-tives concerning user performance

Evaluating MIS Effectiveness

(Level 2) and organizational perfor-mance (Level 3) are typically notquantified, especially for mandateddevelopment projects [11]. Further-more, the stated or manifest objectivesfrequently do not represent the realobjectives since underlying aims ofinvolved personnel go unstated [1 ].

Efficiency-oriented and easily quan-tified objectives and measures aretypically employed.

Efficiency-oriented and easily quan-tified objectives and measures aretypically employed whileeffectiveness-oriented and qualitativeobjectives and measures are ignored.This stems from a focus on resourceconsumption objectives and the MISdesign specification, and thepressures of project justification tofocus on tangible quantitative costsand benefits. Intangible, qualitativeimpacts of the information system tendto be ignored except when quantifiablebenefits are insufficient to justifysystem development [11]. In manycases, measures of intangible,qualitative effects of systems are notavailable.

3. Objectives and measures used toevaluate the system are not thesame as those defined initially.

The dynamic nature of the MISimplementation process suggests thatevolutionary changes in objectives willoccur because of learning by usersand MIS development personnel andchanges in the environment. Evolvinguser needs greatly influencerequirements for maintenance [18]. Asmentioned earlier, the initial lack ofmutually agreed upon objectives andmeasures also implies that those usedto evaluate the system will be different.

4. Individual perceptions may differ onwhat the objectives and measuresare.

When realistic mutual agreementamong the participants is not initially

MIS Quarterly~September 1981 61

Page 8: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Level ofObjective Hierarchy of Objectives Objectives Performance Measures

Amount and type of Ml$ trainingResource Investment

Production Capability

Resource Consumption

Information System

Information and SupportProvided

Use Process and UserPerformance

Organizational Performance

Improved Awareness ofMIS Personnel

Increased MIS Capacity

Improved MIS Schedulecompletion

Improved databaseintegration

Improved data accuracy

Improved user awareness

Improved inventory controland production scheduling

Improved coordination

Improved customer

Number of transaction inputshandled per unit time

Input cutoff metNumber of report rerunsReport deliveries on schedule

Amount of data redundancyNumber of times data enteredAccuracy of bill of materials,routings, and costing

Amount and type of user trainingQuality of user guidesNumber and type of user complaints

Delivery promises metDelivery lead timeInventory turnsCost estimating accuracyNumber of expeditersNumber of split orders

Agreement between production.marketing, and financeClass of MRP user (A, B, C, or D)

Number and type of customersatisfaction

Business planMarketing plan

Production plan

complaints

Percent of profit goals realizedPercent of orders received vs.sales forecastPercent of production rate vs.capacity plan

Figure 4. Examples of Objectives and Performance Measures for a Manufacturing MIS: MRP

Page 9: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

established concerning MIS objectivesand measures, different evaluatorviewpoints may arise when assessingsystem effectiveness. Mutual agree-ment may be hampered because rele-vant parties, or evaluator viewpoints,are not represented during initial objec-tive setting, underlying aims of involvedpersonnel go unstated, andassessments of system effectivenesspredominantly reflect the interests of asingle evaluator viewpoint.

In summary, realistic mutual agreement concern-ing the definition of appropriate objectives andmeasures of accomplishment is typically notreached at the outset by relevant parties. Thismakes evaluation of system effectivenessdifficult.

Recommendations for evaluatingsystem effectiveness

Several recommendations are offered to improvethe ability to evaluate system effectiveness.

1. Define and/or derive appropriatesystem objectives and measures.

Before undertaking system develop-ment, definition of system objectivesand measures is necessary. Forreasons stated earlier, however, theinitial "official" or "documented" state-ment of objectives may be inadequatefor evaluating system effectiveness. Itmay be necessary to derive systemobjectives and appropriate measuresby using the "operative" objectiveswhich are reflected in the tasks andactivities performed within the usingorganization [4].

These organizational processes, ortasks and activities performed in theusing organization, represent anintermediate level of system effect(Level 2 objectives). Since the ultimateeffects of an MIS on organizationalperformance may require a long time tobe realized and may not be direct andimmediate, and since the value of infor-mation is only realized in its use, a

Evaluating MIS Effectiveness

focus on definition of Level 2 objec-tives and measures is recommended.

Surrogate measures of the utility of theMIS in supporting organizational pro-cesses have been recommended forassessing system effectiveness,including user satisfaction [25, 30],information satisfaction [6], andsystem utilitization [19].

2. Enlarge the range of performancebeing evaluated,

The intangible qualitative effects ofinformation systems on organizationalprocesses (Level 2 objectives) andorganizational performance (Level objectives) are often more significantfor assessing system effectiveness.Since objectives and measures aretypically efficiency-oriented and easilyquantified, a need exists to enlarge therange of performance being evaluatedto include Level 2 effectiveness-oriented objectives.

3. Recognize the dynamic nature of theMIS implementation process.

The implementation of an MIS isviewed as a planned organizationalchange which will modify a user’s worksystem to improve its functioning. Thisview explicitly emphasizes the impor-tance of considering effects oftechnical change on the user organiza-tional processes and the dynamicnature of MIS implementation. Systemeffectiveness is explicitly con-ceptualized in terms of theachievement of objectives and the"institutionalization of change," e.g.,ongoing user training, use of MISservices, and MIS support.

4. Account for differing evaluatorviewpoints.

The establishment of realistic mutuallyagreed upon objectives and measuresat the outset of system development isprescribed. Differing viewpoints needto be considered not only in initiallyestablishing these objectives and

MIS Quarterly~September 1981 63

Page 10: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

measures, but also in assessingsystem effectiveness. The literaturegenerally emphasizes the importanceof the primary user viewpoint [19], butas Langefores [17] noted, "it is notenough that primary users are highlysatisfied, since other relevant peoplemay think that more important informa-tion should have been used" inevaluating system effectiveness.

Comparison ofEvaluation ApproachesSeveral MIS evaluation approaches currentlyemployed to assess system effectiveness can becompared by mapping them into the conceptualhierarchy. While many different approaches havebeen suggested [5], a survey of current practicesindicates the following approaches are frequentlyemployed in MIS organizations [14]. The scopeof each evaluation approach is depicted in Figure5 in terms of the objectives being evaluated andsummarized below.

1. Quality Assurance Review

Quality assurance reviews, or technicalreviews, focus on assessing the infor-mation system’s technical quality, e.g.,comparison to standards and opera-tions acceptance procedures.Technical reviews are performed byMIS development/operations person-nel or a separate quality assurancegroup within the MIS function.

2. Compliance Audits

Compliance audits or application con-trol reviews focus on assessing theadequacy and completeness ofcontrols for system inputs, ouputs,processing, security, and access.Compliance audits are typically per-formed by an autonomous internalaudit function.

3. Budget Performance Review

Evaluations of MIS budget perfor-mance focus on compliance with a

64 MIS Quarterly~September 1981

predetermined budget expenditurelevel for the MIS development oroperations process. Evaluations ofuser budget performance focus on MISresource consumption by the user.Both may be supported by achargeback mechanism.

4. MIS Personnel ProductivityMeasurement

The production capability of MIS per-sonnel is typically assessed in terms ofproductivity. Examples of productivitymeasures include lines of code per unittime for programmer (development)personnel and keystrokes per unit timefor data entry (operations) personnel.

Computer Performance Evaluation

The production capability of the com-puter hardware is typically assessed interms of performance efficiencies andbottlenecks that limit production. Forexample, computer performanceevaluation measurements are made onpercent uptime, actual throughput, andI-O channel utilization.

Service Level Monitoring

Service level monitoring focuses onassessing the information and supportprovided to the user based on theterms established between MIS anduser personnel. Assessments of theinformation provided includeturnaround times, response times, anderror rates. Assessments of the sup-port provided include the time requiredto respond to user problems andrequests for changes.

7. User Attitude Survey

User attitude surveys, through ques-tionnaires and/or interviews, focus onassessing the user’s perceptions ofthe information and support providedby the MIS function. User attitudesurveys typically assess such aspectsas the quality of reports, timeliness,quality of service, and MIS-usercommunication.

Page 11: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Means to Measure Accomplishment of Objectives

Quality Budget MIS Personnel Computer Service User Post Cost/Hierarchy of Objectives Assurance Compliance Performance Productivity Performance Level Attitude Installation Benefit

Review Audit Review Measurement Evaluation Monitoring Survey Review Analysis

Resource Investment

Production Capability

Resource Consumption

Information System

Information and SupportProvided

Use Process andUser Performance

OrganizationalPerformance

QualityAssurance

Review "

ComplianceAudit

Review

MIS Personnel ComputerProductivity Performance

Measurement Evaluation

BudgetPerformance

Review

S~rvice UserLevel Attitude

Monitoring ; Survey

PostInstallation

Review

Cost/;-Benefit

L, Analysis

Cost/Benefit

Analysis

rn

Figure 5. Comparison of Evaluation Approaches

Page 12: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

8. Post Installation Review

The focus of a Post Installation Review(PIR) is often on assessing whether thesystem meets the requirements defini-tion, Le., "does the system do what itis designed to do?" However, thescope of the PIn may include a posthoc review of the development andoperations processes, an assessmentof the information and supportprovided, an analysis of the actual useprocess, and cost/benefit analysis ofthe system effects on userperformance.

9. Cost/Benefit Analysis

Cost/Benefit analysis quantifies thesystem’s effect on organizational per-formance in terms of dollars, e.g.,direct cost savings, tangible financialbenefits. Cost/benefit analysis is oftenused in capital budgeting to assess thereturn on investment.

Evaluation approaches were previouslycategorized as being either summative orformative, and the two approaches can becompared using the conceptual hierarchy.Summative evaluation determines whether thesystem has achieved desired "outcomes" or"end result" objectives and focuses on assessingthe accomplishment of Level 2 and Level 3effectiveness-oriented objectives. Formativeevaluation assesses the development "process"or "means" undertaken to accomplish objectivesand focuses on efficiency-oriented objectives andLevel 1 effectiveness-oriented objectives. Meals[21 ], O’Brien [24], and Varanelli [33] discuss thesummative and formative approaches to informa-tion system evaluation.

The MIS evaluation approaches provide differentmeans to measure accomplishment of systemobjectives. The means for measuring systemeffectiveness can be characterized as subjectiveor objective. Price notes that measurement oforganizational effects can be based on subjectivedata, (e.g., perceptions of individuals) objective data (e.g., observable behavior) [26].

Utilizing the distinction between (1) summativeand formative evaluation approaches, and (2)

objective and subjective measures, various MISevaluation approaches can be generally classifiedas shown in Figure 6. When observable behaviorsare evaluated, the approach is categorized as anobjective means to assess accomplishment ofobjectives. The user attitude survey is a formalapproach to quantify subjective perceptions ofsystem effectiveness. Informal approaches toobtaining perception of system effectivenessinclude the day-to-day communication with usersthrough (1) personal face-to-face discussions, (2)telephone calls, (3) group or committeemeetings, and (4) written report letters or memos,including system change requests.

When tangible benefits are difficult to measure forcost/benefit analysis, summative MIS evaluationapproaches often focus on objective measures ofthe use process, e.g., on changes to the decisionprocess. Information economics provides onetechnique to quantify the utility associated withsystem use [24], and utilization has also beensuggested to measure changes in the decisionprocess [32]. Rather than directly measuringsystem benefits, the users’ perception of systembenefits has been advocated for summativeevaluation [10, 15]. The user perception ofsystem adequacy obtained in user attitudesurveys has also been advocated as a formativeand a summative evaluation approach. In most ofthe formative evaluation approaches, expertjudgments are employed, e.g., in qualityassurance reviews, compliance audits, and postinstallation reviews.

ConclusionDiscussions of information system effectivenessor system success, and the studies attempting tomeasure this construct, frequently focus on thequestion of what performance measure to use.Authorities in the field have advocated, orcondemned, measuring changes in the value ofinformation provided, changes in surrogatemeasures of user satisfaction, and changes inorganizational performance. However, it is initiallyuseful to consider several prior and perhaps moreimportant questions:

1. What and whose purpose will theevaluation serve?

66 MIS Quarterly~September 1981

Page 13: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

Means to MeasureAccomplishment

of Objectives

Objective

Subjective

Evaluation Approach

SummativeFocus on outcomes or ends

Cost/Benefit Analysis

Use Process AnalysisUtilizationInformation Economics

User Attitude Survey

Perception of SystemBenefits

FormativeFocus on process or means

Quality Assurance Review

Compliance Audit

Budget Performance Review

MIS Personnel ProductivityMeasurement

Computer PerformanceEvaluation

Post InstallationReview

Service Level Monitoring

User Attitude Survey

Perceptions of SystemEffectiveness

Figure 6. Classification of Evaluation Approaches

2. What are the task objectives of thesystem or organizational units utilizingthe system?

Answers to the first question guide the nature andextent of evaluation approaches. Answers to thesecond question form the basis for assessingsystem effectiveness.

In evaluating information systems, a hierarchy ofsystem objectives needs to be considered, asshown in Figure 1, that recognizes themultidimensional nature of system effectiveness.Performance measures to assess the accom-plishment of objectives stem from the definition oftask objectives, as illustrated in Figures 2 and 3.

Assessments of system effectiveness in mean-ingful terms are frequently hampered by systemobjectives and performance measures whichhave been inadequately defined, which tend to beefficiency oriented and easily quantified, andwhich continually evolve. In many cases, it maybe necessary to derive appropriate objectivesand measures, as illustrated in Figure 4 for aMaterial Requirements Planning System. Theactual assessment of system effectiveness may

employ a variety of evaluation approaches, asshown in Figure 5. The evaluation approachesmay utilize objective or subjective means tomeasure accomplishment of objectives as shownin Figure 6. The subjective assessmentsespecially need to account for differing percep-tions of system objectives, as well as theaccomplishment of objectives.

While the formal evaluation approaches may pro-vide objective measures, informal approaches togaining perceptions of system effectiveness arenecessary and helpful for calibrating thecredibility of information on MIS evaluation. Asone MIS executive noted, "1 talk to people instructured and unstructured situations up anddown the line organization...to develop somekind of credibility check on the regular informationflow," [13]. Management control of MIS willrequire multiple evaluation approaches to satisfyevaluative information requirements.

The subsequent article in the next issue of theMIS Quarterly will present a comparison ofevaluator viewpoints and recommend ways toincorporate multiple viewpoints into evaluationapproaches.

MIS Quarterly~September 1981 67

Page 14: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

References

[1] Alter, S.L. "A Study of Computers AidedDecision Making in Organizations,"Unpublished Ph.D. Dissertation, MIT,Cambridge, Mass,?.chusetts, 1975.

[2] Anderson, J.C., Schroeder, R.G., Tupy,S.E., and White, E.M. "MRP: A Study ofImplementation and Practice," APICSMonograph, forthcoming 1981.

[3] Appleton, D. "A Manufacturing SystemsCookbook: Part 2," Datamation, Volume25, Number 3, June 1979, pp. 132-140.

[4] Campbell, J.P.’ "On the Nature of Organiza-tion Effectiveness," New Perspectives onOrganizational Effectiveness, J.P. Camp-bell, P. Goodman, and J.M. Pennings, eds.,Josey-Bass, Boston, Massachusetts,1977, pp. 13-55.

[5] Carlson, E.O. "Evaluating the Impact ofInformation Systems," ManagementInformatics, Volume 3, March 1974,pp. 57-67.

[6] Cheney, H. "Measuring MIS ProjectSuccess," Proceedings of 9th Annual AIDSConference, Chicago, Illinois, October25-27, 1977, pp. 141-143.

[7] Dickson, G.W. and Powers, R.F. "MISProject Management: Myths, Opinions, andReality," California Management Review,Volume 7, Spring 1973, pp. 147-156.

[8] Dolotto, T.A., Bernstein, M.I., Dickson,R.S., Jr., France, N.A., Rosenblatt, B.A.,Smith, D.M., and Steel, T.B., Jr. DataProcessing in 1980-1985, John Wiley andSons, New York, New York, 1976.

[9] Ein-dor, P. and Segev, E. "StrategicPlanning for Management InformationSystems," Management Science, Volume24, Number 15, November 1978,pp. 1631-1641.

[10] Gallagher, C.A. "Perceptions of the Valueof a Management ScienceImplementation," Academy of ManagementJournal, Volume 17, Number 1, March1974, pp. 46-55.

[11] Ginzberg, M.J. "Improving MIS ProjectSelection," Research Paper #135A,Graduate School of Business, ColumbiaUniversity, New York, New York, August1978.

[12] Greenberg, H.D., Stewart, R.J.S., Hanes,L.F., Kriebel, C.H., and Debons, A.

"Productivity Measurement Systems forAdministrative Services: Computing andInformation Services," Proceedings of theGrantee’s Conference on Research on Pro-ductivity Measurement Systems for Ad-ministrative Services, Tempe, Arizona,November 15-16, 1976.

[13] Halbrecht, H.Z. Interview with R.E.McDonald, MIS Quarterly, Volume 1,.Number 2, June 1977, pp. 7-11.

[14] Hamilton, S. "A Survey of Data ProcessingPost Installation Evaluation Practices,"MISRC WP# 80-06, University of Min-nesota, Minneapolis, Minne~sota, February1980.

[15] Keen, P.G.W. and Scott Morton, M.S.Decision Support Systems: An Organiza-tional Perspective, Addison Wesley,Reading, Massachusetts, 1978.

[16] Kriebel, C.H., Raviv,.A., and Zia, H. "AnEconomics Approach to Modeling the Pro-ductivity of Information Systems,"Technical Report NSP APR75-20546/76/TR2R, Carnegie-Mellon Univer-sity, Pittsburgh, Pennsylvania, July 1977.

[17] Langefores, B. "Discussion of ’DeterminingManagement Information Needs: A Com-parison of Methods,’ " MIS Quarterly,Volume 1, Number 4, December 1977.

[18] Lientz, B.P., Swanson, E.B., andTompkins, G.E. "Characteristics of Applica-tion Software Maintenance," Communica-tions of the ACM, Volume 21, Number 6,June 1978, pp. 466-471.

[19] Lucas, H.C., Jr. "Performance and Use ofan Information System," ManagementScience, Volume 21, Number 4, April1975, pp. 90~919.

[20] McLean, E.R. and Soden, J. Strategic Plan-ning for MIS, Wiley and Son,. New York,New York, 1977.

[21] Meals, D.W. "Systems Evaluation," Journalof Systems Management, Volume 16,Number 7, July 1977, pp. 6-9.

[22] Molnar, J.J. and Rogers, D.L. "Organiza-tional Effectiveness: An Empirical Com-parison of the Goal and System ResourceApproaches," The Sociological Quarterly,Volume 17, Number 2, Summer 1976,pp. 401-413.

[23] Norton, D.P. and Rau, K.G. "A Guide toEDP Performance Management," QED

68 MIS Quarterly~September 1981

Page 15: Evaluating Information System Effectiveness - Part I ... MIS Effectiveness Evaluating Information System Effectiveness --Part I: Comparing Evaluation Approaches By: Scott Hamilton

Evaluating MIS Effectiveness

Information Sciences, Wesley,Massachusetts, 1978.

[24] O’Brien, J.F. "Methodology for Assessingthe Impact .of Computing and InformationSystems on Users," Technical Report APR-20546/77/TR6, Westinghouse R & DCenter, Pittsburgh, Pennsylvania,August 1977.

[25] Pearson, S.W. "Measurement of ComputerUser Satisfaction," Unpublished Ph.D.Dissertation, Arizona State University,Tempe, Arizona, 1977.

[26] Price, J.L. "The Study of OrganizationalEffectiveness," The Sociological Quarterly,Volume 13, Number 4, Winter 1972,pp. 3-15.

[27] Richardson, P.R. and Gordon, J.R.M."Measuring Total Manufacturing Perfor-mance," Sloan Management Review,Volume 9, Number 2, Winter 1980,pp. 47-58.

[28] Schriven, M. "The Methodology of Evalua-tion: Formative and Summative Evaluation,"Evaluating Action Programs, C.H. Weiss,ed., Allyn and Bacon, Boston,Massachusetts, 1972.

’[29] Seashore, S.E. and Yuchtman, E. "FactorAnalysis of Organizational Performance,"Administrative Science Quarterly, Volume12, Number 2, December 1967,pp. 377-395.

[30] Seward, H.H. "Evaluating InformationSystems," The Information Systems Hand-book, F. McFarland and R.L. Nolan, eds.,Dow Jones Irwin, Homewood, Illinois,1975, pp. 132-153.

[31] Smith, G.L., Jr. "Participatory Approachesto the Development of ProductivityMeasures," Proceedings of the 9th AnnualSMIS Conference, September 1977,pp. 101-110.

[32] Stabell, C.B. "Individual Differences inManagement Decision Making Processes:A Study of Conversational Computer

Usage," Ph.D. Dissertation, MIT,Cambridge, Massachusetts, 1974.

[33] Varanelli, A. "An Organizational Approachto the Evaluation of Computing CenterEffectiveness," Unpublished Paper,Department of Management, Pace Univer-sity, New York, New York, 1978.

About the AuthorsScott Hamilton is a senior project manager atCOMSERV Corporation, a producer of informa-tion systems for the manufacturing industry. Hewas previously an MIS manager and informationanalyst, and most recently was an instructor atthe University of Minnesota’s School of Manage-ment while completing his Ph.D. in MIS. Scottreceived a B.S. in industrial engineering from theUniversity of Wisconsin, and an MBA fromArizona State University. He has publishedarticles in several journals, including ManagementScience and the MIS Quarterly, and his researchinterests include MIS planning and controlsystem design and implementation, and manufac-turing information systems.

Norman L. ~hervany is Professor and Chairmanof the Management Sciences Department of theSchool of Management at the University ofMinnesota. He is currently serving as President ofthe American Institute for Decision Sciences. Heholds an MBA and DBA degree from IndianaUniversity. He has consulted extensively forprivate business and state and local government.He has published articles in MIS Quarterly, Deci-sion Sciences, Management Science, Journal ofQuantitative and Financial Analysis, Interfaces,and The American Statistician. His researchinterests focus upon the issues affecting thesuccessful implementation of information decisionsystems.

MIS Quarterly~September 1981 69