18
Following a definition of outcome based manage- ment, San Francisco’s interest in outcome and per- formance measurement is discussed. The history of outcome-based management in San Mateo County is also described including the 1993 decision by the Human Services Agency to begin to develop outcome and performance measures for the Budget Book, tracked quarterly. In 2000, the San Mateo Board of Supervisors linked performance measures to budget decision making by 2002–3. This will give the HSA the opportunity to recapture meaning behind measures, to involve staff agency-wide, and to increase staff’s technical competence. A number of future concerns are identified, including: • Development of adequate data systems and data integrity • Ability of decision-makers on the Board of Supervisors to interpret data correctly and to fully understand measurement process • Amount of time and resources for County Manager’s Office to effectively implement the OBM process and make it work • Effect on legally mandated agency services not included in community’s vision Implications for San Francisco Department of Human Services are also discussed: 1. DHS should ensure the work and learning laid down by Strategic Planning groups be main- tained and expanded upon for future outcome and performance measure development. 2. Initiatives developed in Strategic Plan should include outcome and performance measures. 3. DHS should commit to using measures as a tool to improve program performance. 4. Begin working with Program as soon as possible to get buy-in. 5. DHS should invest time and resources into developing quality training, data systems, and staffing to support ongoing performance mea- surement 6. DHS must understand this process takes time. 25 P ROMOTING O UTCOME -B ASED M ANAGEMENT S YSTEMS Megan Rosenberg* E XECUTIVE S UMMARY *Megan Rosenberg is the Training and Operations Officer for the Adult Services Program of the City and County of San Francisco Department of Human Services. Participants’ Case Studies • Class of 2000

PROMOTING OUTCOME-BASED MANAGEMENT SYSTEMS

  • Upload
    others

  • View
    1

  • Download
    0

Embed Size (px)

Citation preview

Following a definition of outcome based manage-ment, San Francisco’s interest in outcome and per-formance measurement is discussed. The history ofoutcome-based management in San Mateo Countyis also described including the 1993 decision bythe Human Services Agency to begin to developoutcome and performance measures for the BudgetBook, tracked quarterly.

In 2000, the San Mateo Board of Supervisors linkedperformance measures to budget decision makingby 2002–3. This will give the HSA the opportunityto recapture meaning behind measures, to involvestaff agency-wide, and to increase staff’s technicalcompetence.

A number of future concerns are identified, including:

• Development of adequate data systems and dataintegrity

• Ability of decision-makers on the Board ofSupervisors to interpret data correctly and tofully understand measurement process

• Amount of time and resources for CountyManager’s Office to effectively implement theOBM process and make it work

• Effect on legally mandated agency services notincluded in community’s vision

Implications for San Francisco Department ofHuman Services are also discussed:

1. DHS should ensure the work and learning laiddown by Strategic Planning groups be main-tained and expanded upon for future outcomeand performance measure development.

2. Initiatives developed in Strategic Plan shouldinclude outcome and performance measures.

3. DHS should commit to using measures as a toolto improve program performance.

4. Begin working with Program as soon as possibleto get buy-in.

5. DHS should invest time and resources intodeveloping quality training, data systems, andstaffing to support ongoing performance mea-surement

6. DHS must understand this process takes time.

25

PR O M O T I N G OU T C O M E-BA S E D MA N A G E M E N T SY S T E M SMegan Rosenberg*

E X E C U T I V E S U M M A R Y

*Megan Rosenberg is the Training and Operations Officer for the Adult Services Program of the City and County of San FranciscoDepartment of Human Services.

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

T O WA R D S M A N A G I N G F O R R E S U LT S

One of the most challenging aspects of PublicHuman Service provision is defining and measuringa program’s success. The promotion of safe, self-sufficient, healthy communities cannot be quanti-fied as easily as the maintenance of roads or thefighting of fires. Given the incredible complexitiesof the people we serve and the obstacles they face,we create service programs to combat serious socialissues, hoping the services will have the intendedeffect.

This may not be enough anymore. The call for gov-ernment accountability is getting stronger as werecognize the loss of public trust in government andthe fact that we can no longer afford to maintain thestatus quo. To show community members, electedofficials, administrators, and program staff we haveresults, not just good intentions, we need to be ableto answer critical questions such as: What kind ofimpact would we like our services to have on ourclient population? Do these services work? Why orwhy not? Are we succeeding in providing servicesin the most effective possible way? What can we doto improve programs so that they accomplish thedesired goals? All too often, the information neededto answer these questions does not exist.

The concept of managing for results through out-come and performance measurement1 was devel-oped to address these issues, and hinges upon thefollowing key elements:

1. Agreement on a set of outcomes and indicatorsin terms of which the program will be assessedand managed.

2. Development of systems for assessing programperformance in terms of the outcome objectives.

3. Use of program operations and outcome infor-mation to achieve improved program perfor-mance.

4. Communication of performance and results topolicy levels and to the public.

The concept is not new, and has had several incar-nations over the decades: Planning-Programming-Budgeting-Systems in the 1960s, Zero-BasedBudgeting in the 1970s, and Management byObjectives in the 1980s. Now called Outcome-Based Management (OBM), its theoretical benefitsare manifold: It could increase accountability aswell as results, clarifying the connection betweengovernment spending and purposes people under-stand; It could give us a concrete way to judge oursuccesses and failures; It could provide a way foragencies to sustain progress on problems over time,

27

PR O M O T I N G OU T C O M E-BA S E D MA N A G E M E N T SY S T E M SMegan Rosenberg

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

1Common and consistent use of terminology is critical when discussing Outcomes-based Management. The following terms will beused throughout this paper:Outcomes are the big-picture goals we want for our children, adults, families and communities. (aka Results; What we want)Indicators are measures that tell us if we have achieved the outcomes we want. (How do we know we have achieved what we wanted?)Performance Measures are measures that can be systematically tracked to assess if the elements of our strategies to achieve outcomesare performing as well as possible. (Proof on an operations level of progress toward predetermined goals)Input Measures are measures of financial and non-financial resources that are applied in providing services.Output Measures are measures of the quantity of services provided or the quantity of a service that meets a certain quality requirement.Outcome Measures are measures of the results that occur, at least in part, because of the services provided.

instead of having to wave in the political wind; andit could promote the decentralization of control fromState to Local governments by providing a way forcounties to prove they know how best to spend taxdollars.

In 1993 the Government Performance and ResultsAct (GPRA) was passed for federal government pro-grams, which mandated strategic planning and per-formance measurement in the national government.Several states have followed suit, including Texasand Florida. As of yet the State of California hasnot adopted an OBM plan, but some Californiacounties—including San Mateo and San Fran-cisco—are starting to take the first steps towardand using Outcome-Based Management.

S A N F R A N C I S C O ’ S I N T E R E S T I NO U T C O M E A N D

P E R F O R M A N C E M E A S U R E M E N T

Historically, San Francisco’s Department of HumanServices (DHS) has developed and used a limitednumber of performance measures to meet therequirements of the Mayor’s Budget formats as wellas for internal tracking purposes. Efforts on the partof the Office of Planning and Budget to developmore extensive and meaningful outcome and perfor-mance measures for internal use were met with con-cern and resistance from program staff, many ofwhom felt they should not be held accountable forthings beyond their control. The efforts were setaside.

However, in January 2000, the Board of Supervisorspassed the “San Francisco Performance andReview Ordinance of 1999”, containing much ofthe language used in the GPRA. The ordinancerequires all City Departments to submit annual“Departmental Efficiency Plans”, which will

include outcome and performance measurementcomponents, by 2003. These plans may alsoinclude performance budgets, which present thevarying levels of outcome-related performance thatwould result from different budgeted amounts formajor functions and operations of the department.The ordinance will oblige DHS to embark upon aconcerted effort to develop, track, and use measuresat unprecedented levels.

Additionally, DHS is currently forming its firststrategic plan, and wishes to incorporate perfor-mance measures into the process so the success ofthe plan’s initiatives may be evaluated.

Since the San Mateo County Human ServiceAgency (HSA) has been using outcome and perfor-mance measures in budgeting and strategic plan-ning for some time and is currently working with itsBoard of Supervisors to link performance measure-ment to resource allocation, its experience providesan invaluable case study for San Francisco in howto develop and use outcome and performance mea-sures for decision making.

H I S T O R Y O F O U T C O M E -B A S E DM A N A G E M E N T I N S A N M AT E O C O U N T Y

In 1993, the San Mateo HSA—along with all othercounty departments—began developing extensiveoutcomes and performance measures for each bud-get unit, to be submitted with the following year’sBudget Book. These measures were developed byall levels of staff, led by the Agency’s QualityManagement Coordinator. While there was no for-mal linkage of the measures to resource allocationsat this time, the measures were tracked and report-ed to the County Manager’s Office (CMO) quarterly,in order to show of the level of success each agencydivision was having in achieving its outcomes. The

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

28

measures were also intended to be used as a prima-ry management planning tool for decision making:A staff member from HSA Central Administrationwas to work with the Management Analysts andProgram Managers in each division to review thedata, monitor achievements, identify variablesinterfering with progress, and make recommenda-tions for program improvements on a quarterlybasis.

The process of developing outcomes and perfor-mance measures for the Budget Book had tangiblepositive effects: It afforded agency divisions anopportunity to reexamine and renew their missionstatements, provided managers with a concreteoversight mechanism, and initiated the extensivetask of educating staff about outcome and perfor-mance measurement.

As time progressed, however, obstacles impedingthe HSA’s ability to use the measures to effectivelyimprove service provision became apparent. First ofall, many measures were developed in rushed cir-cumstances, and are now difficult or impossible totrack because no systems are in place to providethe necessary data. There is a lack of a data man-agement infrastructure sufficient to support therequirements of extensive ongoing performancemeasurement. Databases have been created withoutrecording their contents or the methodologies usedto update and query the data, so that staff turnoverhas led to the loss of ability to interpret historicaldata and gather new data. This causes confusionabout how the measures are calculated and how thedata is generated year after year.

Secondly, at the time the measures were developed,Central Administration had to insist upon the pro-cess due to program staff resistance, and many pro-gram staff have not bought in to outcome and per-

formance measurement. Intensifying the lack of buy-in is the fact that performance measures are not al-ways program generated but are often mandated byAdministration and the CMO, according to politicalcurrents. Consequently many program staff do notfeel the measures are valid or an effective way toshow success, do not place priority on getting datato Central Administration coordinators on time, anddo not find meaning in the numbers they generate.Clearly, the link between meaning and the mea-sures has to be reestablished if the HSA wants touse outcome and performance measures as an effec-tive tool for planning and program improvement.

In January 2000, an opportunity to reestablish thatlink presented itself: Leaping ahead of most Stateand County governments, members of the SanMateo County Board of Supervisors, following thenational trend towards OBM, decided to use out-come and performance measures as an aid to make“apolitical” resource allocation decisions. Theydecided to take outcome and performance measure-ment one step further, initiating a process that willlead to formal linkage between performance mea-sures and budget decision making for all countyagencies by 2002. Alcohol and Other Drug Services(AOD) was selected as the HSA’s OBM Phase 1Pilot Program for FY 00-01. OBM is planned to goCounty-wide and Agency wide over the course oftwo to three years.

The formal linkage between outcome and perfor-mance measurement and resource allocationsplaces unprecedented weight upon the quality ofthe measures’ development, as well as on the cor-rect interpretation of the data. Because funding willnow be riding on results, usage of the measures toimprove service provision becomes essential. TheHSA is seizing this opportunity to recapture themeanings behind measures, to more thoroughly

29

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

involve and educate staff agency-wide in outcomeand performance measurement, and to improvestaff’s technical competence in generating and gath-ering data as well as in using it to more effectivelyprovide services.

U S I N G A R E S U LT S -A C C O U N TA B I L I T YF R A M E W O R K O N D I F F E R E N T L E V E L S

I N S A N M AT E O

An important concept to keep in mind aboutOBM—or managing through a results-accountabili-ty framework—is that the framework can be used atmany levels; it can be used on a macro level forlarge, all encompassing community outcomes, aswell as on a micro level, to show progress towardsmore specific program outcomes. The San MateoHSA has several planning projects happeningsimultaneously, all which use a results-accountabil-ity framework. It is critical that the different levelsof outcomes be aligned for all these projects.

Big Picture Community Goals

An excellent example of measuring outcomes on amacro level is Children in Our Community: AReport on Their Health and Well-Being, a projectmanaged by HSA in partnership with communitybased organizations and other county agencies. Thereport was released in January 2000, identifying sixoutcomes the community must strive to achieve(e.g. Children are safe, Children are healthy, etc.)and presenting indicators showing “How We AreDoing” in relation to the outcomes. Because theoutcomes go beyond the scope of what the HSA canaccomplish on its own, the development of suchoutcomes acts to highlight connections that existbetween different agencies, and therefore shouldlead to increased coordination.

Strategic Planning for the HSA and its Service Providers

The HSA’s “Year 2000 Strategic Plan for San MateoHuman Service Providers” is another example ofhow the Agency partners with other organizations toachieve agreed-upon outcomes. This Strategic Planis not so much a guide for internal operations as itis an overarching community plan emphasizing col-laboration. The plan reflects the results-account-ability framework, and identifies three major com-munity outcomes (Adults in San Mateo are Self-Sufficient, San Mateo county Families are Strongand Able to Support all Family Members Growthand Development, and San Mateo County is aHealthy Community), which are all aligned with theidentified outcomes of the County at large as wellas the Children’s Report. The plan also points toindicators, strategic directions, and measurableaction steps for each outcome. The HSA will not besolely responsible for tracking the indicators andaction step measures; The Strategic Plan suggeststhat the HSA and its community partners take onthat responsibility.

Outcome Based Management and Budgeting

As mentioned above, the HSA is in the process ofimplementing its OBM pilot project with AOD ser-vices. This process takes the results-accountabilityframework to the micro-level—linking performancemeasures to specific program operations—and itsoutcomes are still linked to the Strategic Plan andChildren’s Report outcomes as well.

OBM I M P L E M E N TAT I O N P R O C E S S AT T H ES A N M AT E O HSA

A second important concept to keep in mind aboutOBM is the fact it takes large amounts of time,

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

30

effort, and commitment to implement successfully.The HSA has invested in a high quality implemen-tation process, which includes complex phases ofPreparation, Work, Checking, and Tracking.

In the “Preparation” phase, the majority of theinternal implementation planning was accom-plished. The AOD program was selected as thepilot, largely because it is a separate budget unitand its services are provided by contracted agen-cies. An OBM steering committee was formed, com-prised of key executive staff, the ResearchManager, the AOD Contract Director, and AODstaff. To ensure that staff at all levels had an ade-quate understanding of OBM and how to developand use outcome and performance measures, thesteering committee worked with consultants select-ed by the County to plan staff training. An AODProvider Advisory Committee was formed to givethe Steering Committee feedback on the OBMprocess.

The “Work” phase entailed the HSA ResearchManager leading the AOD Program staff through aneducational brainstorming procedure set up by theconsultants, with the goal of completing a series ofOBM templates for the CMO (See attachment A).The AOD program staff came up with an OutcomeStatement, Values, Vision, and Priorities for theprogram. They also identified internal Strengthsand Limitations, as well as External Opportunitiesand Threats (the SLOT assessment). Finally, theyworked to develop Program Performance Measuresincluding Input Measures, Efficiency and ServiceQuality Measures, Output Measures, and OutcomeMeasures.

After the AOD Program staff completed the tem-plates, the entire procedure was repeated with theAOD Service Providers, who revised and added to

the work the AOD Program staff had done. Thefinal templates handed in to the CMO are a resultof true collaborative effort on the part of AODProviders, AOD program staff, and the HSAResearch Manager.

Most of the measures developed will be tracked forinternal improvement purposes. However, selected“Headline” Measures will be included in the bud-get. Once a baseline is established, these measureswill be given targets and worked into the budget ona separate template. Another template, the “StoryBehind Our Performance”, will delineate factorsaffecting performance. Also included will be “WhatWe Will Be Doing to Improve Performance Over theNext Two Years,” a list of action steps to addresspriority issues.

With the OBM templates completed, the“Checking” phase began, in which the ResearchManger and AOD staff ascertained data systemcapability for each of the measures. They alsochecked on program operations to make sure theywere aligned to the measurement and results. Whenthey were sure the measures, data systems, andoperations were all linked, the OBM Measures andBudget were turned in to the CMO.

The last phase in OBM implementation is“Tracking”, in which systems are developed totrack the measures, to work with providers and pro-vide them with technical assistance, and to addressfollow-up issues. This is where the agency willmove beyond simply using the measures as an over-sight mechanism to using them as a planning tool toimprove service provision. The HSA has workedvery hard to get to this phase, and the success of itsOBM effort hinges upon taking this next step.

31

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

F U T U R E C O N C E R N S

OBM in San Mateo remains a work-in-progress, andthe HSA staff continue to have concerns about whatthe new management system will mean for theAgency. A major concern continues to be the devel-opment of adequate data systems and the mainte-nance of data integrity. For data to be accurate andhonest, the agency will need to be even more clearabout responsibilities and roles of staff in perfor-mance measurement. Not only does the Agencyneed a data system capable of recording the data,but it will need to have staff trained in how to usethe system to report data, who have bought in to themeasurement process and who have the time to usethe system’s reporting function. The Agency willalso need to have program staff who are responsiblefor maintaining data and performing analysis: staffwho are knowledgeable about how to keep datawarehoused and refreshed, and what is needed toget the information they are asked for.

Management must also be responsible for dataintegrity. They must be committed to building thedata management infrastructure necessary for ongo-ing performance measurement, and remain commit-ted to the OBM process despite policy changes andservice innovations. Management must also resistthe temptation to change measures in an effort toreflect positive results.

A second concern about OBM is the ability of deci-sion-makers on the Board of Supervisors to inter-pret data correctly and to fully understand the per-formance measurement process. Human Serviceprograms are necessarily complex and what lookslike poor results for one program may in fact besuccess, according to population or modality. Willthe BOS be able to understand these distinctions?HSA staff are concerned about how the decision-

makers will use the information based on theirinterpretations.

A third concern is whether the County Manager’sOffice will allow agencies enough time andresources to effectively implement the OBMprocess. If Outcome-Based Budgeting is done forevery separate budget unit, the amount of papernecessary to complete the budget will be stagger-ing. The County Budget as a whole could take upan entire bookshelf. The process is also expensivein agency and provider staff time, and it is unclearwho will pay for it. Time needs to be allowed toevaluate the pilot program’s process before OBMgoes Agency and County-wide.

Other concerns include what effect OBM will haveon legally-mandated Agency services which havenot been included in the Community’s vision, andthe fact that no one knows yet how OBM will reallywork in the HSA.

I M P L I C AT I O N S F O R T H E S A N F R A N C I S C OD E PA R T M E N T O F H U M A N S E R V I C E S

San Mateo County and the City and County of SanFrancisco have many differences in terms of size,population, and government structure. San MateoCounty has already come up with its own set ofCounty-wide outcomes and is poised to link perfor-mance measures to resource allocation, whereasSan Francisco has not developed explicit City-wideoutcomes, and has just begun to consider having itsdepartments measure performance on an extensivelevel. However, because the San Francisco Perfor-mance and Review Ordinance of 1999 will requireDHS to implement Outcome and PerformanceMeasures similar to those of San Mateo by 2003,the San Francisco DHS can learn much from theHSA’s example.

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

32

Through the current Strategic Planning Process,DHS is re-examining and redefining its mission, acritical step in developing outcomes. DHS shouldensure the work and learning laid down by theStrategic Planning groups be maintained andexpanded upon for future outcome and performancemeasure development. Initiatives developed in theStrategic Plan should include outcomes and perfor-mance measures so that their success can be shownin a concrete fashion.

To benefit from performance measurement, DHSexecutive staff will need from the outset to committo treating measures as a tool to improve programperformance, not simply as an oversight mecha-nism. They should also begin working as soon aspossible to get buy-in from Program Managers andline staff on the Outcome and Performance Mea-surement process, so they know what is in store forthem, are able to see the benefits of the process,and have time to adjust to a new way of doing business.

As the San Mateo HSA’s experience illustrates,when the time comes to develop the measures forthe efficiency plan, it is critical that DHS investtime and resources into developing quality training,data systems and staffing to support ongoing perfor-mance measurement. Quick fixes do not apply tothe realm of OBM.

Most of all, DHS needs to remember that the move-ment towards Outcome-Based Management takestime: time for development, time to establish abaseline, time for tracking, time for understanding.It is the investment of time that is necessary tomake OBM a reality, and not just another manage-ment fad.

33

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

35

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

Attachment A

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

36

37

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

38

39

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

40

41

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

42

43

P a r t i c i p a n t s ’ C a s e S t u d i e s • C l a s s o f 2 0 0 0

B A S S C E x e c u t i v e D e v e l o p m e n t P ro g r a m

44