12
Measurement and Data Use for Community Health Services Steve Hodgins, Dena Javadi, and Henry Perry 30 September 2013

CHW Reference Guide - Data

Embed Size (px)

DESCRIPTION

 

Citation preview

Page 1: CHW Reference Guide - Data

Measurement and Data Use for Community Health Services

Steve Hodgins, Dena Javadi, and Henry Perry

30 September 2013

Page 2: CHW Reference Guide - Data
Page 3: CHW Reference Guide - Data

Draft December 2013 14–1

Key Points • Routine measurement and data use for community health services are poorly developed in most

countries.

• Such information systems, if well-developed and used, can strengthen community health services, including the performance of community health workers (CHWs).

• Sometimes additional information will need to be collected beyond what is normally reported on routine monthly report forms.

• Descriptions of some of the findings from monitoring and evaluation of large-scale CHW programs in Pakistan and India are provided.

Page 4: CHW Reference Guide - Data

14-2 Draft December 2013

INTRODUCTION Measurement for a program is analogous to senses for the human body. It is an essential function, required for program effectiveness. It tells what is happening, alerting people to areas that need attention. Data are used at different points in the development and implementation of program activities. Data are used to define or characterize a problem that may call for some new action. So, for example, we may be relying on health facility-based care for immunization. Considering routinely generated data on service volume, we may determine that population coverage is very low. That could prompt decisions to develop new modes of service delivery that more adequately meet the needs of the population (possibly including a new use of CHWs). Data coming from special studies or periodic population surveys can also serve this purpose of helping us better understand problems that may need new approaches. Another important use of data is evaluation. Typically, formal evaluations are a donor requirement associated with major program initiatives. Most often, they are done as one-offs, sometimes finding very disappointing performance, resulting in decisions to make major changes in direction. Related to evaluations is the measurement and analytic work associated with pilot activities. From such findings, decisions can be made about whether or not to proceed with scale-up, and what particular aspects of a demonstration activity need to be modified. Finally, returning to the analogy with senses and the human body, measurement can serve a critical function informing, on an ongoing basis, what is happening and where adjustments need to be made. As important as the other uses of measurement are, this chapter focuses primarily on the ongoing collection and use of data related to community health services for purposes of continually improving performance and impact. This process requires appropriate documentation and information management systems and requires equipping CHWs and other health workers on appropriate documentation and data management and use for improving services.

KEY QUESTIONS

• What does monitoring and evaluation consist of?

• What are the steps in developing a monitoring and evaluation system?

• What are methods for routine monitoring and performance management?

• When is routine measurement not enough? On this ongoing function, Lant Pritchett1 et al. supplement the conventional concepts of “monitoring” and “evaluation” with the idea of “structured experiential learning,” adding an additional “e” to M&E, to get MeE. They point out that typically: (1) “evaluation” is done infrequently, and by some external entity; and (2) ongoing “monitoring,” of the usual kind, is done as an administrative, reporting function. What is needed is more than this; they call for “structured experiential learning,” by which they mean rigorous, real-time tracking of important aspects of program performance by implementers, with tight feedback loops and continuous attention to address performance problems. In the case of many tasks or program activities conducted by CHWs, even routine (administrative) monitoring may be missing because health management information systems (HMISs) commonly fail to capture services provided more peripherally than at the health facility level. The fact that a function gets measured and reported does not necessarily mean that it will get meaningful attention to ensure performance, but if it is not monitored at all there is little likelihood of effective performance management. This need for monitoring is true

Page 5: CHW Reference Guide - Data

Draft December 2013 14–3

of all services, but is particularly important for community health services that often are not adequately monitored. Too frequently, they are also scaled up and implemented over long periods without being subject to rigorous evaluation. Unfortunately, there have been few examples of rigorous, large-scale evaluations of community-based services.1 Some community-based services are frequently captured in routine monitoring or health information systems. For example, in many settings outreach strategies and the use of health auxiliaries are important for delivery of immunization services. Where this is the case, these services are generally reflected in routine health information systems, though not necessarily disaggregated from health facility-based provision. For many other programs, however, services provided by CHWs frequently are not captured at all. For example, CHWs may be depot holders for oral rehydration solution (ORS), condoms, or oral contraceptives, yet distribution of these commodities/services by CHWs may not be captured in the HMIS. Health education and community mobilization functions are typically not captured at all or, if documented and reported, the information may not be interpretable. For example, a report on the number of community education sessions conducted per month by a CHW does not really provide any useful information about the quality of the session, the number of attendees, and so forth. There is a general principle that what gets measured gets attention. Health facility and program managers at all levels are only empowered to actively and effectively manage performance of their services and programs if they have a good idea how things are actually going. That requires selection of meaningful indicators, appropriate ongoing measurement, review of the collected information, and actions taken in response to the information collected. So, although we acknowledge the importance of traditional “monitoring” and “evaluation,” what is particularly important here is the “e” in MeE, i.e., what Pritchett et al. refer to as “experiential learning.” This term can be understood as ongoing monitoring/measurement to track and manage services to improve performance. As we have seen, immunization is a program area that, in many settings, is managed in a way that can serve as a model for other areas of community-level service delivery.

WHAT ARE THE STEPS IN DEVELOPING A MONITORING AND EVALUATION SYSTEM? If a program activity is judged to be important, managers need to have a good idea how it is actually doing. Frequently, enough checking can certainly help a lot. But, supplementing this monitoring with more systematic checking, using indicators and information systems, is generally necessary. There are several common problems, calling for specific responses as shown in Figure 1.

1A notable exception is the multi-country evaluation of community health services by Bryce et al.2

Page 6: CHW Reference Guide - Data

14-4 Draft December 2013

Figure 1. Steps in developing a functional information system for community health services

WHAT ARE METHODS FOR ROUTINE MONITORING AND PERFORMANCE MANAGEMENT? Virtually all primary health care services have routine health information systems, consisting of registers, forms, and reports. They may also include standardized case records, patient-held cards/records, and more specialized information sub-systems, for example, for health facility-level supply chain management. Some systems have provision for capturing services delivered at outreach sessions or at the household level, with dedicated registers or forms used at that level. In most instances, there are no institutionalized provisions for processing and using information collected through these various tools other than for extracting certain items for submission in monthly or quarterly reports. Although there may be integrated information systems that consolidate across all or some programs or services at the primary healthcare level, the more usual situation is a multiplicity of documentation tools associated with different programs. This system imposes a documentation and reporting burden on health workers and CHWs (to the extent that they, too, are obliged to record such information). It also contributes to problems of data quality and completeness and further reduces the likelihood of active data use for quality improvement. There are other possible data sources for routine monitoring and performance management. These sources can include documentation arising from supervisory contacts, individual patient records, and material generated from institutionalized death audit processes.

WHEN IS ROUTINE MEASUREMENT NOT ENOUGH? As already discussed, measurement related to health program performance has several important functions. As a basis for developing strategy, understanding the current situation helps in prioritizing, directing design decisions, and determining resources needed. But, as

Page 7: CHW Reference Guide - Data

Draft December 2013 14–5

services are delivered over time, there are also important dimensions of performance and of drivers or determinants of performance that cannot be readily measured through the normal means available to us for routine monitoring. For example, an important target of many CHW programs is changes in specific household practices. If part of a CHW’s role is to promote exclusive breastfeeding at the household and community levels, the most important measure of effectiveness is what is actually happening with breastfeeding rates. Normally, that cannot be measured any other way than by a representative household level survey of the whole population. Similarly, if an important focus of CHW work is to promote appropriate care seeking for danger signs, we will not be in a position to properly measure this practice based only on public sector service delivery statistics. There can also be important drivers or determinants of performance that program managers need to accurately judge. For example, morale or motivation of health workers (including CHWs) can be a very important factor influencing their performance. But, routine monitoring tools do not provide any insight into such factors. For important aspects of program performance that do not lend themselves to routine measurement, periodic surveys, and special studies can provide valuable information, although normally we have to be satisfied with getting this information far less frequently than we would like. Certain CHW programs, such as Pakistan’s lady health worker (LHW) program and Nepal’s female community health volunteer (FCHV) program, have had the benefit of periodic surveys specifically looking at those programs. These surveys have provided invaluable information on what the CHWs are doing and on factors influencing performance.

EXAMPLES OF MEASUREMENT AND DATA USE IN SPECIFIC CHW PROGRAMS Two recent examples of evaluations of large-scale CHW programs are of Pakistan’s LHW program (established in 1994 with 100,000 LHWs at present) and India’s accredited social health activists (ASHA) worker program (established in 2006 with 820,000 ASHA workers at present). Box 1. An example from India of regional-level monitoring

In India’s ASHA program, ASHA supervisors monitor performance and report on it. The reports on ASHA functionality involve recording whether ASHAs are completing such tasks as (1) visiting newborns within the first day (for newborns born at home), (2) attending immunization camps, (3) visiting households to discuss nutrition, and (4) acting as directly observed treatment short-course (DOTS) providers (to directly administer TB medication).3 These reports are then submitted to the block community mobilizer on a monthly basis and assessed quarterly to determine what percentage of ASHAs are functional. These results are then submitted to the district coordinator, who grades each block in the district based on ASHA functionality. Finally, the monitoring data are consolidated at the state level, and each district is graded. Similar to the ASHA example, monitoring systems for anganwadi workers (AWWs) in India start at the village level and data flows to the child development project officer at the block level and finally to the Ministry of Women and Child Development at the state and national level. Data are reviewed at each level and applied in decision-making. The monitoring and information system is currently being updated and revised so that each state adheres to a common national standard. The revision process involves creating online monitoring software for use at the block level, new information registers, and new reporting formats from the village to the state level.4

In 2009, Oxford Policy Management conducted an external evaluation of the LHW program in Pakistan,5-9 covering the period from 2003-2008. It included a survey of a nationally

Page 8: CHW Reference Guide - Data

14-6 Draft December 2013

representative sample of households and a similarly representative sample of LHWs. Separate interviews were conducted with LHW supervisors, selected medical staff, and community groups. The goals of the evaluation were to examine the level of performance of LHWs and their determinants, measure quality and coverage of services, including coverage among the poor. In addition, reviews of management, organizational systems, program expenditures, and unit costs were done, resulting in 11 reports, including reports for each province of the country. Among the many findings of the evaluation were that the LHW program has effectively managed its expansion from 70,000 to nearly 100,000 LHWs without undermining its impact, although there are still serious problems encountered with supplies, equipment, and clinical referral services. However, it did identify that 25% of LHWs exhibited low levels of service, including working outside of their catchment areas for other organizations and charging for services (which is prohibited). In addition, it found that high turnover in management positions impeded program performance. Coverage in the most disadvantaged areas of the country is still incomplete, so the program will still need to expand further. In India, in 2009 and 2010, the National Health Systems Resource Centre, a technical support institution with the National Rural Health Mission, conducted an assessment to determine what components of the ASHA program work, where, under what circumstances, and to what extent.3 The evaluation was done in three phases: Phase 1 was a qualitative study and a review of secondary information; Phase 2 was a structured questionnaire of a sample of stakeholders in two districts of each of the eight states selected for inclusion in the evaluation (in each district, 100 ASHA workers, 600 beneficiaries, 25 auxiliary nurse midwives [ANMs], 100 AWWs, and 100 community leaders who are representatives of the Panchayati Raj institutions). Their 207-page report had findings and recommendations specific to each state. In addition, the overall recommendations arising from the evaluation results included, among others, the following:

• Improve ASHA worker skills in counseling and interpersonal behavior change in areas related to nutrition, care in pregnancy (including recognition of maternal complications and referral), home-based care of the newborn, prevention, and management of illness in the young child, and prevention of communicable diseases and promotion of good health practices.

• Improve the quality of supervisory support and improve the drug kit refill process.

• Reinforce efforts to ensure that every household in the village is reached by an ASHA worker.

• Focus on advocacy for the program so that policymakers and others in the health system understand the potential impact of the program on saving maternal, newborn, and child lives.

• Establish a system of monitor ASHA functionality at the block, district, and state levels.

• Clarify and build synergy among the overlapping roles of AWWs, ANMs, and ASHA workers.

• Establish improved processes for replacement of ASHA workers and create opportunities for career advancement for qualified ASHA workers. This process will require competency-based training and certification of ASHA workers.

• Promote ASHA worker motivation by achieving a good balance between appropriate remuneration and a spirit of volunteerism and desire to benefit the community.2

2 The report’s statement was: “We need to forge a way forward that builds on the ASHA’s own reiterations of community service, as being her main motivating element and the concept of volunteerism and activism. At the same time we also need to ensure that we do not become exploitative of her service, and that we respect the need to value her service and compensate her adequately for her time.” [p. 127]

Page 9: CHW Reference Guide - Data

Draft December 2013 14–7

CONCLUSIONS Measurement and use of data for strengthening community health services at the local level can strengthen the performance of CHW programs. In addition, well-developed national CHW program evaluations conducted at 5- to 10-year intervals can serve to guide national program strengthening. For CHW programs to remain relevant and effective and to maintain political and governmental support for their long-term sustainability, well-developed monitoring and evaluation activities will be essential.

Page 10: CHW Reference Guide - Data

14-8 Draft December 2013

Useful Resources Segone, M (ed.) Country-led Monitoring and Evaluation Systems: Better Evidence, Better Policies, Better Development Results. UNICEF Regional Office for CEE/CIS. Available online at: http://www.ceecis.org/remf/Country-ledMEsystems.pdf. Lippeveld T, Sauerborn R, and Bodart C. 2000. Design and Implementation of Health Information Systems. World Health Organization: Geneva, Switzerland.

Page 11: CHW Reference Guide - Data

Draft December 2013 14–9

References 1. Pritchett L, Samji S, and Hammer J. 2013. It's All About MeE: Using Structured

Experiential Learning ("e") to Crawl the Design Space. Harvard University, John F. Kennedy School of Government: Cambridge, MA. Available online at: http://ideas.repec.org/p/ecl/harjfk/rwp13-012.html.

2. Bryce J et al. 2010. The Accelerated Child Survival and Development programme in west Africa: a retrospective evaluation. Lancet 375(9714): 572-82.

3. National Health Systems Resource Centre. 2011. ASHA Which way forward...? Evaluation of ASHA Programme. New Delhi, India: National Rural Health Mission

National Health Systems Resource Centre. Available online at: http://nhsrcindia.org/download.php?downloadname=pdf_files/resources_thematic/Community_Participation/NHSRC_Contribution/ASHA_Which_way_forward_-_Evalaution_of_ASHA_Programme_Report_NHSRC_417.pdf.

4. Ministry of Women and Child Development, Government of India. 2012. Integrated Child Development Services (ICDS) Scheme. Accessed on August 15, 2013 at: http://wcd.nic.in/icds.htm.

5. Oxford Policy Management. 2009. External Evaluation of the National Programme for Family Planning and Primary Health Care: Lady Health Worker Programme. Lady Health Worker Study of Socio-Economic Benefits and Experiences. Canadian International Development Agency. Available online at: http://www.opml.co.uk/sites/opml/files/LHW_Qualitative%20Report_1.pdf.

6. Oxford Policy Management. 2009. External Evaluation of the National Programme for Family Planning and Primary Health Care: Lady Health Worker Programme. Systems Review. Canadian International Development Agency. Available online at: http://www.opml.co.uk/sites/opml/files/LHW_%20Systems%20Review_1.pdf

7. Oxford Policy Management. 2009. External Evaluation of the National Programme for Family Planning and Primary Health Care: Lady Health Worker Programme. Summary of Results. Canadian International Development Agency. Available online at: http://www.opml.co.uk/sites/opml/files/Lady%20Health%20Worker%20Programme%20-%204th%20Evaluation%20-%20Summary%20of%20Results_0.pdf.

8. Oxford Policy Management. 2009. Fourth External Evaluation for the National Programme for Family Planning and Primary Health Care: Lady Health Worker Programme. Quantitative Survey Report. Canadian International Development Agency. Available online at: http://www.opml.co.uk/sites/opml/files/LHW%20Quantitative%20Report_0.pdf.

9. Oxford Policy Management. 2009. External Evaluation of the National Programme for Family Planning and Primary Health Care: Lady Health Worker Programme. Management Review.: Canadian International Development Agency. Available online at: http://www.opml.co.uk/sites/opml/files/LHW_%20Management%20Review_1.pdf.

Page 12: CHW Reference Guide - Data

14-10 Draft December 2013