16
Measuring the impact of communication: The case of EPRC M&E Tool 1

Measuring Communications Impact at EPRC

Embed Size (px)

DESCRIPTION

Presenter: Elizabeth Birabwa Podcast: http://bit.ly/1jUBny1 Elizabeth Birabwa is the programme manager of at the Economic Policy Research Centre (EPRC) in Kamapa, Uganda, presents EPRC’s tools for measuring communications. Elizabeth has over 14 years of experience in advocacy, communications, media relations and information management. She holds a bachelor's degree in Mass Communication from Makerere University and a Master's Degree in Library and Inform

Citation preview

Page 1: Measuring Communications Impact at EPRC

Measuring the impact of communication: The case of EPRC M&E Tool

1

Page 2: Measuring Communications Impact at EPRC

2

OutlineAbout EPRCResults Framework at EPRCWhy Measure CommunicationWhat we measureHow we measureChallenges of Impact MeasurementLessons Learned

Page 3: Measuring Communications Impact at EPRC

3

About EPRC EPRC is a Uganda based Policy Think TankMission: Foster sustainable growth &

development of Ugandan economy by advancing the role of research in policy processes

Conducts research, policy analysis & advice, engage in policy outreach and engagement

Have a Four (4) year strategic plan that guides programmes and;

A Policy Engagement & Communication (PEC)Strategy

Page 4: Measuring Communications Impact at EPRC

4

EPRC Results Management Framework To improve performance and implementation of

the Strategic Plan, EPRC adopted a results management culture that focuses on outcomes and impact

EPRC has developed an in-house M& E data collection tool for tracking and reporting on results

M&E function and tool administered by the Information and Dissemination Unit

Indicators for measuring communication are integrated in the tool

Page 5: Measuring Communications Impact at EPRC

5

Why Measure Communication?Ensure that our information products and

services remain of the highest quality and reach our target audiences in most effective manner

Specifically to: Have well-crafted information products and services Improve management of product development,

production & distribution Ensure we are reaching the intended audiences, in

the right way and right time Assess the effect or impact of our products and

services Increase use of the products & services and in turn

improve uptake of research in policy processes.

Page 6: Measuring Communications Impact at EPRC

6

What we Measure?Reach: That is the extent to which

information is distributed, redistributed and referred to i.e breadth and saturation

Usefulness: The quality of products and services i.e. if appropriate, applicable and practical

Use: What is done with knowledge gained from our information products and/or services?

Page 7: Measuring Communications Impact at EPRC

7

How we Measure?Standard of monitoring if products &

services meet requirements that make them effective and useful to policy makers

Monitoring tool used to monitor outputs and outcomes

Incorporated 10 communication related indicators within the tool

Rely on other specific tactics for collecting information for each indicator as will be discussed

Page 8: Measuring Communications Impact at EPRC

8

Indicators--ReachIndicator Data

capturedData source

Significance

1. # of copies of a product distributed to existing lists in hard and or electronic forms

# of copies sent # of people on mailing list

EPRC contacts Database

Helps to reach out to many people with our information products

2. # of copies of a product distributed thru additional distribution e.g. training, workshop or meetings

# of copies, venue, and date distributed.

Administrative record in form ofDistribution form

Helps track distribution that occurs in tandem with related events, hence increases chances of information being understood and applied. Helps build demand for the products.

Page 9: Measuring Communications Impact at EPRC

9

Indicators—Reach & DemandIndicator Data

capturedData source

Significance

3. Numbers of products distributed in response to orders/user initiated requests

# of phone orders, # of email requests# of interpersonal requests.

Admin records/form by Knowledge Management Specialist

Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality.

4. Number of file downloads in a time period refers to Internet user’s transfer of content from EPRC Website to own storage medium.

Web server log files.

Web analysis software-Google Analytics run by IT Specialists

Helps to know the information products and topics that are used most on the website and which countries or regions are using the website most

Page 10: Measuring Communications Impact at EPRC

10

Indicators—Reach & DemandIndicator Data

capturedData source

Significance

5. Media mentions of EPRC events, staff and products

Media outlets,

# and description of articles and radio/TV broadcasts;

Audience reach of media outlets.

Admin records/form by Knowledge Management Specialist

Helps to measure demand and thus indirectly gauges perceived value, appropriateness, and quality.

6. Number of instances that products are selected for inclusion in a library or online resource

#number and type of publications selected;

# and type of library or information Centre/online resource

Admin records/form by Knowledge Management Specialist

Helps not only to capture reach but also proxy measure of quality since librarians will ask for what they believe is beneficial to their clients/users

Page 11: Measuring Communications Impact at EPRC

11

Indicators—Reach & QualityIndicator Data

capturedData source

Significance

7. Number of policy engagements organized e.g. workshops, conferences to share and/ or discuss policy emerging issues or to disseminate research findings.

# of events held

Number of participants by gender and type of sector e.g. NGO, public, private, Media or donors.

Participants registration form

Capture reach but also the demand and feedback on events

Page 12: Measuring Communications Impact at EPRC

12

Indicators—UsefulnessIndicator Data

capturedData source

Significance

8. Percentage of users who are satisfied with a product or service

Qualitative covering user likes and dislikes, attitudes using scales e.g. how strongly they agree or disagree with statements on frequency, subject categories and technical quality

Feedback forms distributed with the product by the KMS who receives feedback via email

However the ideal would be to use other forms of surveys, online, telephone surveys or interviews with users but this requires time to develop tools and analyze data

9. Percentage of users who rate the content of a product or service as useful

Qualitative recording relevance and practical applicability of the content e.g. Was the topic(s) covered in the product interesting and useful to you?”

Data sources used are feedback forms distributed with the product

Ideally user surveys would be most appropriate but not done due to reasons same as above

Page 13: Measuring Communications Impact at EPRC

13

Indicators—UseIndicator Data

capturedData source Significance

10.Number of users using an information product or service to inform policy

Data captured is number of policy recommendations provided to clients;

Number of recommendations used by clients,

Evidence to show how recommendations have been used

Mainly use informal (unsolicited) feedback; that comes via email on the various policy recommendations, (phone or in person);

Review of copies of policies, guidelines, or protocols, referencing or incorporating information from products or services.

The challenge with getting this information is that users may not recall what particular information source they used, information may be used but not referenced

Page 14: Measuring Communications Impact at EPRC

14

ChallengesHarmonization of different data requirements from different projects

and data sourcesConsistency: How to capture information systematically in ways that

are straightforward and on a regular basisMeasurement: of outcomes from different information products and

services varies, and lead to a wide range of impact and influence; most of which is intangible and very hard to measure

Relationship building: not everyone within the department or organization may be motivated to embed M&E within their work;

Time: The process of collecting data and information as well as analyzing it may span over a long period of time to reflect the impact of communication products and services.

Tools and data collected keep changing e.g. advent of social media and web based resources has brought new ways of sharing information and thus monitoring

Page 15: Measuring Communications Impact at EPRC

15

Lessons learned

Not one tool suits all communication M& E requirements, there is need to have various tools that capture specific communication products and services

The tool design requires full participation of all parties that will be involved in using it

Continuous capacity building in tool use to facilitate regular capturing of data

Page 16: Measuring Communications Impact at EPRC

Thanks and Good Luck as you measure Impact of your

communication

16