8
JISC Guidance for Evaluating BCE Support Activities Demonstrating EVIDENCE of outcomes & impact Services are often asked to report on aims, methodologies and lessons learned and to demonstrate outcomes and impact of their activities & initiatives, in other words to show ‘value for money’. This is certainly a requirement for formal evaluation of your service. But capturing data from your day-to-day activities offers a continuous “product development cycle” that is immensely beneficial as a means of reflective practice. This is not only important for ongoing engagement with your clients, to consult upon whether their needs are being met, but also when it comes to later reporting can mean the evidence is readily at hand. Taking this approach may mean being somewhat more pro-active seeking out and collating a rich variety of different types of evidence to back up statements or claims about impact. But the pay-offs are worth it, particularly when it comes to justifying your existence – an inevitable necessity with a dwindling resource base. The purpose here is to provide evidence of impact, in particular. However, it can be unclear to practitioners what is meant by 'evidence' in the context of BCE related activity. This guide aims to clarify and provide some examples against some typical kinds of outcomes from hard quantitative data to the softer qualitative and illustrative evidence. First, let’s consider who the evidence is aimed at. This is where evaluation should feed into dissemination, and vice versa, both being based upon an acute awareness of one’s audience (clients or stakeholders) and leading to an appropriate and effective “message to market match”. - Who are your different stakeholders and what are they most likely to be interested in? - What questions or concerns might they have? - What form of advice or guidance is most likely to suit their needs? And from that starting point, consider what ‘evidence’ is most likely to demonstrate impact, and indeed how well you’ve hit your target aims. Second, let’s look at the nature of the evidence itself. © Belanda Consulting 2010 Jay Dempster & Helen Beetham

Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

  • Upload
    bce-ae

  • View
    373

  • Download
    0

Embed Size (px)

DESCRIPTION

Handout to accompany presentation by Jay Dempster

Citation preview

Page 1: Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

JISC Guidance for Evaluating BCE Support Activities

Demonstrating EVIDENCE of outcomes & impact

Services are often asked to report on aims, methodologies and lessons learned and to demonstrate outcomes and impact of their activities & initiatives, in other words to show ‘value for money’.

This is certainly a requirement for formal evaluation of your service. But capturing data from your day-to-day activities offers a continuous “product development cycle” that is immensely beneficial as a means of reflective practice. This is not only important for ongoing engagement with your clients, to consult upon whether their needs are being met, but also when it comes to later reporting can mean the evidence is readily at hand.

Taking this approach may mean being somewhat more pro-active seeking out and collating a rich variety of different types of evidence to back up statements or claims about impact. But the pay-offs are worth it, particularly when it comes to justifying your existence – an inevitable necessity with a dwindling resource base.

The purpose here is to provide evidence of impact, in particular. However, it can be unclear to practitioners what is meant by 'evidence' in the context of BCE related activity. This guide aims to clarify and provide some examples against some typical kinds of outcomes from hard quantitative data to the softer qualitative and illustrative evidence.

First, let’s consider who the evidence is aimed at. This is where evaluation should feed into dissemination, and vice versa, both being based upon an acute awareness of one’s audience (clients or stakeholders) and leading to an appropriate and effective “message to market match”.

- Who are your different stakeholders and what are they most likely to be interested in? - What questions or concerns might they have? - What form of advice or guidance is most likely to suit their needs?

And from that starting point, consider what ‘evidence’ is most likely to demonstrate impact, and indeed how well you’ve hit your target aims.

Second, let’s look at the nature of the evidence itself.

'Evidence' is anything that helps to confirm or illustrate an outcome (see Examples below). Evidence may be arrived at through formal evaluation processes – when it is likely to be supported by data collected and analysed for that purpose – but also through reflection, data traces gathered during set up, implementation and follow up of an activity or product, as part of a collective 'story' of your service’s deliverables and outcomes.

Creative forms of ‘rich media’ evidence, such as vision boards, multimedia, talking heads, screenshots and images, have an important role to play in communicating outcomes. However, the type of evidence needs to match the type of message or the claim being made. It needs to be right for the purpose, occasion and audience it is used for (e.g. formal report or presentation, informal consultations, website, blog, newsletter or published paper).

For example, if we are claiming that particular service activities generated a significant number of new contacts and partnership opportunities, we need quantitative data, numbers and examples. If we are saying that awareness of BCE has been raised (an outcome that cuts across all BCE projects & services), the data will tend to be more qualitative, analysed from

© Belanda Consulting 2010 Jay Dempster & Helen Beetham

Page 2: Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

JISC Guidance for Evaluating BCE Support Activitiesinterviews, questionnaires and/or surveys, and possibly include quotes from clients & other stakeholders to illustrate how their attitudes, roles or strategies/policies may have changed.

1. An illustrative ‘Activity System’ as a basis for evaluating services ‘product development cycle’

Evidence generated by complex processes – such as for BCEct, the intersection of new technologies with different organisational agendas – is often itself complex and difficult to generalise. This is fine, providing the complexity and the limits of generalisability are acknowledged.

Indeed, you may seek to use theoretical frameworks to model the dynamics of supporting BCE activity and draw out the specific contextual factors at play, e.g. Engstrom’s activity theory used in BCEct (see below) or Wenger’s communities of practice.

Engestrom's activity system triangles:

This framework is not an evaluation methodology as such but the activity system is a very tight fit with the context of JISC services & BCE projects, investigating how diverse objectives (of subjects: institutions & B&C partners) are taken forward through the medium of shared tools & services, in the context of community/institutional strategies for engagement (rules, division of labour).

How might you interpret the dynamics of your stakeholder context and where/how your service might best engage and contribute?

© Belanda Consulting 2010 Jay Dempster & Helen Beetham

Page 3: Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

JISC Guidance for Evaluating BCE Support Activities

Taking the different points in turn gives us a set of evaluation questions that could form the basis for gathering data around a product development cycle.

Objects What were the objectives of the activity/product? How did they fit with clients’ (or other stakeholders’) strategic goals?What have the subjects (participants, end users) said about why they got involved in the activity/use of the product?What have the subjects said about what they wanted from engaging with your service?

Subjects Who were involved as participants in the activity/use of the product? (Give roles and departments/organisations rather than names)Who else had an interest in the collaboration (stakeholders)?

Tools/resources What tool(s) or resource(s) were used by the participants?How fit-for-purpose or effective did they feel the tool/resource was?What issues, if any, arose in implementing the ideas/resource(s) provided?What guidance would you give other institutions on using the tool(s)/resource?BCEct specifically would include here questions around ‘choosing and using collaborative tools’ (examples available if appropriate to your project)

Roles/division of labour

How (much) did different participants engage with you?How comfortable were different participants with their role in implementing the approaches?How has use of approaches/tool(s) changed roles and modes/levels /type of participation? How have attitudes towards BCE changed, if at all?How have participants' different motivations played out? Have they changed?

Community How were relevant communities engaged, developed, facilitated, funded (if relevant), sustained as a result of using your services/products?What indicators are there of any changes in community (institutional) practice and/or policy around BCE? And whether these changes sustainable?

Outcomes What happened as a result of clients using your services/products? What engagement activities or partnerships came about (that might not have happened, or not in that way/at that time)?What evidence is there – of achieving objectives and furthering strategic goals?What changes have been observed – in practice, process, attitude, level of engagement?What benefits have stakeholders identified from using BCE approaches as a result of working with your service?What indirect benefits can be inferred or observed?Were there any unexpected outcomes?

© Belanda Consulting 2010 Jay Dempster & Helen Beetham

Page 4: Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

JISC Guidance for Evaluating BCE Support Activities

2. Examples of Evidence

Examples of the kind of evidence you might provide for different outcomes are given below. The list is meant to be illustrative and not exhaustive.

Outcome(benefits, direct/indirect /unintended, success/impact indicators)

Evidence(gathered from clients or at events)

successful implementation of …(a tangible deliverable)

Feedback, screenshots, guidance notes

successful collaborative activities with partners

Reports, recording (podcast, excerpt), photos, (screenshots if online)

identification and follow-up of new BCE opportunities; new partnerships

Reports, articles/press releases, transcripts, quotations'Talking heads', audio and video clips, especially with new partners or potential partners

improved awareness and perception of BCE within the institutions involved

Feedback questionnaires (include quotes)'Talking heads', audio and video clipsOutcomes of e.g. interviews, survey, questionnaire, focus group

improved skills and expertise Surveys, email follow ups (include quotes)Workshop programme and feedback from participants

more strategic approach to BCE New institutional policies, strategies, mission statements (excerpts)internal emails indicating new practice or approach

sustainable approach to the use of any online tools

Evidence of use being cascaded to other areas of institution; changes in IT policyEvidence of guidelines being adopted.

© Belanda Consulting 2010 Jay Dempster & Helen Beetham

Page 5: Jisc Advance BCE Evidencing Outcomes Benefits & Outcomes (Handout)

JISC Guidance for Evaluating BCE Support Activities

Consider your own service in terms of aims, outcomes and evaluation resources (what, who, where, how, when) to decide what will be manageable for you.

From your list, choose a selection to prioritise that would lend maximum credibility to your narrative about benefits and impact from your activities to the different internal and external audiences.

Outcome(tangible outputs, benefits, direct/indirect /unintended, success/impact indicators)

Example evidence(existing or new data gathered from clients/stakeholders, specify what, who, how, where, when as far as possible)

© Belanda Consulting 2010 Jay Dempster & Helen Beetham