Evaluation guide for funders: How to work with providers to develop useful evaluation: Using Evidence for Impact

Print document | Download PDF

March 2016

The Social Policy Evaluation and Research Unit’s (Superu’s) purpose is to increase the use of evidence by people across the social sector so that they can make better decisions – about funding, policies or services – to improve the lives of New Zealanders, New Zealand communities, families and whānau.

The Using Evidence for Impact project takes a big-picture approach and aims to inspire all those working in the New Zealand social sector to use evidence in decision-making. 

The objectives behind the programme are to drive:

  • greater accessibility to evidence
  • greater transparency of evidence
  • capability development and good practice in using evidence.

Who is this guide for? 

This guide is for funders who work with non-government organisation (NGO) providers of social programmes. It will help you to understand what evaluation is, how it can be used, and how to evaluate using a collaborative approach. 

What is evaluation? 

Evaluation is the systematic determination of value. Evaluation of a programme asks:

  • what changes have been caused by it? 
  • how valuable are those changes? 

Evaluation may look at different things, including: 

  • how, and how well an initiative has been delivered (process evaluation
  • the extent to which the initiative contributed to the achievement of target outcomes and unintended outcomes (outcome, or impact evaluation
  • value for money, cost-effectiveness, or cost-benefit (economic evaluation). 

Evaluation is not performance monitoring, but it uses performance monitoring data. Monitoring measures changes over time for selected indicators, and evaluation adds to this by investigating why changes have happened (or not happened), the extent to which changes are attributable to the initiative, and the value of the changes. 

Why is evaluation important for funders? 

Evaluation can be used to understand what works and what does not, and what difference the funding has made. Done well, evaluation can help funders to: 

  • be accountable 
  • develop better partnerships with providers 
  • improve funding practices 
  • understand how and where to invest for the best results 
  • demonstrate achievements and share lessons 
  • promote continuous improvement among providers 
  • improve outcomes for individuals, families, whānau, and communities. 

Funders have a critical role in developing good evaluation practice because they can incorporate a requirement for evaluation into funding conditions. 

This guide promotes a collaborative approach, where funders and providers work together to develop an evaluation that will fulfil both organisations’ needs for learning and accountability. 

back-to-top |

01_ The funder-provider relationship in monitoring and evaluation

This guide assumes the following respective roles of providers and funders: 

  • Providers monitor and evaluate their own programmes 
  • Funders guide providers towards better evaluation, and ensure evaluation findings will be useful for both the providers and funders. 

Sometimes funders carry out their own evaluations of programmes they have funded, but this guide does not address that situation. 

There are a lot of different approaches to evaluation. Some are better than others. Some are more or less appropriate for different situations, and some are more or less resource-intensive. This guide does not promote one particular approach. Instead it describes overarching principles to guide your interactions with providers, and provides links to online resources that you and your providers can use to explore different evaluation methods in more depth. 

Section 1.1 describes the main ways in which funders can work with providers on monitoring and evaluation. Section 1.2 describes three principles (and associated actions) to guide collaborative practice with providers on evaluation. Section 1.3 links to further resources on funder practices in evaluation. Section 2 links to several online ‘how to evaluate’ resources that you can steer providers towards, and that you can use to develop your own understanding of evaluation. 

back-to-top |

1.1. Different approaches

Overseas work has identified five main ways that funders approach evaluation. These are summarised in the following table. 

Description Advantages Disadvantages
Simple approaches

Simple measures are reported such as:

  • amount of money given
  • number of likely beneficiaries
  • intended results.
  • Straightforward and easy.
  • Low cost.
  • The focus on intended results doesn’t tell you what actually happened.
  • Little opportunity to learn about what worked and what did not.
External evaluation

An external evaluation of a programme or a suite of funded programmes is commissioned.

  • Done well, it can tell the story of funding impact, and promote learning.
  • Specific skills and expertise can be brought in.
  • Helpful if the evaluation needs to be “at arm’s length” from programme staff.
  • Makes it easy to ignore monitoring and evaluation until an external evaluator is hired. But ‘tacking on’ an evaluation at the end of a programme can produce disappointing results if key information was not collected during the programme.
  • If an evaluation is ‘tacked on’, there will be little opportunity for real-time learning as the programme develops.
  • Less able to develop evaluation capability among staff.
  • Takes time for external evaluators to learn about the programme and its context, adding cost.
Funder-led systems

Funders ask all providers to use standard measurement tools so that data can be easily aggregated across programmes.

  • Provides comparability across programmes.
  • Can facilitate efficient data gathering (e.g. direct entry into a central database).
  • It can help to build providers’ measurement capabilities.
  • Using the wrong measures can waste everyone’s time.
  • Can lead to measuring things that are easily added together, rather than more complex outcomes or stories of change.
  • Can create extra work for providers who already have good systems in place or who report to multiple funders.
  • Data can be inaccurate if providers have to force it into inappropriate categories.
Synthesis by the funder

Providers set, measure and report on results in a way that makes sense for them. Funders then extract evidence from providers’ reports, classifying and analysing it against their own framework.

  • Allows providers to gather the data that is most appropriate for the programme.
  • Providers can analyse and report evidence in the way that works for them.
  • Funders can demonstrate to providers that their reports are useful.
  • Funders can choose when they carry out the synthesis, and what programmes will be grouped in the reporting.
  • Funding staff need to be trained and supported in assessing reports and developing a synthesis.
  • Requires providers’ evaluation and reporting to be of sufficient quality. Funding support for improvement may be needed.
  • Takes time and trial and error to get it right.
  • Only works when providers are doing similar things, are working to generate similar outcomes, and report in similar ways.
Co-design

Funders work with providers to create a common measurement approach.

  • A shared approach is created, but does not feel imposed on providers because they co-created it.
  • As well as being useful for evaluation, the framework can be used to explain what the sub-sector is about and the kinds of outcomes it aims to achieve.
  • Takes time to develop the framework and this can be difficult to fit within a funding cycle.
  • If the framework is shared across several providers, they need to be working in a similar field.
  • As well as shared measures, shared methods for collecting the data may need to be developed, to avoid receiving data in different formats and of varying quality.

back-to-top |

1.2. Working collaboratively with providers

When working with providers to help them plan and implement evaluation, we recommend using a co-design approach if possible. But even if co-design is not feasible, working with providers collaboratively will generally get the best results. We recommend applying these three principles:

Work with respective independent values – clarify funder and provider values, and find ways to meet both organisations’ needs for reporting on what was achieved

Proportionality – use an approach that is appropriate for the information needs and is achievable with available resources

Foster learning and accountability; move beyond compliance – take an approach that will help funders and providers to learn and adapt, and to share information about what worked and what did not.

Each principle is described in more detail below, along with suggestions for actions to put them into practice. 

Work with respective independent values 

Funders and providers are driven by their own values and accountabilities. NGO providers of social services are accountable not just to the funders of their services, but also to their communities. Funding relationships should be based on mutual recognition of funders’ and providers’ respective values, roles, and responsibilities. 

Actions 

  • At the earliest stage of planning the evaluation, develop clarity and a mutual understanding of the funder’s and provider’s aims with respect to:
    • the difference that they want the programme to make 
    • the purpose of evaluating the programme. 
  • Work together to develop an evaluation approach that both the funder and the provider can use to find out whether the programme has made the difference they wanted it to make. 

  • Develop a timetable for the evaluation that will work for the funder and the provider. It should fit with the funder’s funding cycle and with the provider’s other reporting responsibilities. 

Proportionality 

Proportionality has implications for the time and money that is needed for evaluation. It is important that the chosen approach can draw meaningful conclusions, and that it is achievable with available resources. 

Actions

  • Scale the evaluation appropriately: 
    • The evaluation cannot and should not measure everything. Explore a range of methods, and choose an approach relevant to the purpose of the evaluation and that measures the desired outcomes. Make sure that any evaluation activity is justified by a need for the evidence that it will gather.

    • If there is already a lot of evidence suggesting that this type of programme works well (e.g. from overseas), you may limit the evaluation to checking that implementation went as planned, and investigating any unique aspects of your situation. If there is little existing evidence, evaluation to build the evidence base is appropriate.

  • Make the evaluation as efficient as possible: 

    • Where possible, use data collection methods that can be integrated into every-day programme activities, using existing systems. This is usually cheaper, more systematic, and better for organisational learning. 

    • Collaborate with others and use shared measurement approaches if they are relevant and can add efficiency. Find out what other organisations working in a similar area measure, and consider measuring the same things. 

  • Ensure that there is adequate resource: 

    • Encourage providers to cost evaluation into their funding applications. 

    • If it is an innovative programme without a pre-existing evidence base, more extensive evaluation is appropriate. This may require allocation of additional resources for evaluation. 

    • Make sure that the necessary infrastructure and capacity are present to support the evaluation. If they are not, either scale expectations down, or allocate resources to the development of infrastructure and capacity. 

Foster learning and accountability, move beyond compliance

Monitoring and evaluation are often treated as compliance activities, where boxes are ticked, but very little learning occurs. The purpose of evaluation should be to generate findings that funders, providers, and (in some cases) the wider community can learn from. A transparent and flexible approach helps to create a climate of learning and accountability.

Actions

  • Encourage reporting of positive, negative, neutral and unintended outcomes:

    •  

      Seek to understand what contributed to success, or to situations where the programme did not work as planned. 

    • Recognise unintended outcomes and investigate their value.

    • Encourage providers to describe any problems encountered, what circumstances led to problems, and what actions they took in response. Where appropriate, use this to improve funder and provider activities.

    • Take time when assessing negative results to understand what failed and why. Look at how good components from the programme might be sustained, or develop an exit strategy to transition out of the programme without negatively affecting beneficiaries.

  • As well as evaluating what difference a programme made, where possible the evaluation should examine what worked and what did not, for whom, and how, so as to help providers and funders learn and adapt.

  •  

    Share and use evaluation findings:

    •  

      Identify who may benefit from the learning and share the findings with them. Funding agencies may be able to use the findings to inform strategy and policy and to help prioritise investment. Other providers may be able to learn how to improve their activities.

    • Consider how feedback will be provided on evaluation findings.

    • Be open to and seek feedback and critique from providers. It is important to show that the funder is a learning organisation too.

    • Consider interactive sharing and feedback methods, such as workshops, to discuss evaluation findings and to identify actions for funders and providers.

back-to-top |

1.3 Further reading

The following references provide further information about the themes described in sections 1.1 and 1.2. 

Resource Link
Different approaches (Section 1.1)

How funders in Scotland measure their own impact

Evaluation Scotland’s description of five approaches that funders can take in measuring their impact.

inspiringimpact
Working collaboratively with providers (Section 1.2)

Funders’ principles and drivers of good impact practice

UK Funders for Impact Working Group publication that aims to help funders think about their impact, and how they can support the organisations they fund to evaluate impact.

inspiringimpact

Does your money make a difference

UK Charities Evaluation Service booklet that describes good practice in monitoring and evaluation by funders. It is aimed at anyone who commissions services from or gives grants to voluntary and community sector organisations.

www.ces-vol. org.uk

Four Essentials for Evaluation

US Grantmakers for Effective Organisations publication that aims to provide grantmakers with ideas, insights and examples to help them develop and strengthen evaluation.

www.geofunders. org

Learning Together: Collaborative Inquiry among Grantmakers and Grantees

Guidance from Grantcraft (USA) on collaborative enquiry. Provides further information about collaborative and co-design approaches to evaluation.

www.grantcraft. org

Evaluation Standards for Aotearoa New Zealand

Ethical standards for people commissioning, using, participating in, or conducting evaluations.

www.superu. govt.nz

 

back-to-top |

02_Evaluation guidance for NGO providers of social programmes

There are many websites and publications that provide advice on evaluation of social programmes. The table below provides links to some online resources that provide high quality entry-level guidance on how to evaluate. You can explore these resources to develop your own understanding of evaluation, and you can also recommend to providers that they read them. 

We consider all of these resources to be useful, but they differ. We encourage you to review several and to think about what approach may work best in your situation.

Resources are ordered alphabetically and not in any order of priority. 

Resource Link

Better Evaluation

Better Evaluation is the most comprehensive website on evaluation, with up-to-date guidance on best practice evaluation design and methods. You can get an overview of how to evaluate by going to this page and following the links, or you can try following the ‘rainbow framework’: http://betterevaluation.org/plan

You can also use the site like an encyclopaedia, to find further information on different evaluation methods or sub-topics. Type your search term into the search box at the top right.

betterevaluation.org

Evaluation Support Scotland – Evaluation

Guides, tools and resources, published by Evaluation Support Scotland, intended to help the charitable sector to carry out and learn from evaluation. The main topics covered are: setting outcomes, collecting information, analysing and reporting, and learning from findings. Each topic links to several support guides, tools, and case studies.

www.evaluationsupportscotland.org.uk

Evaluation planning for funding applicants

Resource developed by Superu, that provides guidance for funding applicants on how to plan evaluation for funding applications. Summarises the steps to take, and provides links to further resources.

www.superu.govt.nz

Field Guide for Evaluation

Field Guide for Evaluation published by Pact, a US international development agency. While targeted at international development, it is also relevant to evaluation in the social sector.

Practical, accessibly written, and includes exercises and worksheets. It is more comprehensive and technical than some other resources and is suitable for people who want to understand evaluation in greater depth.

betterevaluation.org

The Program Manager’s Guide to Evaluation

Published by the Administration for Children and Families (USA), this guide steps through planning, managing, reporting on, and learning from evaluation. Provides practical guidance for programme managers, including examples and worksheets. Has particularly good coverage of ways to integrate evaluation into routine programme operations.

www.acf.hhs.gov

Project Evaluation Guide for Nonprofit Organizations

Evaluation guide published by Imagine Canada. User-friendly, entry-level resource that steps readers through planning and implementing an evaluation, analysing and interpreting data, and communicating results. Contains examples, templates, tipsheets, and checklists.

sectorsource.ca

What Works (Aotearoa New Zealand)

Website developed by Community Research, providing guidance for community organisations on how to evaluate their activities. As well as content that is similar to resources produced overseas, this site has New Zealand-specific content on Kaupapa Māori evaluation, New Zealand evaluation case studies, and links to other Aotearoa resources.

http://whatworks.org.nz

 

back-to-top |
    Last update: 1 Apr 2016