Impact report

Evidence for learning

In last year’s Yearbook we reported on our ‘impact map’, showing the outcomes of work funded between 2007– 12. We produced the map by retrospectively coding the outcomes of individual projects and grouping them to provide a big picture of the difference made by PHF grantees and initiatives. We have found this helpful in several ways. It illustrated the limitations of trying to assess impact from halfway through a strategic period, rather than building it in at the beginning. It also pointed to the drawbacks of broad strategic aims, as PHF has been working to, when it comes to building sufficient evidence from which to learn about impact in any particular area.

As we develop our new strategy, we are focusing on how to help grantees get more useful data and use it to improve the effectiveness of their work. To become a ‘learning organisation’ – making good use of evidence to inform decisions – is a considerable challenge for most organisations (PHF included). Part of meeting the challenge is recognising the difficulties of evaluating complex activity, and therefore understanding the capacity needed in terms of skills, resources, commitment and culture.

These issues are not ours alone. We have been working with others – grantees, partners and other funders – whose strategies are different but who often face fundamentally the same question: how to secure and use evidence to get the most from their resources. This year we supported Inspiring Impact, and joined its funders’ group, convened by ACF, to develop principles for foundations to complement the Code of Good Impact Practice for third-sector organisations. When I spoke at the launch of the funders’ principles in June 2013, there was clearly momentum behind making progress in this area. The challenge now is to integrate this thinking fully into our work.

Grantee and applicant feedback

In summer 2013 we received the results of an independent survey of grantees and unsuccessful applicants, conducted by the Center for Effective Philanthropy (CEP). The surveys provide feedback on a range of questions concerning how helpful we are in supporting grantees to achieve the public benefit they aspire to. This includes our processes, communications and accessibility, and the amount and type of funding and non-monetary support we provide.

This was the second time we commissioned the Grantee Perception Report (GPR), making PHF the first UK foundation to follow up on earlier results (from 2009) to see how things had changed. As we reported on our website, we saw some progress and some disappointments. The new findings provide a baseline for PHF as we develop modes of working to implement our new strategy. We will seek grantees’ perceptions of these in due course.

This was also the first time that a UK foundation had commissioned CEP to carry out a survey of unsuccessful applicants. As a direct result of findings from the Applicant Perception Report (APR), we introduced a new feedback system for UK applicants whose requests were declined at the first stage of our two-stage process. We hope in time this will help to strengthen the quality of applications, as the APR also told us that 87 per cent of declined applicants would consider applying again.

Fitter for Purpose

This year we continued to evaluate Fitter for Purpose, a pilot programme of support for organisational development, begun in 2012. This had its origins in the 2009 GPR, where grantees called for more non-monetary support.

Most of the 28 grantees in the pilot have now completed the work with consultants appointed through the National Council for Voluntary Organisations (NCVO), who helped them to focus on what their organisation needed to strengthen it through times of austerity and turbulence. Already we can see the value of grantees and consultants investing time in the initial diagnosis and scoping stage of the work. Often, approaches to the most pressing problems, such as fundraising or impact assessment, need to start with fundamental issues relating to strategy or governance. Initial feedback from grantees has been positive but the impact of the work will take time to show. We will continue to monitor at six and 18 months after the consultancy has finished.

An Evaluation Roundtable

During the year we worked with the Institute for Voluntary Action Research (IVAR) and the US Center for Evaluation Innovation on the UK’s first Evaluation Roundtable. This approach to collaborative learning about evaluation was developed for US foundations and involves working on a ‘teaching case’ – the story of a real evaluation carried out by a foundation. Our evaluation of Learning Away, an Education and Learning Special Initiative, was used as the teaching case for representatives of more than 20 foundations who came together for the Roundtable gathering in March.

Learning Away illustrates the many challenges experienced by foundations in designing and evaluating programmes to tackle complex problems, especially those involving many stakeholders with different roles, information needs and perspectives. Insightful discussions at the Roundtable focused on the key decision points in the life of an evaluation and how to feed these into strategic thinking for the foundation. We hope the Roundtable process will run again, with other foundations sharing teaching cases.

A PHF grants framework

Like many of PHF’s Special Initiatives, Learning Away, which began in 2009 and runs until the end of 2014/15, has evolved through phases, from initial exploration of the issues, through the development and testing of different interventions, to the sharing of findings and advocacy for wider change. These phases illustrate the different purposes of the work we fund, whether through Special Initiatives or Open Grants. We have developed a framework of different types of grant to help us think about how the purpose of the work determines the type of evidence grantees need in order to manage the work and decide how to take it on to the next phase.

The framework identifies four broad purposes of PHF-funded work, as shown in the diagram above.

For some time, PHF has asked applicants to describe the intended outcomes of their work and how they would measure them. Once an application is approved, we discuss and agree these outcomes and targets. Consideration of which category a grant falls into within our framework can guide these decisions and help to produce the sort of data that is needed to inform choices about whether and how to move the work on to the next stage, or purpose.

Learning or accountability?

This application of a grants framework is one illustration of the way we intend to use evidence and evaluation within the new strategy for PHF – to help us and grantees collect better quality, more useful evidence that can inform next steps and future strategy. We recognise that, especially in times of austerity and public-sector commissioning, evaluation can be seen primarily as a means of accountability and demonstrating value. Instead, we hope that we can become more adept at using evidence to learn and improve effectiveness. Sharing of information and transparency about our evidence and decisions will be an essential part of this.

Jane Steele
Head of Impact and Evaluation