Five Key Lessons in Impact Evaluation

I recently came across this gem: a report by International Initiative for Impact Evaluation (3ie) titled ‘Behind the scenes: Managing and conducting large scale impact evaluations in Columbia‘. This is a current discussion – the paper is dated December ’11. There is substantive investment in IE in Colombia and they are seeing the real benefit of investment in IE.

The focus of the authors Briceño, Cuesta and Attanasio was development projects in Columbia, and I think there is some useful guidance that would apply to the Student Services sector.

I particularly like this concluding section, “Five Key Lessons”.

Five Key Lessons

1. Invest in the preparation of good terms of reference (TOR).

I agree, it is essential to think hard about exactly what you want to find out when starting to design the process of an evaluation.

2. Decide on the best methodological approach to address the evaluation questions.

AMOSSHE’s Value & Impact Bank is a resource that may help you here. I have collated evaluation materials used by UK HEIs which can give you an insight into how others have approached improving the impact of their services. The AMOSSHE Value & Impact Bank is a member’s resource and you can find it here:

3. Ensure evaluation quality.

You may want to meet with professional researchers in your institution to discuss research methods – a resource on your doorstep.

4. To foster evaluation ownership, lay out the incentives for dialogue between the involved parties.

To put it another way, do not fear tackling head-on the questions “what’s in it for me? / What’s in it for you?” with your team for a positive process and outcome.

5. Conduct quality dissemination.

In other words, share your findings. A small note on the departmental newsletter will likely not do justice to your efforts in evaluation. Be creative about the best ways to share your findings.

You can download the full report for free, here.*

*The emboldened conditions for successful IE on pp6-8 are worth looking at – they very much echo the issues around implementation that our pilot Value & Impact HEIs experienced.

Who’s leading the way on student well-being?

I recently read this from the New Economics Foundation – Leading the way on well-being

The article highlights that

“As of today, the UK is the proud owner of the biggest official well-being data set for any single country within a single year.”

This made me wonder: is there anything like this for services that support students, i.e. wellbeing specific?

The author went on to suggest that:

“This data is providing a unique opportunity to really understand what determines well-being and how it changes over time. This information will allow policy-makers and those whose role it is to hold government to account to identify policies that will be beneficial to well-being and avoid those that will be harmful to it.”

I think this is relevant to services that support students too. Why? Because:

  1. We already know that higher education is one of the most data-rich sectors in the UK; and
  2. We know that the well-being of students is one of the most important aspects of these services.

What I’m suggesting is that we may already have strong data – things like the Higher Education Statistics Agency (HESA) , Destination of Learners in Higher Education (DLHE) –  that we can begin to use right now to improve services that support student well-being.

And if we don’t yet, we have everything we need to begin collecting that data. We just need to be braver about collecting data and more confident about its relevance. “Lift the lid”, hence this post’s title.

Here’s the plug: I’m currently building a Bank of resources to evaluate the Value and Impact of services that support students.  It will be a collection of tried and tested resources that higher education institutions have used to evaluate their services. It will be going live on the 21st March 2012 to AMOSSHE members.

Here’s a teaser screenshot of the Value & Impact Bank’s front page:


AMOSSHE’s repository – Categories

Below are the categories we are using to organise all the submissions we’ve had to AMOSSHE’s Value & Impact Repository. After feedback the most popular option was to organise them by department area.

I’m excited because this means our members will be able to get to the tools they need as fast as possible.

I’d really welcome your comments and feedback! Please feel free to click comment or email me, at with “VIP Categories Feedback” in the title.

Response to NCVO

I came across this last Wednesday:

The National Council for Voluntary Organisation: Here’s the Ultimate Impact Measurement Tool

Richard Piper’s suggestion is striking, namely: the human brain is the ultimate impact measurement tool!

The ultimate impact evaluation tool is the human brain. Every paper-based or online tool out there, every survey, every clever dial or star, every randomised control trial, every focus group, is an attempt to get close to the interpretative capability of the human brain.  And many of these formal evaluation tools are ersatz versions of the brain, poor approximations of the real thing, chalk instead of bread flour, fizzy lager instead of real beer.

A declaration of interest here: I am building a repository of evaluation tools for leaders in Student Services to use to evaluate and improve what they do. Piper’s idea, in theory, could contradict my very job! Am I feeding people bread made with chalk, and sub-standard lager? Piper suggests he is not anti-tool per se:

I am not suggesting that the use of forms, surveys, dials, etc is necessarily wrong and that we should stop using them to evaluate impact. But I do feel that sometimes these tools are poor substitutes for our powerful, perceiving and interpreting brains. And I certainly think we should be using common sense ways of getting evidence, based on the perceptual and interpretative abilities of the brains at our disposal.

Good point, well made. The danger we face with tools is to switch off and over-rely on them. But I’m going to make my own counter argument:

Evaluation is a craft. As craftsmen we are refining what we do.

Evaluation is a science. What makes it so is our commitment to refining our knowledge and tools.

Can it be both? My own view is that both apply.

Below is my attempt to boil down Piper’s insights to some snippets that I think all of us working in evaluation can usefully reflect on:

  1. Trust your staff (he uses volunteers).You can unlock [their] tacit knowledge by simply talking to your [staff] face-to-face, in an unstructured or semi-structured interview, asking some open questions”, i.e. using “the crucial phrase “do you think.”” Piper suggests, “This gives them the licence to give you their interpretation, and also makes them feel their view is valued.”
  2. Limit the evaluation burden. “[O] nly […] ask them when you genuinely need new information and learning […] In most cases, it’s absolutely fine to demonstrate the value of a service once, until either the service or the context changes.”
  3. Account for our inherent bias. Assume they are honest and self-reflective and able to give a balanced view. Secondly, talk to more than one and compare the results.”
  4. Know when to stop.  “The trick is to get enough evidence to tell you things you really didn’t know – and really need to know – to help you make the decisions.”

Why measuring impact remains an elusive goal

I recently came across this article from the Philantopic blog, which has some really interesting thoughts on impact in the context of grant-applications in the not-for-profit sector that I think apply equally to student services. To this end I’ve cut out the key ideas for you; my credit to John Colburn for these. To begin;

“[I]t is clear that the advancement of knowledge and understanding occurs because a handful of practitioners persevered against the broader culture of practice and what can reasonably be “known” in order to elicit whole new understandings.

I totally agree! We must be brave in the learning process. He goes on to suggest:

[W]e should try to identify and avoid repeating the same mistakes that have yielded such limited results to date. Here are a few ways we can be smarter….

I think these four points are great!

a) Let’s separate accountability and compliance […] – from impact assessment. The conflation of the two results in over-elaborate monitoring tools that distract from the impact-assessment process. […]

b) Let’s agree that simpler and inexact processes help move the ball on impact assessment. Often, we let the perfect be the enemy of the possible in impact assessment. […]

He suggests that we use “horse sense”, which I read as some sort of “intuition”. It’s worth remembering that intuition is actually the result of the totality of our experience – as scientists have studied.

c) Let’s agree that sharing our results — successes and failures — makes us all smarter. This means we need to begin to develop a common framework for describing our work, our goals, and our results. This allows [us] to learn from one another and avoid repeating each other’s mistakes. The boring and un-glamorous work of [evaluation tools] is an essential building block for developing and sharing knowledge.

The evaluation tools repository I’m working on will help Student Services managers in the UK to share information on best practice.

d) Let’s agree that there is complexity in impact assessment, but not let that stand in the way of seeking universal truths. Yes, context matters. […] And, yes, there are bound to be varying levels of quality in formulating and implementing impact assessment. Still, I am convinced that there are underlying commonalities to our work that allow us to learn from one another and begin to build a body of knowledge of what works and what doesn’t.

Some real wisdom I think, despite the slightly different context, and worth bearing in mind when approaching  the difficult topic of measuring impact.

You can read the whole article here.

An Introduction

Let me introduce myself: My name’s David Stoker and I’m Value and Impact Project Officer at AMOSSHE, the Student Services Organisation.

You can find out more about AMOSSHE’s Value and Impact Project here:

The Value and Impact Project is focused on improving our understanding and evaluations of student services. Its aim is to:

  • find meaningful ways to measure and demonstrate the impact and value of services in Higher Education Institutions;
  • produce and disseminate powerful tools to measure value and impact of services and improve them further.

I am blogging to:

  • Highlight good work;
  • Stimulate debate;
  • Connect with others working in the Student Services sector and beyond.

All views expressed on other projects are my own. I would love to chat with readers about value and impact via this blog!