Reviewing humanitarian evidence

Lisa Walmsley Humanitarian

The Humanitarian Evidence Programme is delighted to announce the launch of its series of eight systematic reviews.

Over the last few years we’ve been working with teams of researchers, practitioners and consultants from academic institutions and NGOs to map out the existing evidence critically appraise it and synthesise the results in response to key questions in eight practice areas.

Each of the independently peer reviewed reports is accompanied by an evidence brief. Each brief summarises the review’s main findings and methodology, highlights gaps in the evidence base and lists the full set of references to the source material. The second page of each brief contains a table or diagram that provides an at-a-glance summary of the evidence and evidence gaps.

Why are these important? How are they useful?

The reviews and briefs aim to provide policymakers and practitioners with a synthesis of the best and most relevant evidence in response to key questions using a published and clearly set out methodology. They allow us all to take stock of what we already know and highlight gaps in terms of the quality and quantity of evidence available.

We are only just starting to assess the use, usefulness and usability of the products in practice and would welcome feedback. We’ve had some very encouraging initial responses, including from practitioners in Oxfam’s own Global Humanitarian Team (GHT). The protection team, for example, is looking at how the findings of the mental health and psycho-social review might be taken forward in its policy review; in Harare, the WASH review has been used to help inform the programme design, implementation and monitoring of household water treatment distributions following the recent typhoid outbreak; and we’re currently summarising the gender-related findings from each review to see what these might reveal for our own programming.

The monitoring, evaluation, accountability and learning (MEAL) team will also be reviewing findings across the board. Given that so much of the humanitarian evidence base draws on programme reports and evaluations, we’ll be looking at the findings and recommendations on these (summarised in italic below ) to see if we can commission, manage, and sign-off in such a way as to ensure that they qualify as robust for use as evidence.

Top 10 recommendations for programme reports and evaluations  

  1. Report when and where the project under evaluation took place.
  2. State when and where data collection took place.
  3. Discuss the type of data collection and instrument (Survey? Interviews?)
  4. Clarify who collected data (Programme staff? External evaluator?)
  5. Provide sample strategy (how were populations identified and recruited?)
  6. Provide sample data (how many respondents participated in the evaluation or study?)
  7. Discuss any limitations or biases that may have affected the results
  8. Include data on cost-effectiveness of different interventions.
  9. Include data on implementation opportunities and challenges of different interventions.
  10. Include collection and documentation of sex- and age-disaggregated data.

What next?

One of the great things about the HEP programme has been the opportunity to bring together producers and users of evidence in open dialogue. It has allowed us to take stock of what we know, who knows what, how and where knowledge is documented and shared, whose voices are included and excluded, whose evidence counts and where all this sits in the context of humanitarian experience, opinion and need. Further reflections are captured in Roxanne Krystalli’s blog for ALNAP.

Please do get in touch if you’d like to be part of this ongoing conversation or give us any feedback.

You can learn more about the programme, access all our documentation and publications online via Oxfam Policy and Practice, Feinstein and DFID R4D.

Download the reports here
Author

Christopher Hoy