Duncan Green reflects on the Oxfam Research Network’s recent ‘Evidence for Influencing’ Conference, Organised by the Oxfam Research Network and hosted by Konkatkt de Kontinenten in the Netherlands.
This week I attended an ‘Evidence for Influencing’ conference in the Netherlands. A couple of Oxfam colleagues had started planning it as a small event, and then found such interest in the topic that it mushroomed to 150 people over 2 days, roughly divided between Oxfammers and others (NGOs, media, academia).
My overall impression was that campaigners, academics and governments are all struggling with the link between evidence and influencing. It’s a bit like teenage sex – everyone thinks that everyone else is doing it with great expertise and having a fine old time, and that it’s just you that has no idea what you’re doing…..
I had to give some broad comments, many of them drawing on previous posts on this blog. Sorry if that means you’ve heard them before, but as I recently told UNICEF bloggers, a powerpoint is just a blog waiting to be written, so I thought I’d follow my own advice. Anyway, according to Fidel Castro ‘repetition is a revolutionary virtue’.
First point, what do we mean by evidence? The organizers defined it as ‘data or information presented in support of an assertion’. So research at the ‘hard’ end of the spectrum (lots of data, RCTs and the rest) is just one form of evidence: experiences (whether of researchers, or of communities and individuals) are another, as are narratives. The plural of ‘anecdote’ may not be ‘data’, as Claire Melamed says, but it is certainly evidence.
Which kinds of evidence should NGOs be stressing? Over recent decades, we have moved from relying almost entirely on moral suasion (we call it ‘framing’ these days) with stories, but little data to back it up (‘we have a moral duty to help poor countries’) to today’s blend of morality and measurement. Overall, that has to be a good thing, but if the measurement people become too professionalized and insulated from the rest of their organizations, there are downsides too: they risk becoming delinked from the thinking about power, politics and influence and start to believe in the power of ‘pure’ research, with some panels at the conference starting to resemble the ritualised academic dance of hypothesis, methodology, findings. (Don’t get me started on panels)
That worries me because if NGOs start to behave like traditional academics they could disappear from view, eclipsed by the sheer scale of academia. A recent paper found that there are 200,000 full time academics in the UK alone. In contrast, the full time equivalents of Oxfam’s researchers on issues of development, poverty etc number less than 100 world-wide. So we need to spot the niches we can usefully fill – the over-used ‘USP’ (Unique Selling Point) of marketing.
For me the USP of aid NGOs comes down to authenticity, connectedness and being able to produce clear, convincing narratives and powerful, human stories. For all sorts of reasons, academia often fails miserably at one or more of these. So the kind of evidence we generate should build on our USP: work extra hard to find, test and polish powerful narratives and/or make what we say is rooted solidly in the experience of our programmes or the lives of real people; give far more attention to ‘bearing witness’ – ensuring that the voice of people, communities, local leaders and organizations, and even our own frontline staff acquires greater, less mediated prominence in the way we talk to the public. This would be a challenge to NGO people keen to ensure a single, coherent message goes out to the public, but it could really help shift the balance of power in who shapes the conversation about development issues. Anyone fancy setting up an NGO to do this – maybe a ‘Hear Directly’ to sit alongside ‘Give Directly’?
That attention to reality on the ground also reflects a shift to systems thinking – we should spot the positive deviants that the system has already thrown up and help to record, understand and spread them; run diary projects to uncover the real lives of poor communities; behave more like anthropologists investigating what is actually happening out there in all its messy glory, rather than economists seeking to confirm their ‘priors’; escape from the tyranny of the project and ‘it’s all about us’.
Part of ‘dancing with the system’ is spotting and responding to the windows of opportunity offered by crises and failures of public policy. Academics struggle to respond to such ‘critical junctures’, perhaps due to being trapped by the conveyor belt of journal papers and research programmes. But to be honest, NGOs often aren’t much better.
The importance of timing goes beyond crises-as-opportunities. Political timetables are more open to evidence and influence at some moments than at others (new leaders, manifestoes and elections as well as after scandals); as issues travel down the ‘policy funnel’ from public discussion to state action, they require different kinds of research at each stage; people as individuals are more open to evidence at some times than at others (when they are young, or in an unfamiliar context or new job).
How can we get better at spotting impending or current opportunities and rapidly marshalling the evidence that is needed? One way is to get away from one of the greatest barriers to ‘evidence for influence’ – the dead hand of supply side thinking. Research conferences are all about the supply of research – the panels, the tenure cattle market etc. There is precious little attention given to the demand side, when do the people we are trying to influence actually want to hear new evidence? In what form? (even with government ministers, first-hand experience usually trumps any number of academic papers). Who do they want to hear about it from (the messenger, not the message)?
Bridging supply and demand is all about building networks and relationships, not just plans and products, as Babu Rahman explained so brilliantly recently. Evidence doesn’t speak or influence for itself. Researchers can’t hide behind their desks; they have to get out more (and not just to talk to other researchers).
Others who attended the conference, feel free to add your bit.