Simone Lombardini introduces our new series ‘real geek’, Research, Evaluation and Adaptive Learning (Generating Evidence and an Enthusiasm for Knowledge), blogs for anyone who sees research, measurement and evaluation as essential tools for learning, adaptation and ultimately improved impact.This is the first of a series of blogs where Oxfam’s technical advisers, researchers and programme colleagues will share questions, learning, and experience from the work we do. We work with Oxfam’s colleagues around the world in supporting the generation and use of rigorous evidence, in order to learn, adapt, and improve the impact of programmes. With this blog, we hope to share and explore technical evaluation tools and research questions that we encounter in our everyday work. This blog is intended for anyone who sees evaluation and research as one of the essential tools for learning and for development work.
There were a lot of interesting debates taking place which never left the four walls of the meeting room
The idea was born out of the realisation that, in our weekly team meetings, there were a lot of interesting debates taking place which never left the four walls of the meeting room. This seemed like a bit of a waste, especially as we know there are other organisations out there grappling with the same issues. So, we wanted to create an informal ‘space’ where we can post our current thinking, and share ideas and questions with colleagues inside and outside the organisation.
As in our work, we expect that the audience for these blog posts will be multiple and diverse, ranging from academic researchers, professional evaluators, Monitoring Evaluation and Learning officers and advisers in countries and regions, and practitioners and colleagues in the field. The goal of this blog is to:
- share methodological challenges, questions, reflections, and lessons arising from our work as researchers and technical advisers in Oxfam
- share ongoing research on methodological experimentation we are currently working on
- promote research or evaluations which we feel particularly proud of
- explore additional research questions and methodological components which may not have made it in to official publications
- respond to methodological questions we receive from colleagues
- advertise consultancy opportunities and positions within the team
- and finally, crowd source un-answered and unsolved questions we are battling within the team
We hope that in response to our posts, you will give us input and feedback, share your experiences and ideas, ask us questions and endorse or challenge our thinking. We hope this will lead to some fruitful and exciting discussions, and maybe some new collaboration.
Next time, we’ll bring you a piece from Jonathan Lain, one of our Global Impact Evaluation Advisers, on conducting power calculations by simulation for propensity score matching estimators.
I hope you will enjoy it!