By James Eaton-Lee and Elizabeth Shaughnessy
The collection and processing of biometric data – data which is linked to the human body or its behaviour – such as fingerprints or facial recognition – has become significantly more common in humanitarian contexts worldwide in the last five years. Few donors or large INGOs have not had a position on biometric data or had some experience rolling this out. But this technology is not without risk or controversy, leading Oxfam in 2015 to issue a ‘moratorium’ on using it in its work.
In 2017, Oxfam reviewed its moratorium and commissioned the Engine Room to produce a report into this debate, with specific recommendations about governance and risk. In the Engine Room report, we recognised that there was a need for better sector-wide understanding of the implications of biometrics. Since then, Oxfam has explored the possibilities – and risks – of biometrics in a number of contexts, seeking to understand what responsible biometrics might look like in practice.
In 2021, we made the decision to formulate our understanding and practice into a formal policy which attempts to reflect the nuance and balance required for this complex subject matter area, called Oxfam Biometric & Foundational Identity Policy.
A policy fit for the future
This new policy is intended to do the following:
- Identify clearly what ‘good’ practice looks like – and more nuanced red lines;
- Provide a practical framework for building safe biometric programming, solutions or systems;
- Enable us to better engage with partners, being transparent and accountable regarding our approach.
Unsurprisingly, GDPR and Oxfam’s existing Responsible Data in Program policy informed the approach of the policy – which is rights-based, has a focus on people and harms, and accountability requirements.
But its key principles in particular reflect the key challenges we have experienced in our exploration since 2017 of this area – including classic issues relating to how data may be used and with whom it may be shared, but also technical issues relating to how data is protected and stored, and what biometrics look like in partnership.
It is, in scope and by design, intentionally an ethics and harm-based approach, seeking to align with frameworks like GDPR but also reflecting a broader ethical commitment to ‘Do No Harm’.
In focus: the policy’s key principles
The policy has 7 key principles:
- Plan, be proportionate and be responsible
- Be accountable to individuals and their community
- Share control with individuals and their communities
- Address risk to individuals and their community
- Address security risk
- Employ responsible biometric practice
- Relationships with third parties are defined
Importantly, these principles and this policy establish a number of red lines:
- No data collection without a biometric use-case or knowing where data will go (I)
- We must offer choice – but also build in ‘data collection without genuine consent’ (e.g. in-kind without identifying individuals) (III)
- We require partners to have and demonstrate good cybersecurity (V)
- Storing biometric data at field level (and other practice) is banned(VI)
- Partners align on the same values (VII)
As part of the development and signoff of this policy, we carried out a series of internal consultation steps, including engagement with a number of countries who have expressed a need for or interest in biometric solutions (particularly for Cash and Voucher Assistance). We did this both through informal sessions as well as a structured qualitative survey aiming to identify areas of the policy which were unclear, might be challenging to implement, or might be difficult for partners. We also took time to write into our policy explicit governance and risk assessment requirements, as well as supplemental material aiming to make it easier to use, but also clear and well-governed.
Next steps: engaging our partners
We now look forward to the internal and external implementation of this policy. Internally, we are realistic that this is a complex and nuanced area. Our implementation plan includes communication and engagement throughout Oxfam, as well as structured training and capacity building.
But we are particularly aware of the need for better external consultation, planning, and engagement outside Oxfam. We look forward to working with our partners who have biometric deployments already to learn from them, as well as understand how our policy may enable us to work better and more safely with them. And we hope that by sharing our policy, we help to further the conversation at sector level about what ‘good’ biometrics look like, and continue to spur research, standards, evidence-gathering, and collation of good practice. We are also committed to a review of this policy in 12-24 months; learning, we hope, from the success of version 1, and the feedback of our partners!
Biometric data collection and processing is gaining momentum in humanitarian contexts, and it’s important that we’re ready for it – as an organisation and sector-wide. New technology in humanitarian and programme spaces can mean exciting potential to reach more people, to facilitate and distribute aid more broadly, to expand access, and to advocate and influence more widely. But it is crucial that we do so in a safe, thoughtful and nuanced way, recognising the challenges that often come with new technology – the additional resources and capacity needed to ensure security and the protection of people’s rights, the sensitivity of data collected and the privacy of individuals, and the balance of power between those wanting to use technology and those – for whom we are working – who would be most impacted by it.
We welcome feedback on this policy, especially if you are a partner or have area expertise. Please get in touch with us at email@example.com.
James Eaton-Lee is Head of Information Security and Data Protection Officer. He runs the specialist team at Oxfam GB providing oversight and support on Privacy, Cyber Security, Data Ethics & Responsible Data – helping colleagues across the Oxfam Confederation to use data, systems, and technology safely, legally, and ethically.
Elizabeth Shaughnessy is a Senior Data Privacy Analyst in Oxfam GB’s Data Protection and Information Security team. Her work includes specialist support for privacy and data ethics as part of Oxfam’s Privacy Management Programme, including Data Protection in Humanitarian, Biometrics, AI/ML and Emerging Tech more broadly.