Use this page to explore imagined-from-life examples which look at the ethical challenges involved in technology and what can go wrong. All the examples are about people trying to do the right thing. Sometimes that isn’t enough.
In a community response to a nationwide issue, a funder sets up a £5m fund to provide support to small charities. The funding focuses on the supply and delivery of food, medicine, menstrual products and emergency hardship grants.
It uses an algorithm to shortlist the ‘top’ applications. It uses keywords like ‘free school meals’ and a postcode check for factors that include health statistics and deprivation (poverty) measures.
The funder's aim is to see where it could make the most impact.
When some applicants hear about the decision-making process they challenge the outcomes of the award. They believe the algorithms could reflect the unconscious bias of the people who wrote them and turn the fund into a geographical lottery.
Nobody asked the important questions at the right moment.
Funders who are considering using algorithms need to assess the likely implications for equality and fairness. They also need to make sure they communicate clearly to applicants how the algorithm is being used when they open the application process.
A youth organisation is delivering more and more of its services online. It’s using Instagram and other social media channels.
It realises it needs a database to help keep track of the growing response. They experiment with a CRM system offering a free pricing tier.
Then they discover the free element allowed the company who sold them the system to process, share and sell the information they gathered.
A data protection policy with active risk assessments that everyone has to follow could have made all the difference. NCVO members can use our guide to creating a good data protection policy.
A group wins a grant to improve things for their community because of covid-19. It's asked to share this data in the following ways.
The charity has permission from the people it’s been working with to store their data and to use it for their own reporting. It includes sensitive data about vulnerable people – their mental health and the immigration status of refugees. It's never asked permission to share the data more widely. If it could, it would help improve covid outcomes for people and save and improve lives.
It decides to give a quick answer, and say yes without asking for permission from the participants. They take a small step towards anonymising the data.
A data-sharing agreement should have been drawn up to limit what the data could be used for. As there was no time to ask people’s permission then the data should've been supplied with more anonymisation or aggregation.
If you need help with planning, look for the planning template in this responsible data management training pack from Oxfam.
For organisations with dedicated IT or digital teams, volunteers or staff members - catch problems before they happen using the full Ethical Operating System toolkit.
Last reviewed: 02 March 2021
Help us improve this contentGet regular updates on NCVO's help, support and services