The Road Ahead

Our analysis of the major opportunities and challenges facing the voluntary sector in 2024. Learn more

Technology — when good intentions go wrong

This page is free to all

Use this page to explore imagined-from-life examples which look at the ethical challenges involved in technology and what can go wrong. All the examples are about people trying to do the right thing. Sometimes that isn’t enough.

Example: A problem with algorithms

In a community response to a nationwide issue, a funder sets up a £5m fund to provide support to small charities. The funding focuses on the supply and delivery of food, medicine, menstrual products and emergency hardship grants.

It uses an algorithm to shortlist the ‘top’ applications. It uses keywords like ‘free school meals’ and a postcode check for factors that include health statistics and deprivation (poverty) measures.

The funder's aim is to see where it could make the most impact.

When some applicants hear about the decision-making process they challenge the outcomes of the award. They believe the algorithms could reflect the unconscious bias of the people who wrote them and turn the fund into a geographical lottery.

How did this go wrong?

Nobody asked the important questions at the right moment.

  • Should a funder tell applicants it's using an algorithm before they start taking applications?
  • Should the funder make it clear that location matters?
  • Should key words and phrases be used at all? What if the bid is from someone whose first language isn’t English or is from a community that phrases the same problems in a different way?

The consequences

  • People from the wrong locations submitted applications.
  • The process favoured some applicants over others.
  • The overall impact of the funding may have been reduced, particularly in some communities.

Funders who are considering using algorithms need to assess the likely implications for equality and fairness. They also need to make sure they communicate clearly to applicants how the algorithm is being used when they open the application process.

Example: A problem with terms and conditions of software companies

A youth organisation is delivering more and more of its services online. It’s using Instagram and other social media channels.

It realises it needs a database to help keep track of the growing response. They experiment with a CRM system offering a free pricing tier.

Then they discover the free element allowed the company who sold them the system to process, share and sell the information they gathered.

How did this go wrong?

  • The organisation didn’t read the terms and conditions carefully enough. It was rushing to grab something that had no cost.
  • It didn’t realise that the free addition to their CRM system wasn’t covered by the software company’s privacy data – but the subscription version was.
  • Its data protection policy was a document somebody had saved in a file once. They had no process for making a risk assessment for every decision which would have prevented this mistake.

The consequences

  • The organisation had to work out how to tell the young people, and their parents and guardians what had happened.
  • It had to find out how the young people could get their information removed from the software company's lists.
  • It had to live with the fact that their rush to find free services had put people at risk. It realised that its duty of care mattered online as well as face-to-face.
  • Fortunately case information about the young people wasn’t included as the organisation had only just started using the system for social media data.

A data protection policy with active risk assessments that everyone has to follow could have made all the difference. NCVO members can use our guide to creating a good data protection policy.

Example: A problem with deciding to share data

A group wins a grant to improve things for their community because of covid-19. It's asked to share this data in the following ways.

  • Aggregated reports and demographic information about the covid prevalence in different community groups; a formal condition of the grant.
  • An ad-hoc request for historical data about local communities that the charity has for its own purposes.

The charity has permission from the people it’s been working with to store their data and to use it for their own reporting. It includes sensitive data about vulnerable people – their mental health and the immigration status of refugees. It's never asked permission to share the data more widely. If it could, it would help improve covid outcomes for people and save and improve lives.

It decides to give a quick answer, and say yes without asking for permission from the participants. They take a small step towards anonymising the data.

How did this go wrong?

  • Time pressures and the severity of the crisis meant the charity didn’t think of some of the ways it could’ve reduced risks.
  • It didn’t discuss more fully anonymising data or aggregating the data before sharing it.
  • It didn’t get its clients' permission.
  • It didn’t think about the future.

The consequences

  • The grant funder had access to sensitive data about vulnerable people without their permission and with only partial anonymity.
  • There was no agreement on what the funder could use that data for. So it could be passed on to other organisations, such as the NHS or the Home Office, for reasons not related to covid.

A data-sharing agreement should have been drawn up to limit what the data could be used for. As there was no time to ask people’s permission then the data should've been supplied with more anonymisation or aggregation.

Further reading

If you need help with planning, look for the planning template in this responsible data management training pack from Oxfam.

For organisations with dedicated IT or digital teams, volunteers or staff members - catch problems before they happen using the full Ethical Operating System toolkit.

This page was last reviewed for accuracy on 02 March 2021

Back to top

Sign up for emails

Get regular updates on NCVO's help, support and services