Technology drivers
The ever-evolving landscape of technology is continually reshaping how we live, work, and interact. It is having a profound impact on our daily lives, our sector and those we seek to support.
In this section, we explore the key trends that will dominate the technological sphere in the year ahead and how voluntary organisations can navigate these advancements.
Artificial intelligence explosion will continue at pace
The UK government broadly defines artificial intelligence (AI) as machines that perform tasks normally performed by human intelligence, especially when the machines learn from data about how to do those tasks.
Google’s chief executive, Sundar Pichai, states the "transition we are seeing right now with AI will be the most profound in our lifetimes, far bigger than the shift to mobile or the web before it." AI has been identified as one of the five critical technologies for the UK through to 2030.
Since the launch of pioneering generative AI ChatGPT in late 2022 – which creates content in response to prompts – we have seen AI moving increasingly into the mainstream. AI is the Collins Dictionary Word of the Year 2023.
From facial recognition software to products used by charities every day, throughout 2024 we can expect AI to be increasingly prevalent in our work and lives.
- AI will increasingly be integrated into our everyday work: In November 2023, Microsoft rolled out CoPilot – its own AI tech integrated into the Microsoft Office programs dominant across the world. In December 2023, Google released Gemini, its most capable AI model to date. Expect big shifts in ‘everyday tech’ offering opportunities to use AI to enhance and speed up your work.
- AI will increase interaction online: It’s increasingly the norm for online meetings to include interactive elements like polls and open questions. These systems are now rolling out AI. Mentimeter will analyse results from participants. Kahoot will group replies to questions.
- AI will use a wider range of inputs to create better outputs: Commonly used AI is like a chatbox – where users ask prompts, and the machine generates a response. In the year ahead, we will likely see ever more sophisticated ‘multimodal’ generative AI — systems that can draw upon a much wider range of inputs to create. By harmonizing text, recordings, audiovisual (AV) and visuals they can create ever more original and effective outputs.
- AI ‘hallucinations’ are likely to reduce: When we make requests to generative AI tools we expect correct answers. However, they can sometimes produce outputs which are similar to reality but ‘hallucinates’ the response. It can mislead, and provide inaccurate or nonsensical responses. However, as training data for AI improves, millions of people's feedback on answers is provided and new tech is being rolled out, analysts expect these hallucinations to reduce.
- More charities are bringing AI into their work: Increasingly voluntary organisations are experimenting with the use of AI. Charities are starting to use ‘charity chat-bots’ to meet the needs of those they seek to serve. New apps and services are being developed. For example, the VODA app is using AI with clinical techniques to provide tailored support for the LGBTQ+ community on mental health. The next year will likely see pilots and under-the-radar initiatives launch and grow.
- Campaigners are using AI to enhance their advocacy: increasingly campaigners are considering how to navigate the risks and benefits of AI within social change. For example, Witness has outlined how generative AI could be used in human rights advocacy. Positively it can be used to anonymise individuals and protect identities or reconstruct places for advocacy purposes.
Stay ahead
- For practical advice on getting started with AI, read this NPC blog
- Charity leaders and trustees can use this tailored AI Checklist from digital consultancy Zoe Amar to gain a shared understanding of AI and help make decisions.
- If you’re considering using AI in your services or provision, consider the advice in the UK government guide to using artificial intelligence in the public sector on how to assess, plan and manage artificial intelligence.
Build your skills in using AI
While AI profoundly impacts how we work, many voluntary organisations are unprepared. The most recent Charity Digital Skills report (the annual barometer of digital skills, attitudes and support needs across the sector), found that three-quarters of charities (78%) either strongly agree or agree that AI is relevant to their organisation and could transform how they work.
Yet, almost three-quarters (73%) disagree or strongly disagree that they feel prepared to respond to the opportunities and challenges that AI brings.
Larger charities are more likely to use (or plan to use) AI in their operations so there is a risk. Smaller charities may lack the financial resources or knowledge to benefit, which could deepen inequality within the sector.
Over a third (35%) already use AI for certain tasks. Across the sector, AI is being used to:
- summarise meetings
- create social media content
- help write board papers
- generate ideas
- and support service delivery.
While there are opportunities for charities to use AI to improve their reach and effectiveness, there are essential questions for leaders to ask themselves about how to protect the distinctive value of their organisation.
There are risks that AI technology is designed in a way that discriminates against the people charities employ, engage and work with. We need to understand the technology and have an open dialogue with companies and the government to make sure big technology companies don't overlook how bias can impact their systems.
How we work is just as important as what we do. Voluntary organisations will need to adopt new technology in a way that doesn't take away the value we bring that often hinges on human relationships.
Stay ahead
- If you choose to use AI, follow this Catalyst guide to how charities can use AI ethically.
- If you plan to build your staff and volunteers skills to use AI, consider the Innovate UK BridgeAI ‘AI Skills for Business Competency Framework’ (first draft out for consultation as of November 2023).
Expect light-touch artificial intelligence regulation
In 2023, the UK government issued its AI White Paper, which sets out its approach to ‘implementing a proportionate, future-proof and pro-innovation framework for regulating AI’.
It has set principles to guide and inform the responsible development and use of AI, which existing regulators can then work on. The government's stated aim is for each regulator's domain-specific expertise to adapt principles to the specific area in which AI is used.
We can expect sector-specific regulators, such as the Charity Commission, to grapple with these requirements in the year ahead. Those regulators with significant impacts on our sector – such as the Information Commissioners Officer, Ofsted and CQC – must also consider their approach.
Individual regulators' approach will also sit alongside boosts to current AI bodies. We already have the Office for AI responsible for overseeing the implementation of the government's AI strategy and the Centre for Data Ethics and Innovation that leads the government’s work to enable trustworthy innovation using data and AI.
In 2024, they will be joined by a new AI Safety Institute - the first state-backed organisation focused on advanced AI safety for the public interest.
Advances in AI won't be limited to our borders. Global progress will also have a significant impact on our voluntary organisations.
The AI Safety Summit held in November 2023 provided an opportunity for global leaders and stakeholders to discuss the development of AI. We expect legal reforms across the European Union and the USA, which may impact AI development.
Stay ahead
- Learn more about the UK government's approach to AI in this Institute for Government overview of AI regulation.
- Follow the Information Commissioners Office guidance on ensuring AI follows data protection law.
Manage escalating cyber security risks
The National Cyber Security Centre (NCSC), the UK’s technical authority for cyber security, has warned of an increasing cyber security threat across the country.
More and more, those with intent to cause harm can access cost-effective tools, lowering the barrier to act against the interest of others. In the year up to August 2023, NCSC received a 64% increase in the number of cyber-attacks reported to it.
Voluntary organisations are at particular risk. Its 2023 UK Charity Sector Cyber Threat Report highlighted charities are increasingly being targeted by hostile actors.
The combined factors contributing to this are a greater degree of part-time staff and volunteers, many of whom use their own devices, and limited resources and technical skills, charities are increasingly been targeted by hostile actors.
Poor cyber security risks others accessing sensitive personal data about those we work with. Many of those may be particularly at risk of further fraud. Financial data about donors could be misused and abused; potentially leading to a loss of donor confidence and support.
The continuing war in Ukraine and the UK government – and many charities – support for Ukraine also heightens risks. In December 2023, the UK government stated that the Russian Intelligence Services has been targeting high-profile individuals and entities through cyber operations, including voluntary organisations.
Such risks may be heightened as we enter a year set to be dominated by preparations for a general election. The most recent UK Government Risk Register highlighted the risks of an attack on government assets or democratic processes and potential interference in elections and democratic processes.
Stay ahead
- Understand the basics of cybersecurity by reading our five steps to cybersecurity.
- Learn more about how small charities can manage cyber risks on the National Cyber Security Centre website.
- To be made aware of significant cyber security issues, sign up for the National Cyber Security Centre Early Warning Service.
Act to build safety and security into your apps
Cyber security is not just a risk to charities. It’s also our duty to those we work with online.
It’s estimated that applications (Apps) were downloaded 2.2bn times in the UK in 2022 Increasingly apps from charities form part of this growth.
However, poorly designed or managed apps bring a range of threats relating to malicious use. In October 2023, the government published an updated Code of Practice for app developers with minimum security and privacy requirements. The government has set an expectation that app developers – including charities - implement the Code by June 2024.
Stay ahead
- If you have or are considering developing an app, make sure you follow the Code of practice for app developers by June 2024.
Beware the growth of misinformation and deepfakes
Deepfake technology is an increasing risk for all types of organisations. It is becoming easier to use AI and deep learning techniques to create fake content showing people doing or saying things they never did.
At a time of significant polarisation with charities caught up in the political crossfire, there is a risk that bad actors seek to use deepfake technology to damage the reputation of charities or senior leaders in those organisations. These can cause reputational damage to voluntary organisations.
Charities are also increasingly being aped by those with an intent to cause harm. Leading debt advice charity StepChange has been working to combat so-called ‘clone firms’ – private, profit-making businesses that pretend to be ‘StepChange’ online.
A Bureau of Investigative Journalism investigation highlighted how predatory businesses create fake websites to build off the back of charities brands and good name.
Make sure you have the right systems, skills and training to protect their organisation and the people you work with:
- Consider the potential for them to be targeted and manage reputational risks effectively.
- Consider whether you should seek trade mark protection for your brand.
- Report deepfakes or misinformation which may relate to your charity. You can often report misinformation on social media platforms, report dubious websites to the National Cyber Security Centre, or misleading adverts to the ASA.
- Consider your expectations in your Code of Conduct for staff or volunteers who share misinformation and make sure of appropriate moderation of your channels – such as WhatsApp groups or social media groups.
Stay ahead
- Learn how tech firms state they are tackling Deepfakes and Synthetic Media in this blog from Tech trade body techUK.