As we prepare to add more content to this website, we're turning off the ability to sign in on ncvo.org.uk and knowhow.ncvo.org.uk. To ensure members can access what they need, member content is available to all users until the end of July. Visit our temporarily free-to-all tools and resources
Use this page to understand why you need a plan to manage ethical risks when you use technology.
Everyone who creates digital services, provides activities or builds digital tools must pay attention to the ethical impacts this has on others.
This means thinking about the impact on:
This is not something that only affects digitally advanced organisations. If you connect with the people you support online in any way, it affects you.
All these services (and more) can be run in ethical or unethical ways:
We don’t intend to do harm when we want to help people. We don’t want to do the following.
Technology doesn’t police itself. Many of the digital services and technologies available for commercial use aren’t good or bad. But, how we use them can have a significant impact in creating positive or negative outcomes.
Use these questions to help you think about your ethical responsibilities:
There are eight areas of risk. This page focuses on four that are most relevant to charities and other organisations.
Misuse of data is a serious problem. Do you always understand what you’re signing up for?
Very often the people you support won’t have the skills to work out what permission they’re giving on the websites and apps they use. They need our help to understand what you’re doing with their information.
The main ethical message is that asking people to give permission is not enough. You need to help people understand what they’re giving permission for.
Data protection law gives everyone the right to ask to see what you’re doing with any information you hold about them.
Over the next few years people will expect to be able to see this more clearly and have much more control over this information. They’ll want this at a push of a button. They’ll become more and more keen to protect data about themselves and select how it's being used.
You need to think about how to make it easier to tell people what data you’re using and in what ways. You need to make it easier for them to give you their information safely. You also need to make it easier for them to take that permission away again.
Machine learning and algorithms are elements of AI (artificial intelligence). Charities, social enterprises and public sector organisations are exploring how they can use this type of AI for things like:
These types of AI can pick up bias from the materials (datasets) that are used to train them. They can make the bias even stronger.
If you’re starting to work in this area you need to ask probing questions. You need to understand where the materials and datasets that train the systems come from. Ask any partner organisations you’re working with on this for their ethical policies and practices.
The great connectivity of the digital world brings a risk. Many communications applications track emails and text messages or call content. GPS tracking can monitor where people are through their phones. Storing sensitive personal information about people in databases when they need help or support means that information exists as a record.
Depending on the law in each country police and governments can demand access to these records. Arrests and torture have sometimes been a result.
Last reviewed: 02 March 2021Help us improve this content
Get regular updates on NCVO's help, support and services