General Election 2024

Read our updates on issues relevant to charities for the upcoming election. Learn more

Reflection and learning about software or tools

This page is free to all

You’ve found a tool that seems like the best one for the job. Now is a great time to think about testing the tool with your users, making sure it’s a good fit.

The same rule applies here as to the planning and research process. The bigger and riskier the project, the more time you should spend testing before you roll it out.

Test the tool or software

You can test tools or software in several ways.

  • Live usability tests. These can be face to face or remote. You create tasks for people to do. Then you observe them using the tool to carry them out. You ask them to talk about their experience.
  • Recorded usability tests. You create tasks for people to do. They record themselves doing the task, speaking their thoughts about it as they go.
  • Data gathering. Use analytics or heat-mapping software to find out where people go and what they do in the tool.
  • Feedback surveys. These are usually less helpful than the other methods. If you're short on time you can use them to find out what people love or hate most.

Here are the types of questions you'll want to explore.

  • Can users complete the tasks you hoped the tool would help with?
  • Are they getting value from the tool? Is it helping with their overall goal or state as well as the specific task?
  • Are as many people using it as you expected?
  • Are people reporting any issues or asking questions?

Don’t forget that your staff are users too.

Depending on how the testing goes you'll need to make choices on the points below.

  • Roll out the tool or software as it is.
  • Get support to see if you can change (often by configuration) the parts that are causing problems. Then roll out the tool or software.
  • Decide that there's too much to change and test a different tool or piece of software.


As you make those decisions a reflective mindset will help.

Reflection prompts

  • What were you hoping to achieve by implementing this tool?
  • Did your plans change as you progressed through testing? If so, how?
  • What went well and why? Which aspects of the tool or service did users find most useful?
  • What could've gone better? Which aspects of the tool or service did users struggle or not engage with?
  • Does the tool fit all the requirements or are there some missing? Can the tool be modified to fit those missing requirements?
  • What advice would you give yourself if you were to go back to the start of the project?
  • What are the two or three key lessons you would share with others?
  • What’s next for you on this project? How comfortable do you feel with moving forward and implementing the tool? Do you have any hesitations about it?
  • Looking forward, what will you have learned from this project a year from now?
  • Are there any lessons for you personally?

This page was last reviewed for accuracy on 02 March 2021

Back to top

Sign up for emails

Get regular updates on NCVO's help, support and services