Effective evaluation is vital to understanding the impact of your work and pre-COVID-19 you may have been using an evaluation framework that helped you understand the effectiveness of a particular project. We are aware that some evaluation frameworks are focused more on the impact of physical engagements with your audiences or communities and, clearly, since March many organisations have been delivering significant programmes of work online. This roadmap has therefore been designed to help you think about how to use data to evaluate the impact of your online work. Even once we are able to gather together in physical space, many organisations will continue to reach and engage audiences online and therefore some of the pointers in this roadmap will remain salient in the long-term.
Five Key Steps to Follow:
- Agree your project objectives
- Review available data from digital platforms
- Identify any data gaps
- Develop your framework & collection process
- Review & reflect
1. Agree your project objectives
In order to know what data to collect, you need to agree your objectives with key stakeholders and develop your evaluation plan (‘framework’). You need to do this before you start your activity because with some types of data it can be difficult to collect it retrospectively.
An evaluation framework is essentially a plan that underpins the evaluation. It describes:
- Your audiences or participants.
- The outcomes you want to bring about.
- What will indicate whether you have achieved these.
- The evidence that can help you prove it.
- The methodologies you will use.
You can add further detail, such as who will be involved, the resources you’ll need and when you will do the data collection.
If you are new to evaluation, you might want to begin by exploring logic models, which can be useful, in particular, for evaluating projects that have wider social, community or wellbeing objectives. It’s important to ask yourselves questions such as:
- Who are the stakeholders of this work? Why is evaluation important to them?
- What difference do we want this project to make? What changes do our funders want to see?
- What will happen if change take place? What are the indicators of success?
A framework for a community participation programme that aims to engage a particular group in a specific geographic location, for example, will look very different from one for a series of online live streams that is seeking to reach a young audience.
Evidence and examples to explore | gov.uk Introduction to Logic Models; 3Ps Planning Framework for Uncertain Times; Digital SOS mentoring; Digital for Audience Development; Digital Culture Compass; Measuring Online Activity in a Locked-Down World; Case in Point: Art UK, Case in Point: National Gallery; Developing a meaningful digital strategy
2. Review available data from digital platforms
Broadly speaking, we can divide data about online activity into two different types:
- Data from analytics tools, such as example Google Analytics, social media and email marketing platforms.
- Data that you will need to gather directly – whether via quantitative or qualitative methods. The quantitative will be from existing sources, whereas the qualitative may require you to fill the gaps.
Commonly used and readily available analytics tools for quantitative data gathering:
- Google Analytics is an important tool to help you understand digital impact. You should ensure that it is configured correctly because by default it only measures two things: how a user came to your site eg via Google search, social media and so on, and what pages they visit while they are there. If you want to measure engagement with content such as video or capture information about other interactions, such as downloads, you need to make sure you have Event Tracking set up. If you are new to Google Analytics try the free learning resources provided by Google or Arts Council England’s Google Analytics for beginners webinar.
- Social media metrics will also likely form a part of your evaluation plan but it is a good idea to think carefully about which metrics are meaningful within the context of your work. If one of the aims of your project is to reach as many people as possible then aggregate ‘reach’ figures may be important but equally, if you are more interested in engaging a particular group of individuals, then engagement rate may be a more meaningful metric to use. You can see all the available metrics for the main social media platforms in a downloadable spreadsheet available here and commentary about the most meaningful metrics here.
Challenges to data collection:
There are a number of challenges with collecting quantitative type data from digital platforms. Metrics can be different across platforms, for example Twitter and Instagram use ‘Impressions’ but Facebook doesn’t (well it does but only for its paid ads).
- There are variations in how far back you can go and with a platform like Instagram you can’t download the insights directly (unless you connect your account to a third party social monitoring tool like, for example Hootsuite or Sprout Social).
- With digital there are also occasions where it’s difficult to get the exact information you’d like, such as whether a specific age group is engaging with a particular ring-fenced set of content on Instagram (in this particular example, you can see age as it pertains to your audience generally but not tied to individual pieces of content).
While there is no easy fix for these issues, the key is to return to your objectives and be clear how the data you’re collecting relates back to what you’re trying to achieve.
Vital, Useful or Merely Interesting?
When working with data, a good tip is to remember the acronym VUMI: Vital Useful or Merely Interesting. There is a myriad of types of data you could be collecting, some of which may be interesting but won’t particularly help you to assess impact. You’re aiming to collect data that fits within the ‘Vital’ and ‘Useful’ categories.
Evidence and examples to explore | Main social media metrics; Most important social media metrics to track; Digital Culture Compass; Google Analytics for beginners webinar; Setting up Events tracking; Digital SOS mentoring; Measuring Online Activity in a Locked-Down World; Online Community Participation Masterclass; Using Data to Make Your Case Masterclass
3. Identify any data gaps
Digital analytics can tell you a huge amount about whether or not audiences are engaging with your project, but tools like Google Analytics and insights from social media platforms are mainly designed to tell you about what people are doing, not so much who they are or what they think. They do not tell you:
- What their motivation is.
- Their response to the activity or content.
- Anything about longer term impact.
For a comprehensive evaluation of digital activity, then, you will almost certainly need to gather some further primary data, both quantitative and qualitative.
Running surveys to get a broader picture of your audiences:
In order to understand more about who your online audience is, it is a good idea to run a survey. Depending on the nature of your project you can do any or all of:
- Deploying it on your website.
- Sharing it via social media.
- Distributing it via email.
You can use the free digital survey that The Audience Agency has created (to share on your website and via your social media accounts) to give you a number of useful insights into:
- Frequency of engagement.
- Demographic and location information.
- Audience Spectrum profile.
- Overlap with physical attendance.
- Motivation to engage.
Making the most of your survey:
- When deploying a survey on your site, it is essential to put this overtly in front of audiences to obtain good response rates. A URL hidden at the bottom of a page will yield poor results. A popup or banner is much more effective.
- Social media and newsletter lists can provide great response rates but do remember that this may only be representative of a certain type of loyal audience member.
- Surveys can also be a great way to gather a pool of people who you could talk to in more depth at a later point. Ask for an opt-in at the end of your survey to further research.
Gathering qualitative data:
Usually evaluation includes some elements of qualitative data and while this may need to be adapted to be more appropriate to an online environment, the principles remain the same i.e. you can still:
- Run focus groups,
- Conduct one-to-one interviews,
- Ask people to undertake task based activities,
- Or even think about how you could incorporate more creative evaluation techniques.
Running online discussion groups:
Discussion groups are likely to be a vital part of your evaluation process. But if you are running them online you will need to plan the sessions carefully, taking into consideration that:
- Sometimes, participants in an online session can be more reluctant to contribute than they would be in an in-person session.
- It is harder to read the non-verbal cues and bring people into the discussion.
There are various ways that you can try to mitigate these potential barriers:
- Consider how you will gently encourage feedback from everyone and be clear in explaining the purpose and format of the session.
- Some online meeting software, e.g. Zoom, will allow you to save the chat. You can therefore ask people to type responses or answers to questions into the chat and then save this for use in your evaluation.
- You may decide that it would be useful to record the session but be mindful of privacy and make it clear beforehand that the session will be recorded.
- For particularly sensitive areas of discussion, for example willingness to pay or donating, it may be easier to conduct sessions one-to-one by video or telephone.
Evidence and examples to explore | Digital Audience Survey; Case Study: English Heritage Online; Getting Creative with Evaluation; Audience Spectrum in the time of COVID-19; Digital SOS mentoring; Digital for Audience Development; Digital Culture Compass; Measuring Online Activity in a Locked-Down World
4. Develop your framework and collection process
Once you have identified your programme/project objectives and considered what data you need to help you understand ‘performance’ against those objectives, you can create your framework.
It is important to note that, in order to ensure that the evaluation is comprehensive, you need not only to create a plan but also identify any processes needed to ensure you are regularly collecting the data.
When deciding upon your collection process, you will need to take into consideration that:
- Some of this can be automated or remains available for a long period of time, as with Google Analytics.
- With other sources, you may need to regularly download the data, unless you are using a third party social media metrics tool, like Hootsuite or Sprout Social,
- For Instagram, for example, you will need to regularly login to the app and download the relevant data.
Crucially, make sure your plan is clear on who is responsible for collecting the data. This is particularly important if the project you’re evaluating is taking place over a longer period of time. You will want to ensure there are no gaps in data collection, which can be caused by, for example, the team member responsible leaving.
Evidence and examples to explore | Google Analytics for beginners webinar; Hootsuite; Sprout Social; gov.uk Introduction to Logic Models; 3Ps Planning Framework for Uncertain Times; Digital SOS mentoring; Digital for Audience Development; Digital Culture Compass; Measuring Online Activity in a Locked-Down World
5. Review and reflect
One of the benefits of online activity is that it can be easier, than with a purely physical activity, to get an almost instant response to your work. This may cause you to review and adapt your programme, at which point, there are a couple of steps not to skip:
- While it’s a really good idea to be responsive in this way, remember that if you change your strategy significantly you may want to review your evaluation plan accordingly.
- For a longer term assessment of your organisation’s digital maturity, we recommend using the Digital Culture Compass, a tool to help cultural and heritage organisations approach, assess and improve digital activities.
Evidence and examples to explore | Digital Culture Compass; Online Community Participation Masterclass; Using Data to Make Your Case Masterclass; 3Ps Planning Framework for Uncertain Times; Digital SOS mentoring;
For further help using data to understand your online activity and the impact of COVID, see The Audience Agency’s Bounce Forwards COVID-19 Response Hub and explore our Evidence Hub, Webinar Series and Support Programmes.
If you need help evaluating a project, please do get in touch via firstname.lastname@example.org.We are experts in digital metrics and evaluation, so whether you want to develop a meaningful measurement framework to help with your monitoring and reporting, or understand the impact of your online activity on a project basis, we can help.