Evaluation with learning disabled audiences and participants

Corali dancer Veneshia Bailey – Excellent Together project; photo by Jon C Archdeacon
The Digital Culture Network has recently been supporting a group of creative and cultural organisations to find out more about audience data collection and evaluation with learning disabled participants and audiences. We’ve been particularly inspired by a consultation carried out by Jennifer Dyer on behalf of Corali Dance Company, which made a number of the practical recommendations and ideas featured throughout this article.
Gathering audience data helps ensure programming is inclusive, accessible and reaching many different types of people. Collecting feedback and evaluating creative and cultural activities allows us to measure and celebrate success, and to learn from and improve things that could be better. Both are typically required to apply and report to funders.
But traditional surveying methods often overlook the access needs of disabled individuals. In this article, we’ll explore some of these challenges, and how we can rethink our approach to evaluating and collecting feedback to ensure that everybody’s voice is heard.

Headway Arts
Challenges of traditional surveying with learning disabled audiences
Traditional surveys can create a range of barriers for learning disabled people:
- Sensory barriers: Surveys are often a very visually dependent medium. This can lead to a range of sensory barriers, exacerbated by text-heavy surveys, and complicated question and form design. Using images can improve accessibility, though they may not be suitable for all questions or individuals. In addition, some digital survey platforms are less well-optimised for accessibility and assistive technology like screen-readers.
- Cognitive barriers: Complex language, lengthy questionnaires, and abstract questions can be hard to process for some people. Sometimes when we create evaluation questions and statements, we don’t consider that some audiences need to process concepts and articulate ideas in different ways. Some of the things we’re asking may not even be relevant or meaningful for those audiences – for example, often questions regarding impacts and brand values or identity can be guilty of this.
- Non-verbal communication: Standard surveys typically assume verbal or written communication, but this excludes people who are non-verbal or have limited speech.
- Overwhelm and fatigue: Filling in a survey can be cognitively or emotionally draining, particularly for those who are already managing sensory overload or fatigue.
So, for learning disabled participants and audiences, you might be creating additional stress and discomfort for respondents, as well as collecting low-quality response data due to unsuitable questions, language and formats. Even worse, there’s a representation problem – only those who can complete the survey will be counted in the results, excluding anyone who didn’t give their feedback or data.
Making surveys more accessible
There are ways to make surveys more accessible, even though these solutions aren’t universally applicable.

An example of Easy Read guidance explaining a survey. However, in some cases you can adapt the survey itself to be presented in Easy Read format.
Here are some suggestions made by organisations who have been working with learning disabled people:
- Easy Read formats: Using simple language and visuals can make surveys more accessible for people with cognitive disabilities. However, templates for this aren’t usually available on digital survey platforms.
- Printed options: Paper surveys can work better for some audiences, especially if digital surveys present barriers.
- Straightforward questions: Multiple choice questions with easy considerations and yes/no/not sure responses are preferred by some organisations rather than a numbered scale.
- Emojis and pictorial options: Adding emojis, visual cues, or images can make questions easier to interpret for those who struggle with text-heavy content.
These are good starting points, but the complexity of access needs means that they won’t work for everyone. That’s why alternative approaches are crucial – and we’ll explore some of these shortly.
Evaluating Performances versus Participatory programmes
It might be helpful to draw a distinction between two different situations for evaluation and data collection, each with their own nuances.
- Performances: Audiences are often larger, and engagement is typically shorter and more passive – audiences are there to watch somebody else perform, rather than to take an active part. This can make it difficult to gather feedback in a meaningful way, especially if surveys aren’t suitable or practical.
- Participatory programmes: These involve smaller groups and (in some cases) longer engagement, making it easier to build trust and gather more nuanced feedback. However, the interventions and impacts that we aim to achieve, and measure are often complex, and the time required for in-depth research is substantial.
Quantitative versus qualitative data
When we talk about survey data, we’re usually talking about ‘quantitative’ data, which means standardised, statistical data about many people. Typically, this is the approach used to look at audience or participant demographics to measure inclusivity. But achieving a sufficiently representative and large enough sample size can be a challenge, especially when aiming to include a diverse range of disabled voices.
There is another kind of information, which we call ‘qualitative’ data. Qualitative data includes non-numerical data such as pieces of text, ideas, discussion and comments that are written, spoken or recorded in some other way. Researchers will often use qualitative data to unlock deeper insight, and to understand and evaluate things. Qualitative feedback is typically collected from a smaller number of people than quantitative survey data.

Corali dancer Bethan Kendrick reflects on previous performance work; photo by Jon C Archdeacon
Individual experiences
When working with learning disabled people, it’s important not to lose sight of how different every individual person’s lived experience can be, which is where qualitative approaches can be very effective. It’s vital that we can ensure individual voices are heard and understood. One person’s experience can highlight nuances that might be lost in a broader survey.
Engaging other key people such as family members, teachers, carers, or support workers can also provide additional perspectives and insights. For learning disabled participants, people who are close to them may be able to help add context and understanding to verbal and non-verbal responses and reactions, adding vital nuance and richness to responses that might otherwise be misinterpreted or lost. When those close to an audience member or participant feel confident that research is inclusive and accessible, they’re more likely to help explore feelings and impacts, and encourage participants to stay engaged with research and feedback.
Finding the right approach
There’s no universal solution to balancing different research requirements and the needs of your audience and participants. Instead, it’s about tailoring approaches and being honest about what is and isn’t possible. A smaller (but richer and more inclusive) sample might be more impactful than a larger, less inclusive and representative dataset, but it can be more difficult to find the capacity or time to explore deep qualitative feedback with respondents.
In the section below, we’ll explore a few ways to build in flexibility, incorporate a mixture of more engaging and creative evaluation methods, and involve the audience themselves in shaping how feedback is collected – leading to more meaningful insights.

Corali dancers performing in Super Hot Hot Dog; photo by Jon C Archdeacon
Short-term approaches for performance audiences
For large-scale performances with short engagement windows, capturing feedback can be challenging. Strategies beyond surveys might include:
- Observations: Rather than surveying audiences, you could observe responses to a performance such as laughter, smiling, dance/foot tapping to music. Where it’s not possible to capture feedback directly, organisations like Spare Tyre monitor and record this kind of quantitative data for non-verbal audiences.
- Vox Pops: Capturing immediate, in-the-moment responses is a great way to collect the most authentic feedback – it can be harder to recall performances after time has passed. One way to do this is via short video or audio recordings. Approaches like this could capture all sorts of different responses, not just verbal ones.
- Creative, interactive evaluation: Allowing audiences to engage with feedback through creative means, such as leaving stickers or comments on an interactive display or contributing to a group artwork.
Middle-ground approaches for ongoing participatory programmes
With repeated sessions, it’s possible to explore more in-depth, interactive approaches.
- Giving time and space: Whilst for some groups or events, responses are better captured “in the moment”, sometimes it can be helpful to create some distance between the session and the feedback. This can give participants a bit of time to process what they’ve experienced before reflecting on it, and you can provide verbal and visual reminders of activity to help support answers.
- Props: Using props which have been used in sessions, or printing out pictures from workshops, can help to stimulate reactions and interactions, making feedback exercises more engaging for learning disabled audiences.
- Using body and drama-based games: Techniques involving drama, movement and role-play to facilitate discussion and reflection. Allie Walton-Robson, Creative Director at Headway says: “In our evaluation process I often encourage physical movement as an equitable means of response. Symbolic gesture as an expression of ideas and emotions helps humans to express, focus and amplify a particular thought. It’s often very spontaneous and therefore very authentic. Our methodology is about being actively open and sensitive to wordless signals, consciously reading body language and openly inviting gestural response.
I’d encourage this form of alternative expression as it is often very authentic and instinctive natural evaluative response. People who won’t answer a direct question are often happy to express their response through physical symbolism, signing or gesture when asked to. For me it’s about taking the time to listen and create ways to understand each other and offer an appropriate means to contribute. Sometimes this can take some time and experimentation to discover together, sometimes it can be very spontaneous. It’s all valuable insight to inform our work.”
These approaches require more time and facilitation, but they can yield richer, more nuanced feedback.

Headway Arts – using movement and physical expression to capture feedback and responses from learning disabled participants.
An example of a more playful and creative approach to facilitating discussion and evaluation was conducted as part of the Transforming Leadership program by Access All Areas. Developed by Lucy Burke and Lee Phillips, the tool explores the leadership styles of renowned figures such as Greta Thunberg, Nelson Mandela, Jenny Sealey, and Marcus Rashford. Users can explore these leaders’ approaches and their varied leadership skills and attributes, complete with definitions.
Designed as a conversation starter, this tool facilitates discussions on the diverse skills and attributes necessary for effective leadership, and highlights less obvious traits, such as quietness and deliberation. The success of the tool in engaging a range of participants with different skills and barriers shows the value of using engaging and relevant stimulus material, storytelling, and light role-play for participants to reflect about themselves.
In-depth evaluation for long-term engagement
For projects with long-term, ongoing engagement, evaluation can be woven into the programme itself.
- Postcards or diaries: Participants could be supported to share their thoughts by writing or drawing on postcards or contributing to a shared diary.
- Facilitator reflections: Facilitators can feed back on participants’ progress and responses, building up a picture over time. However, it’s important to involve multiple perspectives and try to structure evaluation to avoid bias – one person’s subjective feedback alone may not always show the full picture.
- Listening sheets: These can allow researchers and practitioners to gather insight from numerous people who know the participant well. They are then able to use this collective insight to help them directly interpret the participants expressions. This method was developed by Jo Grace and Sarah Bell, inspired by the work of Ben Simmons.
Consistency and planning are important to make sure feedback and reflections are captured regularly, and to reduce the risk of any blind spots in your data. For example, Corali Dance Company’s regular classes include informal participant feedback moments and facilitator observations each week, collected using a Google Form, and then a small number of deeper evaluation exercises across the year.
Jennifer Dyer from Corali says: “Key to the whole process is being clear on the impacts we want to have – like building confidence or leadership skills – so that any evaluation exercises speak to these. But we also need to be open to how our dancers might respond to the work and what they want to tell us – not all responses fit into boxes!” This kind of evaluation isn’t just about data; it’s about storytelling, relationship-building, and capturing the impact on a human level.

Corali engagement event at Tate Modern; photo by Jon C Archdeacon
In summary
There are no shortcuts when it comes to inclusive evaluation. Achieving representation means going beyond surface-level data collection and thinking deeply about who is—and isn’t—able to participate. It means balancing funder requirements with the realities of working with diverse audiences and being willing to adapt as you go.
- The most successful evaluations are those that respect individual needs, listen closely to the stories people want to tell, and embrace flexibility.
- It’s worth taking the time and effort to make your participants comfortable and feel supported and relaxed.
- Building trust and mutual understanding is important. Vulnerable people may be reluctant share criticism or negative feedback – they may not have a great deal of control or agency over their lives. They might also worry that giving offense could jeopardise future opportunities to participate.
- It’s good to clearly signpost how you will be collecting feedback, especially at performances and events. As well as boosting visibility and response rates, audiences won’t be surprised by them and can prepare.
Tailored, thoughtfully designed feedback methods can bridge the gap between organisations and audiences, ensuring that disabled voices can be part of the conversation.
If you’d like to hear more and have a conversation about how you can approach evaluation and feedback with your audience, please don’t hesitate to book for free one-to-one support from the Digital Culture Network.
Grace, J and Bell, S. (2022) Listening to ‘voiceless’ subjects: gathering feedback to a sensory story from participants with profound intellectual and multiple disabilities. Good Autism Practice. Vol 23, No. 2, pp 5-12
Further support
The Digital Culture Network is here to support you and your organisation. Our Tech Champions can provide free one-to-one support to all creative and cultural organisations who are in receipt of, or eligible for, Arts Council England funding. If you need help or would like to chat with us about any of the advice we have covered above, please get in touch. Sign up for our newsletter below and follow us on LinkedIn and X (Twitter) @ace_dcn for the latest updates.