Understanding what went well – and what didn’t – with your project is key to both organisational learning and developing your work with communities. But how do you get a true picture of how things went for the people involved? Guildhall School of Music & Drama Research Fellow Maia Mackney and Barbican Planning and Evaluation Officer Valentina Orru share some exciting new ideas for evaluation.

Co-created evaluation is the process of exploring the what, who, where, why and how of a project in a way that involves participants and stakeholders throughout.

Why co-create evaluation?

Evaluation can be different things for different people. It can tell complex stories of the nuances of artistic and personal experiences as well as showing stakeholders and funders the outcomes of a project. In addition to celebrating and capturing what a project did well, it provides a space for learning and reflection. Evaluation can support a culture of transparency for governance models and hold organisations and funders accountable. 

In this chapter, we’ll be looking at questions such as how can the values and processes of work created in partnership become integral to, but also be questioned by your evaluation? What does a meaningful evaluation of co-creation look like? How can project statistics co-exist with subjective and jointly-created research? How can methods tell more complex narratives of change? How can we talk about ‘impact’ in a broader and more radical way – and in language we all understand?

Co-creation is constantly having to justify its value; it seems to me – why might that be? Holding space for processes of co-creation can be hugely anxiety-making for those in control of the money and the power, because they are not in control of outcomes. Instead, they are invited to enter into new sets of relationships with stakeholders. This in itself has value.’ 

– Chris Rolls, Head of Training & Development at 64 Million Artists

Everyday creativity and the process of engaging people in helping to decide what counts as culture, who makes it, and where it is made, are fundamental to today’s co-creation practices in the cultural sector. Funding criteria and cultural policy have shifted to be increasingly focused on the creation of work ‘by’ and ‘with’ communities, rather than ‘for’ them. While numbers-based evidence required by funders and sector leaders can support the reporting requirements around projects, co-creation – which challenges traditional conceptions of what constitutes an ‘output’ – requires a more nuanced approach. Arts organisations are being asked to prove a social return on investment at the same time as being encouraged to hand power back to the people by putting their experiences at the centre of the process. Sometimes the nuance surrounding these practices gets lost in evaluation and this is what we hope to challenge in this chapter. 

A growing body of research is showing that by planning evaluation methods that place cultural democracy at their heart, the people doing the evaluation have the potential to build a richer picture of the leaps in knowledge that come from artistic work. Indeed, evaluation can hold space for the process of co-creation and collaboration, where outputs are less defined, more relational and social.

Evaluation documentation and tools don’t need to be inaccessible, but can be led by co-creation at each of their stages. Everyone involved, including community groups and participants, can be invited to share their understanding of the context and project outcomes.

We’ve interviewed four researchers in order to delve into this topic more deeply: Dr Sophie Hope and Henry Mulhall from Birkbeck University; Sarah Cassidy, Head of Inclusion and Learning from The Old Fire Station, Oxford; and Chris Rolls, Head of Training and Development from 64 Million Artists. By looking at case studies of current co-created evaluation methods, we’d like to offer you some things to think about and some practical resources so you can create your own evaluation approach for co-created projects and practices.

In practice: BE PART – a critical, co-created approach to evaluation

Cards on the Table, Blind Dates and Fieldnotes Diaries 

BE PART is a four-year Creative Europe-funded project developed by a network of ten partners across Europe and Tunisia, who all share an interest in co-creation practices. The evaluation of BE PART was influenced by the Social Art Mapping approach, a visual approach to representing and understanding relations, structures and connection within an organisation. Led by Dr Sophie Hope and Henry Mulhall, the evaluation reviewed the aims and structure of the project as well as the needs of the people involved. 

Here’s how it works: while each organisation runs fieldwork project activities independently, delegation is crucial. Partners meet annually for ‘assemblies’ as well as more regular meetings. Some partners have additional responsibilities, such as being in charge of the budget or evaluation. Personal relationships, experiences, local and international frameworks, and organisational structures are at the centre of Sophie and Henry’s new evaluation methods. Interestingly, each partner responds differently, according to their individual set-ups, budgets, and contexts. 

‘Our methods are around how we zoom in on the nitty-gritty of personal relationships that evolve and develop. How do we understand the relationship between people and organisational structure, and the wider European international policy framework that sits within?’ 

– Dr Sophie Hope and Henry Mulhall 

The evaluation methods and tools used in BE PART include:

  • Cards on the Table

  • Blind Dates 

  • Fieldnotes Diaries

Try them out in your projects/organisations by following the interactive activities.

The politics of participation 

These methods were developed as part of Sophie and Henry’s ongoing research. Sophie, a self-confessed ‘frustrated evaluator’, is interested in developing alternative, critical approaches to evaluation because she feels that evaluation can become a back-patting, marketing exercise which frequently doesn’t leave opportunity to discuss uncertainties, failures or the issues with funding agendas. With a background in socially-engaged art curation, for the past twenty years, Sophie has been experimenting with critical collective approaches that reflect and question participation and the dynamics behind it. Through these methods, the people involved become the evaluators of the project and have ownership of the process. Indeed, the ethos of co-creation is process over outputs.

Who gets to tell their stories, and on what terms? The politics of who speaks and who listens is key in co-created evaluation, as is the anonymity of the people involved. In order to really understand how effective a project was, evaluation needs to be a safe space where people feel cared for; where they can talk freely about what went wrong, their emotions and personal experiences. As an example of this practice of care, BE PART partners were sent care packages, including pencils, notebooks, stickers, the Cards on the Table game, leaflets, and Blind Date treats.

How can co-created methods of evaluation go some way to challenge the “story theft” involved in colonial attitudes to research? Why would anyone want to talk with us in the first place? What are they getting out of it?’ 

– Dr Sophie Hope and Henry Mulhall 

A mixed-method model

While the qualitative evaluation, whereby participants are asked to describe their experiences on the project, is led by Sophie and Henry, the BE PART partners are responsible for their own monitoring and statistical reporting on the project as a funding requirement. BE PART is an example of where statistics and stories can support each other, where quantitative and qualitative methods can co-exist to reveal the complexity of a project in a way which is not limited to a ‘number crunching exercise’. Sophie described how relational and story-based qualitative and creative methods are more time-consuming and expensive, something she felt needed to be accounted for by funders and in project budgets. 

‘How do we keep the messiness of processes and don’t flatten out the experiences, complex and conflictual experiences that people are having in processes?’ 

– Dr Sophie Hope and Henry Mulhall 

At the same time, it’s an important part of the evaluator/researcher’s role to present data in ways that enable everyone involved to express their level of participation and things they’ve learned. Interesting examples include:

Diagrams: visual representations of the relationships between people and what those relationships might mean. 

Illustrations: which can make research readable and accessible for people. 

Fictional stories: fictionalisation of research/evaluation findings to engage with self-reflection.

Activity – Cards on the Table

Cards on the Table is a ‘reflection in action’ tool for practitioners and organisations which can be applied at any stage of your projects. Using quote cards as stimuli, card players are invited to think and talk about a project beyond the usual reporting parameters. The game supports a culture of constructive honesty, listening and transparency among the co-workers and participants involved. Are you ready to play?

Game rules 

  • Three to eight players recommended. 

  • Timing: Multiply the number of players by five and add ten minutes for dealing cards and concluding. 

  • Choose a dealer from the group – their role will be to facilitate the game and keep time. 

  • The player to the right of the dealer starts by choosing a card from their hand, placing it face up on the table. 

  • Then the dealer draws a Keyword and a Theme card. 

  • The player has two minutes to talk uninterrupted in response to the cards on the table. 

  • The group has three minutes to ask questions, agree, disagree or agree to disagree.

For full game rules visit here.

Costs and practicalities 

  • Cost of the pack: up to £50 or borrow a pack from a ‘pack holder’ organisation. 

  • Players’ fee to attend one to two hours game: according to available budget. 

  • Evaluator/member of staff’s time and fee to attend the game session, listen, analyse and report: to be agreed. 

  • Make sure participants are comfortable with being recorded and in the room where the game takes place. 

  • Check out the pack holders’ network at cardsonthetable.org, you can borrow the game from pack holders near you or decide to become a pack holder yourself.

Activity – Blind Dates

Have you ever thought of a blind date as an evaluation method for your projects? 

Start by nominating people from different parts of your team, organisation, project or partners and agree to go on a ‘blind date’ with each other online or in person. Then, think about a list of questions for the date. Add some extra layers, by mixing people from different roles in the project who don’t know each other’s job titles or even names.

Blind Dates Questions 

  • Set your overarching research question – this should be the same question leading the whole evaluation/research process. 

  • Develop a ‘Menu of Questions’ in relation to your project/organisational objectives to lead the conversation. For example, they can be around governance structure. Participants don’t have to stick to them but use them as a guide. 

  • Let the conversation flow naturally and allow the themes to emerge from the conversation; the evaluator is not present during the conversation. 

Costs/Practicalities 

  • Participants’ fee for one hour: according to budget. 

  • Evaluator/member of staff time and fee to attend the game session, listen, analyse and report: to be agreed. 

  • Make sure participants are comfortable with being recorded and in the space where the date takes place. 

  • This method is suitable for involving participants from different geographical locations.

Activity – Fieldnotes Diaries

Anyone in your organisation or project can be a field diarist and carry out research as an ethnographer on the ground. Diarists are invited to write a diary anonymously about their participation in a project, acting as ‘participant-observers’ of the behind-the-scenes process, attending workshops or meetings for example. The notes can be made in many ways and formats, such as photographs, recordings, emotional responses, writing, descriptions, sketches and diagrams, and in any language.

Costs/Practicalities 

  • Diarists’ fee: a minimum of £1,000 for between four and ten entries (over two years), depending on rates of pay and duration, to be agreed with the Diarist. 

  • Evaluator/staff member’s fee to coordinate, facilitate the process and analyse the diaries: to be agreed. 

  • Diarists should be contracted for their work. 

  • Think about a specific brief, entry points to the process (with access to behind the scenes) and specific ‘training’ programme for the diarists. 

  • Make sure to check in with the Diarists regularly and facilitate opportunities for them to meet and share their experiences. 

  • If participants’ first language is not English, make sure to account for a translation 

In practice: Old Fire Station, Oxford: The Storytelling Methodology 

The Old Fire Station, Oxford, is an arts centre showcasing the best of local, national and international performance work and artwork. Sharing its building with homelessness charity Crisis has informed its approach to working with communities by focussing on creating work which is made with others. In order to tell stakeholders such as funders about the impact of its work, The Old Fire Station historically relied on surveys and in-depth interviews. They expressed frustration with this approach to their external evaluators, Liz Firth and Anne Pirie, who in turn offered an alternative approach based on storytelling and Most Significant Change Methodology. The Most Significant Change approach involves collecting personal stories of change experienced by participants during a project and analysing these stories to explore and draw out common and significant themes. This process prompts discussion around why these changes may have taken place for participants during the project. 

‘We found [our previous approach to evaluating the impact of our work] deeply unsatisfactory, we didn’t always believe the numbers we were getting back and the process of asking people to fill in a form we felt undermined our relationship with them and the project itself. Liz and Anne offered an alternative approach, based on storytelling and the Most Significant Change methodology.” 

– Old Fire Station, Oxford, webinar 

The Most Significant Change methodology adopted by the Old Fire Station has six key phases. Phase 1: Recruit and train story collectors, Phase 2: Identify and brief storytellers, Phase 3: Storytellers and story collectors have a conversation, Phase 4: Conversation is recorded, transcribed and edited, Phase 5: Story discussion session is held, Phase 6: Stories and learning is shared. The two most important, Story Collection and Story Discussion, contain an element of co-creation, and former participants are often hired as Story Collectors. Participants are invited back, alongside project managers and partners, to the Story Discussion Sessions where anonymised stories are shared and explored to reveal common learning and themes. Collectors and stakeholders came together at the Discussion Day to ask, ‘what did these stories tell us about change, and how did that change happen?’ 

‘We stumbled across a way of learning and understanding that is also enjoyable and creative which has now become a crucial part of what we do as an organisation.’ 

– Old Fire Station, Oxford, webinar

The Storytelling Method offered Old Fire Station, Oxford, an opportunity to:

  • Gain a better understanding of the very personalised, diverse outcomes that participants experience.

  • It enabled people involved in the project in different ways to have a say in defining what impacts they felt.

  • It engaged more people in understanding and learning from impact.

  • It offered a more holistic approach to understand impact and how it is achieved.

  • It made the evaluation an enjoyable, creative, and meaningful experience.

  • The stories and learning were helpful in other ways, such as demonstrating impact to funders, writing proposals, communications work and for creative projects. 

Participants often tell us that one of the most enjoyable moments of engaging with Old Fire Station, Oxford, was telling their stories after the project finished.’ 

– Sarah Cassidy 

Whether formal Arts-Based Research (ABR) methods of exploring impact (drawing tasks, poetry writing, video blogs, free flow diary writing), or Most Significant Change stories documented by evaluators, more creative additions to current evaluation methods are increasingly being used to challenge assumptions around what constitutes ‘evidence’ of impact in the cultural sector. Alongside developments in qualitative evaluation (such as ethnographic research or critical social psychology), creative methods and Most Significant Change case studies have grown in popularity in a variety of disciplines, including health, psychology, education as well as in the field of socially-engaged arts, for their ability to effectively frame the complexity of people’s real life experiences. 

Activity: The Storytelling Methodology

Why not try using this more creative approach with your co-workers or on a project through this story collecting exercise? Familiarise yourself with the guidance for Story Collectors. Have a go at collecting a story in pairs, reflecting on your experience at the organisation/group you’re involved with, and what it’s meant to you. Swap over and then reflect on how it went. What worked well? What would you do differently next time?

Story collecting questions 

  • What was your involvement in the project? 

  • What changed for you as a result of participating? 

  • How did that change happen? 

  • Why was that change important to you personally?

Story collecting principles/guidance 

  • Take time to build a connection and get comfortable. 

  • Remember it’s a conversation, not an interview. 

  • Be genuinely interested and actively listen. 

  • Focus on what’s changed for them personally. 

  • Get into the details, ask questions and be curious. 

  • Don’t assume or interpret. 

  • Go at their pace. 

  • Give the conversation sufficient time. 

  • Be OK with silence. 

  • Don’t worry if it doesn’t sound like a story. 

The Storytelling Methodology:
key questions answered

What project would this methodology work particularly well for?

Old Fire Station, Oxford, said this approach works particularly well on longer, more in-depth projects but might be less effective for one-off workshops or shorter projects. 

What’s the cost of including this methodology in my evaluation approach? 

The Old Fire Station budget £300 per story, which includes payment for the Story Collector and external transcription and story editing. The storyteller is typically unpaid, however Old Fire Station, Oxford, has worked with some partners who have offered payment to participants who shared their stories. 

Are there any other costs involved?

The ‘discussion session’ when people get together to talk about the stories and their implications for the practice might incur the cost of a facilitator, travel expenses and catering for participants, if the session is run face-to-face. Typically, partners and community groups attend the discussion session as part of their commitment to the project, so there are no additional costs for paying attendees. However, if freelance practitioners or participants attend, there may be a cost to pay for their time at the session. 

‘One of the biggest considerations around storytelling is capacity, briefing and supporting tellers/collectors, inviting partners to discussion session (and writing report/editing stories) above all else requires time.’ 

– Sarah Cassidy 

Do I use this method in isolation or as part of a mixed-method model? 

The Old Fire Station, Oxford, often uses the Storytelling Methodology to bring to life the experiences that are often less vivid in the other data collected as part of an evaluation. They don’t suggest that this route can always be used to the exclusion of other data, such as tracking data or quantitative surveys. However, they suggest it offers a complementary approach able to explore the nuances and complexities of participants’ experiences on a project. 

What sample size does this approach require to maintain rigour and depth of analysis?

Depending on the size of the project and depth of perspective desired, Old Fire Station, Oxford, recommend a minimum of five to fifteen stories that reflect a breadth of perspectives. They once collected 32 stories on a project, but found this sample presented challenges in terms of analysis and increased the cost of the evaluation. 

How important is anonymity and how do you mitigate against the ethical implications around ‘story appropriation’?

Anonymity is an important part of the Storytelling approach. Knowing people aren’t identifiable from their stories can help people to speak more openly about their experience. The editing process is key to ensure stories remain as anonymous as possible. Sarah Cassidy at Arts at the Old Fire Station, Oxford, stressed that they explain to storytellers prior to participation that anonymity can’t always be guaranteed when stories are familiar to their peers, colleagues or friends, but that they are written in a way to ensure they are not publicly identifiable. Storytellers always have the final say about whether their story is included for analysis and storytellers always have final editing rights. Storytellers always sign off their story before it is read by anyone other than the writer. 

New visions for the Theory of Change: Provocations

One of the strongest themes that arose in conversation with Chris Roll from 64 Million Artists was the need to re-envision the tools used to support project planning and evaluation. In particular Chris commented on the process of designing a project’s Theory of Change for co-created projects. The Theory of Change is a document testing assumptions around how the project is supposed to work and the process needed to achieve the desired results and is also sometimes called Logic Model. Simply put a Theory of Change explores the assumed theory behind the change. Chris challenged the linearity of most theories of change (If X… then Y) and said often these sorts of tools neither capture the context behind a programme nor what brings about the change. This missing component, often called the ‘Mechanism of Change’, is a vital but often overlooked component of the Theory of Change. Here are some provocations which arose following our conversations with the four interviewees: 

What if the Theory of Change was not a linear sequence of causes and effects but a ‘messier’ cycle where project partners can engage at different phases, and where co-creation is embedded at every stage?

And therefore could the social value be assessed throughout the project rather than only at the end?

What if there was social impact in the process of finding resources for the project as well as embedded in its outcomes? 

What if we built capacity among our community partners by helping them to access resources? 

Who holds the processes? How do you open up those processes to make them democratic at every level?

How do you contract with your community members?

Is the decision making ethical and transparent at every stage?