Using digital storytelling to include youth voices in evaluation (case study 2 of 3)

In this case study we talk about how we used digital storytelling in a longitudinal evaluation of a school-based orchestral music program.

This is the second of three case studies I am posting this week. Click here to see an overview of the three case studies.

Key points

·       When combined with a range of other data collection, digital storytelling can add a layer of depth and meaning to evaluation results.

·       Digital storytelling is a good way to connect funders and other evaluation users with the lived experience of program participants.

·       There are resource constraints to implementing digital storytelling. Getting around these constraints requires a flexible and open-minded approach.

·       If you want to combine digital storytelling with an evaluation, it is not enough to simply put the digital stories into an evaluation report; the stories should go through a process of meaning-making and synthesis where program participants analyse the stories for themes. This should then be reported along with the digital stories. Most Significant Change [link] can be used to do this.

Background 

The program being evaluated was a three-year music program at a primary school with a very high number of students living in disadvantaged communities.

The overall purpose of the program was to contribute to increased wellbeing for students living in disadvantaged communities. The idea was that students who participate in group-based music programs receive a range of beneficial outcomes, and that students living in disadvantaged communities are less likely to have access to music programs.

We came into the evaluation at the very end of the program’s life. A range of evaluation activities had already been undertaken. We had a very specific role: we were to synthesise the data that had been collected and prepare a final report to the funder. We also did some final qualitative inquiry with teachers and students to answer questions the current dataset had not addressed.

Evaluation activities included extensive interviews with current and former students of the program, interviews with teachers, teacher journals, and teacher surveys. There had been issues with the data collection: the data relied exclusively on qualitative data and did not attempt to triangulate the themes arising in the data with other forms of evidence such as school grades or psychometric testing. However, the large amount of qualitative data pointed to the program being considered very successful by the music teachers, school staff and students.

Why digital storytelling?

We chose to do a digital storytelling project with students at the school for two reasons:

  1. We wanted to ensure the voices of students were prominently featured in the final evaluation report. We felt that actually watching and listening to students talk about their experience brought an element of meaning that was not present in the traditional evaluation reporting format.
  2. We wanted to use the stories as a way to communicate the evaluation findings to the funders. The stories were seen as a way to make the evaluation report ‘come alive’ and memorable for the funder.

The cost of a running a digital storytelling workshop for an evaluation is normally considered quite prohibitive. Digital storytelling also takes a significant amount of time to plan, which can be a significant barrier when working with schools who are time and resource poor.

Luckily, we had a volunteer. Zoe Dawkins from Storyscape is a digital storytelling expert who was willing to provide her services free of charge. We were also lucky to have program staff and teachers who were flexible and willing to let the students spend an afternoon working on the digital stories. 

Planning

So, we had our date, we had our digital storytelling expert and we had our students. Now, all we needed was a plan.

In typical school fashion, we were given one week’s notice of a suitable date and so only had a short amount of time to plan the workshop. Another issue was that the school, although extremely generous with their time, could only provide one afternoon for the students to prepare the stories rather than one or two days.

We knew that running a workshop for students out of school hours was unlikely to work– a lot of the students lived in newly arrived communities and several had family duties and were responsible for caring for siblings. Realising it was our only chance to create the digital stories, we agreed to the single afternoon session.

Our next step was to organise a facilitation plan, get staff to volunteer their time, get students to ask their parents to sign consent forms, and organise equipment.

We knew that not all of the 30 students would feel comfortable working on a video story, so we designed three ‘streams’ for the afternoon:

  • The first stream involved making a digital story where students talked about their experiences of the program.
  • The second stream involved making a piece of music and filming a video clip for it.
  • The third stream did not involve video. Instead, students created storyboards that described their experience of the program.

The streams were designed so that students could choose the stream they felt most comfortable being part of.

Creating three digital stories with 30 students in three hours meant we needed a lot of staff. I put out a call to the staff members delivering the program and was very lucky to find four people willing to volunteer their time. This meant we had two people working on each stream. 

One of our biggest concerns was that the students would not bring back their consent forms. The day before the workshop, one of the program staff came to my desk, beaming, and put down 30 signed consent forms. Apparently getting that many consent forms signed on time was unprecedented! 

On the day

We asked the students to choose the stream they wanted to join. We were lucky that students evenly divided themselves across the streams, with some students opting out of participating in the filming and making storyboards instead.

The next three hours were controlled chaos. In this time:

  • Stream one participants wrote a script, developed a storyboard for the script with framing for the camera shots, cast the actors, and directed and filmed the videos.
  • Stream two participants wrote a storyline, storyboarded it, set up the camera shots, and played and filmed a piece of music.
  • Stream three participants worked in pairs to complete their storyboards.

At the end of the day, we couldn’t believe we had pulled it off!

The two videos complemented each other nicely – the first one capturing the social and emotional impact on the students, and the second video showing the music journey.

You can watch the videos by clicking on the links below.

Stream one:

Kevin and Linda become friends

http://vimeo.com/channels/storyscape/86483680

Stream two:

Learning to play the Sacred Heart Stomp

http://vimeo.com/channels/storyscape/86481622

Using digital stories in the final evaluation report

We didn’t think it was enough to include the digital stories in the report with no reference to the evaluation findings. We wanted to make sure that the messages were integrated into the evaluation report and the findings. This validated the existing evaluation data and demonstrated how the students’ and evaluators’ perspectives differed and aligned.

We included a description of the digital stories in the outcomes section of the evaluation report. To provide an additional layer of rigour, we analysed the videos for key themes. We also wanted to ‘validate’ the digital stories by seeing what the teachers thought about them, and whether their experience of the program aligned with the students’ experience.

Drawing on the Most Significant Change technique, we asked the teachers to detail the most significant changes that they observed in the stories. The teachers agreed that the stories matched their own experience regarding the benefits of the program for students. This interpretation from teachers added a layer of depth and meaning to the stories and the existing evaluation data.