At the beginning of each school year, many schools and districts assemble all staff at one location to experience a shared school opening event. In Barrington 220, a PK–12+ school district northwest of Chicago serving over 8,000 students, our back-to-school kickoff has also included student performances such as our high school drumline, choral groups, small instrumental ensembles, and a short performance from our summer musical cast as a finale to our opening program. Our staff members enjoy the performances, and more importantly, the tradition brings students into our annual Opening Day event. In 2024 we took our Opening Day performance tradition to a new level by creating and producing an original mini-musical with students using generative Artificial Intelligence (AI) tools as part of our creative process.
In addition to a rigorous academic program, Barrington 220 is proud to be known for innovation, a strong Fine Arts program, and excellence in other areas. Among several ideas brainstormed in December 2023, the idea of using AI tools to create a customized Opening Day student performance was on our list. While past performances had been well received, the songs we selected had not always matched our Opening Day themes.
Also at this time, OpenAI had recently updated their tools, Google had just released Gemini, and many other AI products were making their debuts. Further, Artificial Intelligence was topping EdTech headlines—teetering precariously between “innovative uses” and “potential negative effects” of using AI in school.
One of Barrington 220’s strategies for learning to use AI is to use ChatGPT, Gemini, or other tools in an area where you are already knowledgeable. As an Assistant Superintendent of Technology and Innovation who is a former music teacher and band director, I had used ChatGPT to help me write and record a song. ChatGPT helped me generate lyrics, chord progressions, suggested instrumentation, guidelines for musical style, and GarageBand’s built-in AI tools contributed to the backing tracks. In another example, our high school theater director had facilitated a program where a student group used ChatGPT to create and perform a one-act play. Outside the arts, several staff members had also used AI tools for day-to-day tasks such as helping write letters of recommendation, draft emails, and suggest writing improvements in day-to-day communications.
With our previous AI experiences in mind, we continued to collaborate with our Director of Fine Arts to further shape our Opening Day student performance. Our idea soon evolved into a student project facilitated by staff to create a short musical using generative AI tools. We knew at this point that we wanted to try several AI tools, but also recognized that AI might not work for some aspects of this project. We also fully acknowledged that this idea was, perhaps, a bit crazy—but innovative. Ultimately, we felt that this kind of experience could potentially teach us about AI’s role in the creative process. We forged ahead.
Our next step was to speak to our district Fine Arts Department to find if we had teachers who might be interested in helping with this kind of a student-driven creative project that included AI.
The Assistant Superintendent of Technology and Innovation and District Director of Fine, Visual, and Performing Arts introduced this idea at a Fine Arts department meeting in February 2024. Our “top secret” project idea description was intentionally vague, but we told our teachers that the project would involve Arts, Technology, and Artificial Intelligence. We used a Google Form to poll the Fine Arts department members to find who might be interested in participating in this type of project with students. Many creative teachers were interested with responses ranging from “I would like to learn more” and “I’m very intrigued!” to “Sounds like a fun idea! I’d love to help in any way,” and “I’m all in!”
Among our interested respondents, we identified a core group of staff facilitators and met with them in March 2024. We scheduled our first student work session for the beginning of May, and we asked key high school staff to help us identify students who might be interested in working on the writing of an original musical. We sought students who had some experience as writers, actors, dancers, and/or musicians.
At our first student workshop, approximately 30 students attended. We introduced students to the idea that we would “create and perform an original mini-musical in collaboration with various AI tools, and introduce and/or highlight one or more of the 2024–25 Framework 220 strategic planning focus areas.”
Our strategic planning focus areas included:
Learner Agency & Pathways
Global Awareness
Grade-Level & Program Transitions
Social Media Awareness
We also provided a framework for this project that included the following criteria:
Use a variety of AI tools
10–12 minutes in duration
Involve participants at a variety of grade levels (live and/or recorded)
Include approximately 3 original songs
Highlight one or more Framework 220 (strategic plan) focus areas
Accompany songs with live or AI-performed music
Create visuals using AI-enabled tools
Incorporate the theme “Barrington 220 is The Place To B”
The initial reaction of our student collaborators was mixed. One student captured the feelings of several when he said, “it made me feel a little bit weird. I’ve tried my best to avoid using artificial intelligence in really any capacity for anything. I kind of disagree with its use in music.” Another student later remembered, “the first day I was kind of hesitant because…this is a very weird concept.”
Our high school theater director guided students in several activities throughout our day-long workshop. The group began with a brainstorming session regarding possible plots and characters. As a storyline emerged, ChatGPT was used to create a few different script options. After further refinements and character discussions, ChatGPT was used to continue to develop the dialogue. Our storyline that was co-created with AI followed the same three characters through their time in elementary, middle, and high school. Along the way, characters developed through dialogue and music.
During the workshop, many of the ideas that would lead to our working script and lyrics were developed. For example, in one activity, groups focused on the possible archetypes of the three main characters and discussed how they might change over a period of years from elementary through high school. In another activity, groups used generative AI to write and refine a first draft of dialogue and storyline. Each group then reported their progress to the larger group. Eventually, the full group reached a consensus to divide the mini-musical into three scenes, each representing elementary, middle, and high school. Groups then used AI tools to generate lyrics for a song for each of the scenes to represent that point in the story.
Throughout the day, the student collaborators became more and more comfortable with the use of AI as a creative partner for story development and drafting the script. One student reported, “It’s definitely been different from anything else I’ve done before because everybody’s a little unsure about how to do it because there are no instructions to go with it.” Another student realized that AI helped to provide an entry point for more students to be a part of the project: “[AI] was a really helpful tool to make an easier process that anyone can work on—despite your experience.” Another student reflected, “I think we’re doing a good job so far of incorporating ChatGPT, but not letting it take over.”
During the first workshop day, AI helped the group successfully create a story with characters, dialogue, and song lyrics, but the student co-authors noted several problems. Many acknowledged that the musical was too long. Others felt that the dialogue generated by AI didn’t sound real—and especially didn’t match the way students at different grade levels spoke. The entire mini-musical needed to be refined. Further, the songwriting had not yet progressed beyond the initial draft of song lyrics, and choreography could not begin until the songs were finalized. A second workshop session was required.
Two weeks later, a second workshop was held, this time with a subset of students from the first session. The goal of this second half-day session was to further refine the storyline, dialogue, and lyrics. The session began with a table read where dialogue refinements were made. Students noted that the AI tools that generated the dialogue and lyrics were both too wordy and, in general, didn’t sound like what students at the specified grade levels would say. Drawing both from personal experiences and observations of younger siblings and acquaintances, students re-wrote dialogue and made many cuts to reduce the time. One student author noted: “we had to change some of the language… The original script didn’t seem fitting to [student] voices because it was written by a robot and not a human.” By the end of the session, another student author reported, “I think we did a good job of working out the areas that maybe weren’t so good and making it sound more human.”
Student songwriters were present during the script writing sessions who were also members of a student-led Songwriting Club that had been organized long before this project began. Not only were these student songwriters interested, they were committed to the art of songwriting and had previously collaborated on other projects. The student songwriters made plans to hold specific songwriting sessions a few weeks later.
Our small student songwriting group worked for four days at the beginning of the summer along with a teacher facilitator who is also a performer and songwriter. At the beginning of the songwriting process, the student composers sketched the three songs by identifying structures, shaping chord progressions, and refining lyrics. Although the idea of using AI tools for the songwriting process had been discussed, the group preferred to work traditionally on the composition aspects of songwriting. The AI-generated lyrics were continuously revised by the students throughout the sessions. One of the songwriters reported, “we changed up some of the lyrics to sound more human and show more emotion.”
Although the songwriting group did not use AI to write the music, the original lyrics had been mostly created by AI. A songwriter noted: “I think having the lyrics set before ChatGPT gave us a basic idea or concept which helped guide our work instead of having to come up with all the lyrics from scratch…having [a story] that already exists was helpful so we could use that and build on it.”
The process of writing the music traditionally was described by another songwriter. “I would describe it as collaborative. A lot of what we’ve done is just playing simple things and then just building on it as a group and creating a great idea together.” As the student songwriters finalized the compositions, they rehearsed each song as a group and began to specify instrumentation (the instruments used to play the different parts).
Watching the songwriters work was fascinating. Each day they brought more instruments, amps, and technology to move the songwriting process forward. By the last day of the sessions, the room was filled with several guitars, keyboard instruments, percussion, and multiple iPhones, iPads, and computers. One of the songwriters assumed the responsibility to write the orchestrations so each instrumentalist would have their own sheet music to play for the recording session the following week.
Since this project included summer rehearsals and a performance just before the beginning of the school year, we made the decision to pre-record the accompaniment soundtracks for the songs. To make it clear that our own students were playing the instruments, we made professional audio and video recordings of our student accompaniment performances. The videos would be played during the performance on the large screens that were available at our venue to allow our student instrumentalists to “virtually” be a part of the performance onscreen, even though not physically present. Further, these recordings allowed for consistent rehearsals throughout the summer and fewer logistical issues.
Student composers contacted student instrumentalists to join the group for a rehearsal and recording day. This step in the process truly underscored the level of community that is shared among our group of Barrington 220 student musicians. The students clearly knew each other well, and through a flurry of texts and calls, the available players were contacted, booked, and confirmed for the scheduled days. Knowing the players who would be performing also helped the student orchestrator write music parts at levels of difficulty that would allow the musicians to learn their parts quickly and perform them well.
The following week, our student performers met for a day-long rehearsal in the band room of one of our middle schools to learn how to play their parts. Vocalists also joined us for the rehearsal and recording process to make “scratch recordings” so cast members of the musical could use them to more easily learn the songs. The day after the rehearsal, our professional audio/video recording engineer set up our room to capture our recordings. In all, we recorded three “takes” of each piece of music. Each of the three songs was recorded with and without vocals. Two of the songs were also performed in a revised instrumental form to be used for scene change music, and the final song was adapted into the curtain call music for the bows at the end.
With the instrumental tracks recorded, the musical’s director cast the eleven characters—an elementary school-age, middle school-age, and high school-age actor for each of the three main characters (nine actors), plus two teacher roles. The remaining 20 performers were part of the ensemble. The cast had just six rehearsals during the summer to learn the songs, choreography, and blocking (the movements and positions of the actors on a stage). One of the student composers assisted with teaching the songs to the actors.
Three members of our student dance company created and taught all of the show’s choreography to the cast members. Similar to the songwriting process, the choreography was approached traditionally and without the aid of AI. Our student choreographers worked with the actors who had various levels of dance experience to create the dances and movements for each of the three songs. One of our younger actors noted, “I think it’s amazing that three students from Barrington High School are completely choreographing it, and I think they’re doing a great job!”
Even while the cast was learning the script and songs, students and the director continued to make revisions to dialogue, lyrics, and songs. More than once, the student actors reminded each other that the words were originally written by AI, and no one seemed to have issues making changes on the fly through consensus. One of the lead actors noted that the director “has given us a lot of freedom with the singing so we can add in harmonies when we want to and change a note to make it fit better.” Due to the frequent changes, the script and lyrics existed only as a shared Google Doc to allow all actors to have the most updated version of the script and lyrics at all times. Another actor noted, “it’s kind of like building a plane while flying.”
Since our performance venue has excellent live audio systems and two very large video screens that flank both sides of the stage, we optimized our production for this environment. Rather than using physical backdrops on stage, we used AI-enhanced digital images to create the three settings on the screens—an elementary school playground, a middle school classroom, and a high school cafeteria. Also, each time social media posts were mentioned in the script, the screens showed the audience the post being discussed.
During the songs, the large screens were used to display videos of the student instrumentalists who had been recorded earlier in the summer and provided the audio accompaniment for the show. The entire production was run from a single MacBook running Apple’s Keynote presentation software that allowed the audio, video, and image cues to be easily triggered. In another effort to simplify the production, the only physical set pieces used were three 20-inch wooden cubes on the stage.
On the day of the performance, our audience included approximately 1,000 staff members from across our 14 district buildings. Each attendee was presented with a mini-program designed in the style of a Playbill that listed all of the cast and creatives who were a part of this production, including a credit for ChatGPT. A student introduced the mini-musical by briefly explaining our creative process, while also relating the importance of the sense of community that was required to make the production possible. Our student introduction included:
“Community is found everywhere in Barrington 220 whether it be through a club, a sport, or through some awesome teacher who creates a sense of belonging. This is a show about how Barrington is a community—and community is built around projects like this. Students were able to express their creativity, collaborate with others, and show their extraordinary talents like writing music, creating choreography and teaching it to others, and performing.”
After the introduction, the all-student cast performed the world premiere of our work, #ThePlaceToB - a mini musical, to much acclaim.
The audience reactions were unforgettable. Staff members were deeply moved by the musical on several levels: the show’s inclusive message, the phenomenal performances by our students, the imaginative uses of AI, and the creativity behind the collaborative process, to name a few.
The production allowed everyone involved to get a better sense of how Artificial Intelligence can be used as a part of—but not in place of—the creative process. As one student collaborator remarked, “I think a lot of people are worried that AI has too much power, but I think [this project] proves that at the end of the day, we have the power, and we can always change what AI comes up with.”
When AI delivered undesirable results, we either changed the output or rejected the use of AI completely. We found that AI functioned well to help us brainstorm story elements, draft a script and lyrics, and to create and refine images. However, the extensive revisions by our student writers made the script more believable. In the cases of our songwriters and choreographers, only traditional methods were used. This project truly allowed us to use Artificial Intelligence as a partner in our creative process, but the AI technology never took center stage.
You can watch #ThePlaceToB - a mini-musical, the documentary The Making of #ThePlaceToB - a mini-musical: A Creative Partnership with Artificial Intelligence Tools, and learn more about this production at bit.ly/220aimusical.