Narrative/storytelling Performance Play/Games VR/AR

The Last Dance

The Last Dance is an immersive mixed-reality theatre experience that combines Virtual Reality and real-time Motion Capture technology. It is an experimental narrative that challenges the boundaries of space and storytelling, inspired by the enduring regrets and unfinished stories in life. Through this experience, an audience will take on the role of the father in VR, confronting unresolved regrets. In the end, the participant will share “the last dance” with the departed daughter physically with Mocap actress Yixin Wang.

Student

Coco Ma

Advisor

Rosalie Yu

Abstract

The Last Dance is an immersive mixed-reality theater experience written and directed by Coco Ma, combining Virtual Reality and real-time motion capture technology. It is an experimental narrative inspired by the enduring regrets and unfinished stories in life, challenging the boundaries of space and storytelling. Through this experience, an audience will wear a VR headset, taking on the role of the father in the story. The participant will confront the unresolved regrets of the father and share the last dance with his departed daughter portrayed by actress Yixin Wang. In the final moments of the story, the participant, assuming the role of the father in this story, will dance physically with a human actress while seeing the daughter in the virtual world through a VR headset. Unlike conventional theater performances or VR experiences, "The Last Dance" is a dynamic and unpredictable journey, shaped by the audience's unique emotional responses. Each performance is an open-ended and interactive exploration of the human experience of grief and loss, as seen through the lens of the father's story. Through the innovative use of live motion capture technology, "The Last Dance" creates a seamless fusion of the virtual and reality, blurring the lines between them to create a more intimate, engaging, and profound form of storytelling. This immersive experience is a testament to the limitless potential of the creative arts, inviting audiences to participate in a transformative journey of self-discovery and emotional healing.

Research

The primary objective of The Last Dance is to determine the ideal level of audience participation and explore the integration of storytelling in VR experiences.The early stages of the project involved exploring different ideas and determining the level of audience involvement. Should there be a live audience? Would it work better as a multiplayer VR experience or a theatre performance with both players acting? Mimi asked me about my original intention, which was to use "The Last Dance" to bring closure to feelings of regret. After trying different scenarios where strangers would dance together and playing around with different music genres, I concluded that it was challenging for players to fully immerse themselves without proper guidance and context, so I started writing a script. After attending the SXSW XR exhibition and GDC conference, I gained insights and thoughts on storytelling techniques in VR and games. I was particularly impressed by "Find Wiillii," a VR immersive theatre experience that allowed audiences to interact and converse with virtual actors. As an improvisational comedian, I valued the spontaneity and personal nature of this experience and sought to preserve this improvisational aspect in my work. Ultimately, the project settled on using one live motion capture performer, a VR headset-wearing audience, a large projection of the VR audience member's first-person perspective, and theatre audience members as spectators. Each performance is unique due to the improvisational aspect. I hope this research contributes to the growing body of knowledge on immersive storytelling techniques and for future advancements in the field.

Technical Details

I used real-time motion capture technology, specifically the OptiTrack system, to capture live data from the actor's movements in Motive, which was then imported into MotionBuilder and finally into Unreal Engine. Unreal Engine, Maya and Blender were used for world building and character design. The floor lamps on the stage used MR16 bulbs, which were connected to an Arduino that transmitted data to Max/MSP, which then fed into Unreal Engine, allowing the VR and on-stage lighting to be synchronized in real-time. For the dancing portion, I experimented with attaching a rigid body to the VR headset so that the OptiTrack camera could detect its position.