Coldplay and BTS
NBC's The Voice Season Finale Performance
All of it Now
Mixed Reality Production Studio
Coldplay and BTS performed “My Universe” on NBC’s The Voice together in mixed reality, with Coldplay performing in person and the BTS band members performing as Holograms in Augmented Reality, using 3D virtual avatars created via volumetric capture, and rendered live using Unreal Engine 4.27.1.
How did this project begin?
The initial discussions for this project began in August between Coldplay Creative, Dimension Studio, and All of it Now(AOIN), Where conversations involved using real-time AR to bring the BTS band members on stage with Coldplay, even if they were unable to physically attend.
From there, conversations pivoted to find a show where all teams would have sufficient time, access, and support to produce this effect. The Voice opportunity was selected by the Coldplay team, due to the experience of the existing production team, as well as the studio environment and infrastructure, which significantly expedited the install process.
What were some technical challenges you encountered?
As with any groundbreaking innovation there were some technical challenges to overcome with the mp4 approach – the current SVF plugin for UE4 did not have the ability to track the mp4 recordings properly to timecode. This required collaboration between AOIN and Microsoft to rewrite elements of the SVF plugin code so that the BTS performers remained in sync with Coldplay on stage.
Another challenge with using in the original volumetric capture recordings was that the original BTS performances were recorded at 24 FPS to match the music video frame rate, but the Voice is produced and broadcast at 29.97 FPS. This created some lip sync issues, due to frame blending 24 FPS into a 29.97 output, but AOIN was able to clean up the lip sync issues in post production.
How were you able to effectively collaborate and build the production across so many teams?
The AOIN team received 3D stage assets from The Voice Team, and were able to put the performers inside of a 3D representation of the Voice Stage. This process enabled a crucial round of Previsualization, where the Coldplay creative team could test AR performer configurations, tracking area, transition timing, and camera blocking before even stepping foot inside the studio.
AOIN was able to ingest the 3D stage assets and production camera plot into Unreal, creating a real-world scale accurate representation of the 7 AR cameras, and the performers, so that the Coldplay creative team could visualize which performers would be visible for each moments, which cameras could best capture these moments, and so previsualization renders became the best method to communicate these moments to all parties involved.
This previsualization time was crucial in making the best use of the limited time onsite, and also helped unify all technical and creative efforts into a shared deliverable with shared understanding.
How did you pull this project back from the brink of certain disaster?
One hurdle was that the Voice performance was reduced to 3:30, whereas the original volumetric capture performances were recorded using the original 4 minute long run time. This required shortening the performance, without losing any metadata within the encoded files. As the AR Holograms were tracking to timecode, the AOIN team were able to edit the incoming timecode to effectively “skip over” the missing section, hiding the skipped timecode section with a transition.
All of it Now is a mixed reality production studio, specializing in real-time technologies for live and broadcast entertainment. All of it Now has experience integrating advanced tracking technology into real-time content, which spans across multiple platforms and applications, including XR, In-Camera VFX, and Post Production.