Site icon TPi Stage

The Madison Beer Immersive Reality Concert Experience

Sony Music rewrites the rules of virtual shows with Unreal Engine. Photo Hyperreal & Epic Records.

While most concerts are limited by worldly constraints, a virtual concert can be whatever an artist wants it to be, giving them the power to shape fan experiences and realise fantastical concepts at a much higher level than is possible in real life. The Madison Beer Immersive Reality Concert Experience takes this idea and runs with it, turning one piece of content into the type of transmedia campaign that can thrill fans from YouTube to VR.

For all the leeway afforded to them by 3D, the production team — led by — still saw value in maintaining a measure of realism.

“When we started with a blank canvas, our creative goal was to construct a virtual concert through photoreal recreations of a real venue and a real artist, but which also layered in enough magic to reimagine the concert experience itself,” said Head of Sony Immersive Music Studios.

“You start with things that are totally plausible in a physical setting, because that’s what’s going to make your fans get into it and accept the experience,” said Magnopus Co-Founder, Alex Henning. “Once you’ve got them hooked with that kernel of truth, you start to build on top of that with the fantastical.”

Hyperreal started by capturing Madison’s face and body with two separate arrays of high-resolution camera systems in Los Angeles. The first system produced a volume for her face, neck, and shoulders, as it recorded photometric data at the sub-pore level. By capturing the way she moved from every angle, Hyperreal was able to get enough data to construct an ultra-realistic avatar, or “HyperModel,” that steers clear of the Uncanny Valley.

With the help of 200 cameras, Madison’s body, muscles, and shape were then recorded in a range of biomechanical positions to ensure deformation accuracy in Hyperreal’s real-time HyperRig system. After adding Madison’s preferred performance gear, Hyperreal brought the avatar into Unreal Engine to experiment with movement before the live capture session at PlayStation Studios in LA. While this was happening, Magnopus was hard at work on the venue and VFX systems.

After considering a full LiDAR scan, Sony Immersive Music Studios decided to construct the venue from scratch to allow them more control over the lighting. They started with the original CAD files, which were imported into Autodesk Maya and given the full artistic treatment, including all the nuances that make Sony Hall unique.

Magnopus was then able to build upon that with lighting and VFX effects. “Sony Hall is an intimate venue with a lot of character, detail and beauty, which made it an ideal environment for the experience” said Spahr. “It is also great for VR, because of the scale. It’s not a giant, cavernous arena or a tiny hole-in-the-wall club,” said Henning. “It’s got almost the perfect amount of dimension.”

Magnopus made use of Unreal Engine’s built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects.

VIRTUAL MUSIC PRODUCTION BENEFITS

Unlike most motion capture shoots, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera. In Philadelphia, Hyperreal CEO, Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.

After the motion capture shoot was completed and the experience was polished, Cameraman and Gauge Theory Creative Managing Director, Tom Glynn was able to build out the shot selections for the final 9.5 minute performance.

“There are moments where you can’t believe this was done in a game engine,” remarked Glynn, surprised about how easy a virtual production experience could be on a cameraman. In two days, they recorded hundreds of takes, ensuring that they could get any shot they wanted. “If there was one thing about this that was a challenge, it was, ‘I have so many good shots, I don’t know which one to use!’”

Glynn was also able to overcome some of the physical limitations of the real world with a few quick commands using Unreal Engine.

MOMENT MAKERS

Once the footage was filmed, Magnopus added effects that would not only catch the eye, but would be impossible to recreate in real life.

“There’s a sequence where there’s a ring of fire around Madison. There’s a moving experience where raindrops are falling around her. These are things that, due to safety issues, wouldn’t be allowed in a normal concert venue,” said Spahr.

Magnopus created special and lighting effects within Unreal Engine, using real-time ray tracing and the timeline tools in Sequencer, the engine’s built-in multi-track editor, to jump around as they edited different sections of a song. With Pixel Streaming at its disposal, Magnopus was able to overcome the hardware limitations that box artists in.

“In real time, you’ve always got a render budget and you can’t go over it. It’s always locked to whatever the target device’s power is and you’ve got a certain amount of things you can render, at a certain level of quality, in order to get the screen to refresh at a refresh rate that you need it to,” said Henning. “Being able to exceed that and go well beyond it is appealing for an artist.”

According to Spahr, there are plenty of opportunities for new shows and new ways to use digital avatars to reimagine music. “Anything an artist can dream up can be brought to life, no matter how fantastical it might be. We don’t have to operate within constraints,” he concluded.

This article originally appeared in issue #265 of TPi, which you can read here.

www.unrealengine.com

Exit mobile version