Sony Music’s ‘digital Madison Beer’ sets the virtual concert world on fire

Sony Music’s ‘digital Madison Beer’ sets the virtual concert world on fire

For pop stars, staying in the public eye means constantly standing out—on social, on playlists, on stage, and more.

But in a year where pandemic restrictions became the norm, many artists had to find new ways to reach their fans around the world—sometimes to a jaw-dropping extent.

Among the star-studded livestream and innovative virtual performances over the last 12 months, Epic Records artist Madison Beer has released what might be the most Photorealistic depiction of a musician yet.

The Madison Beer Immersive Reality Concert Experience, a groundbreaking, effects-filled virtual performance that premiered on TikTok LIVE and is now coming broadly to YouTube, VR platforms and more, shows just how far an idea can go when artists set real-time rendering and virtual production loose on their vision.

An ultra-realistic digital avatar of Madison is the centerpiece of a boundary-pushing concert that would be impossible to recreate in real life.

Sony Music Entertainment and Verizon worked with Madison to develop a full-scale recreation of New York’s Sony Hall and present a medley of her hits with all the production value you’d expect from a major artist. Only it’s completely virtual—except for the music and performance driving the experience.

For creatively adventurous artists seeking new and innovative ways to connect with audiences, that can be a good thing.

While most concerts are limited by worldly constraints, a virtual concert can be whatever an artist wants it to be, giving them the power to shape fan experiences and realize fantastical concepts at a much higher level than is possible in real life.

The Madison Beer Immersive Reality Concert Experience takes this idea and runs with it, turning one piece of content into the type of transmedia campaign that can thrill fans from YouTube to VR.
Keeping it real

For all the leeway afforded to them by 3D, the production team—led by Sony Immersive Music Studios, Magnopus, Gauge Theory Creative, and Hyperreal—still saw value in maintaining a measure of realism.

“When we started with a blank canvas, our creative goal was to construct a virtual concert through pH๏τoreal recreations of a real venue and a real artist, but which also layered in enough magic to reimagine the concert experience itself,” says Brad Spahr, Head of Sony Immersive Music Studios.

“You start with things that are totally plausible in a physical setting, because that’s what’s going to make your fans get into it and accept the experience,” says Alex Henning, Co-Founder of Magnopus.

“Once you’ve got them hooked with that kernel of truth, you start to build on top of that with the fantastical. And the more you can pull off the former, the more “wow” you get out of the latter.”

For Magnopus, this meant the venue and the VFX packages. For Hyperreal, it meant Madison herself.

Hyperreal started by capturing Madison’s face and body with two separate arrays of high-resolution camera systems in Los Angeles.

The first system produced a volume for her face, neck, and shoulders, as it recorded pH๏τometric data at the sub-pore level.

By capturing the way she moved from every angle, Hyperreal was able to get enough data to construct an ultra-realistic avatar, or “HyperModel,” that steers clear of the Uncanny Valley.

With the help of 200 cameras, Madison’s body, muscles, and shape were then recorded in a range of biomechanical positions to ensure deformation accuracy in Hyperreal’s real-time HyperRig system.

After adding Madison’s preferred performance gear—outfit, hairstyle, earrings—Hyperreal brought the avatar into Unreal Engine to experiment with movement before the live capture session at PlayStation Studios in LA.

While this was happening, Magnopus was hard at work on the venue and VFX systems. Like the HyperModel, the goal was to stay as real as possible to ground the event, so when things like star fields started appearing above Madison, they would seem magical and surprising.

After considering a full LiDAR scan, Sony Immersive Music Studios decided to construct the venue from scratch to allow them more control over the lighting.

They started with the original CAD files, which were imported into Autodesk Maya and given the full artistic treatment, including all the nuances that make Sony Hall unique.

Magnopus was then able to build upon that with lighting and VFX to achieve the overall goal of a reimagined concert experience.

“Sony Hall is an intimate venue with a lot of character, detail and beauty, which made it an ideal environment for the experience” says Spahr.

“It is also great for VR, because of the scale. It’s not a giant, cavernous arena or a tiny hole-in-the-wall club,” says Henning. “It’s got almost the perfect amount of dimension.”

Since Unreal Engine would be used throughout the creation process, Magnopus made use of its built-in virtual scouting tools to get their cameras set up so they could test the lighting before diving into the special effects. But first, they needed the performance.

The benefits of virtual production for music

Unlike most motion capture shoots, where everyone could be together, The Madison Beer Immersive Concert Experience was a remote affair driven by teams across the US. In LA, Madison Beer was in a mocap suit and head-mounted camera.

In Philadelphia, Hyperreal CEO Remington Scott was directing her in real-time, using a VR headset that not only allowed him to view Madison’s avatar face-to-face live within the virtual Sony Hall, but adhere to the COVID-19 restrictions that were keeping them apart.

Because Unreal Engine operates in real time, virtual productions can use its remote collaboration tools to stream 3D environments anywhere in the world, completely synced across locations.

This allowed Madison’s performance to be recorded in one continuous take, with no cuts and no edits, which was important for a team who wanted the performance to feel as authentic as possible.

After the motion capture shoot was completed and the experience was polished, cameraman Tom Glynn was able to build out the Shot selections for the final 9.5 minute performance.

“There are moments where you can’t believe this was done in a game engine,” says Tom Glynn, Managing Director at Gauge Theory Creative.

“There’s a 3D world with a performance happening, and it’s happening at the same time that I’m moving a camera around.

It’s hard to believe what I was seeing in the viewfinder while I was shooting it. I’m looking at an avatar of Madison Beer and it feels like a real person I’m standing in the room with. It kind of blows my mind.”

In two days, they recorded hundreds of takes, ensuring that they could get any Shot they wanted.

“We were hitting the play ʙuттon, Madison was performing, and Tom was getting his Shots in real time. He was instantaneously watching his Shots on the monitor directly in front of him.

Then we would stop, readjust, hit play, and the concert would go again and he’d get another Shot,” says Spahr. “That real-time feedback was huge.

If there was one thing about this that was a challenge, it was, ‘I have so many good Shots, I don’t know which one to use!’ It was an embarrᴀssment of riches.”

Glynn was surprised about how easy a virtual production experience could be on a cameraman, especially for a “concert” shoot.

Traditionally, a live performance would necessitate five to twelve different cameramen who would be set up in strategic parts of the venue with a variety of different tripods, dollies, Steadicams, and so on.

The team would prepare, shoot it once, and get what they got. In this case, Glynn was able to use all the same equipment, including a handheld rig, but film within a virtual world that allowed for quick takes.

Using Unreal Engine, Glynn was also able to overcome some of the physical limitations of the real world with a few quick commands.

For instance, sometimes the Shot he wanted was a little above or below Madison’s eyeline. So the team just increased his size by a factor of 1.2 or 1.5 within the environment, and he was suddenly “tall” enough to get it.

Other times, he wanted to keep up with her quick moves without introducing bumpiness into the take. So they increased the translation scale by 1.5–2x until one step equaled seven. Problem solved.
Moment makers

Once the footage was “in the can,” it was up to Magnopus to sweeten it with effects that would not only catch the eye, but would be impossible in real life.

“There’s a sequence where there’s a ring of fire around Madison. There’s a moving experience where raindrops are falling around her.

These are things that, due to safety issues, wouldn’t be allowed in a normal concert venue,” says Spahr. “So we are giving a fan the ability to see a concert in a new way, but then we dial it up, with cosmic star fields.”

admin

Leave a Reply

Your email address will not be published. Required fields are marked *