Released on New Year’s Eve, Jean Michel Jarre’s Virtual Reality show, Welcome To The Other Side is a mind-expanding experience that garnered 900,000 views on YouTube within a week of its release and, according to Sony Music International, over 75 million views on all outlets (Facebook, VRC, Weibo, Tik Tok). Defying all expectations, his creation swirls high-tech sights and sounds around a reimagined (and virtual) nine-centuries-old Notre Dame cathedral.
During its 55-minute run time, conceived to celebrate the arrival of 2021 in the French capital, animated geometric forms rise and fall within the historic church’s nave, keyboard instruments melt with colour, brilliant light patterns run up stone columns to play off stained glass windows all turning common perceptions on their head, so the viewer can come out “on the other side.”
Moving seamlessly with Jarre’s music in Welcome To The Other Side is a multifarious lightshow created by Jvan Morandi of Placing Shadows using his ChamSys MagicQ MQ80 console as the starting point to merge show business with game engine workflows. Run along triggered timelines within the Unity game engine, the lightshow is divided into two parts: interior stage element and architecture: sequences created via the ChamSys but then run directly by the Unity engine a series of exterior architectural sequences, involving lights and lasers aa well as camera moves played back via the lighting desk thought ArtNet and software.
“We programmed the cues with a ChamSys MQ80 in my studio,” noted Morandi. “All the cue lists come from ChamSys and were translated into a set of Unity animation triggers on a timeline. When I say ‘translated,’ I mean that Victor Pukhov used the visualized lighting cues to create shader animation that then got triggered by Unity custom scripts by Antony Vitillo.”
Morandi credits his MQ80 with helping this process go smoothly. “The Copy linked features in the ChamSys console were very useful, as was the Off-Set patch,” he said. “With so many camera shots on the outside I needed to dress the shot and fill it differently depending on the situation. Also, the ability to link my desk directly to my software and transfer data between the two, (they patch automatically in ChamSys) was a big help in creating the shots quickly.”
During the show, Jarre played live from TV Studio Gabriel in Paris. He was lit only by a video projector and portrayed in vivid colors and shapes coming directly from the same video content that was UV mapped on the inside of the virtual cathedral. Lending another evocative touch to the show was the virtual rendition of Notre Dame’s interior. The 3D model and the game engine programming were optimized by Lapo Germasi and Victor Pukhov of Manifattura Italiana Design. “Once we received the interior of the cathedral, we worked with our studio software and the game engine to find the right looks,” explained Morandi.
When designing the stage set in the middle of the church, Morandi envisioned a “modern version of Stonehenge.” He viewed the circular stage as a reflection of the shape of the big stained- glass window at the front of the cathedral. The stage columns that animated the scene were video mapped and received streamed content from a cue list on Vimeo.
“The stage lights are actually not real light but volumetric shaders that were animated by Victor Pukhov to mimic my ChamSys programmed lighting cues,” said Morandi. “Most importantly, Vincent Masson created the 3D animations that made the show look stunning. He used our 2D content as starting point, and with a lot of passion and talent created the 3D versions of it.”
Collaboration was critical to making this VR creation come to fruition. “This project involved a great many very creative people coming together from diverse backgrounds, starting with Jean Michel Jarre whose vision and hard work made it all possible,” said Morandi.
“Credit should also go to Louis Caracciolo from VRroom, our French VR Producer; and Antony Vitillo of NTW (Italian developers that looked after all the scripting and game engine functionalities). Jonathan Klahr did an amazing job on the 2D video content mapped onto the interior walls. Stephan and Jeroen from LaserImage of Amsterdam programmed the initial laser sequences. Georgy Molotsdov, Maud Clavier, David Montagne (global tv broadcast) did a great job filming the show all in VR.”
Exemplifying the scope of the project, one camera director was in Moscow (Georgy Molotsdov), while another was in Paris (Maud Clavier). Each of them controlled up to eight remote VR cameras and drones. Filming of this live VR gig was completely in VR within the VRchat platform, an accomplishment that would have been unimaginable not that long ago, but one that will become commonplace not that far into the future, according to Morandi.
“Once we have tested and solved various technological issues, I see tours in the future travelling with a VR/AR component in the crew,” he said. “Each show will be attended by real audience as well as VR/AR audiences. It will be just another way to enjoy entertainment.”