Virtual Sets come of age

  • By Adrian Pennington

Virtual Sets come of age

The virtual production phenomenon goes from strength to strength with new tools, greater photorealism, and lower prices driving demand… By Adrian Pennington.

The pandemic swelled demand but virtual sets were on track to revolutionise production regardless. The pricing for virtual studios has been falling steadily for the past five years putting them within the reach of even the smallest local station’s budget. With increased rendering power, enhanced digitally curated graphical libraries and tools to help create photorealistic virtual graphics, the visual quality of today’s virtual set systems is outstanding. Sophistication and ease of use is increasing all the time given the technology application in remote production of live events, news and sports, education, the enterprise and more. 

“The technologies may have been around in broadcast and film for decades but now come with an astonishing level of photorealism,”says Miguel Churruca, Marketing Director, Brainstorm. “Further more, the whole industry has realised that virtual production can save the business when travel restrictions and social distancing have to be applied.  

“Not only broadcasters but also live events and corporate communications companies, and content creators of many kinds used virtual studios. We’ve seen how many broadcasters covered the Olympics using virtual technology, and also how companies dedicated to live production, corporate communications and presentations in general used virtual sets and production to create content and events for their clients.”  

Brainstorm recently introduced Edison PRO, “an innovative, template-based playout application that seeks to transform any live, online presentation or conference into an immersive experience through AR and virtual environments,” says Churruca.

“It allows users to enhance their speech and storytelling with real-time 3D graphics and other visual aids, as well as include themselves in the presentation, through an affordable and easy-to-use platform that does not require expensive hardware or a studio setup.

“Benefitting from Brainstorm’s advanced 3D technologies, Edison PRO seeks to democratize the virtual set technology allowing users to create dazzling presentations starting from a PPT of PDF document, and without requiring specific knowledge in 3D.”

Currently the industry is using live events LED technology and needs to transition to fit-for-purpose cinematic LED technology. Mo-Sys’ Cinematic XR initiative is aimed at driving for change in LED wall technology, output image quality, re-introduction of established shooting techniques, and smart workflows.  

“This is the first product on the market that changes an LED volume from just being an expensive backdrop, into something that can be integrated into the storytelling,” the company says. “It enables for the first time real interaction between the talent and objects in the virtual scene positioned virtually ‘behind the LED wall’.  

The company is also offering VP Pro XR, described as the first LED wall XR solution purpose-built for cinematic and broadcast use. VP Pro XR “uniquely” minimises the delay issue with XR volumes to a minimum, whilst enabling unique features such as Cinematic XR Focus which allows a focus pull between real and virtual elements and the new NearTime rendering solution. 

NearTime is meets the key requirements of cast and crew seeing the full effect of the shot on-set in real-time, and delivers a higher-quality version of the shot, completely automated, and in a timescale which matches the practical requirements of the production.  

“Winner of the HPA Engineering Excellence 2021 award, this solution is cost-effective, uncompromising in quality and timely without the huge overheads of real-time augmented reality. NearTime draws on the proven Mo-Sys expertise in camera tracking and live compositing, delivering a complete system in partnership with the AWS Media and Entertainment team.” 

Many live event companies have switched to ‘hybrid studio’ concepts. Explains Nalin Mishra for WASP3D, “This is a good way to use virtual sets while maintaining many of the features of a hard set. These studios combine green screen areas with hard set elements and are a cheap and flexible solution. Current set design aesthetics often require big ticket items such as   LED walls to display large graphics. This look can be easily mimicked using combinations of hard sets and green screens at a much lower cost.” 

WASP3D recently launched its 3D Virtual Set Production System. “The cost-effective, ready to use custom virtual sets and playout suite, lets you realize your vision of a studio with a professional look and feel,” she says. The product, priced at $129 per month comes with a design application and playout engine, inbuilt virtual camera animations, data-driven AR graphics and NDI Integration capabilities for video conference integrations.  

“Users can explore endless possibilities with a small green screen set-up,” Mishra says. “You can convert any limited size space to an infinite 360-degree hybrid virtual set environment and capture every minute detail like mirror reflections, shadows and cloth movements. You can enhance a channel’s brand value with custom 3D overlays and unique frames, icons, shapes, patterns, illustrations, textures, images and videos (live/stock footage).” 

When Unreal first broke into the TV world as an engine for virtual productions, it represented such a leap forward in graphics quality that many early adopters treated it a novelty. “Networks would put on these once-in-a-lifetime events with bespoke designs, high-budget external creative and production teams, and lots of fanfare, but then these systems would sit on the shelf for the other 99% of their programming,” says Carol Bettencourt, Chyron. “That’s not going to drive ROI in the big picture. The demand we see from broadcasters is to demystify the virtual workflow, making it into something their designers, producers, and journalists can leverage to engage their audience every day - while still having the horsepower to execute those amazing one-off shows.” 

Enter Chyron’s PRIME VSAR, an Unreal-driven engine which contains new features as part of the PRIME Live Platform 4.3 release. “On top of significant technical improvements - such as support for UHD formats, two cameras per engine, and real-time previews - we’re enriching the Chyron toolbox within Unreal for ultimate usability,” says Carol Bettencourt.  

“With this release, PRIME VSAR simplifies design processes with templates for virtual studio screens, weather forecasting, elections, and other data-driven graphics. Crucially, all of this plugs into our template-based CAMIO workflow that producers and journalists can use to customize virtual elements for stories in MOS rundown - just like any everyday news graphic. PRIME VSAR makes it easy to be creative.” 

Panasonic suggest the significant increase in the demand for virtual sets is partly down to advancements in LED making systems more accessible.  

“Rental and staging companies have been able to develop their own studios due to extensive LED panels for backdrop, and access to lighting and camera technology,” says Ollie Newland Field Marketing Manager. “The low footprint required to build a virtual studio compared to green screen makes them attractive in the broadcast, corporate and higher education market in particular. 

“Virtual studios can be a seamless experience for presenters with relatively low levels of presentation experience, as visible cues obtained from the LED backdrop allows the presenter to be immersed into the environment with ease, and instantly makes them feel more comfortable within the surroundings.”

Panasonic’s PTZ cameras and robotic systems support the FreeD protocol which enables usage with most AR and VR graphic engines. Its system cameras house a Super 35mm MOS sensor which therefore counteracts potential moire issues that can otherwise be a problem when using LED backdrops. KAIROS, Panasonic’s ‘next gen’ live production platform, can sit at the centre of a virtual studio to enable productions to work with a simultaneous combination of video inputs such as ST 2110, NDI, SRT or baseband SDI. “Such connectivity helps support the move towards true remote production, Newland says. “KAIROS has a Canvas feature which enables it to output to non-traditional formats such as 32:9 for a LED backdrop with ease, as well as standard resolutions such as HD and UHD.” 

The product’s GUI enables team with varied levels of experience to deliver engaging content or set-up complex systems with limited training time. “This makes it an ideal option for the implementation of virtual studios into corporate environments where operators can more often than not be employees with limited experience with professional video technology.” 

Incorporating a virtual studio into a broadcast workflow “has enabled remote communication on a different level,” says Reckeen President, Adrianna Hebisz. “For example, by incorporating remote speakers into broadcasts more effectively and not only on a screen but into a virtual studio environment, even if a green screen is located in a different part of the world. Additionally, with 3D virtual studio solutions becoming more accessible, even smaller TV studios can afford to transform their broadcasts into advanced and progressive-looking content, with almost limitless possibilities.” 

The technology in this area is advancing to include Augmented and Extended Reality solutions “which is bound to keep shaking things up in the industry even further,” she says. 

Reckeen’s latest addition to its AR solution is intended to create a truly impressive and immersive experience for viewers, as well as being the firm’s most advanced product by far: The 3DS PRO X8 is a self-sufficient workstation designed to handle all of the heavy-duty tasks of a 3D production and a broadcast simultaneously. With a capture card capable of four physical 4K inputs at once, a powerful GPU, dual LAN with 10 Gigabit throughput, NDI 5 support, and more, “the platform has been made with truly demanding customers in mind.” 

“While so called ‘traditional’ rendering engines have always been adding features, the truth is that game engines arrived and changed the technology panorama,” says João Robalo, Product Manager for Graphics, VR & AR at Vantec. “Unreal Engine is now established in a myriad of TV studios and production houses and is becoming the new standard in VR/AR/XR. It allows the creation of virtual worlds which emulate the real world so well, that sometimes it’s difficult to distinguish the barrier between both worlds. The question ‘is this real or virtual?’ is being posed more and more.” 

Vantec says it’s been working closely with VR/AR companies which employ Unreal as their 3D engine and have been building their solutions on top.  

“We offer an all-in-one product called Studio2Go which is a portable equipment which allows to quickly start a live production, record and livestream,” Robalo explains. “With the possibility of inserting graphic templates, feeding data from external sources and quickly inserting remote interviewers into the production, it truly is an agile and effective production equipment for productions on the go.” 

 dock10 is offering a new solution to broadcasters that enables Augmented Reality (AR) characters to be rendered in real-time in both traditional and virtual sets. 

The solution involves innovative firsts including the combination of full-body motion-capture and facial-capture software for more expressive movement, and the real-time rendering of AR characters into virtual sets. The first client to use this new technology is BBC Education with their AR robot character CLOGS for BBC Bitesize Daily.  

Richard Wormwell, Head of Production Innovation, says: “Motion-capture technology is the best way to deliver really dynamic movement with AR characters in real-time. Traditionally, motion capture has been a time-consuming process with lots of clean-up work required in post. Our solution combines real-time full body rendering of characters for an ‘as live’ output. This is something that was simply not possible before.”  

Andy Waters, Head of Studios adds, “It brings cinematic qualities to television and opens up exciting possibilities for productions in almost any genre. It could completely revolutionise entertainment formats and is perfect for prime-time Saturday night television shows.” 

The facility is also investing £1m to meet the increasing demand for virtual studios and remote galleries. The dedicated multipurpose remote gallery works with OBs and connects to the dock10 network. 

Waters adds, “We are keen to work with OB providers and the new gallery is designed to complement their service rather than compete with them. Live sporting events including the FA Cup Final have already been successfully delivered using our remote gallery service.” 

If the last couple of years taught us anything, it is that virtual sets and virtual studios are a powerful useful production tool with a world of possibilities to accommodate every kind of production: from the one-time driven event to the biggest networks in the world.  

wTVision has been developing virtual solutions and augmented reality systems for decades and can now offer a wider range of options than ever before.  Its control applications and rendering engine are now able to integrate with Unreal Engine. 

“Our creative team is able to design and implement a completely virtual environment from scratch and has been developing virtual studios for more than 20 years,” the company explains. 

“This is a perfect combination with our own control application, Studio CG, and rendering engine and allows us to build projects for every kind of TV show.  

“One of our biggest advantages is our capacity to integrate with third-party developers and providers, easily adapting to our partners’ pre-existing technology and production status. In the last couple of years, we’ve been developing different solutions for different content providers and each project is completely designed according to our client’s wishes. Issues like space, workflows, remote or on-premises operation, as well as design and implementation are adaptable and built for every project.”