Immersive Real-Time Production

  • By Adrian Pennington

Immersive Real-Time Production

Affordable virtual production techniques and technologies are enabling augmented and mixed reality presentations in broadcast, reports Adrian Pennington…

The pandemic has prevented many creative directors from making content in the traditional way, forcing them to explore alternatives. This has coincided with a coming of age for the next generation of virtual set technology, with increasingly powerful photorealistic real-time rendering.  The key drivers towards adoption are space, budget, and participant location.

“Increased virtual set utilisation has taken on two forms,” says Liam Hayter, Senior Solutions Architect, NewTek. “The first is bringing remote participants via chromakey into a virtual environment, especially where presenters and guests alike are unable to travel to a studio. The second is for a set extension from smaller studio spaces, where budget and space are at a premium but the look and feel of a full studio environment is desired.

He adds, “AR/MR particularly come into their own for remote or at-home audiences where a live event space can equally be expanded. Whether this is broadcast or streamed is immaterial. Combining camera and robotics tracking data with on-stage or on-set video walls and virtual set extensions provide visually engaging, dynamic content.”

Demand for virtual sets is soaring across all markets, including broadcast, corporate communications, advertising and events. Using LED screens rather than greenscreen enable realistic reflections and refractions that actors and camera operators can easily see and explore.

“Allowing the performer or presenter to see what’s happening around them instead of just a blank screen located behind them, creates a more natural flow to a production,” says Lanz Short, Technical Solutions Manager, disguise. “[LED volumes] also remove the need for chroma keying, which can be challenging when working in a small studio, and people can wear any colour clothing when on set.”

disguise xR workflow has undergone intensive field testing and refining in collaboration with creative and technical partners. The recent public release the software is claimed by disguise to “significantly reduce the barrier of entry to cutting-edge productions, unlocking the power to create any world, from one location.” The r18 release will also debut a cluster rendering feature for delivery of photorealistic scenes to large scale content displays and support for the ACES colour management standard.  “It unlocks the limitations of a virtual production studio by scaling out real-time content up to an unlimited capacity,” Short says.

Disguise technology has been incorporated by White Light for its SmartStage system. When the pandemic hit it saw a surge in demand for corporate and education use. It has managed more than 500 hours of live corporate presentations from a SmartStage at the Mermaid centre in Blackfriars since Spring 2020. 

Technical Solutions Director Andy Hook, says, “With this technology you can broadcast to a remote audience, globally, in a more compelling way than traditional video conferencing. Everyone is turning into a broadcaster.”

White Light made a broadcast version for by Eurosport, including at this Summer’s Tokyo Olympics. This features ‘teleportation’ in which athletes are filmed live at the venue and displayed in full body 3D as if standing next to the presenter in the virtual studio. This technique has also attracted corporates to ‘teleport’ CEOs and keynote speakers into the virtual space. 

Enabling this are remote controlled pan-tilt-zoom cameras (and ‘virtual’ PTZ cams).  The application of tracking data to PTZ cams looks set to democratise the use of AR for broadcast.

Panasonic’s virtual production studio concept pulls together a number of leading technologies, including its own PTZs. The AW-UE150 PRO PTZ remote camera is claimed as the first robotic camera on the market to provide Position Data Notification (PTZF). It’s ability to be used in virtual sets is only one of many features that makes it the ideal choice for robocam needs. It features a massive wide-angle of view, 4K / 60p video, and versatile video outputs.

To be able to incorporate realistic VR or AR studio sets in your live productions, accurate camera positioning data is imperative. Panasonic says its AW-UE150 PRO PTZ camera features the FreeD protocol, which provides the option to output PTZ and Iris information via Serial (RS 422) and IP (UDP) directly to your tracking system. FreeD is a protocol that sends camera positioning data directly from the camera to a virtual reality production system and is supported by vendors with VR/AR and virtual set solutions including Brainstorm eStudio, The Future Group Pixotope, Vizrt Viz Virtual Studio, Ross Xpression and Zero Density.   “It enables productions to incorporate realistic virtual studio sets and elements into their live video workflow without the need for additional sensors or encoders,” Panasonic say.

Many vendors incorporate Unreal Engine into their virtual set solutions but Zero Density claims to be the first. That was back in 2016 when ZD released its disruptive product ‘Reality Engine, a real-time node-based compositor and Reality Keyer, its proprietary keying technology

Since then, Reality has transformed broadcast and powered numerous live events, esports, commercials and episodic TV. A flagship user is Turkish culture and arts channel, TRT2, which went live with the system in 2019. Switching from one virtual design to another with an entirely different light setup takes less than a minute as each program can be saved as graphs to be loaded when needed. 

While Unreal Engine is the renderer, Reality’s live production toolset offers photorealism, sophistication and ease of use for virtual studio and AR. The software’s pipeline is designed to achieve the perfect blend of the virtual world with the physical, and is effective at handling tracking data, keying and intuitive control tools, according to ZD.

Vantec, part of the Danmon group, made its name creating camera and object tracking systems under the Stype brand and now has its own rendering solution. StypeLand builds on the immense rendering capabilities of Unreal Engine and adds a framework for live work.

Released in 2019, StypeLand is a rendering solution from stYpe now gaining traction among broadcasters and film companies as it builds on the immense rendering capabilities of Unreal Engine and adds a framework for live work and post production. 

In 2020 stYpe introduced GreenKiller, their proprietary chroma keyer, which seems to be the cause of most of the buzz around StypeLand. GreenKiller excels in preserving natural shadows, reflections and hair detail. Another major release of GreenKiller was released in March this year, which according to the company “makes GreenKiller and StypeLand one of the most wanted green screen workflows in the industry.”

It elaborates, “StypeLand is now no longer just a plugin for Unreal for using stYpe products, but even the clients using other camera tracking products are using StypeLand and GreenKiller in their workflows.” All of the components typically required in a broadcast setup, such as camera tracking, AR, VR, LED wall control over nDisplay, set extensions, redundancy engines with disaster recovery as well as centralized control of all engines and scenes through a single PC, tablet, or a mobile phone, are all integral to the StypeLand workflow.

Brainstorm is a leading manufacturer of real-time 3D graphics and virtual studio solutions. Its InfinitySet provides AR and virtual set applications in combination with technology like PTZ cameras. On that note, InfinitySet can receive the video and tracking information from Sony BRC X400 and X1000 cameras and render the virtual scene or the AR objects in real-time, using photorealistic rendering and Unreal Engine.

Florida-based MIG for example has been using a Brainstorm system for live streamed remote produced events this past year. Its configuration is composed of a InfinitySet +Track with Unreal Engine running on an HP Z4 workstation and a camera tracked jib with Stype RedSpy, plus Ultimatte 12 chromakeyer. A second phase of the installation includes a second InfinitySet workstation allowing MIG to take the system on the road and produce virtual sets and productions on location.

Brainstorm is also coordinating AdMiRe a multi-industry R&D project to develop and validate Mixed Reality solutions for TV audiences.

Francisco Ibáñez, R&D Project Manager at Brainstorm, explains, “Currently, TV audiences can only interact with the programmes they are watching through social networks or broadband hybrid TV. AdMiRe will develop a solution to enable home audiences to virtually join the TV show they are watching and interact with presenters and guests in the television studio. The solution will also provide content creators with tools that radically improve the integration of the presenter within hyper-realistic virtual environments and facilitate their interaction with synthetic elements.”

Aximmetry is an all-in-one graphics solution for virtual sets including its own chroma keyer. The Hungarian developer says there’s no need to buy separate modules and extensions for 2D graphics, real-time LED wall control, video wall display, virtual product placement, projection or mixed reality projects since its software includes them all.

“Aximmetry’s highly flexible interactive graphics programming interface enables users to create broadcast quality content even with just one fixed camera and a gamers PC by constructing interactive scenes and effects using virtual lights, virtual camera movements and AR,” the firm says.

The opportunity to open and run Unreal Engine projects with Aximmetry is also supported. The UE4 rendering engine is embedded into Aximmetry’s own user interface.

Content created in Aximmetry can be live-streamed directly to YouTube or Facebook. It also offers solutions for handling real-time audience participation via second screen devices.

For more complex productions a Broadcast Edition of the software can integrate any camera tracking device, is capable of receiving depth information and offers unlimited SDI ports and 4K-SDI. 

Users include Twenty Studios in Stockholm, HÍR TV in Hungary and central Europe’s Tematic Media Group. Lisbon-headquartered wTVision is a real-time graphics provider with virtual solutions that add wonder to any live show either it be Sports, Elections, Newscasts, or Entertainment. Its technology can integrate official data into immersive augmented reality graphics, providing relevant information in real-time with powerful visual impact.

For the recent parliamentary elections in El Salvador, journalists at Canal 10 were able to show results as they were coming in with AR graphics or control significant data with the help of an interactive touchscreen.

In the sports area, wTVision joined forces with Mediapro Mexico to deliver tied-to-the-field virtual graphics for the first matches of the Scotiabank Concacaf League, in Costa Rica. The company’s AR³ Football software generated the virtual graphics for team and sponsors’ logos that were broadcasted during the live game and integrated into the field.

wTVision also designs 360-degree virtual sets, mixing them with different virtual graphics and live videos inputs to create visual impact.

Reckeen makes video production and streaming technologies with Reckeen 3D Studio its newest project. It is a multi-channel mixer that combines four video sources with computer graphics and media. Each of these video sources can be pre-processed with built-in chromatic keys and then placed in a virtual 3D scene, creating a virtual studio or an advanced composition combining CG and video. 

The system works in two independent modes - Reckeen 3D and Reckeen Lite - enabling users to customize the production process.

“The Reckeen 3D Studio package contains all the necessary modules to produce 3D TV content along with all the benefits of today’s real-time generating and editing of 3D graphics,” the company explains. “The main advantage of the 3D package is a possibility to freely operate four independent virtual cameras. You can set the camera at any angle, at any distance from the studio’s objects, while maintaining appropriate positioning of the on-air talent via four independent chromatic keys.”

WASP3D offers a broad range of real-time 3D broadcast graphics solutions for virtual sets, eSports, elections and news. Its broadcast TV graphics workflow is designed to streamline production and enhance visual quality for publishing across all media platforms. WASP3D points out that multiple virtual cameras, whether static or animated, can be set up within the scene and operated by using its software, to save the production money on manual camera operation. Its software can help convert limited size spaces “to an infinite 360-degree HD virtual set environment” by creating “high polygon realistic virtual sets” designed to capture minute details like mirror reflections, shadows and cloth movement.

Multi-video window simulations can be integrated to add guests from virtual meeting application like Zoom and Microsoft Teams. Additional live camera inputs is enabled through NDI integration.