Virtual Sets, Studios & Software
- With David Kirk
Contributing Editor David Kirk looks at the latest trends in all things virtual...
Virtual set technology has been a routine element of film and television production right back to 1916 when Nashville-born cinematographer Frank Williams filed a patent application for the traveling matte concept. RCA engineer Albert Goldsmith developed the colour separation overlay technique in the mid 1950s, trademarked by RCA as Chroma-Key and first used in 1957 by NBC. The real advance came later with the application of computer-generated background scenery, and more recently foreground props. The following summary looks at recent developments in the broadcast sector.
Absen’s PR series LED video wall panels can be used for virtual production studio backdrops and ceilings. The panels are available in a choice of sizes (500 x 500mm or 500 x 1,000mm) and pixel pitches (1.5, 1.9, 2.5, 3.9 and 5.2mm). The PR2.5 covers 99.9% of the DCI-P3 colour gamut and can smoothly display HDR video at up to 251Hz frame rate. With just one vertical side lock and one horizontal side lock, installation and disassembly are more efficient than with conventional panels. Curve blocks can be customized at any angle between 0 and 7.5 degrees.
Aximmetry's All-In-One Virtual Production Platform software allows the creation of real-time 3D virtual studio environments. It includes a node-based editor which can be configured for specific projects, including broadcast production and virtual events. Aximmetry also has its own chroma keying technology which is included in all its software licenses. The integral Aximmetry Calibrator is designed to simplify lens and camera calibration easy even when working with game trackers or PTZ cameras.
Brainstorm’s new Version 6 of the company’s InfinitySet virtual set system “has been developed to take advantage of the most advanced rendering, lighting and material features which the Epic Games’ Unreal Engine 5.3 render engine provides,” says Héctor Viguer, COO and CTO.
“We demonstrated the system at IBC2023, running on just one workstation including the new NVidia RTX 6000 Ada Generation GPU. The demonstration included multi-camera production with XR, one of the most demanding, and demanded, features in virtual production. This can be achieved using existing and common technology, like production switchers. We also announce a dedicated plugin to control Brompton Tessera. InfinitySet 6 is a free update for all clients with a current support contract.” Brainstorm additionally demonstrated an expanded system based on its Edison Pro which is designed for use in live or online presentations and videos that include keyed live performers in virtual 3D environments. Available for iPhone and iPad Pro, it receives new features that allow the app to be self-sufficient, including compatibility with 3D objects and NDI inputs that allow the app to display such objects and videos in the captured scene, and rendered directly.
Chyron Prime VSAR integrates Epic Games’ Unreal Engine graphics in a green screen studio, augmented reality elements on a physical set, and tracked virtual set extensions feeding directly into video walls. It includes a library of 3D text and objects with data integration for dynamic graphics. Each constructed scene and model translates into a template-based asset that producers and journalists can use from their newsroom computer system. Operators can access a visual asset library, customise and modified VSAR scenes, render previews of changes and push a real-time scene update through to playout, all from within the newsroom system interface. Each PrimeVSAR processor supports two cameras, offering SDI connectivity, HD and 4K-UHD format support as well as high dynamic range. An additional processor can be assigned to generating real-time SDI previews of each virtual scene. Disguise’s Porta software works with Epic Games’ Unreal Engine to allow creation of AR and xR graphics as well as more traditional motion graphics such as lower thirds and full screens. All graphics can then be controlled from a single interface. Users can also draw on existing graphics templates.
Reckeen offers turnkey products that are designed to allow a single operator to run multi-camera real-time productions. “Our solutions combine all necessary features in one easy-to-operate workstation,” says Adrianna Hebisz, COO. “Customers can choose between the compact Reckeen 3D Studio or the more advanced 3DS Pro x8. The software combines features which you might expect from a video mixer and broadcasting application, plus fully operational virtual production software which lets you build and edit your own virtual and AR sets easily, adding animations, managing virtual lighting and setting paths for the virtual cameras. Combined with our hardware, the products are plug and play solutions for those who want to make stunning content without breaking the bank. Reckeen XR allows operators to make trackless productions, as well using PTZ tracking and augmented reality in a single production session. The system is equipped with virtual cameras which can be moved around the whole virtual set at any angle without limitations, imitating a camera crane or a manual operator’s movements. Pan/tilt/zoom functions gives a presenter the ability to move around a green screen more freely. A shadow processor generates real-time shadows and reflections of an on-screen presenter who can also walk around 3D objects located in the virtual set.”
Vizrt Viz Arc 1.9 adds new features to the company’s virtual set and AR graphics system. These include the ability to load and unload graphics and toggle their visibility within a scene during play. Viz Arc combines object tracking and optical character recognition to read, convert and compose data into text graphics. A new Elgato Stream Deck plug-in allows users to map actions from the Viz Arc GUI directly to a Stream Deck console button. The plug-in supports multiple Stream Deck profiles, each with its own 10-page set of buttons and dials.
Viz Arc includes control of Vizrt’s keyer which allows users to immerse the presenter into a virtual environment. The toolset includes a Color Picker to select a sample from the set or location, as well as various fine-tuning adjustments to blend the AR and virtual set backgrounds. Viz Arc’s integration with hardware devices such as Elgato’s Stream Deck and the Loupedeck Live Console allow fine-tuning adjustment. Other capabilities include light-wrapping, a de-noiser and multiple mattes. A calibration feature enables operators to place and present AR graphics quickly.
vMix offers live video production and live streaming software allowing creation, mixing, switching, recording and live streaming of productions on a Windows PC or laptop. It can handle operate from sources such as cameras, video files, NDI, SRT, virtual sets, text and audio. New features in vMix 26 release include input effects, stream delay and simultaneous support for horizontal and vertical aspect ratios. vMix can now receive multiple tracks of audio; these are combined and made available to the input’s channel matrix and mixer. On the output side, support has been added to split a single multi-channel track into multiple stereo audio feeds. SRT support has been added to instant replay; CBR support has also been added to SRT.
Wasp3D's Virtual Set Studio system comprises a series of products such as the Trackless Virtual Set Studio to the tracked Virtual Set Studio. The Virtual Set Studio system allows the creation of a garbage matte layer over a composite output, masking out any unnecessary physical studio detail beyond the chroma area. This allows producers to simulate very large 3D spaces even when using a small green/blue chroma-key wall. The Tracked Virtual Set Studio can ingest tracking data from various camera sensing devices and lenses into 3D Virtual Set Studio scenes. The WASP3D Drone designer has an inbuilt virtual camera which accepts pan, tilt, zoom and focus data coming from the physical camera and the perspective to match the 3D virtual environments as the physical camera moves. Virtual Set Studio can be configured as a Trackless Virtual Set Studio as well. A Live shot of an anchor against a chromakey screen can be texture-mapped onto a surface and placed within a 3D environment in the Drone Designer and keyed in real time. Multiple virtual cameras can be set up and operated using action-set buttons or salvo-buttons.
Zero Density describes its Reality 5 as a virtual production platform that has been redeveloped to offer dynamic visuals, data-driven graphics and seamless workflow integration. “With the introduction of our Reality 5, we are tapping into the cutting-edge features of Unreal Engine’s 5.3 processor,” says Amir Hochfeld, Chief Product Officer.
Through the convergence of Reality 5, Unreal 5.3 and our Traxis Talent Tracking, our clients can fully harness Unreal-native capabilities, including a collision control mechanism that allows the talent to interact with virtual objects, all while enhancing realism with dynamic shadows and reflections. New architecture lowers RAM usage and boosts render process efficiency by 40%. Total delay from input to output is just five frames. New versions of Unreal Engine are supported within two weeks of release. The Reality Hub system controls all of our graphics platforms, giving operators confidence, flexibility, and control. It also saves time and reduces errors by integrating with newsroom systems, studio automation, and other data providers to streamline data handling processes.