Virtual Sets, Studios & Software

  • With David Kirk

Virtual Sets, Studios & Software

Progress in virtual sets, studios & software continues at rapid speed, as much in the realm of pricing as in technology - by Contributing Editor David Kirk...

The following summary looks at recent developments from some of the leading vendors in this highly competitive sector.

Absen’s PR series LED video wall panels can be used for virtual production studio backdrops and ceilings. The panels are available in a choice of sizes (500 x 500mm or 500 x 1,000mm) and pixel pitches (1.5, 1.9, 2.5, 3.9 and  5.2mm). The PR2.5 covers 99.9% of the DCI-P3 colour gamut and can smoothly display HDR video at up to to 251Hz frame rate. With just one vertical side lock and one horizontal side lock, installation and disassembly are more efficient than with conventional panels. Curve blocks can be customised at any angle between 0 and 7.5 degrees.

Brainstorm’s Version 6 InfinitySet virtual set system “has been developed to take advantage of the most advanced rendering, lighting and material features which the Epic Games’ Unreal Engine 5.3 render engine provides,” says Héctor Viguer, COO and CTO.

“We demonstrated the system at IBC2023, running on just one workstation including the new NVidia RTX 6000 Ada Generation GPU. The demonstration included multi-camera production with XR, one of the most demanding, and demanded, features in virtual production. This can be achieved using existing and common technology, like production switchers. We also announce a dedicated plugin to control Brompton Tessera.

InfinitySet 6 is a free update for all clients with a current support contract.” Brainstorm additionally demonstrated an expanded system based on its Edison Pro which is designed for use in live or online presentations and videos that include keyed live performers in virtual 3D environments. It receives new features that allow the app to be self-sufficient, including compatibility with 3D objects and NDI inputs that allow the app to display such objects and videos in the captured scene, and rendered directly.

Chyron’s Prime VSAR integrates Epic Games’ Unreal Engine graphics in a green screen studio, augmented reality elements on a physical set, and tracked virtual set extensions feeding directly into video walls. It includes a library of 3D text and objects with data integration for dynamic graphics. Each constructed scene and model translates into a template-based asset that producers and journalists can use from their newsroom computer system. Operators can access a visual asset library, customise and modified VSAR scenes, render previews of changes and push a real-time scene update through to playout, all from within the newsroom system interface. Each PrimeVSAR processor supports two cameras, offering SDI connectivity, HD and 4K-UHD format support as well as high dynamic range. An additional processor can be assigned to generating real-time SDI previews of each virtual scene.

Disguise’s Porta software works with Epic Games’ Unreal Engine to allow creation of AR and xR graphics as well as more traditional motion graphics such as lower thirds and full screens. All graphics can then be controlled from a single interface. Users can also draw on existing graphics templates.

Pixotope has introduced what it claims to be industry-first zoom capabilities to its through-the-lens camera tracking software. “We’re not known to back down from a challenge,” says Gideon Ferber, Senior VP of Product.

“Adding zoom capabilities to markerless TTL camera tracking was thought to be impossible, until now. With the ability to zoom, users experience the creative freedom to capture more dynamic and engaging shots while still enjoying the same streamlined setup and operation that Pixotope is known for. With the introduction of this capability, we’re opening up new creative possibilities and delivering greater operational efficiencies in the process. Pixotope Fly enables more dynamic shot composition for augmented reality, extended reality and virtual studio workflows with greater simplicity during live broadcast events, whether they’re indoors, outdoors, or in the studio. This new feature enhances content quality and production efficiency with the ability to zoom in and out of a space without physically having to move the camera closer or farther away.  The TTL tracking uses visual feature points like geological formations, tree patterns or architectural details to dynamically anchor the AR graphics and create a virtual point cloud reference system that adapts seamlessly to changing conditions. This effectively eliminates the need for tracking hardware, such as physical markers, camera and lens encoders, and the associated costs that come with them. As a result, there is a notable reduction in both complexity and cost, increasing overall efficiency and accessibility for broadcast operations of all sizes while providing unparalleled creative freedom with a level of accuracy and flexibility that was previously unattainable.”

Ross Video’s Voyager Trackless Studio virtual graphics tool uses Epic Games’ Unreal rendering engine. “It allows photorealistic environments in a single studio or by just using a green screen”, says Manesh Patel, Manager, Product Management and Business Development, Virtual Graphics. “Is designed for use with a small green screen either in-studio or on location and covers a wide range of workflow types.” HD/UHD/4K production, 12G and IP workflows, HDR and wide colour gamut are supported.

Also offered by Ross Video, Vision[Ai]ry Facial Tracking (Ft) uses video analytics to detect, locate and track the position of faces within the video stream directly from a camera. These facial positions drive the pan, tilt and zoom axes of the robotic camera system to maintain the desired framing of the face or faces in the image. This eliminates the need for a camera operator to manually adjust the position of the subject in the image. New features include a multi-channel interface, enhanced tracking capabilities, multi-engine support and an auto-reselect feature. The multi-channel interface view offers a configurable grid for up to six channel previews, with each pane providing access to functions such as tracking mode and subject selection. Auto-Reselect allows users to continue searching in automatic subject selection mode after a subject is lost, with an option to set the time before declaring a subject lost. Version 1.3 is designed to provide users with improved workflows, high-quality tracking, and better framing. Key features include a unique multi-channel interface, enhanced tracking capabilities, multi-engine support, and an auto-reselect feature. 

Stype offers a range of virtual set related products including the RedSpy camera tracking system, StypeKit bolt-on mechanical tracking kit for camera cranes and the Follower motion capture system. StypeKit is a bolt-on mechanical camera tracking system for various industry-standard camera crane brands. It allows existing cranes to be retrofitted and transformed into virtual production cranes. Each StypeKit consists of sensors which are mounted on the crane itself. No external sensors, markers or reference points are needed. The system bolts onto key points of cranes, allowing use on different cranes within the same day. All the components fit into carry-on plane luggage. StypeKit is claimed to work with all rendering engines currently available on the market. 

Stype camera trackers use lens calibration within the control console. An automatic aiming and focusing feature allows a crane operator to let go of the joystick and just swing the crane arm. Follower is a motion capture system for film and live broadcasts that tracks cameras, objects and talent simultaneously. A Follower setup consists of four or more infrared witness cameras positioned around the studio and up to 1,000 LED beacons that can be tracked. The system tracks the position and orientation of these beacons to enable functions such as camera tracking or to attach graphics in real- time. Follower’s LED beacons emit unique light patterns or solid light so each beacon is unmistakably identified. The system integrates easily with rendering processors from third-party vendors. Follower’s camera tracking functionality is achieved by attaching Stype Spyder modules to each studio camera. Follower Pen allows and operator to draw three-dimensional images in mid-air, in real-time. MiniBeacon allows live interaction with your 3D drawings by using Bluetooth technology to communicate with a phone or tablet. 

Vizrt has introduced version 5.2 of its Viz Engine 5 graphics platform. “With this latest iteration comes Adaptive Graphics, a revolution for real-time graphics enabling graphic artists to create once and publish multiple times,” says Kavita Taneja-Jhalla, Channel Demand Generation Manager. “It saves time, reduces errors and improves the quality of production. Viz Engine 5 also offers advanced integration with Epid Games’ Unreal Engines 5, making it possible to blend the two render paths into one graphics workflow. This update delivers various upgrades to the Viz Engine Renderer, including material alpha mask support, texture renderer/substance support across scenes, and transparent shadows. In addition, Viz Engine Renderer pipeline support has been added to the Viz Arena image-based AR graphics and virtual advertising solution.

A new ‘superchannel’ enhancement to Viz Multiplay allows use of 16 superchannels (32 subchannels of each type), plus improvements to the transition shader feature. 

Viz Engine 5.2 also adds enhanced video wall support, including for virtual windows and curved video walls. New colour correction capabilities improve set extensions.”

Zero Density’s Reality5 uses the latest version of Epic Games’ Unreal Engine 5 to create hyper-photorealistic virtual environments and graphics in real-time. “Zero Density’s tools blend the physical and virtual worlds to create true-to-life visuals that are indistinguishable,” says Ofir Benovici, Zero Density CEO.

“The complexity of managing multiple tracking signals and lens data can be a pain. Traxis Camera Tracking provides rock-solid performance with continuous recalibration to ensure uninterrupted, real-time tracking. A markerless, stereoscopic multi-talent tracking system, it identifies talents within a 3D virtual environment.”