Practical solutions for synthesising media

  • With Adrian Pennington

Practical solutions for synthesising media

Virtual Production adoption has been accelerated since the pandemic, and it’s now a widely used technology that involves not just Virtual Sets, but also LED-based XR and other technologies - by Adrian Pennington...

The world of virtual production is constantly changing as existing technology evolves and new solutions hit the market. Virtual production enables users to tell better stories, gain viewership, increase revenue, and reduce costs using new techniques such as games engines or the use of LED volumes. 

Ross Video highlights Extended Reality (XR) - a type of virtual production technique that involves an LED volume allowing the talent to see the environment they are supposed to be interacting with. “XR continues to gain traction by delivering immersive and innovative experiences including in reality TV shows, music performances and live sports,” says Manesh Patel, Product Management & Business Development for Virtual Graphics.  

“Ross Virtual Solution has led the way with a suite of technologies designed to make it easy for production teams to deploy complex workflows. The Voyager render platform allows operators to create stunning virtual environments while simplifying live workflows. When combined with the Lucid Studio control platform, users get the entire workflow under one control umbrella, including LED production, rendering engine, control surface, and tracking. 

“For those looking to break into virtual production for the first time, an all-in-one solution is a great entry point. The Voyager Trackless solution uses the power of the Unreal Engine to provide photorealistic yet easy-to-operate virtual environments.  

The modern newsroom requires any graphics workflow to integrate seamlessly into the existing MOS environment. Indeed, journalists expect to continue using their existing story creation tools while graphics elements that enhance the story are seamlessly added to the final presentation. Ross’ solution allows for that seamless MOS-based integration and for journalists to add virtual or augmented reality elements to their stories with ease. Ross Virtual Solution uses the same workflow components as the XPression motion graphics system, guaranteeing tight integration between CG and AR workflows, which results in quicker and easier adoption by newsroom users.” 

This year disguise addressed a significant challenge curbing the widespread use of virtual sets and real-time graphics in broadcast. Niki Whittle, Head of Broadcast Solutions EMEA explains,  “Even with the multiple benefits that these technologies offer, broadcasters still struggle with integrating virtual sets into their existing broadcast workflows - often needing to hire additional programmers to plan the content in advance.  

“Off the back of our successful acquisition of Polygon Labs, we launched a suite of new tools to allow for better integration of real-time data and graphics into existing broadcast workflows. 

“These included our Porta 2.0 software together with px hardware which are engineered to power Unreal Engine vanilla graphics and enable broadcasters to seamlessly create, control, and collaborate on real-time graphics without needing to hire additional programmers to plan the content in advance.”  

With Porta, Whittle explains that Team members can easily edit and control dynamic content on the fly, create playlists, and modify Unreal Engine parameters, such as light values, as well as trigger disguise’s software timeline cues to synchronize physical and virtual lights. This provides more flexibility, allowing creatives to build templates of Unreal Engine graphics without the need for blueprints or coding.  

Porta and px also give production crews access to a centralized data aggregation, curation, and playout control platform known as Ipsum, which enables integration with third-party APIs, databases and files for real-time graphics.  

“Porta's real-time data-driven graphics workflow significantly accelerates xR and AR broadcast productions.

Anyone regardless of skill level can now work on an extended reality production collaboratively, making everyday Unreal Engine graphics a simple operation for all tiers of production.” 

Premier League Productions recently selected AE Live to manage the build and integration of a virtual studio for all their studio programming supporting the 380 matches that they broadcast live and distribute internationally every year. The new studio had to be flexible enough to support multiple virtual sets to suit different programming requirements and fit within the facility’s existing space at IMG Studios in Stockley Park. 

AE Live played an integral role throughout the expansive project, from the concept stage through the build process and final installation to on-air operation.

Thirteen different sets, all housed within a single large-scale, completely virtual ‘Premier League HQ’ were created using Unreal Engine and are output each day using Zero Density’s Reality Engine along with Stype’s RedSpy camera tracking systems.  

Complex, 3D AR graphics are generated within the Unreal/Zero Density workflow, made possible by AE Live’s custom plug-ins for Unreal and our unique control software to deliver immersive live graphics for talent analysis of stats, formations, tables, and virtual screens.  

“We overcame the challenge of incorporating virtual technology in the existing production space to create a multi-functional studio that can be quickly changed to different virtual environments to be able to handle a packed schedule of weekly shows,” explains Scott Marlow, Head of Virtual Studios. “To avoid delays we provided facilities at our campus for development and extensive testing. Following a comprehensive rehearsal schedule, the studio successfully launched this summer bringing impactful, engaging, and stylish coverage and analysis to viewers around the world.” 

Filmmakers are continually searching for tools that considerably reduce the time and costs of production. With the rise of Virtual Production, it can be a time-consuming exercise to bring a user’s set together in tandem with Unreal Engine Editor. That’s where ARwall says it can make a difference. 

The company is a provider of Virtual Production supervision and software, offering consulting and turnkey solutions spanning across creative, hardware, software, and immersive XR innovation. 

“For independent filmmakers especially, the software needs to be as user-friendly, efficient, and reliable as possible,” ARwall tells InBroadcast. “Compared to on-location or physical set shooting, time and money can be saved using virtual sets and backdrops, cutting costs across the board or being able to reallocate those resources into other areas.”   

In a bid to streamline the workflows for virtual sets, ARwall recently announced more updates to their award-winning ARFX Pro Plugin for Unreal Engine. The software integrates directly with the latest versions of Unreal Engine. 

“The plugin consolidates every virtual production setting from Unreal Editor into one single UI, while adding easy to reach controls for settings like global quality presets, colour management, and even custom hotkeys. All of these settings are no longer limited to just the editor mode of Unreal either. Users are now able to customise and save all their settings in-game with the virtual scene running, without having to jump out into edit mode.   

“These updates will drastically cut down the time it takes to produce quality content and make changes to virtual sets on-the-fly. One update at a time, ARwall strives to innovate and provide the most accessible solutions for filmmakers.” 

Brainstorm has been pioneering the development of the technology, understanding Virtual Sets as real-time 3D environments, in which the talent can be inserted along with other content such as data-driven graphics.  

“3D virtual sets go far beyond the chroma key video compositing and have been essential in the development of hyper-realistic Virtual Production,” says Miguel Churruca, Marketing and Communications Director. 

Brainstorm has always been innovative on VR. The company has developed many cutting-edge technologies such as TrackFree, TeleTransport, HandsTracking, 3D Presenter and more, allowing customers to enjoy the most advanced tools for any kind of Virtual Production.  

“On top of that, Brainstorm is known for adopting the latest trends in this area, and this year demonstrated how combining LED-based XR and chroma sets is a winning combination when it is done properly and easily for the client,” says Churruca. The company’s flagship product InfinitySet is Unreal Engine native, taking advantage of all the possibilities this game engine provides for content creation, including UE5.  

“And for the things Unreal Engine can’t do by itself, or when it requires complex developments, InfinitySet can make it work along with our own render engine eStudio, especially for data-driven graphics and CG applications using Aston, or for better integration with any kind of broadcast workflow.” 

With the virtual production solutions springing up like mushrooms after the rain, Reckeen says it is proud to continue being “the most comprehensive, accessible and easy-to-use product on the market” for virtual productions. Given a €30,000 budget the company offers a full PC workstation with capture and audio cards catering up to four cameras, lifetime licenses for real-time production and broadcasting software, virtual set editors, camera tracking, and free library of 3D assets. This all-inclusive product is a solution for local TV studios, smaller production houses, schools, universities and for corporate settings. 

Adrianna Hebisz, President, Reckeen, explains: “With the newest software version 3.0 we strived to provide ultimate tools for more convenient and better import of virtual sets, 3D assets and PBR materials, making editing and customising 3D environments effortless and even faster. Compatibility with libraries such as Adobe 3D Substance, CGaxis, Quixel Mixer and Megascans provide users with amazing possibilities to build stunning and impressive virtual sets in an easy way.  

“In the upcoming update we are enriching the features by dynamic virtual lighting giving the users freedom to modify and animate virtual lights during live production. We are also extending the list of supported Free-D PTZ cameras to Panasonic and Canon models. 

“Reckeen continues to deliver high-quality solutions, and is dedicated to providing easier access to virtual production technologies to everyone, regardless of the budget and previous experience.” 

Virtual Production which includes Virtual Set production is a mainstay of Bluefish444 hardware technology, the support for third party equipment and software, and for the diverse customer base of Bluefish444. The latest of its SDI and IP video IO interface cards in the KRONOS range assist in the delivery of the industry's most requested features; highest quality at the Lowest Latency on the most commonly used interfaces for professional video/audio SDI and SMPTE 2110/2022 IP. 

Virtual set production, and the whole Virtual Production landscape including LED Volumes, XR and In camera VFX, are supported by Bluefish444 hardware, driver, SDK and support for industry leading third party tools including the likes of Aximmetry, Brainstorm, ClassX, Foundry, Unity, VICON, Vizrt and the recently supported Unreal Engine from Epic Games. The Unreal Engine support has been developed by Bluefish444 to bring the lowest latency technology to this growth industry. 

Bluefish444 continues to be regarded as a high quality low latency solution for virtual set and virtual production environments, in addition to many other workflows within the professional, broadcast, film, corporate, education, and government sectors. 

Even lighting of large-scale green screens often poses a challenge and results in sacrificing keying quality when the camera moves. Aximmetry's brand new 3D clean plate feature being in the final test phase now meets this challenge.  “Users can simply and quickly record multiple images of their green screen session by session and Aximmetry combines these into a model to create a virtual map of the green screen,” explains  Orsolya Dormon, COO. “During production, Aximmetry creates a perfect clean plate for every angle achieving superior keying results real-time even with a moving camera. 

“We believe that with camera tracking becoming increasingly widespread, this feature will have the utmost importance for a large number of users and is a true game changer in Virtual Production.” 

Aximmetry have partnered with Bluefish444 to enable the “lowest latency IO” for 4K SDI and SMPTE 2110/2022 IP interfaces for real time 3D graphics, virtual set, virtual production, and XR workflows utilising KRONOS K8 and the forthcoming KRONOS Optikos 3G. 

Viz Engine 5 solves the riddle of producing and serving graphics across multiple platforms, according to Mark Gederman, Product Marketing Manager at Vizrt. “Adaptive Graphics intelligently automates the deployment of graphics to various output formats simultaneously.

"Our latest iteration of Viz Engine 5 turns pain into possibility. What once demanded too much time and effort with a high risk of failing, now creates a new world of possibilities for producers.” 

He argues that adaptive graphics saves time, effort, and dramatically reduces workflow complexity. “It's a revolution for real-time graphics, as it lets graphic artists create once and publish multiple times - saving time, reducing errors, and improving production quality. Adaptive graphics also ensure a better look and unified identity across all platforms, protecting the most valuable asset of any media provider: their brand. 

“Viz Engine 5 also offers the most advanced integration with Unreal Engine 5, making it possible to unite the two render paths into a single and powerful graphics workflow that is greater than the sum of its component parts. This blend of capabilities provides artists with previously unimaginable creative options; built from the seamlessly interwoven assets from both render blades, in real-time, in the same scene.” 

Gederman continues, “Viz Engine 5’s integration with Unreal Engine 5 drives a smooth virtual ecosystem, to achieve a more engaging and immersive experience for audiences.

"The operator-friendly workflow makes it simple for information to be relayed in a concise way within broadcasts."