Distributed Live Takes Hold
- By Adrian Pennington
Remote production is growing fast, and more and more events are starting to utilize it. However, traditional workflows are likely to persist because people still need cameras and monitors locally, even if the processing is in the cloud. Productions are also looking to limit their carbon footprint on-site therefore the industry is leaning toward a more sustainable hybrid model of half remote, half on-site.
“While remote production has gained serious momentum over the last two years, calling it the new norm would be a bit premature,” says Lawo’s Christian Scheck. “Similarly, SDI and OB trucks are going nowhere anytime soon. Nor do they have to, for almost all remote production setups still involve SDI-based devices which can be controlled in the same way as open-standards IP natives. And new OB trucks are still being built.”
Nonetheless, as one of the companies that pioneered IP-based remote production, Lawo is confident that remote production will one day become the norm.
“An IP network usually involves edge devices that take care of signal ingress and egress (gateways),” he says. “They allow users to mix and match SDI solutions with IP-native devices, the only difference being that SDI signals are converted to IP at one edge, and back at the other. The decision to base operators at the production hub, who then control edge devices stationed on-site cuts travel costs and allows talented operators to produce more shows.”
Lawo is still proud of its Mix Kitchen and decentralized audio production approach, but its most groundbreaking development so far has been the release of HOME, which is designed to make IP operation both intuitive and secure.
“Many live event productions have requirements to capture the camera feeds into replay and highlight package, creation workflows,” explains Alan Repech, Director of Marketing, Telestream. “If cameras do not have wireless connections back to the home studio or broadcast centre, capturing content requires systems on site. However, the editing and processing can occur remotely even to the point of being able to edit growing files only a few seconds delayed from the action. This means creating highlight packages and other derivative content can be done while the live event is in progress.
In other words, what is often done today with many creatives and a large technical staff on site is now more practical than ever from offsite locations. Some of the remote processing steps may include creating proxies and archive files, frame rate conversion, and HDR-SDR conversion. All of that can be, and is being, done today. In addition, as test, measurement, monitoring, and synchronization solutions increasingly offer fully functional remote user interfaces, the quality assessment of these workflows becomes increasingly practical.
Telestream offers both a flexible waveform monitor/network analyzer called PRISM and product called Inspect 2110 designed for monitoring by exception of ST 2110 video and audio streams. These can be integrated into a complete solution.
“Imagine monitoring all of the most critical contribution and distribution feeds in a live, distributed production environment with the only human effort being to observe a single computer screen,” says Steven Bilow, Product Marketing Manager, Telestream. “Imagine being automatically notified of a PTP timing issue with a distribution stream, being able to explore the issue immediately, and with a single mouse click being able to dive as deeply as you need into every aspect of the ST 2110 streams until you have found and fixed the problem. The result is an environment with lower stress, in the already stressful world of live event and sports production. It is that kind of solution-oriented innovation, that thinking about how to make life easier for users, that we have now; more of which will be continually forthcoming.”
NDI, the group developing the Network Device Interface protocol and part of Vizrt, is bringing new enhancements that will allow even better usability and integration. “We are working on refining the latest version NDI 5 to enhance the use of audio, NDI Bridge, switching and routing while simultaneously addressing the need for flexibility in determining bandwidth, quality, and latency to enable limitless content creation for all markets,” says Michael Namatinia, company President.
NDI points to the use that ATP Media with AWS and Gravity Media made of the signal protocol to complete a large-scale proof of concept virtualised live production for the Rolex Paris Masters. The project saw multiple freelance broadcast experts test various AWS-hosted live production solutions. David Sabine at AWS reported that freelance experts didn’t notice any difference between using cloud-based live production tools and conventional ones.
“To have the time to be able to experiment with combinations of vendors interoperating with each other and to understand the benefits and limitations of the single vendor solutions was invaluable,” said ATP Media CTO, Shane Warden.
Net Insight concurs that cloud-based remote live production and distributed architectures are steadily gaining traction “as a cost-effective alternate to hardware-based on-premise projects,” according to Kenth Andersson, Head of Strategic Alliances.
“Our customers are taking advantages of solutions like our cloud-based media delivery and routing technology Nimbra Edge to use cloud as the means to deliver live feeds to remote distributed locations. They can connect talent sitting at home doing the production on low resolution feeds.”
Nimbra Edge is a cloud agnostic, multi-cloud and multi-tenant solution that supports the major industry standards such as RIST, Zixi, SRT for ARQ transmission. “In short, our solutions offer openness for technology vendors to integrate their solutions, which is extended as a strong overall solution and media ecosystem for contribution and distribution of media services for customers.”
Adoption of remote production over the past two years may have been born out of necessity, but as we return to a sense of greater normality, many workflows that have been driven by innovation are here to stay.
“We’ve seen the evangelisation of remote production over the past year, and we’ll continue to see media companies break new boundaries with proven cloud-based workflows in 2022,” says Larissa Görner, Director of Cloud Product Management, Grass Valley. “Despite traditional workflows such as OBs still having a core role in live content creation, the skyrocketing global demand for content in our market means media organisations are turning to new and pioneering production methods to achieve greater scale and agility.”
GV AMPP, the firm’s cloud-native Agile Media Processing Platform, is in constant evolution. On top of core media production workflows within the platform, recent additions include our AMPP Audio Mixer and AMPP Asset Management solutions.
“GV AMPP is at the heart of our GV Media Universe (GVMU) vision,” says Görner, “a digital ecosystem that allows customers to seamlessly combine on-premise, hybrid and cloud technologies from Grass Valley and verified partners to design live production environments to fit their needs.
“While no company can truly claim to be an ‘end-to-end’ partner, GVMU allows us to be as close to that as possible, providing our customers across the media landscape with seamless access to software and hardware technologies.”
“Remote production is not yet the norm, but it is the direction that most of our live event production customers are heading,” says John Schur, President, Solutions Group, Telos Alliance.
“Some of the largest sporting events are being produced almost entirely remote today, but it’s taking time for others to change over to new platforms and new workflows.”
The Telos Infinity VIP - Virtual Intercom Platform is a fully virtual and cloud deployable professional intercom system. One of its unique features is the ability to easily integrate with on-prem and remote sites that are using Telos and third-party systems. Infinity VIP is also available as part of the Grass Valley AMPP platform.
French sports broadcaster L’Equipe TV recently launched its OTT platform using a remote production workflow including cloud-hosted remote voice-over solution from Broadcasting Center Europe (BCE).
Jérome Aubin, Production Director at L’Equipe explains, “For us, it is obvious that sports production must reinvent itself. Voice-over is one of them. However, we will only move to full remote production when the economic interest is beneficial. To date, however, we cannot really say that this is the case, especially since we mainly produce small sports events.”
Going forward, 2500 hours of live will be commented with BCE’s remote voice-over solution. “The BCE solution is easy to use. All it takes is an internet connection and a computer. However, to ensure the best sound quality, we decided to add an external sound card. All our commentators have been using the system since the launch of our ‘Live’ platform. Pandemic or not, it was necessary for a channel like ours to find solutions to produce more content while better controlling costs. The remote voice-over is one of chosen means.”
In addition to sound administration, the cloud remote controller grants access to graphics and titles management.
In addition, you can also trigger text titles, write the texts during the event or pre-configure the titles in templates. Sign language video feeds can be integrated and since the solution connects to a webcam or connected camera, users can decide whether to add this view as a picture in picture in the live feed.
“In my opinion, remote live TV production is not a replacement for traditional production workflow, but a good complement and opportunity to cover events and activities with limited budgets,” says Igor Vitiorets, CTO at Slomo.tv. “For serious events, the traditional OB and SDI based workflow is more reliable and preferable.”
He makes a pretty convincing argument that for fairly simple broadcasts with low levels of responsibility and quality requirements remote production and broadcast automation are suitable at reasonable cost.
“The situation is different with large-scale and important events, because “acceptable” is not enough. Premier League broadcasts need a large number of cameras with high-magnification zoom lenses, SuperMotion cameras, SpyderCam and cameras on motorized rail systems. This expensive equipment is not permanently installed in the arena and requires on-site set up.
“They require experienced camera personnel who are able to quickly react to any changes in the game. Any delays in controlling the cameras and receiving the director’s commands are simply unacceptable.”
As a rule, he says, video engineers should also be in the arena to adjust camera settings in real time, e.g. Iris settings.
Therefore, a fairly large number of personnel should be on the site while, in theory, the Remote Production centre can accommodate an audio engineer, broadcast director and replay operators.
“Despite the well-established procedures and ‘standardised’ broadcasts, directors often have to give direct commands to the members of the TV crew. There is a rule of thumb: to comfortably control live processes, the delay should not exceed 300 milliseconds.
“There are also unplanned events that may occur: ‘impossible’ goals, force majeure or conflicts that are of great interest to viewers,” Vitiorets says. “The procedures and algorithm for broadcasting such moments are difficult to formalize and require an instant reaction from the broadcast director.”
It seems hybrid at the venue and decentralized live will be the modus operandi for major events for some time.
While remote video production is still not considered the norm, we’ve seen a huge increase since the beginning of the pandemic.
“Previously considered out of reach, remote production has become an achievable deliverable in a short amount of time,” says EVS’ CTO Alex Redfern. “The pandemic has led to an irreversible change in the way broadcasters and media companies create live content, opening the doors to new work practices. A large percentage of organizations are currently undergoing a transition from SDI sources and systems to IP core infrastructures. This transformation, which is driven by the need for greater agility and scalability, also applies to OB trucks.”
Leveraging cloud processing and machine learning, XtraMotion is EVS’ software application that enables the transformation of footage from any camera angle on a production into high-speed video using frame interpolation. As a result, says Redfern, productions can easily increase their super slow-motion coverage without any extra cost and without the need for additional hardware on site. XtraMotion was first deployed as a Proof of Concept (POC) at Super Bowl LIV, in February 2020, after which FOX Sports decided to make XtraMotion integral to its productions.
“It was at Daytona 500 that XtraMotion truly demonstrated the extent of its storytelling capabilities by allowing viewers to watch super slow-motion replays from the in-car cameras - a first in the history of live sports broadcasting,” says Redfern.
Impact of 5G
Of all the technologies likely to transform live remote production it is 5G which holds most promise.
Telestream’s Bilow points to mmWave 5G as having the capacity to provide bandwidth up to about 2 Gbps. This makes it possible to support bandwidth-intensive video up to 4K and potentially 8K. It appears that this bandwidth can even support volumetric video streaming on mobile devices.
He adds, “5G currently performs inconsistently in these applications because, as one moves their devices around, there are frequent handoffs between 5G and much lower performance 4G towers, among other things. Furthermore, 5G is very directional so achieving the theoretical bandwidth is difficult because there are rarely line-of-sight connections. This means that there are challenges to overcome. That said, these challenges are the perfect ones for a company with expertise in streaming media and monitoring to address. They are not insurmountable, and we view mobile and remote production over 5G as a realistic path forward.”
NDI and Vizrt have completed several proofs of concept in 5G. Sky Germany leveraged Vizrt and NDI to deliver real-time 5G for a recent Bundesliga Handball Final. It was the first time a broadcaster made an end-to-end 5G live production with NDI Bridge and Vizrt graphics, analysis, and production tools - all in the cloud.
“5G simplifies and streamlines these benefits significantly with the provision of network slicing and related QoS functionalities,” says Namatinia. “For live production, it brings guaranteed low latency and corresponding bandwidth.
“However, low latency and corresponding bandwidth must not only be ensured in the 5G network; this also plays a role in the further processing of the signals. Corresponding connections to the hyperscalers such as AWS, Azure, or Google must also be ensured.
“Another approach is to use edge computing to reduce the number of hops between the source and the production backend, thus decreasing latency accordingly. Achieving this is something we are currently working on at NDI, to reduce latency, not only for 5G but for other applications using LAN and WAN.”
Net Insight’s Andersson says 5G will unlock smart stadiums by providing an additional means to collect feeds within a venue that is operating a private 5G standalone network.
“It will provide the ability to use more cameras within a stadium such as spot cameras, tracking favourite players or different angles of the field. These feeds can be distributed to production remotely, for example to create a fan zone viewing experience complementary to the main produced broadcast feed.”
Another trend that is starting to have a significant impact on how live events are produced and distributed is the proliferation of cloud-based production platforms. Schur reports that these platforms enable even small venues and specialized events to be produced with professional tools at affordable prices.
“I expect that we’ll see many more of these platforms come online, targeting specific types of productions and markets,” he says.