MAM rises to meet the content production surge
- With Kevin Hilton
The mind-boggling statistic that 500 hours of video is uploaded to YouTube every minute illustrates the explosion in video and audio material now being created for all platforms. As Contributing Editor Kevin Hilton reports, this put is putting more pressure on the technologies being used to keep track of everything...
The rate of technological change in broadcasting has increased considerably in the past 20 years. While areas such as acquisition and remote production are high profile examples, media asset management (MAM), usually regarded as a background function, is now experiencing a similarly rapid evolution.
In the last six months there has been a dramatic rise in the amount of content being produced in all areas of media. This upsurge comes at the same time artificial intelligence (AI) is being used for a wide range of broadcast-related tasks. In terms of MAM, AI is finding a particular use for transcribing and auto-tagging material.
A further move on is generative AI, which gives the capability to produce different kinds of content, from text and images to audio and synthetic data, in a more realistic and human, rather than obviously computer-generated, way.
ChatGPT (generative pre-trained transformer) is the leading example of this and since its launch in November 2022 it has already made a big impact on broadcasting, including for media management.
"MAMs need to continually adapt to the media and metadata they're hosting and managing," comments David Candler, Senior Director of Customer Solutions at Veritone. "It's not just the management of media MAMs should enable but also its activation across multiple downstream paths. As new media platforms continue to explode and the likes of generative AI and ChatGPT, which are very much in people's minds now, come into play, a huge volume of media will be generated. But as you create content, you still need to manage and activate it. One of our specialties is not only in generating new forms of content - including synthetic voice and avatars - but also searching and discovering valuable insights into that content, activating it and monetising it"
Candler adds that AI has assisted some MAM systems with "deep discovery" for some time, using AI-based plug-ins for functions such as face/logo/object detection and speech-to-text. Veritone taken this further by creating an entire "enterprise AI platform" called Veritone aiWARE. "This orchestrates hundreds of cognitive engines," he explains, "and our Digital Media Hub asset management platform is one of the applications sitting on top of the aiWARE stack. This revolutionises the deep discovery of moments within content as well as finding the assets themselves."
Stephen Tallamy, Chief Technology Officer at EditShare, agrees that the rate of change in both the media sector as a whole and for MAM in particular has been swift. "There are some pretty big transitions going on in the industry that are driving different workflows and requiring a different focus from MAM vendors," he comments. "We have seen changes in working behaviour, tooling and video formats over the last few years that have made for a rapidly evolving space."
In recent months, Tallamy continues, this has seen the emergence of two key topics that high-end broadcasters, production companies and post-production facilities now see as priorities. "The first is multi-site or multi-location, while the second is and hybrid type deployments," he explains. "Over the last five or six years, there was the idea of doing everything in the cloud but now the thinking is that a blend of the two might be the right way forward. As for what's driving that, some of it is consolidation - different companies coming together, such as Warner Bros and Discovery or post houses with different specialities - and sometimes that means operations in different locations."
Because of this MAM systems have to be, in Tallamy's description, "multi-site aware", which he sees as very important for broadcasters and facilities today. "Businesses need to see where their assets are across multiple locations," he says, "They might be stored on-prem at two or three locations and some of it is in the cloud. It's not just providing a centralised view of those assets, it's also about being able to distribute and control automations across those sites as well."
Identifying and bringing together not just the assets themselves but also the functions controlling or manipulating them is, Tallamy comments, beginning to cause problems for those using MAM systems. "You get MAMs that are trying to handle all the different types of media and workflow from a common place and that is generally creating a tension in the user experience base," he explains.
"The producers don't want to see the work in progress and if they do, they want to see it in a very different format. At the same time editors need a much more detailed view. And you either get a MAM that is spread thin in its support of all the different assets, or it goes into deep detail on one thing.
"Customers are now expecting depth and breath but the trouble there is that it become an overtly complex beast and the user experience goes out of the window. What we're trying to do with our FLOW media asset manager is create persona-based user experiences, so that within the breath and depth, people like producers will see just what they need to see."
A MAM system focused on production assets is Facilis' FastTracker, which is not standalone but integrated into the company's shared storage systems.
Facilis Chief Marketing Officer Jim McKenna explains that in this application, there is a need to track and catalogue non-video files, such as script notes, Excel sheets and project metadata files. "These are all catalogued the same as video and audio files," he says. "They can also be appended in the database with any metadata required for searching and secured against unauthorised viewing in the same way as the image and sound data."
McKenna agrees that AI is a "big topic right now", even though companies like Facilis have been applying it to their technologies for some time. "We're announcing substantial improvement in AI auto-tagging through image recognition, along with enhanced transcription support," he adds. "Not all customers will make use of AI but for those that own their content, a database of terms matching visual or audio elements of the assets is valuable for efficient production."
Something that has long been a critical factor in how MAM systems operate and are used is metadata, which, says Aaron Kroger, Product Marketing Manager for Media Workflows at Dalet, is becoming ever more important as the volume of content continues to rise. "To make your content findable and, ultimately, usable, advanced indexing and rich metadata are crucial for search tools to locate it," he says. "That metadata is also important to further utilise the workflow engine and smart automation to create mezzanine files to aid in previewing or even working with them in different aspects of an operation."
Dalet's cloud-native MAM platform, Flex recently added the new FlexMOBILE app for Android and iOS devices, plus further functionality within the FlexREVIEW asset review and approval tool. As for the increased volume and diversity of material that needs to be dealt with, Kroger says workflow engines have to be powerful enough to index and manage everything. "While there are differences between video and audio assets, we have user defined objects that can specify how those objects are handled. They can be configured to act similarly or not, depending on your needs."
Toni Vilalta, Product Development Director at VSN, observes that the huge amount of material now being produced means "a well-preserved and catalogued archive is more important than ever", while the main challenge for MAM developers is to provide "nimble" and easy-to-use tools. "That means we will have to get deep into AI capabilities for automated cataloguing, improve the metadata customisation for our customers and improve our search engines," he says. To achieve this, the company has added a new interface for asset segmentation and cataloguing to its VSNExplorer MAM, plus an option to present and edit the results of AI analysis from one or more AI engines. As well as this, VSNExplorer has also been integrated with the Woody Technologies IN2IT ingest platform.
Developments in the media sector over the last few months appears to be changing not only how broadcasters and content owners are presenting their programming but also what they expect from asset management technologies.
"When we think of MAM in 2023, we no longer see customers looking for a monolithic solution that aggregates and organises a given company's media assets and processes from a single vendor tool," says Eric Carson, Chief Revenue Officer at Ateliere. "There is a marked shift toward best of breed tools connected via serverless, event-based architectures."
In keeping with this more open approach, Carson describes Ateliere's Connect system as a "content hub" that works with cloud-based media supply chains. "It's more focused than MAM," he says, "and while it can cover all aspects of the content journey from ingest to distribution, the platform is designed to be used in an apportioned manner, allowing customers to use and pay only for the functions needed.
"The advantage of this approach is that it doesn’t require replacing legacy systems. It can be deployed quickly and easily next to existing systems, expanding a business' capability to monetise and deliver content in a matter of days." In the run-up to this year's NAB Show, Ateliere announced data analytics capabilities for Connect, enabling user to measure their media supply chain volume and performance.
With more and more content being generated for conventional broadcast, streaming, YouTube, social media and podcasts, the need to identify and store every item becomes both a bigger headache and increasingly vital. Which means the development of media asset management systems shows no sign of slowing down soon.