This post originally appeared on Microsoft"s Media & Cable blog
The explosion of video consumption across digital platforms and devices means that broadcasters and content creators are dealing with more complexity than ever. Production teams are asked to prepare and distribute content in multiple formats, multiple languages, multiple versions, to multiple partners, all with different requirements. Many are struggling to keep up, and to make matters worse they aren’t getting additional resources or budget to do it.
In an effort to become more efficient, video businesses are looking at every aspect of production and distribution, from their systems to their processes. Artificial Intelligence (also known as AI), is proving to be a valuable technology, especially when it comes to extracting relevant metadata from videos.
At NAB (April 24-27 in Las Vegas), Ooyala will demonstrate how Ooyala Flex, our media logistics platform, integrates with Microsoft Cognitive Services. The goal: To simplify metadata capture by extracting transcripts, detecting faces within videos, and analyzing text to detect key topics.
Inputting, capturing, curating, and QC-ing metadata is a pain. It’s a manual process, rife with errors. Let’s face it: No one enjoys doing it. For media asset management, good metadata makes it easier to find assets, especially if you have large archives. Companies often go and shoot new or alternative footage simply because they can’t find old footage in their archive, or it takes too long to find a specific 10-second clip of footage buried within a two-hour video. Simply put, the more information contained in the metadata, the better you know what’s in the video and the easier it is to find.
But the value of good metadata doesn’t stop there. We’re working across our media logistics, video platform, and video advertising products to help our customers leverage their metadata to increase revenues, not just efficiencies.
Video recommendations are an obvious application of the metadata that can be captured via AI. With the metadata extracted from facial identification, we’re able to recommend all available content in your library based on your audience’s preferences. For example, recommending content with a specific celebrity, whether the celebrity is the subject of the entire video, or if they only appear in part of it. Or, for sports fans, we can highlight clips featuring specific players within a soccer match, detecting the precise moment they appear in the game. Personalized content recommendation videos or clips have proven to keep people watching longer. That means more opportunities to serve ads to that viewer, and more revenue for your business.
Advertising is another interesting use case that we’ll demonstrate at NAB. Let’s use the same example as above. If you know that a certain celebrity appears in a series of videos, and that celebrity is relevant to a particular brand, you now have the ability to deliver ads at that precise moment they appear. An example of this might be an ad for a new Rolex watch appearing during Roger Federer’s post-match interview.
You can also leverage sentiment or topic data to deliver more relevant ads. For example, if someone’s watching a video about sports or fitness you can insert an ad for a sporting goods store or a fitness tracker. Or, if there’s a particular scene in which people are fearful, you might deliver an ad for a security system. (You get the idea.)
That now allows your advertising team to create and sell very targeted advertising packages for premium rates.
This is a new field that we’re only just scratching the surface, but the benefits are very real today. Look for more innovation to come from Ooyala on this front.