Quantcast
Channel: Category Name
Viewing all articles
Browse latest Browse all 5971

Live stream analysis using Video Indexer

$
0
0

Video Indexer is an Azure service designed to extract deep insights from video and audio files offline. This is to analyze a given media file already created in advance. However, for some use cases it's important to get the media insights from a live feed as quick as possible to unlock operational and other use cases pressed in time. For example, such rich metadata on a live stream could be used by content producers to automate TV production, like our example of EndemolShine Group, by journalists of a newsroom to search into live feeds, to build notification services based on content and more.

To that end, I joined forces with Victor Pikula a Cloud Solution Architect at Microsoft, in order to architect and build a solution that allows customers to use Video Indexer in near real-time resolutions on live feeds. The delay in indexing can be as low as four minutes using this solution, depending on the chunks of data being indexed, the input resolution, the type of content and the compute powered used for this process.

Sample player displaying the Video Indexer metedata on the live stream

Figure 1 – Sample player displaying the Video Indexer metadata on the live stream

The stream analysis solution at hand, uses Azure Functions and two Logic Apps to process a live program from a live channel in Azure Media Services with Video Indexer and displays the result with Azure Media Player showing the near real-time resulted stream.

In high level, it is comprised of two main steps. The first step runs every 60 seconds, and takes a sub-clip of the last 60 seconds played, creates an asset from it and indexes it via Video Indexer. Then the second step is called once indexing is complete. The insights captured are processed, sent to Azure Cosmos DB, and the sub-clip indexed is deleted.

The sample player plays the live stream and gets the insights from Azure Cosmos DB, using a dedicated Azure Function. It displays the metadata and thumbnails in sync with the live video.

The two logic apps processing the live stream every minute in the cloud

Figure 2 – The two logic apps processing the live stream every minute in the cloud.

Near real-time indexing for video production

At the EBU Production Technology Seminar in Geneva last month, an end-to-end solution was demonstrated by Microsoft. Several live feeds were ingested to Azure using Dejero technology or the webRTC protocol, and sent to Make.TV Live Video Cloud to switch inputs. The selected input was sent as a transcoded stream to Azure Media Services for multi bitrate transcoding and OTT delivery in low latency mode.  The same stream was also processed in near real time with Video Indexer.

Example of live stream processing in Azure

Figure 3 – Example of live stream processing in Azure

Next steps

The full code and a step-by-step guide to deploy the results can be found in this GitHub project for Live media analytics with Video Indexer. Need near real-time analytics for your content? Now you have a ready-made solution for that, go ahead and give it a try!

Have questions or feedback? We would love to hear from you! Visit our UserVoice to help us prioritize features, or email VISupport@Microsoft.com with any questions.


Viewing all articles
Browse latest Browse all 5971

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>