If you’ve had experience building search capabilities into your video applications, you know that indexing and searching video content is complex – not only due to the time dimension, but because it also requires searching through many objects and related metadata. You’ve had to write multiple queries that are time consuming, fragile at scale and hard to optimize for performance.
We’re excited to introduce a new Kaltura search API that will revolutionize how video search is done. Leveraging the Elastic Search engine, eSearch exposes a set of API actions that unlock a variety of search capabilities and simplify how video search is done.
We’ll demonstrate a few of the cool features below:
Search Term Highlighting
A really neat feature, highlighting, gives you insight on why a particular object was returned in the search results, meaning, what caused the object to match the query.
Let’s assume you have an account in Kaltura with over a thousand entries and you’re looking for a pasta recipe. Searching “pasta” with the unified search would search through all entry data – and this is where highlighting comes in: you’d be able to determine whether pasta was found in the captions, description, or simply just the entry name.
At this point you’re just hungry and you don’t care whether the recipe is for pasta or salmon. You’d like to eat something “delicious”, but you’re open to “yummy” as well. This is another case for the partial feature, which uses the WordNet English synonym dictionary by default and gets you that recipe with fewer searches!
Now imagine that you have recipes with captions in various languages, such as Chinese or German. eSearch supports searching across 22 languages (and we’ll be adding more!), including; Spanish, French, Russian, Dutch, Chinese. Let’s demonstrate searching inside captions for the Chinese phrase 食谱 which, you guessed it, means recipe.
The unified option searches through all of the entry’s related objects – such as captions, metadata, and cue-points but if you know where your keyword is, with eSearch it is just as easy to search through specific objects.
Video often includes temporal (time based) metadata such as annotations, comments, notes, overlays, in-video chapters and markers, synced powerpoint slides, or descriptive data (often generated by video analysis engines such as OCR, face detection, scene detection, etc.). Cue Points are the objects that store this temporal metadata, and it’s very often that you’ll want to search through them to create smart search driven experiences.
For example, in our video cooking recipes library, all our videos were marked with ingredients as we were using them: sugar, flour, etc. Let’s find all the strawberry recipes where we’re not using sugar.
Now to make things interesting: what if you were looking for an entry with “pasta” in the name and 食谱 (recipe in Chinese) within the captions? Or if you were looking for “recipe” in custom metadata and wanted to highlight the results? eSearch has your back.
Kaltura's mission is to power any video experience. Our wide array of video solutions are deployed globally across thousands of enterprises, media companies, service providers, and educational institutions, leveraging video to teach, learn, communicate, collaborate, and entertain.