EBU MDN WORKSHOP 2019 – Day 1
This week, Konodrac is in Geneva at an EBU event (European Broadcast Union). The event is organized by the EBU Strategic Program on Media Information Management and AI (MIM-AI) and the Metadata Development Network (MDN), a community for developers in which knowledge is shared, current work is presented, feedback is received, and there are opportunities for collaboration in metadata related projects.
Among other things, yesterday they talked about a job done by the Swiss television, RTS, which allows their workers, especially journalists, the search and access of images and videos of people of interest. This is new way of working that moves from traditional search, either extensive or based on written annotations, to one in which you search for an individual, and you get all video frames in the catalogue where that person appears. This system also allows you to place an image of a person and the system, it will automatically return all the videos where person. In a similar way, this system supports images of objects or relevant landmarks. To achieve this, they built a system that uses precalculated neural networks (MRCNN, RESNET-50 or FACENET) that compares the image in question with the whole archive, presenting the most relevant and similar results.
Concerning Konodrac’s lines of work, we enjoyed a presentation that aligned very well with some of the interests that we have right now. Konodrac has been offering television channels the chance to have reliable audience metrics in real time. Recently, we have focused on offering a wide range of categorical variables associated with TV consumption. For example, we can see, from total consumption, what is the fraction of viewers who consume content from the web or from a certain category in a given taxonomy. Having this information allows broadcasts, to better define their audience by answering questions such as “who’s watching?”, “How are they watching?”, “How long are they watching?” … Yesterday’s talk seeks to deepen this analysis, by enriching consumer data with metadata, not associated with consumption but with the broadcast itself. Consider, for example, seeing the name of the talk-show guest who is intervening at a specific time, associated with the audience curve, or the protagonists of the news that is being broadcast. As we are now doing by integrating social networks, automatically incorporating quality annotations from the videos that broadcasts are playing to the tools offered by Konodrac, would allow our clients to be able, not only to answer descriptive questions, but to be able to make inferences on possible causes based on the changes in behavior of their audience.
We will see what our European colleagues offer us that help us to keep learning and to continue looking for ways to improve our product. More on this tomorrow!