Augmenting a TV Broadcast with Synchronised User Generated Video and Relevant Social Network Content

conference paper
As TNO, we have developed an Augmented Live Broadcast use case, using components from the FP7 STEER project. In this use case, a television broadcast of a live event is augmented with user generated content. This user generated content consists of videos made by users at the event, and also of relevant social network content. The current implementation uses timestamps inserted in the media streams to synchronise related media streams. Social networks are searched using EPG information as a starting point. The presentation of the content is done on a TV (using a PC as a settop box), and on a tablet. We propose the use of a number of components of the FIcontent project to enhance our use case. Fingerprinting can be used to synchronise with an actual broadcast, instead of with a dedicated broadcast. Audio mining can be used to generate additional keywords for the social analytics framework. The TV application framework can be used to port our application to a variety of TV sets, and the second screen framework can be used to dynamically use tablets as a second screen in our use case.
TNO Identifier
506653
Publisher
ACM
Source title
ACM International Conference on Interactive Experiences for TV and Online Video, TVX 2014, 25-27 June 2014, Newcastle Upon Tyne, UK
Place of publication
New York, NY