Media Orchestration between Streams and Devices via New MPEG Timed Metadata
article
The proliferation of new capabilities in affordable smart devices capable of capturing, processing, and rendering audiovisual media content triggers a need for coordination and orchestration between these devices and their capabilities and of the content flowing from and to such devices. The upcoming Moving Picture Experts Group (MPEG) Media Orchestration standard (MORE, ISO/IEC 23001-13) enables the temporal and spatial orchestration of multiple media and metadata streams. The temporal orchestration is about time synchronization of media and sensor captures, processing, and renderings, for which the MORE standard uses and extends a Digital Video Broadcasting standard. The spatial orchestration is about the alignment of (global) position, altitude, and orientation for which the MORE standard provides dedicated timed metadata. Other types of orchestration involve timed metadata for the region of interest, perceptual quality of media, audio-feature extraction, and media timeline correlation. This paper presents the status of the MORE standard as well as the associated technical and experimental support materials. We also link MORE to the recently initiated MPEG immersive project. © 2018 Society of Motion Picture and Television Engineers, Inc.
Topics
TNO Identifier
843672
ISSN
15450279
Source
SMPTE Motion Imaging Journal, 127(10), pp. 32-38.
Publisher
Society of Motion Picture and Television Engineers
Article nr.
8532409
Pages
32-38
Files
To receive the publication files, please send an e-mail request to TNO Repository.