Multi-sensor capture and network processing for virtual reality conferencing
conference paper
Recent developments in key technologies like 5G, Augmented and Virtual Reality (AR/VR) and Tactile Internet result in new possibilities for communication. Particularly, these key digital technologies can enable remote communication and collaboration in remote experiences. In this demo, we work towards 6-degrees of freedom (DoF) photo-realistic shared experiences by introducing a multi-view multi-sensor capture end-to-end system. Our system acts as a baseline end-to-end system for capture, transmission and rendering of volumetric video of user representations. To handle multi-view video processing in a scalable way, we introduce a Multi-point Control Unit (MCU) to shift processing from end devices into the cloud. MCUs are commonly used to bridge videoconferencing connections, and we design and deploy a VR-ready MCU to reduce both upload bandwidth and end-device processing requirements. In our demo, we focus on a remote meeting use case where multiple people can sit around a table to communicate in a shared VR environment. ? 2019 Authors.
Topics
Social VRTelepresenceVideoconferencingVR5G mobile communication systemsMicrocontrollersMobile telecommunication systemsMultimedia systemsVideo conferencingVirtual realityVisual communicationAugmented and virtual realitiesDigital technologiesEnd-to-end systemsNetwork processingRemote communicationShared experiencesSocial VRTelepresenceVideo signal processing
TNO Identifier
868185
ISBN
9781450362979
Publisher
Association for Computing Machinery, Inc
Source title
Proceedings of the 10th ACM Multimedia Systems Conference, MMSys 2019, 10th ACM Multimedia Systems Conference, MMSys 2019, 18 June 2019 through 21 June 2019
Pages
316-319
Files
To receive the publication files, please send an e-mail request to TNO Repository.