ACMMM’19 Tutorial: A Journey towards Fully Immersive Media Access
Abstract: Universal media access as proposed in the late 90s, early 2000 is now reality. Thus, we can generate, distribute, share, and consume any media content, anywhere, anytime, and with/on any device. A major technical breakthrough was the adaptive streaming over HTTP resulting in the standardization of MPEG-DASH, which is now successfully deployed in HTML5 environments thanks to corresponding media source extensions (MSE). The next big thing in adaptive media streaming is virtual reality applications and, specifically, omnidirectional (360°) media streaming, which is currently built on top of the existing adaptive streaming ecosystems. This tutorial provides a detailed overview of adaptive streaming of both traditional and omnidirectional media within HTML5 environments. The tutorial focuses on the basic principles and paradigms for adaptive streaming – both traditional and omnidirectional media – as well as on already deployed content generation, distribution, and consumption workflows. Additionally, the tutorial provides insights into standards and emerging technologies in the adaptive streaming space. Finally, the tutorial includes the latest approaches for immersive media streaming enabling 6DoF DASH through Point Cloud Compression (PCC) and concludes with open research issues and industry efforts in this domain.
Lecturers: Christian Timmerer, Alpen-Adria-Universität Klagenfurt & Bitmovin, Inc.
Ali C. Begen, Ozyegin University and Networked Media
Learning Objectives:
This tutorial consists of two main parts. In the first part, we provide a detailed overview of the HTML5 standard and show how it can be used for adaptive streaming deployments. In particular, we focus on the HTML5 video, media extensions, and multi-bitrate encoding, encapsulation and encryption workflows, and survey well-established streaming solutions. Furthermore, we present experiences from the existing deployments and the relevant de jure and de facto standards (DASH, HLS, CMAF) in this space. In the second part, we focus on omnidirectional (360) media from creation to consumption as well as first thoughts on dynamic adaptive point cloud streaming. We survey means for the acquisition, projection, coding and packaging of omnidirectional media as well as delivery, decoding and rendering methods. Emerging standards and industry practices are covered as well (OMAF, VR-IF). Both parts present some of the current research trends, open issues that need further exploration and investigation, and various efforts that are underway in the streaming industry.
- Principles of HTTP adaptive streaming for the Web/HTML5
- Principles of omnidirectional (360-degree) media delivery
- Content generation, distribution and consumption workflows for traditional and omnidirectional media
- Standards and emerging technologies in the adaptive streaming space
- Current and future research on traditional and omnidirectional media delivery, specifically enabling 6DoF adaptive streaming through point cloud compression
Table of Contents
- HTML5 video and media extensions
- Survey of well-established streaming solutions
- Multi-bitrate encoding, and encapsulation and encryption workflows
- The MPEG-DASH standard, Apple HLS and the developing CMAF standard
- Common issues in scaling and improving quality, multi-screen/hybrid delivery
- Acquisition, projection, coding and packaging of 360-degree video
- Delivery, decoding and rendering methods
- The developing MPEG-OMAF and MPEG-I standards
- Ongoing industry efforts, specifically towards 6DoF adaptive streaming