ACM Mile High Video 2024 (mhv), Denver, Colorado, February 11-14, 2024

Authors: Daniele Lorenzi (Alpen-Adria-Universität Klagenfurt, Austria), Minh Nguyen (Alpen-Adria-Universität Klagenfurt, Austria), Farzad Tashtarian (Alpen-Adria-Universität Klagenfurt, Austria), and Christian Timmerer (Alpen-Adria-Universität Klagenfurt, Austria)

Abstract: HTTP Adaptive Streaming (HAS) is the de-facto solution for delivering video content over the Internet. The climate crisis has highlighted the environmental impact of information and communication technologies (ICT) solutions and the need for green solutions to reduce ICT’s carbon footprint. As video streaming dominates Internet traffic, research in this direction is vital now more than ever. HAS relies on Adaptive BitRate (ABR) algorithms, which dynamically choose suitable video representations to accommodate device characteristics and network conditions. ABR algorithms typically prioritize video quality, ignoring the energy impact of their decisions. Consequently, they often select the video representation with the highest bitrate under good network conditions, thereby increasing energy consumption. This is problematic, especially for energy-limited devices, because it affects the device’s battery life and the user experience. To address the aforementioned issues, we propose E-WISH, a novel energy-aware ABR algorithm, which extends the already-existing WISH algorithm to consider energy consumption while selecting the quality for the next video segment. According to the experimental findings, E-WISH shows the ability to improve Quality of Experience (QoE) by up to 52% according to the ITU-T P.1203 model (mode 0) while simultaneously reducing energy consumption by up to 12% with respect to state-of-the-art approaches.

Keywords: HTTP adaptive streaming, Energy, Adaptive Bitrate (ABR), DASH

 

IEEE Transactions on Network and Service Management

Authors: Reza Farahani, Ekrem Cetinkaya, Christian Timmerer, Mohammad Shojafar, Mohammad Ghanbari, and Hermann Hellwagner

Abstract: Recent years have witnessed video streaming demands evolve into one of the most popular Internet applications. With the ever-increasing personalized demands for high-definition and low-latency video streaming services, network-assisted video streaming schemes employing modern networking paradigms have become a promising complementary solution in the HTTP Adaptive Streaming (HAS) context. The emergence of such techniques addresses long-standing challenges of enhancing users’ Quality of Experience (QoE), end-to-end (E2E) latency, as well as network utilization. However, designing a cost-effective, scalable, and flexible network-assisted video streaming architecture that supports the aforementioned requirements for live streaming services is still an open challenge. This article leverage novel networking paradigms, i.e., edge computing and Network Function Virtualization (NFV), and promising video solutions, i.e., HAS, Video Super-Resolution (SR), and Distributed Video Transcoding (TR), to introduce A Latency- and cost-aware hybrId P2P-CDN framework for liVe video strEaming (ALIVE). We first introduce the ALIVE multi-layer architecture and design an action tree that considers all feasible resources (i.e., storage, computation, and bandwidth) provided by peers, edge, and CDN servers for serving peer requests with acceptable latency and quality. We then formulate the problem as a Mixed Integer Linear Programming (MILP) optimization model executed at the edge of the network. To alleviate the optimization model’s high time complexity, we propose a lightweight heuristic, namely, Greedy-Based Algorithm (GBA). Finally, we (i) design and instantiate a large-scale cloud-based testbed including 350 HAS players, (ii) deploy ALIVE on it, and (iii) conduct a series of experiments to evaluate the performance of ALIVE in various scenarios. Experimental results indicate that ALIVE (i) improves the users’ QoE by at least 22%, (ii) decreases incurred cost of the streaming service provider by at least 34%, (iii) shortens clients’ serving latency by at least 40%, (iv) enhances edge server energy consumption by at least 31%, and (v) reduces backhaul bandwidth usage by at least 24% compared to baseline approaches.

Keywords: HTTP Adaptive Streaming (HAS); Edge Com- puting; Network Function Virtualization (NFV); Content Deliv- ery Network (CDN); Peer-to-Peer (P2P); Quality of Experience (QoE); Video Transcoding; Video Super-Resolution.

IEEE Access, A Multidisciplinary, Open-access Journal of the IEEE

Title: Characterization of the Quality of Experience and Immersion of Point Cloud Video Sequences through a Subjective Study @ IEEE Access

AuthorsMinh NguyenShivi VatsSam Van Damme (Ghent University – imec and KU Leuven, Belgium), Jeroen van der Hooft (Ghent University – imec, Belgium), Maria Torres Vega (Ghent University – imec and KU Leuven, Belgium), Tim Wauters (Ghent University – imec, Belgium), Filip De Turck (Ghent University – imec, Belgium), Christian Timmerer, Hermann Hellwagner

Abstract: Point cloud streaming has recently attracted research attention as it has the potential to provide six degrees of freedom movement, which is essential for truly immersive media. The transmission of point clouds requires high-bandwidth connections, and adaptive streaming is a promising solution to cope with fluctuating bandwidth conditions. Thus, understanding the impact of different factors in adaptive streaming on the Quality of Experience (QoE) becomes fundamental. Point clouds have been evaluated in Virtual Reality (VR), where viewers are completely immersed in a virtual environment. Augmented Reality (AR) is a novel technology and has recently become popular, yet quality evaluations of point clouds in AR environments are still limited to static images.

In this paper, we perform a subjective study of four impact factors on the QoE of point cloud video sequences in AR conditions, including encoding parameters (quantization parameters, QPs), quality switches, viewing distance, and content characteristics. The experimental results show that these factors significantly impact the QoE. The QoE decreases if the sequence is encoded at high QPs and/or switches to lower quality and/or is viewed at a shorter distance, and vice versa. Additionally, the results indicate that the end user is not able to distinguish the quality differences between two quality levels at a specific (high) viewing distance. An intermediate-quality point cloud encoded at geometry QP (G-QP) 24 and texture QP (T-QP) 32 and viewed at 2.5 m can have a QoE (i.e., score 6.5 out of 10) comparable to a high-quality point cloud encoded at 16 and 22 for G-QP and T-QP, respectively, and viewed at a distance of 5 m. Regarding content characteristics, objects with lower contrast can yield better quality scores. Participants’ responses reveal that the visual quality of point clouds has not yet reached an immersion level as desired. The average QoE of the highest visual quality is less than 8 out of 10. There is also a good correlation between objective metrics (e.g., color Peak Signal-to-Noise Ratio (PSNR) and geometry PSNR) and the QoE score. Especially the Pearson correlation coefficients of color PSNR is 0.84. Finally, we found that machine learning models are able to accurately predict the QoE of point clouds in AR environments.

The subjective test results and questionnaire responses are available on Github: https://github.com/minhkstn/QoE-and-Immersion-of-Dynamic-Point-Cloud.

Sebastian Uitz and Michael Steinkellner presented their highly anticipated game, “A Webbing Journey,” at the biggest gaming event in Austria, the Game City in Vienna, from October 13th to 15th, 2023. This event was a bustling hub of innovation, bringing together game developers and enthusiasts from near and far. It offered a remarkable opportunity to connect with fellow developers and immerse themselves in a world of fantastic games from other indie developers and big publishers. 
Nestled within the heart of Game City, our booth provided a gateway into the captivating universe of “A Webbing Journey.” Attendees of all ages were invited to step into the eight-legged shoes of our adventurous spider, experiencing the game’s enchanting storyline and unique gameplay mechanics. Our setup, equipped with a laptop, a Steam Deck, and a Nintendo Switch, allowed players to traverse the spider’s wondrous journey, leaving no web unspun. 
One of the event’s highlights was our engaging interview with the FM4 radio channel. This platform provided an excellent opportunity to share the inspiration behind “A Webbing Journey,” explore the game’s captivating features, and show off the newest level in our game. We were thrilled to offer a glimpse into the game’s development process and reveal the magic that makes our project so unique.

Authors: Gregor Molan, Gregor Dolinar, Jovan Bojkovski, Radu Prodan, Andrea Borghesi, Martin Molan

Journal: IEEE Access

Purpose: The gap between software development requirements and the available resources of software developers continues to widen. This requires changes in the development and organization of software development.

Objectives: Presented is a model introducing a quantitative software development management methodology that estimates the relative importance and risk of functionality retention or abundance, which determines the final value of the software product.

Method: The final value of the software product is interpreted as a function of the requirements and functionalities, represented as a computational graph (called a software product graph). The software product graph allows the relative importance of functionalities to be estimated by calculating the corresponding partial derivatives of the value function. The risk of not implementing the functionality is estimated by reducing the final value of a product.

Validation: This model has been applied to two EU projects: CareHD and vINCI. In vINCI, the functionalities with the most significant added value to the application were developed based on the implemented model and those that brought the least value were abandoned. Optimization was not implemented in the CareHD project and proceeded as initially designed. Consequently, only 71% of the CareHD’s potential value has been realized.

Conclusions: Presented model enables rational management and organization of software product development with real-time quantitative evaluation of functionalities impacts, assessment of the risks of omitting them without a significant impact. A quantitative evaluation of the impacts and risks of retention or abundance is possible based on the proposed algorithm, which is the core of the model. This model is a tool for rational organization and development of software products.

Special Issue on Sustainable Multimedia Communications and Services, IEEE COMSOC MMTC Communications – Frontiers

Title: Towards Low-Latency and Energy-Efficient Hybrid P2P-CDN Live Video Streaming

Authors: Reza Farahani, Christian Timmerer, and Hermann Hellwagner

Abstract: Streaming segmented videos over the Hypertext Transfer Protocol (HTTP) is an increasingly popular approach in both live and video-on-demand (VoD) applications. However, designing a scalable and adaptable framework that reduces servers’ energy consumption and supports low latency and high quality services, particularly for live video streaming scenarios, is still challenging for Over-The-Top (OTT) service providers. To address such challenges, this paper introduces a new hybrid P2P-CDN framework that leverages new networking and computing paradigms, i.e., Network Function Virtualization (NFV) and edge computing for live video streaming. The proposed framework introduces a multi-layer architecture and a tree of possible actions therein (an action tree), taking into account all available resources from peers, edge, and CDN servers to efficiently distribute video fetching and transcoding tasks across a hybrid P2P-CDN network, consequently enhancing the users’ latency and video quality. We also discuss our testbed designed to validate the framework and compare it with baseline methods. The experimental results indicate that the proposed framework improves user Quality of Experience (QoE), reduces client serving latency, and improves edge server energy consumption compared to baseline approaches.


Title: ARTICONF Decentralized Social Media Platform for Democratic Crowd Journalism

Authors: Ines Rito Lima, Vasco Filipe, Claudia Marinho, Alexandre Ulisses, Antorweep Chakravorty, Atanas Hristov, Nishant Saurabh, Zhiming Zhao, Ruyue Xin, Radu Prodan

Social Network Analysis and Mining https://www.springer.com/journal/13278

Abstract: Media production and consumption behaviors are changing in response to new technologies and demands, giving birth to a new generation of social applications. Among them, crowd journalism represents a novel way of constructing democratic and trustworthy news relying on ordinary citizens arriving at breaking news locations and capturing relevant videos using their smartphones. The ARTICONF  project proposes a trustworthy, resilient, and globally sustainable toolset for developing decentralized applications (DApps) to address this need. Its goal is to overcome the privacy, trust, and autonomy-related concerns associated with proprietary social media platforms overflowed by fake news.

Leveraging the ARTICONF tools, we introduce a new DApp for crowd journalism called MOGPlay. MOGPlay collects and manages audio-visual content generated by citizens and provides a secure blockchain platform that rewards all stakeholders involved in professional news production.

Besides live streaming, MOGPlay offers a marketplace for audio-visual content trading among citizens and free journalists with an internal token ecosystem. We discuss the functionality and implementation of the MOGPlay DApp and illustrate four pilot crowd journalism live scenarios that validate the prototype.

Authors: Juanjuan Li, Rui Qin, Cristina Olaverri-Monreal, Radu Prodan, Fei-Yue Wang

Journal: IEEE Transactions on Intelligent Vehicles

Abstract: As part of TIV’s DHW on Vehicle 5.0, this letter introduces a novel concept, Logistics 5.0, to address high complexities in logistics CyberPhysical-Social Systems (CPSS). Building upon the theory of parallel intelligence and leveraging advanced technologies and methods such as blockchain, scenarios engineering, Decentralized Autonomous Organizations and Operations (DAOs), Logistics 5.0 promises to accelerate the paradigm shift towards intelligent and sustainable logistics. First, the parallel logistic framework is proposed, and the logistics ecosystem is discussed. Then, the human-oriented operating systems (HOOS) are suggested to providing intelligent Logistics 5.0 solutions. Logistics 5.0 serves as a critical catalyst in realizing the “6S” objectives, i.e. Safety, Security, Sustainability, Sensitivity, Service, and Smartness, within the logistics industry