SIGMM Records Column

Samira Afzal (Alpen-Adria-Universität (AAU) Klagenfurt, Austria), Radu Prodan (Alpen-Adria-Universität (AAU) Klagenfurt, Austria), Christian Timmerer (Alpen-Adria-Universität (AAU) Klagenfurt and Bitmovin Inc., Austria)

Introduction:

Regarding the Intergovernmental Panel on Climate Change (IPCC) report in 2021 and Sustainable Development Goal (SDG) 13 “climate action”, urgent action is needed against climate change and global greenhouse gas (GHG) emissions in the next few years [1]. This urgency also applies to the energy consumption of digital technologies. Internet data traffic is responsible for more than half of digital technology’s global impact, which is 55% of energy consumption annually. The Shift Project forecast [2] shows an increase of 25% in data traffic associated with 9% more energy consumption per year, reaching 8% of all GHG emissions in 2025.

Video flows represented 80% of global data flows in 2018, and this video data volume is increasing by 80% annually [2].  This exponential increase in the use of streaming video is due to (i) improvements in Internet connections and service offerings [3], (ii) the rapid development of video entertainment (e.g., video games and cloud gaming services), (iii) the deployment of Ultra High-Definition (UHD, 4K, 8K), Virtual Reality (VR), and Augmented Reality (AR), and (iv) an increasing number of video surveillance and IoT applications [4]. Interestingly, video processing and streaming generate 306 million tons of CO2, which is 20% of digital technology’s total GHG emissions and nearly 1% of worldwide GHG emissions [2].

While research has shown that the carbon footprint of video streaming has been decreasing in recent years [5], there is still a high need to invest in research and development of efficient next-generation computing and communication technologies for video processing technologies. This carbon footprint reduction is due to technology efficiency trends in cloud computing (e.g., renewable power), emerging modern mobile networks (e.g., growth in Internet speed), and end-user devices (e.g., users prefer less energy-intensive mobile and tablet devices over larger PCs and laptops). However, since the demand for video streaming is growing dramatically, it raises the risk of increased energy consumption.

Investigating energy efficiency during video streaming is essential to developing sustainable video technologies. The processes from video encoding to decoding and displaying the video on the end user’s screen require electricity, which results in CO2 emissions. Consequently, the key question becomes: “How can we improve energy efficiency for video streaming systems while maintaining an acceptable Quality of Experience (QoE)?”.

 

37th IEEE International Parallel & Distributed Processing Symposium (IPDPS) May 15-19, 2023, Florida USA

Authors: Zahra Najafabadi Samani (Alpen-Adria-Universität Klagenfurt, Austria), Narges Mehran (Alpen-Adria-Universität Klagenfurt, Austria), Dragi Kimovski (Alpen-Adria-Universität Klagenfurt, Austria), Radu Prodan (Alpen-Adria-Universität Klagenfurt, Austria)

Abstract: The accelerating growth of modern distributed applications with low delivery deadlines leads to a paradigm shift towards the multi-tier computing continuum. However, the geographical dispersion, heterogeneity, and availability of the continuum resources may result in failures and quality of service degradation, significantly negating its advantages and lowering users’ satisfaction. We propose in this paper a proactive application placement PROS method relying on distributed coordination to prevent the quality of service violations through service-level agreements on the computing continuum. PROS employs a sigmoid function with adaptive weights for the different parameters to predict the service level agreement assurance of devices based on their past credentials and current capabilities. We evaluate PROS using two application workloads with different traffic stress levels up to 90 million services on a real testbed with 600 heterogeneous instances deployed over eight geographical locations. The results show that PROS increases the success rate by 7-33%, reduces the response time by 16-38%, and increases the deadline satisfaction rate by 19-42% compared to the two related work methods. A comprehensive simulation study with 1000 devices and a workload of up to 670 million services confirms the scalability of the results.

HiPEAC magazine https://www.hipeac.net/news/#/magazine/

HiPEACINFO 68, pages 27-28.

Autohrs: Dragi Kimovski (Alpen-Adria-Universität Klagenfurt, Austria), Narges Mehran (Alpen-Adria-Universität Klagenfurt, Austria), Radu Prodan (Alpen-Adria-Universität Klagenfurt, Austria), Souvik Sengupta (iExec Blockchain Tech, France), Anthony Simonet-Boulgone (iExec Blockchain Tech, France), Ioannis Plakas (UBITECH, Greece) , Giannis Ledakis (UBITECH, Greece) and Dumitru Roman (University of Oslo and SINTEF AS, Norway)

Abstract: Modern big-data pipeline applications, such as machine learning, encompass complex workflows for real-time data gathering, storage and analysis. Big-data pipelines often have conflicting requirements, such as low communication latency and high computational speed. These require different kinds of computing resource, from cloud to edge, distributed across multiple geographical locations – in other words, the computing continuum. The Horizon 2020 DataCloud project is creating a novel paradigm for big-data pipeline processing over the computing continuum, covering the complete lifecycle of bigdata pipelines. To overcome the runtime challenges associated with automating big-data pipeline processing on the computing continuum, we’ve created the DataCloud architecture. By separating the discovery, definition, and simulation of big-data pipelines from runtime execution, this architecture empowers domain experts with little infrastructure or software knowledge to take an active part in defining big-data pipelines.

This work received funding from the DataCloud European Union’s Horizon 2020 research and innovation programme under grant agreement no. 101016835.

IEEE International Conference on Communications (ICC)

28 May – 01 June 2023– Rome, Italy

Conference Website

Reza Farahani (Alpen-Adria-Universität Klagenfurt),  Abdelhak Bentaleb (Concordia University, Canada), Christian Timmerer (Alpen-Adria-Universität Klagenfurt), Mohammad Shojafar (University of Surrey, UK), Radu Prodan (Alpen-Adria-Universität Klagenfurt), and Hermann Hellwagner (Alpen-Adria-Universität Klagenfurt)

Abstract: 5G and 6G networks are expected to support various novel emerging adaptive video streaming services (e.g., live, VoD, immersive media, and online gaming) with versatile Quality of Experience (QoE) requirements such as high bitrate, low latency, and sufficient reliability. It is widely agreed that these requirements can be satisfied by adopting emerging networking paradigms like Software-Defined Networking (SDN), Network Function Virtualization (NFV), and edge computing. Previous studies have leveraged these paradigms to present network-assisted video streaming frameworks, but mostly in isolation without devising chains of Virtualized Network Functions (VNFs) that consider the QoE requirements of various types of Multimedia Services (MS).

To bridge the aforementioned gaps, we first introduce a set of multimedia VNFs at the edge of an SDN-enabled network, form diverse Service Function Chains (SFCs) based on the QoE requirements of different MS services. We then propose SARENA, an SFC-enabled ArchitectuRe for adaptive VidEo StreamiNg Applications. Next, we formulate the problem as a central scheduling optimization model executed at the SDN controller. We also present a lightweight heuristic solution consisting of two phases that run on the SDN controller and edge servers to alleviate the time complexity of the optimization model in large-scale scenarios. Finally, we design a large-scale cloud-based testbed, including 250 HTTP Adaptive Streaming (HAS) players requesting two popular MS applications (i.e., live and VoD), conduct various experiments, and compare its effectiveness with baseline systems. Experimental results illustrate that SARENA outperforms baseline schemes in terms of users’ QoE by at least 39.6%, latency by 29.3%, and network utilization by 30% in both MS services.

Index TermsHAS; DASH; NFV; SFC; SDN, Edge Computing.

 

Where does technology help us in our daily lives?

Interview with Felix Schniz, Game Studies and Engineering SPL @ ITEC

 

We meet Felix Schniz for an interview in Lakeside Park, in the CD laboratory Athena, building B12B, to learn something about him and his work and why he chose his career. For those who don´t yet know Felix: he is always neatly dressed, has a smile on his lips and is eager for a mutual exchange of ideas and opinions. So, he was quick to accept the invitation to be the first person on a new journey from “People Behind Informatics”. He is passionate about his work and is happy to share his views with us.

 

Hello Felix, thanks for taking the time to talk to us. Please tell me something about yourself, where you come from, and how your professional career has evolved.

I was born in Bietigheim-Bissingen near Stuttgart. I studied in Mannheim, with the focus of my Bachelor’s degree in English and American Studies. For my Master, I specialized in culture in the process of modernity. In addition to literature and film, we also dealt with digitization processes and that’s how I came to the video game area. That was my “unusual entry” into technical sciences. After my Master’s degree, it was clear to me: I wanted to write a doctoral thesis on video games. The academic path is simply mine, and the topic offers many exciting perspectives, as it is still unexplored in large parts. During my research for the right environment for such a research project, I met René Schallegger at a conference in Oxford. We stay in contact. When a vacancy for a university assistant was advertised at the Department of English in 2016, I applied for this position, started my doctorate at the same time and have been here since then.

 

Such a coincidence, and very lucky that you found exactly what you were looking for. How was your start at the University of Klagenfurt?

I started immediately and also took on the role of the SPL (programme director) of the Master’s degree in “Game Studies and Engineering“, which combines both – humanities and technical aspects. This is also what is special about this programme: the students learn technical approaches to video games and what kind of a role a technical medium plays in society.

 

What do you particularly like about your work?

I am taken seriously and can combine my passion for technology and humanities. I am very happy to question: What is the reason for that, what is behind it, and what else needs to be considered? I can live that to the full in my work.

 

And how did your doctorate continue?

In my doctorate, I asked the research question of what a video game experience actually is. It’s not that easy to name and has to be illuminated from many sides. Philosophically – psychologically – sociologically – media science… The path goes from one’s own, personal to the technical implementation. I wrote theoretical basics, worked with content analyses and scientifically processed my own experiences. This gave me a new, exciting field of questions for myself and research on video games – because how can we speak scientifically about the content of the medium when we experience it in such a personal way?

 

What consensus emerged for you?

Video games help us to get a bigger, better picture of people in the digital age. We have to ask ourselves what kind of influence video games in the future can and should have and need to raise awareness of what kind of responsibility video game programmers have. Programmers should also ask themselves what they want to offer people. The virtual worlds that open video games can offer us a lot, but we have to learn how to deal with them.

In short, I have to ask myself: What do I want to achieve with technology? What role should it play in my life?

Over the past few years, one has been able to follow what role virtual worlds can play in the lives of people. The well-known video game “Fortnite”, for example, was suddenly not just a popular game, but also a much-needed social meeting point, and a retreat for young people, whose social and private spaces were taken away by the pandemic.

Video games can be of great importance for each of us. They can offer us things we need emotionally, socially, or intellectually, or allow us to explore ourselves. This does not mean that the virtual should replace the real world – but it can be a great addition to it. In order to continue to pursue these thoughts in targeted extracts, I also wrote a lot about coping with grief in addition to my doctoral thesis. I am currently working on a book about the spiritual experience of interactive media in general. It will be published later this year.

 

Thank you very much for inviting us into your interesting area of work. We wish you a lot of joy and success in your favourite research area.

Journal Website: Journal of Network and Computer Applications

[PDF]

Samira Afzal (Alpen-Adria-Universität Klagenfurt), Vanessa Testoni (unico IDtech), Christian Esteve Rothenberg (University of Campinas), Prakash Kolan (Samsung Research America), and Imed Bouazizi (Qualcomm)

Abstract:

Demand for wireless video streaming services increases with users expecting to access high-quality video streaming experiences. Ensuring Quality of Experience (QoE) is quite challenging due to varying bandwidth and time constraints. Since most of today’s mobile devices are equipped with multiple network interfaces, one promising approach is to benefit from multipath communications. Multipathing leads to higher aggregate bandwidth and distributing video traffic over multiple network paths improves stability, seamless connectivity, and QoE. However, most of current transport protocols do not match the requirements of video streaming applications or are not designed to address relevant issues, such as networks heterogeneity, head-of-line blocking, and delay constraints. In this comprehensive survey, we first review video streaming standards
and technology developments. We then discuss the benefits and challenges of multipath video transmission over wireless. We provide a holistic literature review of multipath wireless video streaming, shedding light on the different alternatives from an end-to-end layered stack perspective, reviewing key multipath wireless scheduling functions, unveiling trade-offs of each approach, and presenting a suitable taxonomy to classify the
state-of-the-art. Finally, we discuss open issues and avenues for future work.

 

Collaborative Edge-Assisted Systems for HTTP Adaptive Video Streaming

5G/6G Innovation Center,  University of Surrey, UK

6th January 2023 | Guildford, UK

Abstract: The proliferation of novel video streaming technologies, advancement of networking paradigms, and steadily increasing numbers of users who prefer to watch video content over the Internet rather than using classical TV have made video the predominant traffic on the Internet. However, designing cost-effective, scalable, and flexible architectures that support low-latency and high-quality video streaming is still a challenge for both over-the-top (OTT) and ISP companies. In this talk, we first introduce the principles of video streaming and the existing challenges. We then review several 5G/6G networking paradigms and explain how we can leverage networking technologies to form collaborative network-assisted video streaming systems for improving users’ quality of experience (QoE) and network utilization.

 

 

Reza Farahani is a last-year Ph.D. candidate at the University of Klagenfurt, Austria, and a Ph.D. visitor at the University of Surrey, Uk. He received his B.Sc. in 2014 and M.Sc. in 2019 from the university of Isfahan, IRAN, and the university of Tehran, IRAN, respectively. Currently, he is working on the ATHENA project in cooperation with its industry partner Bitmovin. His research is focused on designing modern network-assisted video streaming solutions (via SDN, NFV, MEC, SFC, and P2P paradigms), multimedia Communication, computing continuum challenges, and parallel and distributed systems. He also worked in different roles in the computer networks field, e.g., network administrator, ISP customer support engineer, Cisco network engineer, network protocol designer, network programmer, and Cisco instructor (R&S, SP).

Journal: Sensors

Authors: Akif Quddus Khan, Nikolay Nikolov, Mihhail Matskin,Radu Prodan, Dumitru Roman, Bekir Sahin, Christoph Bussler, Ahmet Soylu

Abstract: Big data pipelines are developed to process data characterized by one or more of the three big data features, commonly known as the three Vs (volume, velocity, and variety), through a series of steps (e.g., extract, transform, and move), making the ground work for the use of advanced analytics and ML/AI techniques. Computing continuum (i.e., cloud/fog/edge) allows access to virtually infinite amount of resources, where data pipelines could be executed at scale; however, the implementation of data pipelines on the continuum is a complex task that needs to take computing resources, data transmission channels, triggers, data transfer methods, integration of message queues, etc., into account. The task becomes even more challenging when data storage is considered as part of the data pipelines. Local storage is expensive, hard to maintain, and comes with several challenges (e.g., data availability, data security, and backup). The use of cloud storage, i.e., storage-as-a-service (StaaS), instead of local storage has the potential of providing more flexibility in terms of scalability, fault tolerance, and availability. In this article, we propose a generic approach to integrate StaaS with data pipelines, i.e., computation on an on-premise server or on a specific cloud, but integration with StaaS, and develop a ranking method for available storage options based on five key parameters: cost, proximity, network performance, server-side encryption, and user weights/preferences. The evaluation carried out demonstrates the effectiveness of the proposed approach in terms of data transfer performance, utility of the individual parameters, and feasibility of dynamic selection of a storage option based on four primary user scenarios.

Every year, Carinthia celebrates its cultural and scientific greats by awarding a total of 13 prizes based on the proposal of the Carinthian Cultural Board. This year Hermann Hellwagner received one of the three appreciation prizes in the natural and technical sciences category. Congratulations! Further information: https://www.aau.at/blog/kulturpreise-des-landes-kaernten-fuer-hermann-hellwagner-roswitha-rissner-und-wolfgang-puschnig/