scispace - formally typeset
Search or ask a question
Author

Bob Lawlor

Bio: Bob Lawlor is an academic researcher from Hamilton Institute. The author has contributed to research in topics: White paper & Quality of experience. The author has an hindex of 1, co-authored 1 publications receiving 617 citations.

Papers
More filters
12 Mar 2013
TL;DR: The concepts and ideas cited in this paper mainly refer to the Quality of Experience of multimedia communication systems, but may be helpful also for other areas where QoE is an issue, and the document will not reflect the opinion of each individual person at all points.
Abstract: This White Paper is a contribution of the European Network on Quality of Experience in Multimedia Systems and Services, Qualinet (COST Action IC 1003, see www.qualinet.eu), to the scientific discussion about the term "Quality of Experience" (QoE) and its underlying concepts. It resulted from the need to agree on a working definition for this term which facilitates the communication of ideas within a multidisciplinary group, where a joint interest around multimedia communication systems exists, however approached from different perspectives. Thus, the concepts and ideas cited in this paper mainly refer to the Quality of Experience of multimedia communication systems, but may be helpful also for other areas where QoE is an issue. The Network of Excellence (NoE) Qualinet aims at extending the notion of network-centric Quality of Service (QoS) in multimedia systems, by relying on the concept of Quality of Experience (QoE). The main scientific objective is the development of methodologies for subjective and objective quality metrics taking into account current and new trends in multimedia communication systems as witnessed by the appearance of new types of content and interactions. A substantial scientific impact on fragmented efforts carried out in this field will be achieved by coordinating the research of European experts under the catalytic COST umbrella. The White Paper has been compiled on the basis of a first open call for ideas which was launched for the February 2012 Qualinet Meeting held in Prague, Czech Republic. The ideas were presented as short statements during that meeting, reflecting the ideas of the persons listed under the headline "Contributors" in the previous section. During the Prague meeting, the ideas have been further discussed and consolidated in the form of a general structure of the present document. An open call for authors was issued at that meeting, to which the persons listed as "Authors" in the previous section have announced their willingness to contribute in the preparation of individual sections. For each section, a coordinating author has been assigned which coordinated the writing of that section, and which is underlined in the author list preceding each section. The individual sections were then integrated and aligned by an editing group (listed as "Editors" in the previous section), and the entire document was iterated with the entire group of authors. Furthermore, the draft text was discussed with the participants of the Dagstuhl Seminar 12181 "Quality of Experience: From User Perception to Instrumental Metrics" which was held in Schlos Dagstuhl, Germany, May 1-4 2012, and a number of changes were proposed, resulting in the present document. As a result of the writing process and the large number of contributors, authors and editors, the document will not reflect the opinion of each individual person at all points. Still, we hope that it is found to be useful for everybody working in the field of Quality of Experience of multimedia communication systems, and most probably also beyond that field.

686 citations


Cited by
More filters
Journal ArticleDOI
01 Jun 1959

3,442 citations

Journal ArticleDOI
TL;DR: The technical development of HAS, existing open standardized solutions, but also proprietary solutions are reviewed in this paper as fundamental to derive the QoE influence factors that emerge as a result of adaptation.
Abstract: Changing network conditions pose severe problems to video streaming in the Internet. HTTP adaptive streaming (HAS) is a technology employed by numerous video services that relieves these issues by adapting the video to the current network conditions. It enables service providers to improve resource utilization and Quality of Experience (QoE) by incorporating information from different layers in order to deliver and adapt a video in its best possible quality. Thereby, it allows taking into account end user device capabilities, available video quality levels, current network conditions, and current server load. For end users, the major benefits of HAS compared to classical HTTP video streaming are reduced interruptions of the video playback and higher bandwidth utilization, which both generally result in a higher QoE. Adaptation is possible by changing the frame rate, resolution, or quantization of the video, which can be done with various adaptation strategies and related client- and server-side actions. The technical development of HAS, existing open standardized solutions, but also proprietary solutions are reviewed in this paper as fundamental to derive the QoE influence factors that emerge as a result of adaptation. The main contribution is a comprehensive survey of QoE related works from human computer interaction and networking domains, which are structured according to the QoE impact of video adaptation. To be more precise, subjective studies that cover QoE aspects of adaptation dimensions and strategies are revisited. As a result, QoE influence factors of HAS and corresponding QoE models are identified, but also open issues and conflicting results are discussed. Furthermore, technical influence factors, which are often ignored in the context of HAS, affect perceptual QoE influence factors and are consequently analyzed. This survey gives the reader an overview of the current state of the art and recent developments. At the same time, it targets networking researchers who develop new solutions for HTTP video streaming or assess video streaming from a user centric point of view. Therefore, this paper is a major step toward truly improving HAS.

746 citations

Journal ArticleDOI
TL;DR: This paper critically examine MOS and the various ways it is being used today and discusses a variety of alternative approaches that have been proposed for media quality measurement.
Abstract: Mean opinion score (MOS) has become a very popular indicator of perceived media quality. While there is a clear benefit to such a "reference quality indicator" and its widespread acceptance, MOS is often applied without sufficient consideration of its scope or limitations. In this paper, we critically examine MOS and the various ways it is being used today. We highlight common issues with both subjective and objective MOS and discuss a variety of alternative approaches that have been proposed for media quality measurement.

336 citations

Journal ArticleDOI
TL;DR: The focus of this article is on the issue of reliability and the use of video quality assessment as an example for the proposed best practices, showing that the recommended two-stage QoE crowdtesting design leads to more reliable results.
Abstract: Quality of Experience (QoE) in multimedia applications is closely linked to the end users' perception and therefore its assessment requires subjective user studies in order to evaluate the degree of delight or annoyance as experienced by the users. QoE crowdtesting refers to QoE assessment using crowdsourcing, where anonymous test subjects conduct subjective tests remotely in their preferred environment. The advantages of QoE crowdtesting lie not only in the reduced time and costs for the tests, but also in a large and diverse panel of international, geographically distributed users in realistic user settings. However, conceptual and technical challenges emerge due to the remote test settings. Key issues arising from QoE crowdtesting include the reliability of user ratings, the influence of incentives, payment schemes and the unknown environmental context of the tests on the results. In order to counter these issues, strategies and methods need to be developed, included in the test design, and also implemented in the actual test campaign, while statistical methods are required to identify reliable user ratings and to ensure high data quality. This contribution therefore provides a collection of best practices addressing these issues based on our experience gained in a large set of conducted QoE crowdtesting studies. The focus of this article is in particular on the issue of reliability and we use video quality assessment as an example for the proposed best practices, showing that our recommended two-stage QoE crowdtesting design leads to more reliable results.

278 citations

Journal ArticleDOI
TL;DR: This survey presents a tutorial overview of the popular video streaming techniques deployed for stored videos, followed by identifying various metrics that could be used to quantify the QoE for video streaming services; and presents a comprehensive survey of the literature on various tools and measurement methodologies that have been proposed to measure or predict theQoE of online video streaming Services.
Abstract: Video-on-demand streaming services have gained popularity over the past few years. An increase in the speed of the access networks has also led to a larger number of users watching videos online. Online video streaming traffic is estimated to further increase from the current value of 57% to 69% by 2017, Cisco, 2014. In order to retain the existing users and attract new users, service providers attempt to satisfy the user's expectations and provide a satisfactory viewing experience. The first step toward providing a satisfactory service is to be able to quantify the users' perception of the current service level. Quality of experience (QoE) is a quality metric that provides a holistic measure of the users' perception of the quality. In this survey, we first present a tutorial overview of the popular video streaming techniques deployed for stored videos, followed by identifying various metrics that could be used to quantify the QoE for video streaming services; finally, we present a comprehensive survey of the literature on various tools and measurement methodologies that have been proposed to measure or predict the QoE of online video streaming services.

206 citations