scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Assessing the Quality of Experience of SopCast

01 Mar 2009-International Journal of Internet Protocol Technology (Inderscience Publishers)-Vol. 4, Iss: 1, pp 11-23
TL;DR: The characteristics and important design issues of SopCast, as well as the QoE that the end users perceive are revealed, using both objective and subjective measurement technologies.
Abstract: Recently, there has been a growing interest in academic and commercial environments for live streaming using P2P technology. A number of new P2P digital Television (P2PTV) applications have emerged. Such P2PTV applications are developed with proprietary technologies. Their traffic characteristics and the Quality of Experience (QoE) provided by them are not well known. Therefore, investigating their mechanisms, analysing their performance, and measuring their quality are important objectives for researchers, developers and end users. In this paper, we present results from a measurement study of a BitTorrent-like P2PTV application called SopCast, using both objective and subjective measurement technologies. The results obtained in our study reveal the characteristics and important design issues of SopCast, as well as the QoE that the end users perceive.

Content maybe subject to copyright    Report

Citations
More filters
Book ChapterDOI
01 Jun 2010
TL;DR: This paper presents an overview of various techniques for measuring QoE, thereby mostly focusing on freely available tools and methodologies.
Abstract: Quality of Experience (QoE) relates to how users perceive the quality of an application. To capture such a subjective measure, either by subjective tests or via objective tools, is an art on its own. Given the importance of measuring users’ satisfaction to service providers, research on QoE took flight in recent years. In this paper we present an overview of various techniques for measuring QoE, thereby mostly focusing on freely available tools and methodologies.

118 citations


Cites methods from "Assessing the Quality of Experience..."

  • ...In [15], with Wireshark and a small script that zaps from channel to channel, the zapping times of the P2PTV application SopCast have been measured....

    [...]

  • ...This approach has been applied in [15]....

    [...]

Journal ArticleDOI
TL;DR: This paper proposes a method to compute a set of optimal client strategies for continuous video playback by optimizing the overall video quality by proper selection of the next chunk from the encoded versions.
Abstract: In state-of-the-art adaptive streaming solutions, to cope with varying network conditions, the client side can switch between several video copies encoded at different bit-rates during streaming. Each video copy is divided into chunks of equal duration. To achieve continuous video playback, each chunk needs to arrive at the client before its playback deadline. The perceptual quality of a chunk increases with the chunk size in bits, whereas bigger chunks require more transmission time and, as a result, have a higher risk of missing transmission deadline. Therefore, there is a trade-off between the overall video quality and continuous playback, which can be optimized by proper selection of the next chunk from the encoded versions. This paper proposes a method to compute a set of optimal client strategies for this purpose.

82 citations

Proceedings ArticleDOI
22 Oct 2012
TL;DR: CLIVE is a cloud-assisted P2P live streaming system that estimates the available capacity in the system through a gossip-based aggregation protocol and provisions the required resources from the cloud to guarantee a given level of QoS at low cost.
Abstract: Peer-to-peer (P2P) video streaming is an emerging technology that reduces the barrier to stream live events over the Internet. Unfortunately, satisfying soft real-time constraints on the delay between the generation of the stream and its actual delivery to users is still a challenging problem. Bottlenecks in the available upload bandwidth, both at the media source and inside the overlay network, may limit the quality of service (QoS) experienced by users. A potential solution for this problem is assisting the P2P streaming network by a cloud computing infrastructure to guarantee a minimum level of QoS. In such approach, rented cloud resources (helpers) are added on demand to the overlay, to increase the amount of total available bandwidth and the probability of receiving the video on time. Hence, the problem to be solved becomes minimizing the economical cost, provided that a set of constraints on QoS is satisfied. The main contribution of this paper is CLIVE, a cloud-assisted P2P live streaming system that demonstrates the feasibility of these ideas. CLIVE estimates the available capacity in the system through a gossip-based aggregation protocol and provisions the required resources from the cloud to guarantee a given level of QoS at low cost. We perform extensive simulations and evaluate CLIVE using large-scale experiments under dynamic realistic settings.

81 citations


Cites background from "Assessing the Quality of Experience..."

  • ...Peer-to-peer (P2P) live streaming is becoming an increasingly popular technology, with a large number of academic [1], [2], [3], [4], [5] and commercial [6], [7] products being designed and deployed....

    [...]

Journal ArticleDOI
TL;DR: Up to 19 use cases for IDMS are presented, each one having its own synchronization requirements, including Social TV, which is perhaps the most prominent use case in which IDMS is useful.
Abstract: Traditionally, the media consumption model has been a passive and isolated activity. However, the advent of media streaming technologies, interactive social applications, and synchronous communications, as well as the convergence between these three developments, point to an evolution towards dynamic shared media experiences. In this new model, geographically distributed groups of consumers, independently of their location and the nature of their end-devices, can be immersed in a common virtual networked environment in which they can share multimedia services, interact and collaborate in real-time within the context of simultaneous media content consumption. In most of these multimedia services and applications, apart from the well-known intra and inter-stream synchronization techniques that are important inside the consumers’ playout devices, also the synchronization of the playout processes between several distributed receivers, known as multipoint, group or Inter-destination multimedia synchronization (IDMS), becomes essential. Due to the increasing popularity of social networking, this type of multimedia synchronization has gained in popularity in recent years. Although Social TV is perhaps the most prominent use case in which IDMS is useful, in this paper we present up to 19 use cases for IDMS, each one having its own synchronization requirements. Different approaches used in the (recent) past by researchers to achieve IDMS are described and compared. As further proof of the significance of IDMS nowadays, relevant organizations’(such as ETSI TISPAN and IETF AVTCORE Group) efforts on IDMS standardization (in which authors have been and are participating actively), defining architectures and protocols, are summarized.

75 citations


Cites background from "Assessing the Quality of Experience..."

  • ...In [39], the importance of IDMS in web-based P2P TV systems for minimizing noticeable playout differences was revealed....

    [...]

Book ChapterDOI
11 May 2010
TL;DR: It is recommended to incorporate the following aspects when designing video conferencing applications: traffic load control/balancing algorithms to better use the limited bandwidth resources and to have a stable conversation.
Abstract: More and more free multi-party video conferencing applications are readily available over the Internet and both Server-to-Client (S/C) or Peer-to-Peer (P2P) technologies are used. Investigating their mechanisms, analyzing their system performance, and measuring their quality are important objectives for researchers, developers and end users. In this paper, we take four representative video conferencing applications and reveal their characteristics and different aspects of Quality of Experience. Based on our observations and analysis, we recommend to incorporate the following aspects when designing video conferencing applications: 1) Traffic load control/balancing algorithms to better use the limited bandwidth resources and to have a stable conversation; 2) Use traffic shaping policy or adaptively re-encode streams in real time to limit the overall traffic. This work is, to our knowledge, the first measurement work to study and compare mechanisms and performance of existing free multi-party video conferencing systems.

61 citations

References
More filters
Proceedings ArticleDOI
13 Mar 2005
TL;DR: This paper presents DONet, a data-driven overlay network for live media streaming, and presents an efficient member and partnership management algorithm, together with an intelligent scheduling algorithm that achieves real-time and continuous distribution of streaming contents.
Abstract: This paper presents DONet, a data-driven overlay network for live media streaming. The core operations in DONet are very simple: every node periodically exchanges data availability information with a set of partners, and retrieves unavailable data from one or more partners, or supplies available data to partners. We emphasize three salient features of this data-driven design: 1) easy to implement, as it does not have to construct and maintain a complex global structure; 2) efficient, as data forwarding is dynamically determined according to data availability while not restricted by specific directions; and 3) robust and resilient, as the partnerships enable adaptive and quick switching among multi-suppliers. We show through analysis that DONet is scalable with bounded delay. We also address a set of practical challenges for realizing DONet, and propose an efficient member and partnership management algorithm, together with an intelligent scheduling algorithm that achieves real-time and continuous distribution of streaming contents. We have extensively evaluated the performance of DONet over the PlanetLab. Our experiments, involving almost all the active PlanetLab nodes, demonstrate that DONet achieves quite good streaming quality even under formidable network conditions. Moreover, its control overhead and transmission delay are both kept at low levels. An Internet-based DONet implementation, called CoolStreaming v.0.9, was released on May 30, 2004, which has attracted over 30000 distinct users with more than 4000 simultaneously being online at some peak times. We discuss the key issues toward designing CoolStreaming in this paper, and present several interesting observations from these large-scale tests; in particular, the larger the overlay size, the better the streaming quality it can deliver.

1,310 citations

Journal ArticleDOI
TL;DR: The independent test results from the VQEG FR-TV Phase II tests are summarized, as well as results from eleven other subjective data sets that were used to develop the NTIA General Model.
Abstract: The National Telecommunications and Information Administration (NTIA) General Model for estimating video quality and its associated calibration techniques were independently evaluated by the Video Quality Experts Group (VQEG) in their Phase II Full Reference Television (FR-TV) test. The NTIA General Model was the only video quality estimator that was in the top performing group for both the 525-line and 625-line video tests. As a result, the American National Standards Institute (ANSI) adopted the NTIA General Model and its associated calibration techniques as a North American Standard in 2003. The International Telecommunication Union (ITU) has also included the NTIA General Model as a normative method in two Draft Recommendations. This paper presents a description of the NTIA General Model and its associated calibration techniques. The independent test results from the VQEG FR-TV Phase II tests are summarized, as well as results from eleven other subjective data sets that were used to develop the method.

1,268 citations


"Assessing the Quality of Experience..." refers methods in this paper

  • ...We broadcasted two videos at different data rates: one at 45 KB/s (the most common data rate used in SopCast) and another one at 1 Mb/s. VQM provided the following scores per node (see Figure 14): The minimum threshold for acceptable quality corresponds to the line MOS = 3.5....

    [...]

  • ...VQM (Video Quality Metric) is a software tool developed by the Institute for Telecommunication Science to objectively measure perceived video quality....

    [...]

  • ...3) Image Quality: In this section, after the start-up freezing phase of peers, we cut the received video at them and synchronized each frame of the cut received video with the cut original video to get the average objective Mean Opinion Score (MOS) (ITU-T, 1996), using VQM (Pinson and Wolf, 2004)....

    [...]

  • ...VQM takes the original video and the processed video and produces quality scores that reflect the predicted fidelity of the impaired video with reference to its undistorted counterpart....

    [...]

Journal ArticleDOI
Xiaojun Hei, Chao Liang1, Jian Liang1, Yong Liu1, Keith W. Ross1 
TL;DR: In this paper, an in-depth measurement study of one of the most popular P2P IPTV systems, namely, PPLive, has been conducted, which enables the authors to study the global characteristics of the mesh-pull peer-to-peer IPTV system.
Abstract: An emerging Internet application, IPTV, has the potential to flood Internet access and backbone ISPs with massive amounts of new traffic. Although many architectures are possible for IPTV video distribution, several mesh-pull P2P architectures have been successfully deployed on the Internet. In order to gain insights into mesh-pull P2P IPTV systems and the traffic loads they place on ISPs, we have undertaken an in-depth measurement study of one of the most popular IPTV systems, namely, PPLive. We have developed a dedicated PPLive crawler, which enables us to study the global characteristics of the mesh-pull PPLive system. We have also collected extensive packet traces for various different measurement scenarios, including both campus access networks and residential access networks. The measurement results obtained through these platforms bring important insights into P2P IPTV systems. Specifically, our results show the following. 1) P2P IPTV users have the similar viewing behaviors as regular TV users. 2) During its session, a peer exchanges video data dynamically with a large number of peers. 3) A small set of super peers act as video proxy and contribute significantly to video data uploading. 4) Users in the measured P2P IPTV system still suffer from long start-up delays and playback lags, ranging from several seconds to a couple of minutes. Insights obtained in this study will be valuable for the development and deployment of future P2P IPTV systems.

1,070 citations

Journal ArticleDOI
TL;DR: Tribler as discussed by the authors is a peer-to-peer file-sharing system that exploits social phenomena by maintaining social networks and using these in content discovery, content recommendation, and downloading.
Abstract: Most current peer-to-peer (P2P) file-sharing systems treat their users as anonymous, unrelated entities, and completely disregard any social relationships between them. However, social phenomena such as friendship and the existence of communities of users with similar tastes or interests may well be exploited in such systems in order to increase their usability and performance. In this paper we present a novel social-based P2P file-sharing paradigm that exploits social phenomena by maintaining social networks and using these in content discovery, content recommendation, and downloading. Based on this paradigm's main concepts such as taste buddies and friends, we have designed and implemented the TRIBLER P2P file-sharing system as a set of extensions to BitTorrent. We present and discuss the design of TRIBLER, and we show evidence that TRIBLER enables fast content discovery and recommendation at a low additional overhead, and a significant improvement in download performance. Copyright © 2007 John Wiley & Sons, Ltd.

463 citations

Proceedings ArticleDOI
13 Apr 2008
TL;DR: An inside look at the new Coolstreaming system is taken by exposing its design options and rationale behind them, and it is demonstrated that there is a highly skewed resource distribution in such systems and the performance is mostly affected by the system dynamics.
Abstract: The peer-to-peer (P2P) based video streaming has emerged as a promising solution for Internet video distribution. Leveraging the resource available at end users, this approach poses great potential to scale in the Internet. We have now seen the commercial P2P streaming systems that are orders of magnitude larger than the earlier academic systems. We believe understanding its basic principles and limitations are important in the design of future systems. The Coolstreaming, first released in summer 2004, arguably represented the first successful large-scale P2P live streaming system. Since then, the system has been significantly modified and commercially launched. This paper takes an inside look at the new Coolstreaming system by exposing its design options and rationale behind them, and examines their implications on streaming performance. Specifically, by leveraging a large set of traces obtained from recent live event broadcast and extensive simulations, we study the workload characteristics, system dynamics, and impact from a variety of system parameters. We demonstrate that there is a highly skewed resource distribution in such systems and the performance is mostly affected by the system dynamics. In addition, we show that there are inherent correlations and fundamental trade-off among different system parameters, which can be further explored to enhance the system performance.

299 citations