scispace - formally typeset
Search or ask a question
Author

Pradip Paudyal

Other affiliations: Kathmandu
Bio: Pradip Paudyal is an academic researcher from Roma Tre University. The author has contributed to research in topics: Video quality & Subjective video quality. The author has an hindex of 10, co-authored 17 publications receiving 245 citations. Previous affiliations of Pradip Paudyal include Kathmandu.

Papers
More filters
Journal ArticleDOI
TL;DR: The relation between depth map quality and overall quality of LF image is studied and evidence that the estimated quality score by the proposed framework has a significant correlation with subjective quality rating is achieved.
Abstract: Immersive media, such as free view point video and 360° video, are expected to be dominant as broadcasting services. The light field (LF) imaging is being considered as a next generation imaging technology offering the possibility to provide new services, including six degree-of-freedom video. The drawback of this technology is in the size of the generated content thus requiring novel compression systems and the design of ad-hoc methodologies for evaluating the perceived quality. In this paper, the relation between depth map quality and overall quality of LF image is studied. Next, a reduced reference quality assessment metric for LF images is presented. To predict the quality of distorted LF images, the measure of distortion in the depth map is exploited. To test and validate the proposed framework, a subjective experiment has been performed, and a LF image quality dataset has been created. The dataset is also used for evaluating the performance of state-of-the-art quality metrics, when applied to LF images. The achieved results evidence that the estimated quality score by the proposed framework has a significant correlation with subjective quality rating. Consequently, reference data can be delivered to the clients thus allowing the local estimation of the perceived quality of service.

73 citations

Journal ArticleDOI
TL;DR: This paper presents SMART light field image quality dataset, which consists of source images, compressed images, and annotated subjective quality scores, and analysis of perceptual effects of compression on SMART dataset.
Abstract: Evaluation of perceived quality of light field images, as well as testing new processing tools, or even assessing the effectiveness of objective quality metrics, relies on the availability of test dataset and corresponding quality ratings This paper presents SMART light field image quality dataset The dataset consists of source images (raw data without optical corrections), compressed images, and annotated subjective quality scores Furthermore, analysis of perceptual effects of compression on SMART dataset is presented Next, the impact of image content on the perceived quality is studied with the help of image quality attributes Finally, the performances of 2-D image quality metrics when applied to light field images are analyzed

62 citations

Journal ArticleDOI
TL;DR: The study of the impact of delay, jitter, packet loss, and bandwidth on Quality of Experience (QoE) and the evaluation of the relationship between content related parameters and the QoE for different levels of impairments.
Abstract: The analysis of the impact of video content and transmission impairments on Quality of Experience (QoE) is a relevant topic for the robust design and adaptation of multimedia infrastructures, services, and applications. The goal of this paper is to study the impact of video content on QoE for different levels of impairments. In more details, this contribution aims at i) the study of the impact of delay, jitter, packet loss, and bandwidth on QoE, ii) the analysis of the impact of video content on QoE, and iii) the evaluation of the relationship between content related parameters (spatial-temporal perceptual information, motion, and data rate) and the QoE for different levels of impairments.

45 citations

Proceedings ArticleDOI
10 May 2016
TL;DR: The design of a Light Field image dataset is presented and performed analysis shows that the proposed set of images is sufficient for addressing a wide range of attributes relevant for assessing Light field image quality.
Abstract: In this contribution, the design of a Light Field image dataset is presented. It can be useful for design, testing, and benchmarking Light Field image processing algorithms. As first step, image content selection criteria have been defined based on selected image quality key-attributes, i.e. spatial information, colorfulness, texture key features, depth of field, etc. Next, image scenes have been selected and captured by using the Lytro Illum Light Field camera. Performed analysis shows that the proposed set of images is sufficient for addressing a wide range of attributes relevant for assessing Light Field image quality.

29 citations

Proceedings ArticleDOI
01 Dec 2014
TL;DR: A video database, ReTRiEVED, to be used in evaluating the performances of video quality metrics is presented and shows that packet loss rate, throughput/bandwidth, and jitter have significant effect on perceived quality, while an initial delay does not significantly affect the perceived quality.
Abstract: In this paper a video database, ReTRiEVED, to be used in evaluating the performances of video quality metrics is presented. The database contains 184 distorted videos obtained from eight videos of different content. Packet loss rate, jitter, delay, and throughput have been considered as possible distortions resulting from video transmission. Video sequences, collected subjective scores, and results of the performed analysis are made publicly available for the research community, for designing, testing and comparing objective video quality metrics. The analysis of the results shows that packet loss rate, throughput/bandwidth, and jitter have significant effect on perceived quality, while an initial delay does not significantly affect the perceived quality.

27 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: This exhaustive literature review provides a concrete definition of Industry 4.0 and defines its six design principles such as interoperability, virtualization, local, real-time talent, service orientation and modularity.
Abstract: Manufacturing industry profoundly impact economic and societal progress. As being a commonly accepted term for research centers and universities, the Industry 4.0 initiative has received a splendid attention of the business and research community. Although the idea is not new and was on the agenda of academic research in many years with different perceptions, the term “Industry 4.0” is just launched and well accepted to some extend not only in academic life but also in the industrial society as well. While academic research focuses on understanding and defining the concept and trying to develop related systems, business models and respective methodologies, industry, on the other hand, focuses its attention on the change of industrial machine suits and intelligent products as well as potential customers on this progress. It is therefore important for the companies to primarily understand the features and content of the Industry 4.0 for potential transformation from machine dominant manufacturing to digital manufacturing. In order to achieve a successful transformation, they should clearly review their positions and respective potentials against basic requirements set forward for Industry 4.0 standard. This will allow them to generate a well-defined road map. There has been several approaches and discussions going on along this line, a several road maps are already proposed. Some of those are reviewed in this paper. However, the literature clearly indicates the lack of respective assessment methodologies. Since the implementation and applications of related theorems and definitions outlined for the 4th industrial revolution is not mature enough for most of the reel life implementations, a systematic approach for making respective assessments and evaluations seems to be urgently required for those who are intending to speed this transformation up. It is now main responsibility of the research community to developed technological infrastructure with physical systems, management models, business models as well as some well-defined Industry 4.0 scenarios in order to make the life for the practitioners easy. It is estimated by the experts that the Industry 4.0 and related progress along this line will have an enormous effect on social life. As outlined in the introduction, some social transformation is also expected. It is assumed that the robots will be more dominant in manufacturing, implanted technologies, cooperating and coordinating machines, self-decision-making systems, autonom problem solvers, learning machines, 3D printing etc. will dominate the production process. Wearable internet, big data analysis, sensor based life, smart city implementations or similar applications will be the main concern of the community. This social transformation will naturally trigger the manufacturing society to improve their manufacturing suits to cope with the customer requirements and sustain competitive advantage. A summary of the potential progress along this line is reviewed in introduction of the paper. It is so obvious that the future manufacturing systems will have a different vision composed of products, intelligence, communications and information network. This will bring about new business models to be dominant in industrial life. Another important issue to take into account is that the time span of this so-called revolution will be so short triggering a continues transformation process to yield some new industrial areas to emerge. This clearly puts a big pressure on manufacturers to learn, understand, design and implement the transformation process. Since the main motivation for finding the best way to follow this transformation, a comprehensive literature review will generate a remarkable support. This paper presents such a review for highlighting the progress and aims to help improve the awareness on the best experiences. It is intended to provide a clear idea for those wishing to generate a road map for digitizing the respective manufacturing suits. By presenting this review it is also intended to provide a hands-on library of Industry 4.0 to both academics as well as industrial practitioners. The top 100 headings, abstracts and key words (i.e. a total of 619 publications of any kind) for each search term were independently analyzed in order to ensure the reliability of the review process. Note that, this exhaustive literature review provides a concrete definition of Industry 4.0 and defines its six design principles such as interoperability, virtualization, local, real-time talent, service orientation and modularity. It seems that these principles have taken the attention of the scientists to carry out more variety of research on the subject and to develop implementable and appropriate scenarios. A comprehensive taxonomy of Industry 4.0 can also be developed through analyzing the results of this review.

1,011 citations

Journal ArticleDOI
TL;DR: 5G-QoE would enable a holistic video flow self-optimisation system employing the cutting-edge Scalable H.265 video encoding to transmit UHD video applications in a QoE-aware manner.
Abstract: Traffic on future fifth-generation (5G) mobile networks is predicted to be dominated by challenging video applications such as mobile broadcasting, remote surgery and augmented reality, demanding real-time, and ultra-high quality delivery. Two of the main expectations of 5G networks are that they will be able to handle ultra-high-definition (UHD) video streaming and that they will deliver services that meet the requirements of the end user’s perceived quality by adopting quality of experience (QoE) aware network management approaches. This paper proposes a 5G-QoE framework to address the QoE modeling for UHD video flows in 5G networks. Particularly, it focuses on providing a QoE prediction model that is both sufficiently accurate and of low enough complexity to be employed as a continuous real-time indicator of the “health” of video application flows at the scale required in future 5G networks. The model has been developed and implemented as part of the EU 5G PPP SELFNET autonomic management framework, where it provides a primary indicator of the likely perceptual quality of UHD video application flows traversing a realistic multi-tenanted 5G mobile edge network testbed. The proposed 5G-QoE framework has been implemented in the 5G testbed, and the high accuracy of QoE prediction has been validated through comparing the predicted QoE values with not only subjective testing results but also empirical measurements in the testbed. As such, 5G-QoE would enable a holistic video flow self-optimisation system employing the cutting-edge Scalable H.265 video encoding to transmit UHD video applications in a QoE-aware manner.

108 citations

Journal ArticleDOI
TL;DR: The relation between depth map quality and overall quality of LF image is studied and evidence that the estimated quality score by the proposed framework has a significant correlation with subjective quality rating is achieved.
Abstract: Immersive media, such as free view point video and 360° video, are expected to be dominant as broadcasting services. The light field (LF) imaging is being considered as a next generation imaging technology offering the possibility to provide new services, including six degree-of-freedom video. The drawback of this technology is in the size of the generated content thus requiring novel compression systems and the design of ad-hoc methodologies for evaluating the perceived quality. In this paper, the relation between depth map quality and overall quality of LF image is studied. Next, a reduced reference quality assessment metric for LF images is presented. To predict the quality of distorted LF images, the measure of distortion in the depth map is exploited. To test and validate the proposed framework, a subjective experiment has been performed, and a LF image quality dataset has been created. The dataset is also used for evaluating the performance of state-of-the-art quality metrics, when applied to LF images. The achieved results evidence that the estimated quality score by the proposed framework has a significant correlation with subjective quality rating. Consequently, reference data can be delivered to the clients thus allowing the local estimation of the perceived quality of service.

73 citations

Journal ArticleDOI
TL;DR: This work considers the distortions introduced in this typical light field processing chain, and proposes a full-reference light field quality metric which outperforms the state-of-the-art quality metrics which may be effective for light field.
Abstract: Owning to the recorded light ray distributions, light field contains much richer information and provides possibilities of some enlightening applications, and it has becoming more and more popular. To facilitate the relevant applications, many light field processing techniques have been proposed recently. These operations also bring the loss of visual quality, and thus there is need of a light field quality metric to quantify the visual quality loss. To reduce the processing complexity and resource consumption, light fields are generally sparsely sampled, compressed, and finally reconstructed and displayed to the users. We consider the distortions introduced in this typical light field processing chain, and propose a full-reference light field quality metric. Specifically, we measure the light field quality from three aspects: global spatial quality based on view structure matching, local spatial quality based on near-edge mean square error, and angular quality based on multi-view quality analysis. These three aspects have captured the most common distortions introduced in light field processing, including global distortions like blur and blocking, local geometric distortions like ghosting and stretching, and angular distortions like flickering and sampling. Experimental results show that the proposed method can estimate light field quality accurately, and it outperforms the state-of-the-art quality metrics which may be effective for light field.

67 citations

Journal ArticleDOI
TL;DR: This paper presents SMART light field image quality dataset, which consists of source images, compressed images, and annotated subjective quality scores, and analysis of perceptual effects of compression on SMART dataset.
Abstract: Evaluation of perceived quality of light field images, as well as testing new processing tools, or even assessing the effectiveness of objective quality metrics, relies on the availability of test dataset and corresponding quality ratings This paper presents SMART light field image quality dataset The dataset consists of source images (raw data without optical corrections), compressed images, and annotated subjective quality scores Furthermore, analysis of perceptual effects of compression on SMART dataset is presented Next, the impact of image content on the perceived quality is studied with the help of image quality attributes Finally, the performances of 2-D image quality metrics when applied to light field images are analyzed

62 citations