scispace - formally typeset
Journal ArticleDOI

On the Minimum Perceptual Temporal Video Sampling Rate and Its Application to Adaptive Frame Skipping

TLDR
This work determines the minimum temporal sampling rate at which a video should be presented to make temporal sampling imperceptible to viewers, and proposes a model to compute the required minimum sampling rate based on these two parameters.
Abstract
Media technology, in particular video recording and playback, keeps improving to provide users with high-quality real and virtual visual content. In recent years, increasing the temporal sampling rate of videos and the refresh rate of displays has become one focus of technical innovation. This raises the question, how high the sampling and refresh rates should be? To answer this question, we determine the minimum temporal sampling rate at which a video should be presented to make temporal sampling imperceptible to viewers. Through a psychophysical study, we find that this minimum sampling rate depends on both the speed of the objects in the image plane and the exposure time of the recording camera. We propose a model to compute the required minimum sampling rate based on these two parameters. In addition, state-of-the-art video codecs employ motion vectors from which the local object movement speed can be inferred. Therefore, we present a procedure to compute the minimum sampling rate given an encoded video and camera exposure time. Since the object motion speed in a video may vary, the corresponding minimum frame rate is also varying. This is why the results of this paper are particularly applicable when used together with adaptive frame rate computer generated graphics or novel video communication solutions that drop insignificant frames. In our experiments, we show that videos played back at the minimum adaptive frame rate achieve an average bit rate reduction of 26% compared to constant frame rate playback, while perceptually no difference can be observed.

read more

References
More filters
Journal ArticleDOI

Evidence for separate motion-detecting mechanisms for first- and second-order motion in human vision

TL;DR: It is suggested that several varieties of second-order motion stimuli may be regarded as equivalent to contrast-modulated images when considered in terms of the effects of local spatiotemporal filtering operations carried out by the human visual system.
Journal ArticleDOI

Position displacement, not velocity, is the cue to motion detection of second-order stimuli

TL;DR: It is concluded that, within the ranges tested here, second-order motion is more readily detected with a mechanism which tracks the change of position of features over time.
Journal ArticleDOI

Visual mechanisms of motion analysis and motion perception

TL;DR: Current psychophysical and physiological data indicate that local motion sensors are selective for orientation and spatial frequency but they do not eliminate any of the three main models-the Reichardt detector, the motion-energy filter, and gradient-based sensors.
Journal ArticleDOI

Haptic discrimination of force direction and the influence of visual information

TL;DR: The results show that the congruency of visual information significantly affected haptic discrimination of force directions, and that the force-direction discrimination thresholds did not seem to depend on the reference force direction.
Related Papers (5)