scispace - formally typeset
Search or ask a question
Topic

Residual frame

About: Residual frame is a research topic. Over the lifetime, 4443 publications have been published within this topic receiving 68784 citations.


Papers
More filters
Patent
08 Mar 1993
TL;DR: In this article, motion vectors from one video frame to another are detected by segmenting a present frame of video data into plural blocks and then comparing a block in the present frame to a corresponding block in a preceding frame to detect rotational and zoom movement of the present block relative to the preceding block, in addition to rectilinear movement.
Abstract: Motion vectors from one video frame to another are detected by segmenting a present frame of video data into plural blocks and then comparing a block in the present frame to a corresponding block in a preceding frame to detect rotational and zoom movement of the present block relative to the preceding block, in addition to rectilinear movement.

29 citations

Patent
10 Mar 2000
TL;DR: In this paper, the authors propose a method for segmenting variable length frames into fixed length cells in a network equipment, which allows preparing information to build the cells resulting in the segmenting of a succession of frames directed to the same destination in said network equipment.
Abstract: Method and apparatus for segmenting variable length frames into fixed length cells in a network equipment The method allows preparing information to build the cells resulting in the segmenting of a succession of frames directed to the same destination in said network equipment The cell may be packed with more than one frame The cell information comprises the address where to read the frame data in a first storage unit, the cell header itself which indicates if the cell includes data from one packet or for more packets, a pointer per each of said more packets, designating the place of the end of data of the previous packet in the cell and a cell type field indicating one of the following types: a start of a new frame type, a continuation of frame type, a end of current frame type and a start and a end of a new frame type The segmenting apparatus comprises a finite state machine using an Add/substract unit to compute the cell information and write said cell information in a second storage unit

29 citations

Proceedings ArticleDOI
TL;DR: The development of an algorithm to identify and reverse frame removal and insertion acts, at the heart of the algorithm lies the concept of a frame-pair [f,f*].
Abstract: A plausible motivation for video tampering is to alter the sequence of events. This goal can be achieved by re-indexing attacks such as frame removal, insertion, or shuffling. In this work, we report on the development of an algorithm to identify and, subject to certain limitations, reverse such acts. At the heart of the algorithm lies the concept of a frame-pair [f,f*]. Frame-pairs are unique in two ways. The first frame is the basis for watermarking of the second frame sometime in the future. A key that is unique to the location of frame f governs frame-pair temporal separation. Watermarking is done by producing a low resolution version of 24-bit frame, spreading it, and then embedding it in the color space of f*. As such, watermarking f* is tantamount to embedding a copy of frame f in a future frame. Having tied one frame, in content and timing, to another frame downstream, frame removal and insertion can be identified and, subject to certain limitations, reversed.

29 citations

Patent
12 Nov 2004
TL;DR: In this article, a video codec efficiently signals that a frame is identical to its reference frame, such that separate coding of its picture content is skipped, and the information of the frame is represented jointly in a coding table of a frame coding type element for bit rate efficiency in signaling.
Abstract: A video codec efficiently signals that a frame is identical to its reference frame, such that separate coding of its picture content is skipped. Information that a frame is skipped is represented jointly in a coding table of a frame coding type element for bit rate efficiency in signaling. Further, the video codec signals the picture type (e.g., progressive or interlaced) of skipped frames, which permits different repeat padding methods to be applied according to the picture type.

29 citations

Patent
06 Apr 1998
TL;DR: In this paper, the authors propose a method for reducing noise in a video signal which includes a plurality of video frames being composed of a pluralityof pixels, the method comprising the steps of: comparing video information contained in a current video frame and the plurality of temporally adjacent video frames; selecting from the current video frames and the adjacent videos frames the video information that according to a predetermined condition is likely to be correct for the current frame; and finally assigning the selected video information to the current data frame to produce a video frame wherein noise has been reduced.
Abstract: A method of reducing noise in a video signal which includes a plurality of video frames being composed of a plurality of pixels, the method comprising the steps of: comparing video information contained in a current video frame and a plurality of temporally adjacent video frames; selecting from the current video frame and the adjacent video frames the video information that according to a predetermined condition is likely to be correct for the current video frame; and finally assigning the selected video information to the current video frame to thereby produce a video frame wherein noise has been reduced.

29 citations


Network Information
Related Topics (5)
Feature (computer vision)
128.2K papers, 1.7M citations
81% related
Feature extraction
111.8K papers, 2.1M citations
80% related
Image segmentation
79.6K papers, 1.8M citations
80% related
Image processing
229.9K papers, 3.5M citations
78% related
Pixel
136.5K papers, 1.5M citations
78% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
202313
202223
20217
20204
20196
201811