scispace - formally typeset
Search or ask a question
Topic

Aperture

About: Aperture is a research topic. Over the lifetime, 44310 publications have been published within this topic receiving 467495 citations. The topic is also known as: diaphragm.


Papers
More filters
Proceedings ArticleDOI
29 Jul 2007
TL;DR: A simple modification to a conventional camera is proposed to insert a patterned occluder within the aperture of the camera lens, creating a coded aperture, and introduces a criterion for depth discriminability which is used to design the preferred aperture pattern.
Abstract: A conventional camera captures blurred versions of scene information away from the plane of focus. Camera systems have been proposed that allow for recording all-focus images, or for extracting depth, but to record both simultaneously has required more extensive hardware and reduced spatial resolution. We propose a simple modification to a conventional camera that allows for the simultaneous recovery of both (a) high resolution image information and (b) depth information adequate for semi-automatic extraction of a layered depth representation of the image. Our modification is to insert a patterned occluder within the aperture of the camera lens, creating a coded aperture. We introduce a criterion for depth discriminability which we use to design the preferred aperture pattern. Using a statistical model of images, we can recover both depth information and an all-focus image from single photographs taken with the modified camera. A layered depth map is then extracted, requiring user-drawn strokes to clarify layer assignments in some cases. The resulting sharp image and layered depth map can be combined for various photographic applications, including automatic scene segmentation, post-exposure refocusing, or re-rendering of the scene from an alternate viewpoint.

1,489 citations

Book
01 Jan 1995
TL;DR: In this article, the authors present a spotlight SAR and polar format algorithm for non-planar motion polar format processing with phase error management and phase error magnitude of phase error requirements on a practical SAR motion sensor moving target effects.
Abstract: Part 1 Introduction: spotlight SAR SAR modes importance of spotlight SAR early SAR chronology. Part 2 Synthetic aperture radar fundamentals: SAR system overview imaging considerations pulse compression and range resolution synthetic aperture technique for Azimuth resolution SAR coherence requirements signal phase equation inverse SAR (ISAR) SAR sensor parametric design. Part 3 Spotlight SAR and polar format algorithm: scope of processing task polar format overview polar data storage as a two-dimensional signal correction for non-planar motion polar format algorithm limitations Taylor series expansion procedures phase of image pixels image geometric distortion image focus error equations displacements and absolute positioning. Part 4 Digital polar format processing: sampling rate conversion polyphase filters polar interpolation image scale factors image distortion correction signal history projections stabilized scene polar interpolation subpatch processing and mosaicking. Part 5 Phase errors: classification of phase error management of phase error magnitude of phase error requirements on a practical SAR motion sensor moving target effects. Part 6 Autofocus techniques: mapdrift multiple aperture mapdrift phase difference phase gradient prominent point processing considerations for space-variant refocus. Part 7 Processor design examples: the common UNIX SAR processor the ground to air imaging radar processor. Part 8 SAR system performance: image quality metrics system performance budgeting requirements on system impulse response requirements on system noise geometric distortion secondary image quality metrics test arrays. Part 9 Spotlight processing applications: spotlight processing of scan and stripmap SAR data interferometric SAR forward look SAR vibrating target detection. Part 10 Range migration algorithm: model algorithm overview analytical development discussion efficient algorithms for range migration processing. Part 11 Chirp scaling algorithm: non-dechirped signal model algorithm overview analytical development discussion. Part 12 Comparison of image formation algorithms: image formation algorithm models computational complexity memory requirements other considerations.

1,381 citations

Journal Article
TL;DR: In this article, a self-scanned 1024 element photodiode array and a minicomputer are used to measure the phase (wavefront) in the interference pattern of an interferometer to lambda/100.
Abstract: A self-scanned 1024 element photodiode array and minicomputer are used to measure the phase (wavefront) in the interference pattern of an interferometer to lambda/100. The photodiode array samples intensities over a 32 x 32 matrix in the interference pattern as the length of the reference arm is varied piezoelectrically. Using these data the minicomputer synchronously detects the phase at each of the 1024 points by a Fourier series method and displays the wavefront in contour and perspective plot on a storage oscilloscope in less than 1 min (Bruning et al. Paper WE16, OSA Annual Meeting, Oct. 1972). The array of intensities is sampled and averaged many times in a random fashion so that the effects of air turbulence, vibrations, and thermal drifts are minimized. Very significant is the fact that wavefront errors in the interferometer are easily determined and may be automatically subtracted from current or subsequent wavefrots. Various programs supporting the measurement system include software for determining the aperture boundary, sum and difference of wavefronts, removal or insertion of tilt and focus errors, and routines for spatial manipulation of wavefronts. FFT programs transform wavefront data into point spread function and modulus and phase of the optical transfer function of lenses. Display programs plot these functions in contour and perspective. The system has been designed to optimize the collection of data to give higher than usual accuracy in measuring the individual elements and final performance of assembled diffraction limited optical systems, and furthermore, the short loop time of a few minutes makes the system an attractive alternative to constraints imposed by test glasses in the optical shop.

1,300 citations

Journal ArticleDOI
01 Jul 2005
TL;DR: A unique array of 100 custom video cameras that are built are described, and their experiences using this array in a range of imaging applications are summarized.
Abstract: The advent of inexpensive digital image sensors and the ability to create photographs that combine information from a number of sensed images are changing the way we think about photography. In this paper, we describe a unique array of 100 custom video cameras that we have built, and we summarize our experiences using this array in a range of imaging applications. Our goal was to explore the capabilities of a system that would be inexpensive to produce in the future. With this in mind, we used simple cameras, lenses, and mountings, and we assumed that processing large numbers of images would eventually be easy and cheap. The applications we have explored include approximating a conventional single center of projection video camera with high performance along one or more axes, such as resolution, dynamic range, frame rate, and/or large aperture, and using multiple cameras to approximate a video camera with a large synthetic aperture. This permits us to capture a video light field, to which we can apply spatiotemporal view interpolation algorithms in order to digitally simulate time dilation and camera motion. It also permits us to create video sequences using custom non-uniform synthetic apertures.

1,285 citations

Journal ArticleDOI
TL;DR: The authors describe a camera for performing single lens stereo analysis, which incorporates a single main lens along with a lenticular array placed at the sensor plane and extracts information about both horizontal and vertical parallax, which improves the reliability of the depth estimates.
Abstract: Ordinary cameras gather light across the area of their lens aperture, and the light striking a given subregion of the aperture is structured somewhat differently than the light striking an adjacent subregion. By analyzing this optical structure, one can infer the depths of the objects in the scene, i.e. one can achieve single lens stereo. The authors describe a camera for performing this analysis. It incorporates a single main lens along with a lenticular array placed at the sensor plane. The resulting plenoptic camera provides information about how the scene would look when viewed from a continuum of possible viewpoints bounded by the main lens aperture. Deriving depth information is simpler than in a binocular stereo system because the correspondence problem is minimized. The camera extracts information about both horizontal and vertical parallax, which improves the reliability of the depth estimates. >

1,229 citations


Network Information
Related Topics (5)
Amplifier
163.9K papers, 1.3M citations
85% related
Optical fiber
167K papers, 1.8M citations
84% related
Scattering
152.3K papers, 3M citations
82% related
Beam (structure)
155.7K papers, 1.4M citations
81% related
Image processing
229.9K papers, 3.5M citations
81% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20229
2021961
20201,429
20191,656
20181,526
20171,417