scispace - formally typeset
Search or ask a question
Topic

Light field

About: Light field is a research topic. Over the lifetime, 5357 publications have been published within this topic receiving 87424 citations.


Papers
More filters
Patent
28 Mar 2013
TL;DR: In this paper, an optical sensor is arranged as a light field sensor for detection of direction information of a light beam contained in the light field, and an image output device is utilized for outputting acquired and/or evaluated information.
Abstract: The system has an optical sensor (4) assigned to a part of a motor vehicle (1). The optical sensor is arranged as a light field sensor for detection of direction information of a light beam contained in a light field. The optical sensor allows detection of position and/or intensity and/or wavelength and/or polarization and/or color of the light field. An image output device is utilized for outputting acquired and/or evaluated information.

39 citations

Patent
10 Jan 2012
TL;DR: In this paper, a 3D display consisting of a light source, a beam scanner and a beam deflector array is presented to reproduce a light field by changing a direction of light rays scanned by the beam scanner.
Abstract: Provided is a 3-dimensional (3D) display apparatus including a light source, a beam scanner, and a beam deflector array. The beam scanner scans light emitted by the light source, and the beam deflector array includes a plurality of beam deflectors arranged in an array to reproduce a light field by changing a direction of light rays scanned by the beam scanner.

39 citations

Journal ArticleDOI
TL;DR: Inspired by the inherent depth cues and geometry constraints of light field, three novel unsupervised loss functions are introduced: photometric loss, defocus loss and symmetry loss and it is shown that this method can achieve satisfactory performance in most error metrics and prove the effectiveness and generality of it on real-world light-field images.
Abstract: Learning based depth estimation from light field has made significant progresses in recent years. However, most existing approaches are under the supervised framework, which requires vast quantities of ground-truth depth data for training. Furthermore, accurate depth maps of light field are hardly available except for a few synthetic datasets. In this paper, we exploit the multi-orientation epipolar geometry of light field and propose an unsupervised monocular depth estimation network. It predicts depth from the central view of light field without any ground-truth information. Inspired by the inherent depth cues and geometry constraints of light field, we then introduce three novel unsupervised loss functions: photometric loss, defocus loss and symmetry loss. We have evaluated our method on a public 4D light field synthetic dataset. As the first unsupervised method published in the 4D Light Field Benchmark website, our method can achieve satisfactory performance in most error metrics. Comparison experiments with two state-of-the-art unsupervised methods demonstrate the superiority of our method. We also prove the effectiveness and generality of our method on real-world light-field images.

39 citations

Journal ArticleDOI
TL;DR: In this article , a nanophotonic light-field camera incorporating a spin-multiplexed bifocal metalens array capable of capturing high-resolution light field images over a record depth-of-field ranging from centimeter to kilometer scale, simultaneously enabling macro and telephoto modes in a snapshot imaging.
Abstract: A unique bifocal compound eye visual system found in the now extinct trilobite, Dalmanitina socialis, may enable them to be sensitive to the light-field information and simultaneously perceive both close and distant objects in the environment. Here, inspired by the optical structure of their eyes, we demonstrate a nanophotonic light-field camera incorporating a spin-multiplexed bifocal metalens array capable of capturing high-resolution light-field images over a record depth-of-field ranging from centimeter to kilometer scale, simultaneously enabling macro and telephoto modes in a snapshot imaging. By leveraging a multi-scale convolutional neural network-based reconstruction algorithm, optical aberrations induced by the metalens are eliminated, thereby significantly relaxing the design and performance limitations on metasurface optics. The elegant integration of nanophotonic technology with computational photography achieved here is expected to aid development of future high-performance imaging systems.

39 citations

Patent
Joo-young Kang1, Byung Kwan Park1, Sang Wook Han1, Seong-Deok Lee1, Won-Hee Choe1, Jae-guyn Lim1 
17 Sep 2010
TL;DR: In this paper, a light field data acquisition apparatus includes a modulator with an attenuation pattern to spatially modulate a 4D light field for an image, and a sensor to acquire 2D signals of the spatial modulated 4 D light field.
Abstract: A technology of acquiring and processing light field data for images is provided A light field data acquisition apparatus includes a modulator with an attenuation pattern to spatially modulate a 4 D light field for an image, and a sensor to acquire 2 D signals of the spatially modulated 4 D light field By utilizing the attenuation pattern of the modulator, more spatial data may be acquired in a low angular frequency region than that acquired in a high angular frequency region

39 citations


Network Information
Related Topics (5)
Optical fiber
167K papers, 1.8M citations
79% related
Image processing
229.9K papers, 3.5M citations
78% related
Pixel
136.5K papers, 1.5M citations
78% related
Laser
353.1K papers, 4.3M citations
78% related
Quantum information
22.7K papers, 911.3K citations
77% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
2023135
2022375
2021274
2020493
2019555
2018503