Bio: Roland Shack is an academic researcher. The author has contributed to research in topics: Wavefront & Shack–Hartmann wavefront sensor. The author has an hindex of 1, co-authored 1 publications receiving 717 citations.
TL;DR: Enhanced images of satellites are enhanced by inserting a beam splitter in collimated space behind the eyepiece and placing a plate with holes in it at the image of the pupil, which captures a snapshot of the atmospheric aberrations rather than to average over time.
Abstract: developed out of a need to solve a problem. The problem was posed, in the late 1960s, to the Optical Sciences Center (OSC) at the University of Arizona by the US Air Force. They wanted to improve the images of satellites taken from earth. The earth's atmosphere limits the image quality and exposure time of stars and satellites taken with telescopes over 5 inches in diameter at low altitudes and 10 to 12 inches in diameter at high altitudes. Dr. Aden Mienel was director of the OSC at that time. He came up with the idea of enhancing images of satellites by measuring the Optical Transfer Function (OTF) of the atmosphere and dividing the OTF of the image by the OTF of the atmosphere. The trick was to measure the OTF of the atmosphere at the same time the image was taken and to control the exposure time so as to capture a snapshot of the atmospheric aberrations rather than to average over time. The measured wavefront error in the atmosphere should not change more than /10 over the exposure time. The exposure time for a low earth orbit satellite imaged from a mountaintop was determined to be about 1/60 second. Mienel was an astronomer and had used the standard Hartmann test (Fig 1), where large wooden or cardboard panels were placed over the aperture of a large telescope. The panels had an array of holes that would allow pencils of rays from stars to be traced through the telescope system. A photographic plate was placed inside and outside of focus, with a sufficient separation, so the pencil of rays would be separated from each other. Each hole in the panel would produce its own blurry image of the star. By taking two images a known distance apart and measuring the centroid of the images, one can trace the rays through the focal plane. Hartmann used these ray traces to calculate figures of merit for large telescopes. The data can also be used to make ray intercept curves (H'-tan U'). When Mienel could not cover the aperture while taking an image of the satellite, he came up with the idea of inserting a beam splitter in collimated space behind the eyepiece and placing a plate with holes in it at the image of the pupil. Each hole would pass a pencil of rays to a vidicon tube (this was before …
TL;DR: This method gives the wavefunction a straightforward and general definition in terms of a specific set of experimental operations, and shows that the concept is universal, being applicable to other degrees of freedom of the photon, and to other quantum systems—for example, electron spins, SQUIDs and trapped ions.
Abstract: The wavefunction, describing both the wave-like and the particle-like nature of everything in the Universe, is central to quantum theory. Physicists usually learn about it indirectly in tomographic experiments that measure only some aspects of its behaviour. Now a team from Canada's Institute for National Measurement Standards has developed a new and gentle technique that makes it possible to observe the wavefunction directly. They demonstrate the approach by measuring the transverse spatial wavefunction of a single photon. The discovery that the wavefunction can be probed directly provides a tool that could prove useful in a wide range of fields, and raises questions bordering on the philosophical about what the wavefunction actually is. The wavefunction is the complex distribution used to completely describe a quantum system, and is central to quantum theory. But despite its fundamental role, it is typically introduced as an abstract element of the theory with no explicit definition1,2. Rather, physicists come to a working understanding of the wavefunction through its use to calculate measurement outcome probabilities by way of the Born rule3. At present, the wavefunction is determined through tomographic methods4,5,6,7,8, which estimate the wavefunction most consistent with a diverse collection of measurements. The indirectness of these methods compounds the problem of defining the wavefunction. Here we show that the wavefunction can be measured directly by the sequential measurement of two complementary variables of the system. The crux of our method is that the first measurement is performed in a gentle way through weak measurement9,10,11,12,13,14,15,16,17,18, so as not to invalidate the second. The result is that the real and imaginary components of the wavefunction appear directly on our measurement apparatus. We give an experimental example by directly measuring the transverse spatial wavefunction of a single photon, a task not previously realized by any method. We show that the concept is universal, being applicable to other degrees of freedom of the photon, such as polarization or frequency, and to other quantum systems—for example, electron spins, SQUIDs (superconducting quantum interference devices) and trapped ions. Consequently, this method gives the wavefunction a straightforward and general definition in terms of a specific set of experimental operations19. We expect it to expand the range of quantum systems that can be characterized and to initiate new avenues in fundamental quantum theory.
TL;DR: In this paper, a phase-space formulation for the transport of intensity equation (TIE) is presented for analyzing phase retrieval under partially coherent illumination. But the authors do not consider the effect of the partial coherence on phase retrieval.
Abstract: The well-known transport of intensity equation (TIE) allows the phase of a coherent field to be retrieved non-interferometrically given positive defined intensity measurements and appropriate boundary conditions. However, in many cases like the optical microscopy, the imaging systems often involve extended and polychromatic sources for which the effect of the partial coherence is not negligible. In this work, we present a phase-space formulation for the TIE for analyzing phase retrieval under partially coherent illumination. The conventional TIE is reformulated in the joint space-spatial frequency domain using Wigner distribution functions. The phase-space formulation clarifies the physical meaning of the phase of partially coherent fields, and enables explicit account of partial coherence effects on phase retrieval. The correspondence between the Wigner distribution function and the light field in geometric optics limit further enables TIE to become a simple yet effective approach to realize high-resolution light field imaging for slowly varying phase specimens, in a purely computational way.
TL;DR: A new era in which strict coherence and interferometry are no longer prerequisites for quantitative phase imaging and diffraction tomography is highlighted, paving the way toward new generation label-free three-dimensional microscopy, with applications in all branches of biomedicine.
Abstract: When it comes to “phase measurement” or “quantitative phase imaging”, many people will automatically connect them with “laser” and “interferometry” Indeed, conventional quantitative phase imaging and phase measurement techniques generally rely on the superposition of two beams with a high degree of coherence: complex interferometric configurations, stringent requirements on the environmental stabilities, and associated laser speckle noise severely limit their applications in optical imaging and microscopy On a different note, as one of the most well-known phase retrieval approaches, the transport of intensity equation (TIE) provides a new non-interferometric way to access quantitative phase information through intensity only measurement Despite the insufficiency for interferometry, TIE is applicable under partially coherent illuminations (like the Kohler’s illumination in a conventional microscope), permitting optimum spatial resolution, higher signal-to-noise ratio, and better image quality In this tutorial, we give an overview of the basic principle, research fields, and representative applications of TIE, focus particularly on optical imaging, metrology, and microscopy The purpose of this tutorial is twofold It should serve as a self-contained introduction to TIE for readers with little or no knowledge of TIE On the other hand, it attempts to give an overview of recent developments in this field These results highlight a new era in which strict coherence and interferometry are no longer prerequisites for quantitative phase imaging and diffraction tomography, paving the way toward new generation label-free three-dimensional microscopy, with applications in all branches of biomedicine
TL;DR: A strong correlation between visual symptoms and ocular aberrations, such as monocular diplopia with coma and starburst and glare with spherical aberration, suggest the LADARWave wavefront measurement device is a valuable diagnostic tool in measuring refractive error with ocularAberrations in post-LASIK eyes.
Abstract: Purpose To evaluate the information assessed with the LADARWave wavefront measurement device and correlate it with visual symptoms, refraction, and corneal topography in previously LASIK-treated eyes. Participants One hundred five eyes (58 patients) of individuals who underwent LASIK surgery were evaluated. Design Retrospective, noncomparative case series. Main outcome measures Complete ophthalmologic examination, corneal topography, and wavefront measurements were performed. Correlations were made between the examinations and symptoms. Methods Wavefront measurements were assessed with the LADARWave device. Manifest, cycloplegic refraction, and topographic data were compared with wavefront refraction and higher order aberrations. Visual symptoms were correlated to higher order aberrations in 3 different pupil sizes (5-mm, 7-mm, and scotopic pupil size). Pearson's correlation coefficient and generalized estimating equations were used for statistical analysis. Results In post-LASIK eyes, wavefront refraction components were poorly correlated to manifest and cycloplegic components. The comparison between manifest, cycloplegic, and wavefront refraction with total amount of higher order aberrations showed no strong correlation. The comparison between topography and manifest, cycloplegic, and wavefront refraction did not show strong correlation. Visual symptoms analysis showed correlation of double vision with total coma and with horizontal coma for the 5-mm and 7-mm pupil size; correlation between starburst and total coma for the 7-mm pupil size; and correlation of double vision with horizontal coma, glare with spherical aberrations and with total aberrations, and starburst with spherical aberrations for the scotopic pupil size. Scotopic pupil size had a positive association with starburst and a negative association with double vision. Conclusions The LADARWave wavefront measurement device is a valuable diagnostic tool in measuring refractive error with ocular aberrations in post-LASIK eyes. A strong correlation between visual symptoms and ocular aberrations, such as monocular diplopia with coma and starburst and glare with spherical aberration, suggest this device is valuable in diagnosing symptomatic LASIK-induced aberrations. Horizontal coma was correlated with double vision, whereas vertical coma was not.
TL;DR: An overview of recent advances in digital holography is presented, ranging from holographic techniques designed to increase the resolution of microscopic images, holographic imaging using incoherent illumination, phase retrieval with coherent illumination, and the holographic recording of depth-extended objects using a frequency-comb laser.
Abstract: This article presents an overview of recent advances in the field of digital holography, ranging from holographic techniques designed to increase the resolution of microscopic images, holographic imaging using incoherent illumination, phase retrieval with incoherent illumination, imaging of occluded objects, and the holographic recording of depth-extended objects using a frequency-comb laser, to the design of an infrastructure for remote laboratories for digital-holographic microscopy and metrology. The paper refers to current trends in digital holography and explains them using new results that were recently achieved at the Institute for Applied Optics of the University Stuttgart.