scispace - formally typeset
Search or ask a question
Author

Zhishan Gao

Bio: Zhishan Gao is an academic researcher from Nanjing University of Science and Technology. The author has contributed to research in topics: Interferometry & Wavefront. The author has an hindex of 12, co-authored 78 publications receiving 450 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: A detailed review of the different types of optical freeform surface representation techniques and their applications and discuss their properties and differences is presented.
Abstract: Modern advanced manufacturing and testing technologies allow the application of freeform optical elements. Compared with traditional spherical surfaces, an optical freeform surface has more degrees of freedom in optical design and provides substantially improved imaging performance. In freeform optics, the representation technique of a freeform surface has been a fundamental and key research topic in recent years. Moreover, it has a close relationship with other aspects of the design, manufacturing, testing, and application of optical freeform surfaces. Improvements in freeform surface representation techniques will make a significant contribution to the further development of freeform optics. We present a detailed review of the different types of optical freeform surface representation techniques and their applications and discuss their properties and differences. Additionally, we analyze the future trends of optical freeform surface representation techniques.

63 citations

Journal ArticleDOI
TL;DR: Results show that the Numerical orthogonal polynomial is superior to the other three polynomials because of its high accuracy and robustness even in the case of a wavefront with incomplete data.
Abstract: Four orthogonal polynomials for reconstructing a wavefront over a square aperture based on the modal method are currently available, namely, the 2D Chebyshev polynomials, 2D Legendre polynomials, Zernike square polynomials and Numerical polynomials. They are all orthogonal over the full unit square domain. 2D Chebyshev polynomials are defined by the product of Chebyshev polynomials in x and y variables, as are 2D Legendre polynomials. Zernike square polynomials are derived by the Gram-Schmidt orthogonalization process, where the integration region across the full unit square is circumscribed outside the unit circle. Numerical polynomials are obtained by numerical calculation. The presented study is to compare these four orthogonal polynomials by theoretical analysis and numerical experiments from the aspects of reconstruction accuracy, remaining errors, and robustness. Results show that the Numerical orthogonal polynomial is superior to the other three polynomials because of its high accuracy and robustness even in the case of a wavefront with incomplete data.

34 citations

Journal ArticleDOI
TL;DR: In this article, a windowed Fourier transform (WFT) was used to extract the phase of a white-light interferogram and compensate for the difference in zero optical path difference (ZOPD) position in WLSI.

26 citations

Journal ArticleDOI
TL;DR: In this article, the relationship between the Zernike polyno-mial coefficients and the sensitive variables, for example, the air thickness error, the tilt, the decenter of the transmission sphere with an analy- sis program written in Zemax program language, was analyzed.
Abstract: A reference transmission sphere is an important device to measure the spherical surfaces, and it offers a high-quality spherical wave and a reference spherical surface with peak-to-valley less than l/20 (l50.6328 mm). For this value, only the manual alignment is difficult to manage it. Thus, we study the computer-aided alignment (CAA), which can provide a guide to align the individual lens. We describe the following works concerning the CAA with no iteration for the transmission spheres: (1) by analysis of the relationship between the Zernike polyno- mial coefficients and the sensitive variables, for example, the air thick- ness error, the tilt, the decenter of the transmission sphere with an analy- sis program written in Zemax program language, only several Zernike coefficients that change linearly with these sensitive variables are cho- sen; (2) the magnitude and direction of the correction are found using the Moore-Penrose generalized inverse matrix; and (3) a numerical simula- tion and successful alignment are processed for a 4-in.-diam, f/5 trans- mission sphere. © 2004 Society of Photo-Optical Instrumentation Engineers.

25 citations

Journal ArticleDOI
TL;DR: An iterative autofocusing method is proposed to correct the axial distance error and the distance can be accurately obtained that can be used to enhance the quality and resolution of the reconstructed image.

25 citations


Cited by
More filters
01 Jan 2017
TL;DR: In this article, the amplitude and phase from the intensity distribution of an electron micrograph is estimated using the relative defocus between micrographs, and the procedure is valid both in bright field and dark field microscopy for any specified coherence of the electron source.
Abstract: A method is given for the evaluation, in transmission electron microscopy, of the amplitude and phase from the intensity distribution of an electron micrograph. The method requires a minimum of two micrographs taken under different defocus conditions. The iterative scheme requires only the relative defocus between micrographs, and the procedure is valid both in bright-field and dark-field microscopy for any specified coherence of the electron source. Assumptions on the scattering properties of the specimen, such as the weak-phase-weak-amplitude object, are not required. For a complete determination of the amplitude-phase distribution for electron transmission through the specimen, the electron micrograph must be corrected for the effect of lens aberrations and defocusing to give the electron wavefunction immediately after transmission; only in the case of a weak-phase object can this wavefunction be directly related to the projected potential distribution in the object. Inelastic electron scattering is explicitly omitted from the analysis presented.

225 citations

Journal ArticleDOI
20 Feb 2021
TL;DR: This article begins with a brief history of freeform optics, focusing on imaging systems, including marketplace emergence, and describes fabrication methods, emphasizing deterministic computer numerical control grinding, polishing, and diamond machining.
Abstract: In the last 10 years, freeform optics has enabled compact and high-performance imaging systems. This article begins with a brief history of freeform optics, focusing on imaging systems, including marketplace emergence. The development of this technology is motivated by the clear opportunity to enable science across a wide range of applications, spanning from extreme ultraviolet lithography to space optics. Next, we define freeform optics and discuss concurrent engineering that brings together design, fabrication, testing, and assembly into one process. We then lay out the foundations of the aberration theory for freeform optics and emerging design methodologies. We describe fabrication methods, emphasizing deterministic computer numerical control grinding, polishing, and diamond machining. Next, we consider mid-spatial frequency errors that inherently result from freeform fabrication techniques. We realize that metrologies of freeform optics are simultaneously sparse in their existence but diverse in their potential. Thus, we focus on metrology techniques demonstrated for the measurement of freeform optics. We conclude this review with an outlook on the future of freeform optics.

123 citations

Journal ArticleDOI
TL;DR: This paper reviews recent developments of non-contact three-dimensional (3D) surface metrology using an active structured optical probe and discusses principles of each technology, and its advantageous characteristics as well as limitations.
Abstract: This paper reviews recent developments of non-contact three-dimensional (3D) surface metrology using an active structured optical probe. We focus primarily on those active non-contact 3D surface measurement techniques that could be applicable to the manufacturing industry. We discuss principles of each technology, and its advantageous characteristics as well as limitations. Towards the end, we discuss our perspectives on the current technological challenges in designing and implementing these methods in practical applications.

109 citations

Book ChapterDOI
01 Jan 2004
TL;DR: The chapter reviews the most important methods for obtaining transformed signal characteristics such as principal component analysis, the discrete Fourier transform, and the discrete cosine and sine transform.
Abstract: This chapter gives an overview of the most relevant feature selection and extraction methods for biomedical image processing. Besides the traditional transformed and non-transformed signal characteristics and texture, feature extraction methods encompass structural and graph descriptors. The feature selection methods described in this chapter are the exhaustive search, branch and bound algorithm, max-min feature selection, sequential forward and backward selection, and also Fisher's linear discriminant. Feature extraction and selection in pattern recognition are based on finding mathematical methods for reducing dimensionality of pattern representation. A lower-dimensional representation based on pattern descriptors is a so-called feature. It plays a crucial role in determining the separating properties of pattern classes. The choice of features, attributes, or measurements has an important influence on: the accuracy of classification, the time needed for classification, the number of examples needed for learning, and the cost of performing classification. The chapter reviews the most important methods for obtaining transformed signal characteristics such as principal component analysis, the discrete Fourier transform, and the discrete cosine and sine transform. The basic idea employed in transformed signal characteristics is to find such transform-based features with a low redundancy and a high information density of the original input.

97 citations