J
Jingyu Yang
Researcher at Nanjing University of Science and Technology
Publications - 389
Citations - 16961
Jingyu Yang is an academic researcher from Nanjing University of Science and Technology. The author has contributed to research in topics: Feature extraction & Linear discriminant analysis. The author has an hindex of 57, co-authored 385 publications receiving 15583 citations. Previous affiliations of Jingyu Yang include Jiangsu University.
Papers
More filters
Journal ArticleDOI
Two-dimensional PCA: a new approach to appearance-based face representation and recognition
TL;DR: A new technique coined two-dimensional principal component analysis (2DPCA) is developed for image representation that is based on 2D image matrices rather than 1D vectors so the image matrix does not need to be transformed into a vector prior to feature extraction.
Journal ArticleDOI
KPCA plus LDA: a complete kernel Fisher discriminant framework for feature extraction and recognition
TL;DR: A two-phase KFD framework is developed, i.e., kernel principal component analysis (KPCA) plus Fisher linear discriminant analysis (LDA), which provides novel insights into the nature of KFD.
Journal ArticleDOI
Why can LDA be performed in PCA transformed space
Jian Yang,Jingyu Yang +1 more
TL;DR: This paper points out the weakness of the previous LDA based methods, and suggests a complete PCA plus LDA algorithm, and experimental results indicate that the proposed method is more effective than the previous ones.
Journal ArticleDOI
A Two-Phase Test Sample Sparse Representation Method for Use With Face Recognition
TL;DR: A two-phase test sample representation method for face recognition using the representation ability of each training sample to determine M “nearest neighbors” for the test sample and uses the representation result to perform classification.
Journal ArticleDOI
Globally Maximizing, Locally Minimizing: Unsupervised Discriminant Projection with Applications to Face and Palm Biometrics
TL;DR: In this paper, an unsupervised discriminant projection (UDP) technique for dimensionality reduction of high-dimensional data in small sample size cases is proposed, which can be seen as a linear approximation of a multimanifolds-based learning framework taking into account both the local and nonlocal quantities.