X
Xiao Ling
Researcher at Ocean University of China
Publications - 5
Citations - 202
Xiao Ling is an academic researcher from Ocean University of China. The author has contributed to research in topics: Feature learning & Deep learning. The author has an hindex of 4, co-authored 4 publications receiving 130 citations.
Papers
More filters
Journal ArticleDOI
An overview on data representation learning: From traditional feature learning to recent deep learning
TL;DR: This paper investigates both traditional feature learning algorithms and state-of-the-art deep learning models, and gives a few remarks on the development of data representation learning and suggest some interesting research directions in this area.
Journal ArticleDOI
From shallow feature learning to deep learning: Benefits from the width and depth of deep architectures
TL;DR: This advanced review describes the historical profile of the shallow feature learning research and introduces the important developments of the deep learning models, and surveys the deep architectures with benefits from the optimization of their width and depth.
Proceedings ArticleDOI
Convolutional Discriminant Analysis
TL;DR: Beyond the softmax loss, CDA employs a convolutional discriminant loss (CD-Loss), which minimizes the distance between the sample and its class center while maximizes thedistance between the samples and its adversarial class center in the space of the learned deep representations.
Book ChapterDOI
The Necessary and Sufficient Conditions for the Existence of the Optimal Solution of Trace Ratio Problems
Guoqiang Zhong,Xiao Ling +1 more
TL;DR: This paper analyzes an algorithm that transforms the trace ratio problems into a series of trace difference problems, and proposes the necessary and sufficient conditions for the existence of the optimal solution of trace ratios problems.
Journal ArticleDOI
Recurrent attention unit: A new gated recurrent unit for long-term memory of important parts in sequential data
TL;DR: The authors proposed a recurrent attention unit (RAU), which can seamlessly integrate the attention mechanism into the interior of the GRU cell by adding an attention gate, which enhances the ability of RAU to remember longterm information and pay attention to important parts in the sequential data.