Supervised dictionary learning using distance dependent indian buffet process
Citations
107 citations
Cites background from "Supervised dictionary learning usin..."
...Learning the model from the training samples instead of using some predefined bases such as Fourier or wavelet bases has been shown to produce more accurate results [28]–[33]....
[...]
29 citations
Cites background from "Supervised dictionary learning usin..."
...metric [7–13] and developing a new feature representation [14–25]....
[...]
5 citations
1 citations
Cites background from "Supervised dictionary learning usin..."
...INTRODUCTION Linear subspace models are widely used in signal processing and data analysis since many datasets can be wellapproximated with low-dimensional subspaces [1–12]....
[...]
Cites background from "Supervised dictionary learning usin..."
...It is good to note that algorithms based on supervised dictionary learning and subspace learning are also useful for deriving the smooth representation of background component [25]-[28]....
[...]
References
42,067 citations
12,940 citations
"Supervised dictionary learning usin..." refers background in this paper
...MNIST [19] and USPS [20] are standard handwritten digit databases, ISOLET dataset [21] comprises of examples of letters from the alphabet spoken in isolation by 30 individual speakers and COIL2 [22] is two class object recognition dataset....
[...]
9,658 citations
"Supervised dictionary learning usin..." refers methods in this paper
...[11] used training data as atoms of the dictionary for face recognition task....
[...]
...We compare the proposed method (PM) to four popular DL based classification methods, RECL [11], SRSC [23], DLSI [6], FDDL [12], and two classical classification methods, K-nearest neighbor (K...
[...]
...We compare the proposed method (PM) to four popular DL based classification methods, RECL [11], SRSC [23], DLSI [6], FDDL [12], and two classical classification methods, K-nearest neighbor (K=3) and linear SVM ....
[...]
8,905 citations
6,884 citations
"Supervised dictionary learning usin..." refers methods in this paper
...Hence, we resort to Gibbs sampling [17] to approximate the posterior with S samples (the Gibbs sampling equations are avalable in [18] and are ommited due to the lack of space)....
[...]