Learning the parts of objects by non-negative matrix factorization
Citations
15,106 citations
9,227 citations
Cites background or methods from "Learning the parts of objects by no..."
...They replaced the least-squares updates with the multiplicative update introduced in [151]....
[...]
...Welling and Weber [238] perform multiplicative updates like Lee and Seung [151] for NNCP....
[...]
...Paatero and Tapper [181] and Lee and Seung [151] proposed using nonnegative matrix factorizations for analyzing nonnegative data, such as environmental models and grayscale images, because it is desirable for the decompositions to retain the nonnegative characteristics of the original data and thereby facilitate easier interpretation....
[...]
7,345 citations
Cites background from "Learning the parts of objects by no..."
...We have previously shown that nonnegativity is a useful constraint for matrix factorization that can learn a parts representation of the data [4, 5]....
[...]
4,958 citations
Cites background from "Learning the parts of objects by no..."
...To be precise, the solution is also approached in two steps: 1) global model: use reconstruction constraint to recover a medium high-resolution face image, but the solution is searched only in the face subspace; and 2) local model: use the local sparse model to recover the image details. a) Nonnegative Matrix Factorization (NMF): In face SR, the most frequently used subspace method for modeling the human face is principal component analysis (PCA), which chooses a low-dimensional subspace that captures as much of the variance as possible....
[...]
...Nonnegati ve Matrix Factorization (NMF) [31] seeks a representation of t he given signals as an additive combination of local features....
[...]
...Column 4 shows the intermediate results from the NMF global modeling and column 5 demonstrates the results after local sparse modeling....
[...]
...NMF [29] seeks a representation of the given signals as an additive combination of local features....
[...]
...To find such a part-based subspace, NMF is formulated as the following optimization problem: (12) where is the data matrix, is the basis matrix and is the coefficient matrix....
[...]
3,325 citations
Cites background from "Learning the parts of objects by no..."
...While PCA suffers from a number of shortcomings [8, 10], such as its implicit assumption of Gaussian distributions and its restriction to orthogonal linear combinations, it remains popular due to its simplicity....
[...]
References
14,562 citations
12,059 citations
9,157 citations
6,014 citations