Journal ArticleDOI
Multi-view learning via probabilistic latent semantic analysis
Reads0
Chats0
TLDR
This paper proposes a new generative model for Multi-view Learning via Probabilistic Latent Semantic Analysis, called MVPLSA, which jointly model the co-occurrences of features and documents from different views.About:
This article is published in Information Sciences.The article was published on 2012-09-01. It has received 39 citations till now. The article focuses on the topics: Probabilistic latent semantic analysis & Latent Dirichlet allocation.read more
Citations
More filters
Journal ArticleDOI
Consensus and complementarity based maximum entropy discrimination for multi-view classification
Guoqing Chao,Shiliang Sun +1 more
TL;DR: This paper proposes a new method called consensus and complementarity based MED (MED-2C) for multi-view classification, which well utilizes the two principles consensusand complementarity formulti-view learning (MVL).
Journal ArticleDOI
Multi-View Fusion with Extreme Learning Machine for Clustering
TL;DR: A novel multi-view fusion clustering framework based on an ELM, called MVEC, which exposes the underlying clustering structures embedded within multi-View data with a high degree of accuracy and a simple yet efficient solution to solve the optimization problem within MVEC.
Journal ArticleDOI
3D model retrieval and classification by semi-supervised learning with content-based similarity
TL;DR: A novel 3D model retrieval and recognition method that employs both a distance histogram and 3D moment invariants as features that are invariant to 3D object scaling, translation, and rotation is proposed.
Journal ArticleDOI
Learning very fast decision tree from uncertain data streams with positive and unlabeled samples
TL;DR: Experimental results demonstrate the strong ability and efficiency of puuCVFDT to handle concept drift with uncertainty under positive and unlabeled learning scenario, and the classification performance of the proposed algorithm is still compared to that of CVFDT, which is learned from fully labeled data without uncertainty.
Journal ArticleDOI
A multi-dimensional image quality prediction model for user-generated images in social networks
TL;DR: A multi-dimensional image quality prediction model for UGIs in social networks is proposed and the results indicate that the proposed model can be implemented to predict image quality in practical environments.
References
More filters
Book
Elements of information theory
Thomas M. Cover,Joy A. Thomas +1 more
TL;DR: The author examines the role of entropy, inequality, and randomness in the design of codes and the construction of codes in the rapidly changing environment.
Proceedings ArticleDOI
Combining labeled and unlabeled data with co-training
Avrim Blum,Tom M. Mitchell +1 more
TL;DR: A PAC-style analysis is provided for a problem setting motivated by the task of learning to classify web pages, in which the description of each example can be partitioned into two distinct views, to allow inexpensive unlabeled data to augment, a much smaller set of labeled examples.
Proceedings Article
A comparison of event models for naive bayes text classification
Andrew McCallum,Kamal Nigam +1 more
TL;DR: It is found that the multi-variate Bernoulli performs well with small vocabulary sizes, but that the multinomial performs usually performs even better at larger vocabulary sizes--providing on average a 27% reduction in error over the multi -variateBernoulli model at any vocabulary size.
Proceedings ArticleDOI
Unsupervised word sense disambiguation rivaling supervised methods
TL;DR: An unsupervised learning algorithm for sense disambiguation that, when trained on unannotated English text, rivals the performance of supervised techniques that require time-consuming hand annotations.
Journal ArticleDOI
Unsupervised Learning by Probabilistic Latent Semantic Analysis
TL;DR: This paper proposes to make use of a temperature controlled version of the Expectation Maximization algorithm for model fitting, which has shown excellent performance in practice, and results in a more principled approach with a solid foundation in statistical inference.