K
Kejun Huang
Researcher at University of Florida
Publications - 75
Citations - 3625
Kejun Huang is an academic researcher from University of Florida. The author has contributed to research in topics: Matrix decomposition & Identifiability. The author has an hindex of 20, co-authored 67 publications receiving 2685 citations. Previous affiliations of Kejun Huang include Carnegie Mellon University & Oregon State University.
Papers
More filters
Journal ArticleDOI
Tensor Decomposition for Signal Processing and Machine Learning
Nicholas D. Sidiropoulos,Lieven De Lathauwer,Xiao Fu,Kejun Huang,Evangelos E. Papalexakis,Christos Faloutsos +5 more
TL;DR: The material covered includes tensor rank and rank decomposition; basic tensor factorization models and their relationships and properties; broad coverage of algorithms ranging from alternating optimization to stochastic gradient; statistical performance analysis; and applications ranging from source separation to collaborative filtering, mixture and topic modeling, classification, and multilinear subspace learning.
Journal ArticleDOI
Non-Negative Matrix Factorization Revisited: Uniqueness and Algorithm for Symmetric Decomposition
TL;DR: Uniqueness aspects of NMF are revisited here from a geometrical point of view, and a new algorithm for symmetric NMF is proposed, which is very different from existing ones.
Journal ArticleDOI
Nonnegative Matrix Factorization for Signal and Data Analytics: Identifiability, Algorithms, and Applications
TL;DR: Nonnegative matrix factorization (NMF) aims to factor a data matrix into low-rank latent factor matrices with nonnegativity constraints with nonNegativity constraints.
Journal ArticleDOI
Feasible Point Pursuit and Successive Approximation of Non-Convex QCQPs
Omar Mehanna,Kejun Huang,Balasubramanian Gopalakrishnan,Aritra Konar,Nicholas D. Sidiropoulos +4 more
TL;DR: In this article, a new feasible point pursuit successive convex approximation (FPP-SCA) algorithm is proposed for non-convex quadratic programs (QCQPs), which adds slack variables to sustain feasibility and a penalty to ensure slacks are sparingly used.
Journal ArticleDOI
Consensus-ADMM for General Quadratically Constrained Quadratic Programming
TL;DR: The core components are carefully designed to make the overall algorithm more scalable, including efficient methods for solving QCQP-1, memory efficient implementation, parallel/distributed implementation, and smart initialization.