Solving Random Quadratic Systems of Equations Is Nearly as Easy as Solving Linear Systems
Yuxin Chen,Emmanuel J. Candès +1 more
Reads0
Chats0
TLDR
In this article, the authors consider the problem of solving quadratic systems of equations in variables, where the number of equations and unknowns is unknown and the search direction is fixed.Abstract:
We consider the fundamental problem of solving quadratic systems of equations in $n$ variables, where $y_i = |\langle \boldsymbol{a}_i, \boldsymbol{x} \rangle|^2$, $i = 1, \ldots, m$ and $\boldsymbol{x} \in \mathbb{R}^n$ is unknown We propose a novel method, which starting with an initial guess computed by means of a spectral method, proceeds by minimizing a nonconvex functional as in the Wirtinger flow approach There are several key distinguishing features, most notably, a distinct objective functional and novel update rules, which operate in an adaptive fashion and drop terms bearing too much influence on the search direction These careful selection rules provide a tighter initial guess, better descent directions, and thus enhanced practical performance On the theoretical side, we prove that for certain unstructured models of quadratic systems, our algorithms return the correct solution in linear time, ie in time proportional to reading the data $\{\boldsymbol{a}_i\}$ and $\{y_i\}$ as soon as the ratio $m/n$ between the number of equations and unknowns exceeds a fixed numerical constant We extend the theory to deal with noisy systems in which we only have $y_i \approx |\langle \boldsymbol{a}_i, \boldsymbol{x} \rangle|^2$ and prove that our algorithms achieve a statistical accuracy, which is nearly un-improvable We complement our theoretical study with numerical examples showing that solving random quadratic systems is both computationally and statistically not much harder than solving linear systems of the same size---hence the title of this paper For instance, we demonstrate empirically that the computational cost of our algorithm is about four times that of solving a least-squares problem of the same sizeread more
Citations
More filters
Journal ArticleDOI
Nonconvex Optimization Meets Low-Rank Matrix Factorization: An Overview
Yuejie Chi,Yue Lu,Yuxin Chen +2 more
TL;DR: This tutorial-style overview highlights the important role of statistical models in enabling efficient nonconvex optimization with performance guarantees and reviews two contrasting approaches: two-stage algorithms, which consist of a tailored initialization step followed by successive refinement; and global landscape analysis and initialization-free algorithms.
Journal ArticleDOI
Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
TL;DR: This paper presents a new algorithm, termed Truncated amplitude flow (TAF), to recover an unknown vector from a system of quadratic equations, and proves that as soon as the number of equations is on the order of theNumber of unknowns, TAF recovers the solution exactly.
Posted Content
Solving Systems of Random Quadratic Equations via Truncated Amplitude Flow
TL;DR: In this article, the authors proposed a new algorithm, termed \emph{truncated amplitude flow} (TAF), to recover an unknown vector from a system of quadratic equations of the form $y_i=|\langle\bm{a}_i, \bm{x}\rangle|^2, where the unknown vectors are given random measurement vectors.
Journal ArticleDOI
A Geometric Analysis of Phase Retrieval
Ju Sun,Qing Qu,John Wright +2 more
TL;DR: In this paper, it was shown that when the measurement vectors are generic (i.i.d. complex Gaussian) and numerous enough, with high probability, a natural least-squares formulation for GPR has the following benign geometric structure: (1) there are no spurious local minimizers, and all global minimizers are equal to the target signal up to a global phase, and (2) the objective function has a negative directional curvature around each saddle point.
Journal ArticleDOI
Implicit Regularization in Nonconvex Statistical Estimation: Gradient Descent Converges Linearly for Phase Retrieval, Matrix Completion, and Blind Deconvolution
TL;DR: In this paper, the authors show that gradient descent can achieve near-optimal statistical and computational guarantees without explicit regularization for phase retrieval, low-rank matrix completion, and blind deconvolution.
References
More filters
Journal ArticleDOI
Phase retrieval algorithms: a comparison.
TL;DR: Iterative algorithms for phase retrieval from intensity data are compared to gradient search methods and it is shown that both the error-reduction algorithm for the problem of a single intensity measurement and the Gerchberg-Saxton algorithm forThe problem of two intensity measurements converge.
Journal Article
A practical algorithm for the determination of phase from image and diffraction plane pictures
TL;DR: In this article, an algorithm is presented for the rapid solution of the phase of the complete wave function whose intensity in the diffraction and imaging planes of an imaging system are known.
Book ChapterDOI
Introduction to the non-asymptotic analysis of random matrices.
TL;DR: This is a tutorial on some basic non-asymptotic methods and concepts in random matrix theory, particularly for the problem of estimating covariance matrices in statistics and for validating probabilistic constructions of measurementMatrices in compressed sensing.
MonographDOI
Lectures on modern convex optimization: analysis, algorithms, and engineering applications
TL;DR: The authors present the basic theory of state-of-the-art polynomial time interior point methods for linear, conic quadratic, and semidefinite programming as well as their numerous applications in engineering.
Journal ArticleDOI
The Rotation of Eigenvectors by a Perturbation. III
Chandler Davis,William Kahan +1 more
TL;DR: In this article, the difference between the two subspaces is characterized in terms of certain angles through which one subspace must be rotated in order most directly to reach the other, and Sharp bounds upon trigonometric functions of these angles are obtained from the gap and from bounds upon either the perturbation or a computable residual.