The Expected Norm of a Sum of Independent Random Matrices: An Elementary Approach
read more
Citations
Probability and Random Processes
High-Dimensional Probability: An Introduction with Applications in Data Science
Randomized numerical linear algebra: Foundations and algorithms
Universality laws for randomized dimension reduction, with applications
Universality laws for randomized dimension reduction, with applications
References
Convex Optimization
Optimization by Vector Space Methods
Probability and random processes
Concentration Inequalities: A Nonasymptotic Theory of Independence
Related Papers (5)
Frequently Asked Questions (13)
Q2. What is the simplest way to estimate the norm?
Once the authors have a bound for the expectation, the authors can use scalar concentration inequalities, such as [BLM13, Thm. 6.10], to obtain high-probability bounds on the deviation between the norm and its mean value.
Q3. What is the eigenvalue decomposition of a Hermitian matrix?
∈Hd can be expressed in the formH = ∑d i=1λi ui u ∗ i (2.2)where the λi are uniquely determined real numbers, called eigenvalues, and {ui } is an orthonormal basis for Cd .
Q4. What is the important identity of the spectral norm?
The authors have the important identity‖B‖2 = ‖B∗B‖ = ‖B B∗‖ for every matrix B . (2.8)Furthermore, the spectral norm is a convex function, and it satisfies the triangle inequality.
Q5. How do the authors define a monomial function of an Hermitian matrix?
The authors define a monomial function of an Hermitian matrix H ∈Hd by repeated multiplication:H 0 = Id , H 1 = H , H 2 = H ·H , H 3 = H ·H 2, etc.
Q6. What is the first approximation of Qi?
{Qi } is an independent family of POISSON(1) random variables, and the first approximation follows from the Poisson limit of a binomial.
Q7. What is the spectral norm of a matrix Rademacher series?
Rudelson [Rud99] pointed out that the noncommutative Khintchine inequality also implies bounds for the spectral norm of a matrix Rademacher series.
Q8. What is the key step to prove the upper bound in the theorem?
To prove the upper bound in Theorem I, the key step is to establish the result for the special case of a sum of fixed matrices, each modulated by a random sign.
Q9. What is the norm of a block-diagonal Hermitian matrix?
The norm of a block-diagonal Hermitian matrix is the maximum spectral norm of a block, which follows from the Rayleigh principle (2.4) with a bit of work.
Q10. What is the function for the eigenvalue decomposition of a monomial?
Calculate thattr [ HW q H Y 2r−q ] = tr [ H ( ∑di=1λ q i ui u ∗ i ) H ( ∑d j=1µ 2r−q j v j v ∗ j )]= ∑d i , j=1λ q i µ 2r−q j · tr [ Hui u ∗ i H v j v ∗ j ]≤ ∑d i , j=1 |λi | q |µ j |2r−q · ∣ ∣u∗i H v j ∣ ∣ 2.(2.16)The first identity relies on the formula (2.3) for the eigenvalue decomposition of a monomial.
Q11. What is the variance parameter for the random matrix isv(Z)?
The variance parameter for the random matrix isv(Z )= ∥ ∥∥∑d i=1 ∑n j=1E ( δi j −n−1 )2 ·Ei i∥ ∥ ∥= ∥ ∥ ∥ ∑di=1 ∑n j=1 n −1(1−n−1 )
Q12. What is the first approximation of the central limit theorem?
{γi } is an independent family of standard normal variables, and the first approximation follows from the central limit theorem.
Q13. What is the spectral norm for a Hermitian matrix?
For an Hermitian matrix, the spectral norm can be written in terms of the eigenvalues:‖H‖=max { λmax(H ), −λmin(H ) } for each Hermitian matrix H . (2.9)As a consequence, ‖A‖ =λmax(A) for each positive-semidefinite matrix A. (2.10)This discussion implies that‖H‖2p =‖H 2p‖ for each Hermitian H and each nonnegative integer p .