scispace - formally typeset
Open AccessJournal ArticleDOI

Linear operators and positive semidefiniteness of symmetric tensor spaces

Reads0
Chats0
TLDR
In this paper, a decomposition invariance theorem for linear operators over the symmetric tensor space and cones arising from polynomial optimization and physical sciences was proved, which leads to several other interesting properties in symmetric spaces.
Abstract
We study symmetric tensor spaces and cones arising from polynomial optimization and physical sciences. We prove a decomposition invariance theorem for linear operators over the symmetric tensor space, which leads to several other interesting properties in symmetric tensor spaces. We then consider the positive semidefiniteness of linear operators which deduces the convexity of the Frobenius norm function of a symmetric tensor. Furthermore, we characterize the symmetric positive semidefinite tensor (SDT) cone by employing the properties of linear operators, design some face structures of its dual cone, and analyze its relationship to many other tensor cones. In particular, we show that the cone is self-dual if and only if the polynomial is quadratic, give specific characterizations of tensors that are in the primal cone but not in the dual for higher order cases, and develop a complete relationship map among the tensor cones appeared in the literature.

read more

Content maybe subject to copyright    Report

SCIENCE CHINA
Mathematics
.
ARTICLES
.
January 2015 Vol. 58 No. 1: 197–212
doi: 10.1007/s11425-014-4930-z
c
Science China Press and Springer-Verlag Berlin Heidelberg 2014 math.scichina.com link.springer.com
Linear operators and positive semidefiniteness of
symmetric tensor spaces
LUO ZiYan
1,
, QI LiQun
2
& YE YinYu
3
1
State Key Laboratory of Rail Traffic Control and Safety, Beijing Jiaotong University, Beiji ng 100044, China;
2
Department of Applied Mathematics, The Hong K ong Polytechnic U ni versity,
Kowloon, Hong Kong, China;
3
Department of Management Science and Engineering, Stanford University, Stanford, CA 94305, USA
Email: zyluo@bjtu.edu.cn, maqilq@polyu.edu.hk, yiny u-ye@stanford.edu
Received August 26, 2013; accepted M arch 2, 2014; published online November 14, 2014
Abstract We study symmetric tensor spaces and cones arising from polynomial optimization and physical
sciences. We prove a decomposition invariance theorem for linear operators over the symmetric tensor space,
which leads to several other interesting properties in symmetric tensor s paces. We then consider the positive
semidefiniteness of linear operators whi ch deduces the convexity of the Frobenius norm function of a symmetric
tensor. Furthermore, we characterize the symmetric positive semidefinite tensor (SDT) cone by employing the
properties of linear operators, design some face structures of its dual cone, and analyze its relationship to many
other tensor cones. In particular, we show that the cone is self-dual if and only if the polynomial is quadratic,
give specific characterizations of tensors that are in the primal cone but not in the dual for higher order cases,
and develop a complete relationship map among the tensor cones appeared in the literature.
Keywords symmetric tensor, symmetric positive semidefinite tensor cone, linear operator, SOS cone
MSC(2010) 15A69, 53A45, 47A05, 53C35
Citation: Luo Z Y, Qi L Q, Ye Y Y . Linear operators and positive semidefiniteness of symmetric tensor spaces. Sci
China Math, 2015, 58: 197–212, doi: 10.1007/s11425-014-4930-z
1 Introduction
A symmetric tensor is a higher order generalizatio n of a symmetric matrix. Analogous to the fact that a
symmetric matrix can be regarde d both as a linear operator on some Euclidean space and as an element
in symmetric matrices space, a symmetric tensor both acts as a multilinear operator on some Cartesian
products of Euclidean spaces and as a component of symmetric tensors space. As the former one, it has
recently gaine d intense attention due to its wide applications to polynomial optimization [9], higher order
derivatives of smooth functions [13], moments and cumulants o f random vectors [15] in the theoretical
respect, and to physical sciences such as imaging technologies in the practical respect. As the latter
one, new mathematical developments for symmetric tensors involve tensor eigenvalues [17], tens or ranks
and symmetric outer product decompositions [6], all of which are generalizations to symmetric matrices.
Linear operato rs are fundamental and essential for any linear space, and hence for the symmetric tensor
space. However, few papers concentrate on linear operators and their prope rties which work on symmetric
tensors in the manner of a tensor-level thinking.
Corresponding author

198 Luo Z Y et al. Sci China Math January 2015 Vol. 58 No. 1
In this paper, we are interested in a class of s pecial linear operators on symmetric tensor spaces, which
further contribute to exploiting properties of some special components the cone of positive se midefinite
symmetric tensors (SDT cone for sho rt) in symmetric tensor spaces. An intimate relation to pos itive
semidefinite symmetric tensors is the nonnegative homogeneous polynomials, where the nonnegativity is
an intrinsic prope rty of polynomial functions, as one can see from quadratic polynomial functions. Let
x R
n
and m be a positive integer. An m-order homogeneo us polynomial function in x can be written as
f
A
(x) =
n
X
i
1
,...,i
m
=1
A
i
1
···i
m
x
i
1
x
i
2
···x
i
m
. (1.1)
The coefficients of the polynomial can be regarded as an m-order n-dimensional real tensor , denoted
by A, which is symmetric invariant under any permutation of its indices. Such a tensor A is said to
be positive semidefinite if the cor responding polyno mial f
A
(x) is nonnegative for any x R
n
. Obviously,
positive semidefinite tensors only occur when m is even. The SDT cone consists of all such tensors. In
this regard, verifying the nonnegativity of a homogeneous polynomial is equivalent to identifying the
associated tensor lies in the SDT cone. Additionally, for many real applications, a tensor, either second
or higher o rder, must be positive semidefinite to be physically meaningful. For example, the involved
diffusion tensor in the diffusion weighted MRI is a symmetric positive semidefinite tensor of o rder 4
and dimension 3 (see [3]). All these show the evidence that to study the SDT cone is a necessary and
meaningful work to a certain extent.
We greatly employ a special class of linear operators to get a better understanding of the SDT cone and
the c orresponding tensor space. Such a cla ss of special linear operator s on symmetric tensor spaces is just
an m-order genera liz ation of the symmetric-scaled repres e ntation used in interior-point algorithms in the
literature of semidefinite programming [21], and shares a decomposition invariance theorem resembling
that of the symmetr ic-scaled representation. Several other interesting properties in symmetric tens or
spaces follow fr om this invariance property. All these properties may be potential to build a foundation
for us to learn more about the symmetric tensor structures which provide greater descriptive flexibility
than that of the corresponding reshaped matrix-based structure s.
The positive semidefiniteness (i.e., monotonicity and self-adjointness) of such a c lass of linear operators
is also considered which leads to the convexity of the Frobenius no rm function of a symmetric tensor.
Spec ially, for the case n = 3, we exhibit the matrix representation and the rank of such a matrix of
the aforementioned class of linear operators, and then exploit the positive semidefiniteness of linear
operators by the property of the corresponding matrices. Here the concept of matrix representation for a
general linear operator was first proposed by Qi and Ye [18] by employing a linear isomorphic operato r L
which builds up a one -to-one correspondence from a tensor of dimension 3 to a real vector in R
κ
with
κ =
(m+2)(m+1)
2
, the number of independent compo nents of a 3-dimensional symmetric tensor.
Based on the achieved properties o f such a class of linear operators aforementioned, we pr oceed with
characterizing the SDT cone. Just like all positive semidefinite symmetric matrices form a closed pointed
convex self-dual cone in sy mmetric matrices space, the SDT cone can be easily verified as a closed pointed
convex cone, while generally no longer self-dual in symmetric tensors space. In particular, we show that
the symmetric positive semidefinite tensor cone is self-dual if and only if m = 2. We also present other
results for the dual of the SDT cone such as a clas s of its faces, and give specific characterizations of
tensors that in the primal cone but not in the dual. In addition, we analyze SDT’s rela tionship to many
other tensor cones, such a s nonnegative tensor, partially diagonal tensor, and sum- of-squares (SO S) c ones.
We present a complete relationship map among thes e cones. This also expresses some striking differences
between tensors and matrices.
The organization of this paper is as follows. In Section 2 , we propose a decomposition invariance
property for a class of linear operators and describe some other interesting properties of such linear
operators ba sed on such decomposition invariance property. In Sec tion 3, the pos itive se midefiniteness
of such linear operators is considered. Symmetric positive semidefinite tensor cone and its relationship
to its dual cone and several other cones, such as nonnegative tensor cone, partially diagonal tensor cone

Luo Z Y et al. Sci China Math January 2015 Vol. 58 No. 1 199
and sum-of-squares (SOS) cone are characterized by the aforementioned pr operties of linear operators in
Sections 4 and 5. Some concluding remarks are made in Section 6.
Before we proceed, we describe some notation used throughout this paper: T (m, n) stands for the set
of all m -order n-dimensional real symmetr ic tensors; [x]
m
:= xx···x is a rank-one symmetric tensor
generated by the outer product of m copies of a vector x; S(m, n) denotes the set of all positive semidefinite
tensors in T (m, n); V(m, n) is the dual cone of S(m, n) in T (m, n); L(·) is the linear isomorphic operator
between T (m, 3) and R
κ
with κ =
1
2
(m + 1)(m + 2). Let F be a convex subset of some convex closed
cone K, F K means that F is a face of K, i.e., for any x, y K with x + y F , we have x,
y F . The inner product of two tensors X, Y T (m, n) is defined as the component-wise product
hX, Y i =
P
n
i
1
,...,i
m
=1
X
i
1
···i
m
Y
i
1
···i
m
and k·k
F
denotes the Frobenius no rm which is deduced by the inner
product. Unless otherwise pointed out, we will restrict m to be even in the rest of the paper.
2 A decomposition invariance property
A linear operator is a basic operation to any linear space, so is to the tensor space. Qi and Ye [18]
proposed a class of linear operato rs based on matrices in tensors space for the purpose of the orthogonal
similarity of E-e igenvalues. Co mon and Sorensen [7] employed such class of linear operators based on
orthogonal matrices for tensor dia gonalization. The definition of this class of linear operators is reviewed
as follows.
Definition 2 .1. Let P = (P
ij
) R
n×n
. Define a linear opera tor P
[m]
such tha t
(P
[m]
(A))
i
1
···i
m
:=
n
X
i
1
,...,i
m
=1
P
i
1
i
1
···P
i
m
i
m
A
i
1
···i
m
, (2.1)
for any A T (m, n) with the (i
1
, . . . , i
m
)-th entry A
i
1
···i
m
.
We use P
[m]
to denote such a linear operator in this paper instead of P
m
used in [17] to avoid the
notation confusion from power operation of matrices in subsequent analyses . Note that P
[m]
is a linear
operator from T (m, n) to itself acc ording to [17, Proposition 10], and its definition is given in an entry-
wise manner. Now, we turn our attention to the decomposition form of such linear operator based on
the so-ca lle d symmetric outer product decomposition of symmetric tensors; see Co mon et al. [6], i.e., for
any A T (m, n), one can find α
1
, . . . , α
s
R and u
(1)
, . . . , u
(s)
R
n
such tha t
A =
s
X
i=1
α
i
[u
(i)
]
m
. (2.2)
Though the symmetric outer product decomposition (2.2 ) of any given A is not unique in genera l [6,12],
we prove that the linear operator P
[m]
enjoys the following decompo sition invariance property.
Theorem 2.2 (Decomposition invariance). For any P R
n×n
, and any A T (m, n) with any of its
symmetric outer product decomposition A =
P
s
i=1
α
i
[u
(i)
]
m
, we have
P
[m]
(A) =
s
X
i=1
α
i
[P (u
(i)
)]
m
. (2.3)
Proof. For any A T (m, n) and any i
1
, . . . , i
m
{1, 2, . . . , n}, we have
s
X
i=1
α
i
[P (u
(i)
)]
m
i
1
···i
m
=
s
X
i=1
α
i
(P (u
(i)
))
i
1
···(P (u
(i)
))
i
m
=
s
X
i=1
α
i
n
X
j
1
=1
P
i
1
j
1
u
(i)
j
1
···
n
X
j
m
=1
P
i
m
j
m
u
(i)
j
m
=
s
X
i=1
α
i
n
X
j
1
,...,j
m
=1
P
i
1
j
1
u
(i)
j
1
···P
i
m
j
m
u
(i)
j
m

200 Luo Z Y et al. Sci China Math January 2015 Vol. 58 No. 1
=
n
X
j
1
,...,j
m
=1
P
i
1
j
1
···P
i
m
j
m
s
X
i=1
α
i
u
(i)
j
1
···u
(i)
j
m
=
n
X
j
1
,...,j
m
=1
P
i
1
j
1
···P
i
m
j
m
A
j
1
···j
m
= (P
[m]
(A))
i
1
···i
m
,
where the last equality is achieved by (2.1). This completes the proof.
It is worth pointing out that P
[m]
can be rega rded as an extension of the symmetric-scaled represen-
tation Q
P
in the setting of symmetric matrices space, i.e.,
Q
P
(A) = P AP
T
= P
[2]
(A), A S(2, n).
Such an operator plays an essential role in the design o f interior-point methods for the symmetry preser-
vation in the context of semidefinite programming [2, 21] and has a lso bee n generalized to the Euclidean
Jordan algebra, which is called the quadratic representation; see [8,19, 20] for more details.
Next, we show that the operator P
[m]
possesses several nice properties analogous with the ones of
quadratic representation. The decomposition invariance property developed in Theorem 2.2 greatly
facilitates the proofs of these properties.
Proposition 2.3. For any P R
n×n
, we have
(i) (P
[m]
)
k
= (P
k
)
[m]
;
(ii) if P is invertible, then (P
[m]
)
1
= (P
1
)
[m]
;
(iii) hP
[m]
(A), Bi = hA, (P
T
)
[m]
(B)i, for any A, B T (m, n);
(iv) if P is invertible, then hP
[m]
(A), (P
T
)
[m]
(B)i = hA, Bi, for any A, B T (m, n);
(v) if m is odd, then P
[m]
is a symmetric operator iff P is a symmetric matrix;
(vi) if m is even, then P
[m]
is a symmetric operator iff P is a symmetric matrix or a skew-symm etric
matrix.
Proof. The assertions in (i) and (ii) follow directly from the decomposition invariance property of P
[m]
.
For any A, B T (m, n ), with A =
P
s
i=1
α
i
[u
(i)
]
m
, B =
P
T
j=1
β
j
[w
(j)
]
m
, it yie lds from (2.3) that
hP
[m]
(A), Bi =
s
X
i=1
α
i
[P u
(i)
]
m
,
T
X
j=1
β
j
[w
(j)
]
m
=
s
X
i=1
T
X
j=1
α
i
β
j
h[P u
(i)
]
m
, [w
(j)
]
m
i
=
s
X
i=1
T
X
j=1
α
i
β
j
((w
(j)
)
T
P u
(i)
)
m
=
s
X
i=1
T
X
j=1
α
i
β
j
((u
(i)
)
T
P
T
w
(j)
)
m
=
s
X
i=1
T
X
j=1
α
i
β
j
h[P
T
w
(j)
]
m
, [u
(i)
]
m
i
=
s
X
i=1
α
i
[u
(i)
]
m
,
T
X
j=1
β
j
[P
T
w
(j)
]
m
= hA, (P
T
)
[m]
(B)i.
This prove s the assertion in (iii). The statement in (iv) follows from (ii) and (iii), and (v) and (vi) follow
from (iii).

Luo Z Y et al. Sci China Math January 2015 Vol. 58 No. 1 201
We now characterize the range and null spaces of linear operator P
[m]
, whose proofs are straightforward
and we omit them here. They are basically determined by the range and null spaces of matrix P due to
Theorem 2 .2.
Proposition 2.4. For any P R
n×n
,
(i) the range space and the null space of P
[m]
are given by
R(P
[m]
) =
r
X
i=1
α
i
[u
(i)
]
m
: α
i
R, u
(i)
R(P ), i = 1, 2, . . . , r, r N
,
N(P
[m]
) =
r
X
i=1
α
i
[u
(i)
]
m
: α
i
R, u
(i)
N(P ), i = 1, 2, . . . , r, r N
.
(ii) if P is invertible, then R(P
[m]
) = T (m, n ) and N(P
[m]
) = {0}.
We ar e a lso able to g ive explicit expressions of the range and null spaces of P
[m]
for any P R
n×n
in
terms of the generalized inverse of a matrix, instead of the symmetric outer product decomposition form
presented above. This resembles the projection theory of regular ma trices.
Proposition 2.5. For any P R
n×n
, let P
be the generalized inverse of P. Then,
(i) (P
[m]
)
= (P
)
[m]
;
(ii) R(P
[m]
) = (P P
)
[m]
(T (m, n)), N(P
[m]
) = (I (P
P )
[m]
)(T (m, n ));
(iii) for any A T (m, n) and the tensor equation P
[m]
(X) = A, X = (P
)
[m]
(A) + (I (P
P )
[m]
)(B),
B T (m, n) is a general solution to the equation if it is consistent, or a least-squares solution if it is
inconsistent.
Actually, the class of linear operators defined in Definition 2.1 and its linear combinations play a
fundamental role in the symmetric tensor space which ca n a c t as a subtensor generator, some special pro-
jection operators, diagonalization operators and even the Hadamard product, as the following proposition
illustrates.
Proposition 2.6. For any B T (m, n),
(i) let Γ
k
be any index subset of {1, 2, . . . , n} with |Γ
k
| = k (k = 1, . . . , n), B
Γ
k
T (m, k) be the
corresponding m-order k-dimensional sub-tensor of B, and B
Γ
k
be the natural expansion of B
Γ
k
. Then
(I
Γ
k
)
[m]
(B) = B
Γ
k
, where I
Γ
k
R
n×n
is the matrix with the (i, i)-entry 1 if i Γ
k
and other entries 0;
(ii) let w
(1)
, w
(2)
, . . . , w
(h)
R
n
be unit vect ors which are pairwise orthogonal, and S
{w
(1)
,w
(2)
,...,w
(h)
}
:=
{
P
h
i=1
k
i
[w
(i)
]
m
: k
i
R, i = 1, 2, . . . , h}. Then the projection of B onto S
{w
(1)
,w
(2)
,...,w
(h)
}
is
P
h
i=1
(P
(i)
)
[m]
(B), with P
(i)
:= w
(i)
(w
(i)
)
T
;
(iii) let diag(B) be the diagonal tensor which remains the diagonal entries of B and others 0. Then
diag(B) =
P
n
i=1
(
˜
P
(i)
)
[m]
(B), with
˜
P
(i)
:= e
(i)
(e
(i)
)
T
, where e
(i)
R
n
with 1 at the ith entry and 0
everywhere else;
(iv) let A :=
P
s
i=1
α
i
[u
(i)
]
m
and A B be the Hadamard product of A and B. Then A B =
P
s
i=1
α
i
(
¯
P
(i)
)
[m]
(B), with
¯
P
(i)
the diagonal matrix generated by u
(i)
.
Proof. (i), (ii) and (iii) fo llow by definition. For any B T (m, n) with any of its symmetric outer
product decomposition B =
P
t
j=1
β
j
[v
(j)
]
m
, it follows from the decomposition invariance property that
for any i
1
, . . . , i
m
{1, 2, . . . , n},
s
X
i=1
α
i
(
¯
P
(i)
)
[m]
(B)
i
1
···i
m
=
s
X
i=1
α
i
t
X
j=1
β
j
[
¯
P
(i)
v
(j)
]
m
i
1
···i
m
=
s
X
i=1
α
i
t
X
j=1
β
j
(u
(i)
)
i
1
(v
(j)
)
i
1
···(u
(i)
)
i
m
(v
(j)
)
i
m
=
s
X
i=1
α
i
(u
(i)
)
i
1
···(u
(i)
)
i
m

t
X
j=1
β
j
(v
(j)
)
i
1
···(v
(j)
)
i
m

Citations
More filters

Interior Point Algorithms Theory And Analysis

Yvonne Freeh
TL;DR: This interior point algorithms theory and analysis tends to be the representative book in this website.
Journal ArticleDOI

The Drazin inverse of an even-order tensor and its application to singular tensor equations

TL;DR: The index and the invertibility of an even-order square tensor are defined and the Drazin inverse through the core-nilpotent decomposition for a tensor of even- order is obtained.
Journal ArticleDOI

Weighted Moore-Penrose inverses and fundamental theorem of even-order tensors with Einstein product

TL;DR: In this article, even-order tensors with Einstein product were treated as linear operators from tensor space to tensor spaces, and the null spaces and the ranges of tensors were defined.
Journal ArticleDOI

Completely Positive Tensors: Properties, Easily Checkable Subclasses, and Tractable Relaxations

TL;DR: A so-called CP-Vandermonde decomposition for positive Cauchy--Hankel tensors is established and a numerical algorithm is proposed to obtain such a special type of CP decomposition.
Journal ArticleDOI

SOS Tensor Decomposition: Theory and Applications

TL;DR: This paper deduces an upper bound for general tensors that have SOS decomposition and the SOS-width for general SOS tensor cone using the known results in the literature of polynomial theory, and provides an explicit sharper estimate for the OS-rank of SOS Tensor decomposition with bounded exponent.
References
More filters
Journal ArticleDOI

Tensor Decompositions and Applications

TL;DR: This survey provides an overview of higher-order tensor decompositions, their applications, and available software.
Book

Topics in Matrix Analysis

TL;DR: The field of values as discussed by the authors is a generalization of the field of value of matrices and functions, and it includes singular value inequalities, matrix equations and Kronecker products, and Hadamard products.
Book

Analysis on Symmetric Cones

TL;DR: In this paper, the Peirce decomposition in a Jordan algebra is used to classify Euclidean Jordan algebras, and the gamma function of a symmetric cone is shown to be a special function.
Journal ArticleDOI

Eigenvalues of a real supersymmetric tensor

TL;DR: It is shown that eigenvalues are roots of a one-dimensional polynomial, and when the order of the tensor is even, E-eigenvaluesare roots of another one- dimensional polynomials associated with the symmetric hyperdeterminant.
Book

Tensor Methods in Statistics

TL;DR: In this article, the authors provide a systematic development of tensor methods in statistics, beginning with the study of multivariate moments and cumulants, and an examination of the effect of making a polynomial transformation of the original variables.
Related Papers (5)