scispace - formally typeset
Search or ask a question
Topic

Spectral graph theory

About: Spectral graph theory is a research topic. Over the lifetime, 1334 publications have been published within this topic receiving 77373 citations.


Papers
More filters
Book ChapterDOI
01 Jan 1993
TL;DR: In this paper, the Laplacian L(G) of G is the Gram matrix of the outward normals of ∑(G), and it is shown that the spectral properties of G are reflected by the geometric shape of the Steiner circumscribed ellipsoid S of G. In particular, the squares of the half-axes of S are proportional to the reciprocals of the eigenvalues of L (G).
Abstract: Let G be a finite undirected connected graph with n vertices. We assign to G an (n - 1)-simplex ∑(G) in the point Euclidean (n - 1)-space in such a way that the Laplacian L(G) of G is the Gram matrix of the outward normals of ∑(G). It is shown that the spectral properties of L(G) are reflected by the geometric shape of the Steiner circumscribed ellipsoid S of ∑(G) in a simple manner. In particular, the squares of the half-axes of S are proportional to the reciprocals of the eigenvalues of L(G). Also, a previously discovered relationship to resistive electrical circuits is mentioned.

25 citations

Posted Content
TL;DR: This paper shows that by sampling a significantly smaller subset of vertices and using simple least squares, it can reconstruct the power spectrum of the graph signal from the subsampled observations, without any spectral priors.
Abstract: In this paper we focus on subsampling stationary random processes that reside on the vertices of undirected graphs. Second-order stationary graph signals are obtained by filtering white noise and they admit a well-defined power spectrum. Estimating the graph power spectrum forms a central component of stationary graph signal processing and related inference tasks. We show that by sampling a significantly smaller subset of vertices and using simple least squares, we can reconstruct the power spectrum of the graph signal from the subsampled observations, without any spectral priors. In addition, a near-optimal greedy algorithm is developed to design the subsampling scheme.

25 citations

Proceedings ArticleDOI
04 May 2014
TL;DR: This paper proposes a novel graph signal coarsening method with spectral invariance, which means both the spectrum of the graph and the spectrumof the graph signal are approximately kept invariant.
Abstract: Signal processing on graphs is an emerging field that attracts increasing attention. For applications such as multiscale transforms on graphs, it is often necessary to get a coarsened version of graph signal with its underlying graph. However, most of the existing methods use only topology information but no property of graph signals to complete the process. In this paper, we propose a novel graph signal coarsening method with spectral invariance, which means both the spectrum of the graph and the spectrum of the graph signal are approximately kept invariant. The problem is formulated into an optimization problem and is solved by projected subgradient method. Experiment results verify the effectiveness of the coarsening method.

25 citations

Proceedings ArticleDOI
06 Jul 2020
TL;DR: Graph Learning Neural Networks (GLNNs) as mentioned in this paper exploit the optimization of graphs (the adjacency matrix in particular) from both data and tasks by leveraging on spectral graph theory, and propose the objective of graph learning from a sparsity constraint, properties of a valid adjACency matrix as well as a graph Laplacian regularizer via maximum a posteriori estimation.
Abstract: Graph Convolutional Neural Networks (GCNNs) are generalizations of CNNs to graph-structured data, in which convolution is guided by the graph topology. In many cases where graphs are unavailable, existing methods manually construct graphs or learn task-driven adaptive graphs. In this paper, we propose Graph Learning Neural Networks (GLNNs), which exploit the optimization of graphs (the adjacency matrix in particular) from both data and tasks. Leveraging on spectral graph theory, we propose the objective of graph learning from a sparsity constraint, properties of a valid adjacency matrix as well as a graph Laplacian regularizer via maximum a posteriori estimation. The optimization objective is then integrated into the loss function of the GCNN, which adapts the graph topology to not only labels of a specific task but also the input data. Experimental results show that our proposed GLNN significantly outperforms state-of-the-art approaches over widely adopted social network datasets and citation network datasets for semi-supervised classification.

25 citations

Journal ArticleDOI
TL;DR: In this article, a new method is given to calculate the transfer function of a graphy by analyzing the strong components of the graph, the elementary paths between two nodes, and the linear subgraphs.

25 citations


Network Information
Related Topics (5)
Bounded function
77.2K papers, 1.3M citations
82% related
Upper and lower bounds
56.9K papers, 1.1M citations
82% related
Iterative method
48.8K papers, 1.2M citations
81% related
Matrix (mathematics)
105.5K papers, 1.9M citations
80% related
Optimization problem
96.4K papers, 2.1M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
202316
202236
202153
202086
201981