scispace - formally typeset
Search or ask a question
Topic

Spectral graph theory

About: Spectral graph theory is a research topic. Over the lifetime, 1334 publications have been published within this topic receiving 77373 citations.


Papers
More filters
Journal ArticleDOI
Zhibin Du1
TL;DR: An algorithm is presented, using an algorithm, to diagonalize the distance matrix of cographs, from which one can deduce a diagonal matrix congruent to matrix.
Abstract: Cographs is a well-known class of graphs in graph theory, which can be generated from a single vertex by applying a series of complement (or equivalently join operations) and disjoint union operations. The distance spectrum of graphs is a rather active topic in spectral graph theory these years. This paper denotes to revealing some properties for the distance spectrum of cographs. More precisely, we present an algorithm, using $O(n)$ time and space, to diagonalize the distance matrix of cographs, from which one can deduce a diagonal matrix congruent to matrix $D + \lambda I$ , where $D$ is the distance matrix of a cograph, $\lambda $ is a real number, and $I$ is the identity matrix. Besides, we also give some applications of such algorithm about the inertia of distance matrix of complete multipartite graphs.

2 citations

Proceedings ArticleDOI
01 Dec 2012
TL;DR: The stability properties of the ensuing deformed consensus protocol are studied in terms of parameter s for some special families of undirected graphs, and for graphs of arbitrary topology by leveraging the spectral theory of quadratic eigenvalue problems.
Abstract: This paper studies a generalization of the standard continuous-time consensus protocol, obtained by replacing the Laplacian matrix of the undirected communication graph with the so-called deformed Laplacian. The deformed Laplacian is a second-degree matrix polynomial in the real variable s which reduces to the standard Laplacian for s equal to unity. The stability properties of the ensuing deformed consensus protocol are studied in terms of parameter s for some special families of undirected graphs, and for graphs of arbitrary topology by leveraging the spectral theory of quadratic eigenvalue problems. Examples and simulation results are provided to illustrate our theoretical findings.

2 citations

Posted Content
TL;DR: In this article, the eccentricity version of the Laplacian energy of a graph G is investigated, which is the sum of the absolute value of the difference between the eigenvalues of the matrix of G and the average degree of the vertices of G.
Abstract: The energy of a graph G is equal to the sum of absolute values of the eigenvalues of the adjacency matrix of G, whereas the Laplacian energy of a graph G is equal to the sum of the absolute value of the difference between the eigenvalues of the Laplacian matrix of G and average degree of the vertices of G. Motivated by the work from Sharafdini et al. [R. Sharafdini, H. Panahbar, Vertex weighted Laplacian graph energy and other topological indices. J. Math. Nanosci. 2016, 6, 49-57.], in this paper we investigate the eccentricity version of Laplacian energy of a graph G.

2 citations

Proceedings Article
12 Jul 2020
TL;DR: A novel regularization approach for deep learning that incorporates and respects the underlying graphical structure of the neural network's underlying graph via spectral graph theory and an alternative but equivalent formulation in the form of a structurally weighted L1 penalty is provided.
Abstract: We introduce a novel regularization approach for deep learning that incorporates and respects the underlying graphical structure of the neural network. Existing regularization methods often focus on dropping/penalizing weights in a global manner that ignores the connectivity structure of the neural network. We propose to use the Fiedler value of the neural network's underlying graph as a tool for regularization. We provide theoretical support for this approach via spectral graph theory. We list several useful properties of the Fiedler value that makes it suitable in regularization. We provide an approximate, variational approach for fast computation in practical training of neural networks. We provide bounds on such approximations. We provide an alternative but equivalent formulation of this framework in the form of a structurally weighted L1 penalty, thus linking our approach to sparsity induction. We performed experiments on datasets that compare Fiedler regularization with traditional regularization methods such as dropout and weight decay. Results demonstrate the efficacy of Fiedler regularization.

2 citations


Network Information
Related Topics (5)
Bounded function
77.2K papers, 1.3M citations
82% related
Upper and lower bounds
56.9K papers, 1.1M citations
82% related
Iterative method
48.8K papers, 1.2M citations
81% related
Matrix (mathematics)
105.5K papers, 1.9M citations
80% related
Optimization problem
96.4K papers, 2.1M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
202316
202236
202153
202086
201981