scispace - formally typeset
Search or ask a question
Topic

Spectral graph theory

About: Spectral graph theory is a research topic. Over the lifetime, 1334 publications have been published within this topic receiving 77373 citations.


Papers
More filters
Proceedings Article
16 Jun 2013
TL;DR: The bigraphical lasso is introduced, an estimator for precision matrices of matrix-normals based on the Cartesian product of graph theory, a prominent product in spectral graph theory that has appealing properties for regression, enhanced sparsity and interpretability.
Abstract: The i.i.d. assumption in machine learning is endemic, but often flawed. Complex data sets exhibit partial correlations between both instances and features. A model specifying both types of correlation can have a number of parameters that scales quadratically with the number of features and data points. We introduce the bigraphical lasso, an estimator for precision matrices of matrix-normals based on the Cartesian product of graphs. A prominent product in spectral graph theory, this structure has appealing properties for regression, enhanced sparsity and interpretability. To deal with the parameter explosion we introduce l1 penalties and fit the model through a flip-flop algorithm that results in a linear number of lasso regressions. We demonstrate the performance of our approach with simulations and an example from the COIL image data set.

46 citations

Proceedings ArticleDOI
19 Jul 2018
TL;DR: A sublinear time algorithm is presented that, given the ability to query a random node in the graph and select a random neighbor of a given node, computes a succinct representation of an approximation of the spectrum of large large networks.
Abstract: The spectrum of a network or graph $G=(V,E)$ with adjacency matrix A , consists of the eigenvalues of the normalized Laplacian $L= I - D^-1/2 A D^-1/2 $. This set of eigenvalues encapsulates many aspects of the structure of the graph, including the extent to which the graph posses community structures at multiple scales. We study the problem of approximating the spectrum, $lambda = (lambda_1,\dots,lambda_|V| )$, of G in the regime where the graph is too large to explicitly calculate the spectrum. We present a sublinear time algorithm that, given the ability to query a random node in the graph and select a random neighbor of a given node, computes a succinct representation of an approximation $\widetilde lambda = (\widetilde lambda_1,\dots,\widetilde lambda_|V| )$, such that $\|\widetilde lambda - lambda\|_1 le e |V|$. Our algorithm has query complexity and running time $exp(O(1/\eps))$, which is independent of the size of the graph, $|V|$. We demonstrate the practical viability of our algorithm on synthetically generated graphs, and on 15 different real-world graphs from the Stanford Large Network Dataset Collection, including social networks, academic collaboration graphs, and road networks. For the smallest of these graphs, we are able to validate the accuracy of our algorithm by explicitly calculating the true spectrum; for the larger graphs, such a calculation is computationally prohibitive. The spectra of these real-world networks reveal insights into the structural similarities and differences between them, illustrating the potential value of our algorithm for efficiently approximating the spectrum of large large networks.

46 citations

16 Sep 2006
TL;DR: An approach to searching over a nonparametric family of spectral transforms by using convex optimization to maximize kernel alignment to the labeled data and results in a flexible family of kernels that is more data-driven than the standard parametric spectral transforms.
Abstract: Many graph-based semi-supervised learning methods can be viewed as imposing smoothness conditions on the target function with respect to a graph representing the data points to be labeled. The smoothness properties of the functions are encoded in terms of Mercer kernels over the graph. The central quantity in such regularization is the spectral decomposition of the graph Laplacian, a matrix derived from the graph's edge weights. The eigenvectors with small eigenvalues are smooth, and ideally represent large cluster structures within the data. The eigenvectors having large eigenvalues are rugged, and considered noise. Different weightings of the eigenvectors of the graph Laplacian lead to different measures of smoothness. Such weightings can be viewed as spectral transforms, that is, as transformations of the standard eigenspectrum that lead to different regularizers over the graph. Familiar kernels, such as the diffusion kernel resulting by solving a discrete heat equation on the graph, can be seen as simple parametric spectral transforms. The question naturally arises whether one can obtain effective spectral transforms automatically. In this paper we develop an approach to searching over a nonparametric family of spectral transforms by using convex optimization to maximize kernel alignment to the labeled data. Order constraints are imposed to encode a preference for smoothness with respect to the graph structure. This results in a flexible family of kernels that is more data-driven than the standard parametric spectral transforms. Our approach relies on a quadratically constrained quadratic program (QCQP), and is computationally practical for large datasets.

46 citations

Journal ArticleDOI
TL;DR: In this paper, the spectral characterization problem of signed lollipop graphs was extended to the adjacency matrix and Laplacian matrix of the signed graphs, and it was shown that the spectral properties of a signed lipop graph can be determined by the spectrum of its LaplACian matrix.

46 citations

Proceedings ArticleDOI
04 May 2014
TL;DR: It is proved that LFVC can be related to a monotonic submodular set function that guarantees that greedy node or edge removals come within a factor 1-1/e of the optimal non-greedy batch removal strategy.
Abstract: In this paper, a new centrality called local Fiedler vector centrality (LFVC) is proposed to analyze the connectivity structure of a graph. It is associated with the sensitivity of algebraic connectivity to node or edge removals and features distributed computations via the associated graph Laplacian matrix. We prove that LFVC can be related to a monotonic submodular set function that guarantees that greedy node or edge removals come within a factor 1 1=e of the optimal non-greedy batch removal strategy. Due to the close relationship between graph topology and community structure, we use LFVC to detect deep and overlapping communities on real-world social network datasets. The results offer new insights on community detection by discovering new significant communities and key members in the network. Notably, LFVC is also shown to significantly out- perform other well-known centralities for community detection. approach to detect significant communities and key members in the network, which we refer as deep and overlapping com- munity detection. For instance, when our proposed approach is applied to the network scientist coauthorship dataset we show that a zoologist is correctly identified as an outlier node during the detection process since the authors are mostly physicists, thus leading to revelation of new community structures. Local Fiedler vector centrality (LFVC) is proposed to eval- uate the connectivity structure of a graph based on spectral graph theory (13). LFVC is associated with an upper bound on algebraic connectivity (14) when a subset of nodes or edges are removed from a graph. We show that LFVC relates to a monotonic submodular set function such that greedy node or edge removals can be employed with bounded performance loss relative to the optimal non-greedy batch removal strategy. Moreover, LFVC can be computed in a distributed manner and it is applicable to large-scale network analysis. We apply this method to real-world social network datasets and compare to the modularity method and other well-known centralities.

46 citations


Network Information
Related Topics (5)
Bounded function
77.2K papers, 1.3M citations
82% related
Upper and lower bounds
56.9K papers, 1.1M citations
82% related
Iterative method
48.8K papers, 1.2M citations
81% related
Matrix (mathematics)
105.5K papers, 1.9M citations
80% related
Optimization problem
96.4K papers, 2.1M citations
79% related
Performance
Metrics
No. of papers in the topic in previous years
YearPapers
20241
202316
202236
202153
202086
201981