scispace - formally typeset
Search or ask a question

Showing papers in "Internet Mathematics in 2016"


Book ChapterDOI
TL;DR: This work surveys here recent results on temporal graphs and temporal graph problems that have appeared in the Computer Science community.
Abstract: A temporal graph is, informally speaking, a graph that changes with time. When time is discrete and only the relationships between the participating entities may change and not the entities themselves, a temporal graph may be viewed as a sequence \(G_1,G_2\ldots ,G_l\) of static graphs over the same (static) set of nodes V. Though static graphs have been extensively studied, for their temporal generalization we are still far from having a concrete set of structural and algorithmic principles. Recent research shows that many graph properties and problems become radically different and usually substantially more difficult when an extra time dimension in added to them. Moreover, there is already a rich and rapidly growing set of modern systems and applications that can be naturally modeled and studied via temporal graphs. This, further motivates the need for the development of a temporal extension of graph theory. We survey here recent results on temporal graphs and temporal graph problems that have appeared in the Computer Science community.

118 citations


Journal ArticleDOI
TL;DR: The burning number as mentioned in this paper measures the speed of the spread of contagion in a graph; the lower the burning number, the faster the contagion spreads, and is derived for the graphs generated by the Iterated Local Transitivity model for social networks.
Abstract: We introduce a new graph parameter called the burning number, inspired by contact processes on graphs such as graph bootstrap percolation, and graph searching paradigms such as Firefighter. The burning number measures the speed of the spread of contagion in a graph; the lower the burning number, the faster the contagion spreads. We provide a number of properties of the burning number, including characterizations and bounds. The burning number is computed for several graph classes, and is derived for the graphs generated by the Iterated Local Transitivity model for social networks.

60 citations


Journal ArticleDOI
TL;DR: This article considers the global clustering coefficient of random graphs on the hyperbolic plane, proposed recently by Krioukov and colleagues as a mathematical model of complex networks, under the fundamental assumption thathyperbolic geometry underlies the structure of these networks.
Abstract: Clustering is a fundamental property of complex networks and it is the mathematical expression of a ubiquitous phenomenon that arises in various types of self-organized networks such as biological networks, computer networks, or social networks. In this article, we consider what is called the global clustering coefficient of random graphs on the hyperbolic plane. This model of random graphs was proposed recently by Krioukov and colleagues as a mathematical model of complex networks, under the fundamental assumption that hyperbolic geometry underlies the structure of these networks. We give a rigorous analysis of clustering and characterize the global clustering coefficient in terms of the parameters of the model. We show how the global clustering coefficient can be tuned by these parameters and we give an explicit formula for this function.

39 citations


Journal ArticleDOI
TL;DR: In this paper, the first detailed empirical evaluation of the use of tree decomposition (TD) heuristics for structure identification and extraction in social graphs is presented, showing that TD methods can identify structures that correlate strongly with the core-periphery structure of realistic networks.
Abstract: Recent work has established that large informatics graphs such as social and information networks have non-trivial tree-like structure when viewed at moderate size scales. Here, we present results from the first detailed empirical evaluation of the use of tree decomposition (TD) heuristics for structure identification and extraction in social graphs. Although TDs have historically been used in structural graph theory and scientific computing, we show that—even with existing TD heuristics developed for those very different areas—TD methods can identify interesting structure in a wide range of realistic informatics graphs. Our main contributions are the following: we show that TD methods can identify structures that correlate strongly with the core-periphery structure of realistic networks, even when using simple greedy heuristics; we show that the peripheral bags of these TDs correlate well with low-conductance communities (when they exist) found using local spectral computations; and we show that ...

27 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed the first betweenness centrality approximation algorithm with a provable guarantee on the maximum approximation error for dynamic networks, which is the first algorithm that can be used to update the shortest paths in unweighted graphs.
Abstract: Betweenness is a well-known centrality measure that ranks the nodes of a network according to their participation in shortest paths. Because exact computations are prohibitive in large networks, several approximation algorithms have been proposed. Besides that, recent years have seen the publication of dynamic algorithms for efficient recomputation of betweenness in networks that change over time.In this article, we propose the first betweenness centrality approximation algorithms with a provable guarantee on the maximum approximation error for dynamic networks. Several new intermediate algorithmic results contribute to the respective approximation algorithms: (i) new upper bounds on the vertex diameter, (ii) the first fully dynamic algorithm for updating an approximation of the vertex diameter in undirected graphs, and (iii) an algorithm with lower time complexity for updating single-source shortest paths in unweighted graphs after a batch of edge actions.Using approximation, our algorithms are t...

26 citations


Journal ArticleDOI
TL;DR: This work introduces a new sampling method where it guarantees termination while achieving speed comparable to the MCMC method for creating random realizations of very large degree sequences.
Abstract: We examine the problem of creating random realizations of very large degree sequences. Although fast in practice, the Markov chain Monte Carlo (MCMC) method for selecting a realization has limited usefulness for creating large graphs because of memory constraints. Instead, we focus on sequential importance sampling (SIS) schemes for random graph creation. A difficulty with SIS schemes is assuring that they terminate in a reasonable amount of time. We introduce a new sampling method by which we guarantee termination while achieving speed comparable to the MCMC method.

20 citations


Journal ArticleDOI
TL;DR: In this article, the authors study the behavior of network diffusions based on the PageRank random walk from a set of seed nodes, and propose a new method that quickly approximates the result of the diffusion for all values of this parameter.
Abstract: We study the behaviour of network diffusions based on the PageRank random walk from a set of seed nodes. These diffusions are known to reveal small, localized clusters (or communities), and also large macro-scale clusters by varying a parameter that has a dual-interpretation as an accuracy bound and as a regularization level. We propose a new method that quickly approximates the result of the diffusion for all values of this parameter. Our method efficiently generates an approximate solution path or regularization path associated with a PageRank diffusion, and it reveals cluster structures at multiple size-scales between small and large. We formally prove a runtime bound on this method that is independent of the size of the network, and we investigate multiple optimizations to our method that can be more practical in some settings. We demonstrate that these methods identify refined clustering structure on a number of real-world networks with up to 2 billion edges.

18 citations


Journal ArticleDOI
TL;DR: The expected value and variance of the time of the first return of a random walk decrease with increasing vertex weight, so for a given time budget, returns to high-weight vertices should give the best property estimates.
Abstract: We study the use of random walks as an efficient estimator of global properties of large undirected graphs, for example the number of edges, vertices, triangles, and generally, the number of small fixed subgraphs. We consider two methods based on first returns of random walks: the cycle formula of regenerative processes and weighted random walks with edge weights defined by the property under investigation. We review the theoretical foundations for these methods, and indicate how they can be adapted for the general non-intrusive investigation of large online networks.

12 citations


Journal ArticleDOI
TL;DR: In this article, the authors presented a detailed analysis of the global clustering coefficient in scale-free graphs and analyzed the clustering coefficients for both weighted and unweighted graphs.
Abstract: In this article, we present a detailed analysis of the global clustering coefficient in scale-free graphs. Many observed real-world networks of diverse nature have a power-law degree distribution. Moreover, the observed degree distribution usually has an infinite variance. Therefore, we are especially interested in such degree distributions. In addition, we analyze the clustering coefficient for both weighted and unweighted graphs. There are two well-known definitions of the clustering coefficient of a graph: the global and the average local clustering coefficients. There are several models proposed in the literature for which the average local clustering coefficient tends to a positive constant as a graph grows. However, there are no models of scale-free networks with an infinite variance of the degree distribution and with an asymptotically constant global clustering coefficient. Models with constant global clustering and finite variance were also proposed. Therefore, in this work we focus only ...

12 citations


Journal ArticleDOI
TL;DR: In this paper, the authors studied the behavior of the spatial preferential attachment (SPA) model when the distribution of the nodes is nonuniform, and proved precise theoretical results with regard to the degree of a node, the number of common neighbors, and the average out-degree in a region.
Abstract: The spatial preferential attachment (SPA) is a model for complex networks. In the SPA model, nodes are embedded in a metric space, and each node has a sphere of influence whose size increases if the node gains an in-link, and otherwise decreases with time. In this work, we study the behavior of the SPA model when the distribution of the nodes is nonuniform. Specifically, the space is divided into dense and sparse regions, where it is assumed that the dense regions correspond to coherent communities. We prove precise theoretical results with regard to the degree of a node, the number of common neighbors, and the average out-degree in a region. Moreover, we show how these theoretically derived results about the graph properties of the model can be used to formulate a reliable estimator for the distance between certain pairs of nodes, and to estimate the density of the region containing a given node.

10 citations


Journal ArticleDOI
TL;DR: This work investigates the existence of pure Nash equilibria for at least three players on different classes of graphs including paths, cycles, grid graphs and hypercubes, and answers an open question proving that there is no Nash equilibrium.
Abstract: We study competitive diffusion games on graphs introduced by Alon et al. [1] to model the spread of influence in social networks. Extending results of Roshanbin [7] for two players, we investigate the existence of pure Nash equilibria for at least three players on different classes of graphs including paths, cycles, and grid graphs. As a main result, we answer an open question proving that there is no Nash equilibrium for three players on \(m\times n\) grids with \(\min \{m,n\}\ge 5\).

Journal ArticleDOI
TL;DR: The theory demonstrates that security of networks can be achieved by a merging of natural selection and combinatorial principles, and that both natural selection principle and combinatorsial principles are essential toSecurity of networks.
Abstract: We propose the definition of security of networks against the cascading failure models of deliberate attacks. We propose a model of networks by the natural selection of homophyly/kinship, randomness and preferential attachment, referred to as security model. We show that the networks generated by the security model are provably secure against any attacks of sizes poly(log n) under the cascading failure models, for which the principles of natural selection and the combinatorial principles of the networks of the security model, including a power law, a self-organizing principle, a small diameter property, a local navigation law, a degree priority principle, an inclusion-exclusion principle, and an infection priority tree principle etc, are the underlying principles. Furthermore, we show that the networks generated by the security model have an expander core. This property ensures that the networks of the security model satisfy the requirement of global communications in engineering. Based on our theory, we ...

Journal ArticleDOI
TL;DR: For a random intersection graph with a power law degree sequence having a finite mean and an infinite variance, the authors showed that the global clustering coefficient admits a tunable asymptotic distribution.
Abstract: For a random intersection graph with a power law degree sequence having a finite mean and an infinite variance we show that the global clustering coefficient admits a tunable asymptotic distribution.

Journal ArticleDOI
TL;DR: A randomized algorithm is presented that can, in O(log n) rounds, detect and reach consensus about the health of the leader (i.e., whether it is able to maintain good communication with rest of the network) and is guaranteed with high probability that there is at most one leader at any time.
Abstract: We investigate the problem of electing a leader in a sparse but well-connected synchronous dynamic network in which up to a fraction of the nodes chosen adversarially can leave/join the network per time step. At this churn rate, all nodes in the network can be replaced by new nodes in a constant number of rounds. Moreover, the adversary can shield a fraction of the nodes (which may include the leader) by repeatedly churning their neighborhood and, thus, hindering, their communication with the rest of the network. However, empirical studies in peer-to-peer networks have shown that a significant fraction of the nodes are usually stable and well connected. It is, therefore, natural to take advantage of such stability to establish a leader that can maintain good communication with the rest of the nodes. Because the dynamics could change eventually, it is also essential to reelect a new leader whenever the current leader either has left the network or is not well-connected with rest of the nodes. In su...

Journal ArticleDOI
TL;DR: A strategic game model for the Firefighter Problem is introduced to tackle its complexity from a different angle and it turns out that it is possible to compute an equilibrium in polynomial time, even for constant-size coalitions.
Abstract: The Firefighter Problem was proposed in 1995 as a deterministic discrete-time model for the spread and containment of a fire. The problem is defined on an undirected finite graph G = (V, E), where fire breaks out initially at f nodes. In each subsequent time-step, two actions occur: a certain number b of firefighters are placed on nonburning nodes, permanently protecting them from the fire, then the fire spreads to all nondefended neighbors of the nodes on fire. Because the graph is finite, at some point each node is either on fire or saved, and thus the fire cannot spread further. One of the objectives for the problem is to place the firefighters in such a way that the number of saved nodes is maximized. The applications of the Firefighter Problem reach from real fires to the spreading of diseases and the containment of floods. Furthermore, it can be used to model the spread of computer viruses or viral marketing in communication networks. Most research on the problem considers the case in which the fire starts in a single place (i.e., f = 1), and in which the budget of available firefighters per time-step is one (i.e., b = 1). So does the work in this study. This configuration already leads to hard problems and, even in this case, the problem is known to be NP-hard. In this work, we study the problem from a game-theoretical perspective. We introduce a strategic game model for the Firefighter Problem to tackle its complexity from a different angle. We refer to it as the Firefighter Game. Such a game-based context seems very appropriate when applied to large networks where entities may act and make decisions based on their own interests, without global coordination. At every time-step of the game, a player decides whether to place a new firefighter in a nonburning node of the graph. If so, he must decide where to place it. By placing it, the player is indirectly deciding which nodes to protect at that time-step. We define different utility functions in order to model selfish and nonselfish scenarios, which lead to equivalent games. We show that the Price of Anarchy (PoA) is linear for a particular family of graphs, but it is at most two for trees. We also analyze the quality of the equilibria when coalitions among players are allowed. It turns out that it is possible to compute an equilibrium in polynomial time, even for constant-size coalitions. This yields to a polynomial time approximation algorithm for the problem and its approximation ratio equals the PoA of the corresponding game. We show that for some specific topologies, the PoA is constant when constant-size coalitions are considered.

Journal ArticleDOI
TL;DR: LiveRank as discussed by the authors is a ranking of the old pages so that active nodes are more likely to appear first, and the quality of a LiveRank is measured by the number of queries necessary to identify a given fraction of the active nodes when using the LiveRank order.
Abstract: This paper considers the problem of refreshing a dataset. More precisely , given a collection of nodes gathered at some time (Web pages, users from an online social network) along with some structure (hyperlinks, social relationships), we want to identify a significant fraction of the nodes that still exist at present time. The liveness of an old node can be tested through an online query at present time. We call LiveRank a ranking of the old pages so that active nodes are more likely to appear first. The quality of a LiveRank is measured by the number of queries necessary to identify a given fraction of the active nodes when using the LiveRank order. We study different scenarios from a static setting where the Liv-eRank is computed before any query is made, to dynamic settings where the LiveRank can be updated as queries are processed. Our results show that building on the PageRank can lead to efficient LiveRanks, for Web graphs as well as for online social networks.

Journal Article
TL;DR: In this paper, the role of geometry on bootstrap percolation is analyzed mathematically for geometric scale-free networks, and the authors show that the process exhibits a phase transition in terms of the initial infection rate in this region.
Abstract: Geometric inhomogeneous random graphs (GIRGs) are a model for scale-free networks with underlying geometry. We study bootstrap percolation on these graphs, which is a process modelling the spread of an infection of vertices starting within a (small) local region. We show that the process exhibits a phase transition in terms of the initial infection rate in this region. We determine the speed of the process in the supercritical case, up to lower order terms, and show that its evolution is fundamentally influenced by the underlying geometry. For vertices with given position and expected degree, we determine the infection time up to lower order terms. Finally, we show how this knowledge can be used to contain the infection locally by removing relatively few edges from the graph. This is the first time that the role of geometry on bootstrap percolation is analysed mathematically for geometric scale-free networks.

Journal ArticleDOI
TL;DR: It is shown how standard sampling techniques can be used to obtain efficient estimators for the most commonly used measures of weighted clustering coefficient, and a novel graph-theoretic notion of clustering coefficients in weighted networks is proposed.
Abstract: The clustering coefficient of an unweighted network has been extensively used to quantify how tightly connected is the neighbor around a node and it has been widely adopted for assessing the quality of nodes in a social network. The computation of the clustering coefficient is challenging since it requires to count the number of triangles in the graph. Several recent works proposed efficient sampling, streaming and MapReduce algorithms that allow to overcome this computational bottleneck. As a matter of fact, the intensity of the interaction between nodes, that is usually represented with weights on the edges of the graph, is also an important measure of the statistical cohesiveness of a network. Recently various notions of weighted clustering coefficient have been proposed but all those techniques are hard to implement on large-scale graphs.