scispace - formally typeset
Proceedings ArticleDOI

An entropy maximization problem in shortest path routing networks

21 May 2014-pp 1-6
TL;DR: In the context of an IP network, an interesting case of the inverse shortest path problem is investigated using the concept of network centrality, and a heuristic approach is proposed to obtain a centrality distribution that maximizes the entropy.

...read more

Abstract: In the context of an IP network, we investigate an interesting case of the inverse shortest path problem using the concept of network centrality. For a given network, the centrality distribution associated with the links of a network can be determined based on the number of shortest paths passing through each link. An entropy measure for this distribution is defined, and we then forumulate the inverse shortest problem in terms of maximizing this entropy. We then obtain a centrality distribution that is as broadly distributed as possible subject to the topology constraints. An appropriate change in the weight of a link alters the number of shortest paths that pass through it, thereby modifying the centrality distribution. The idea is to obtain a centrality distribution that maximizes the entropy. This problem is shown to be NP-hard, and a heuristic approach is proposed. An application to handling link failure scenarios in Open Shortest Path First routing is discussed.

...read more

Topics: K shortest path routing (71%), Constrained Shortest Path First (67%), Network theory (66%) ...read more
Citations
More filters

Journal ArticleDOI
20 Jul 2016-Computer Networks
TL;DR: This paper investigates an interesting case of the inverse shortest path problem using the concept of network centrality, and a heuristic approach is proposed to obtain a centrality distribution that maximizes the entropy.

...read more

Abstract: In the context of an IP network, this paper investigates an interesting case of the inverse shortest path problem using the concept of network centrality. For a given network, a special probability distribution, namely the centrality distribution associated with the links of a network can be determined based on the number of the shortest paths passing through each link. An entropy measure for this distribution is defined, and the inverse shortest path problem is formulated in terms of maximizing this entropy. We then obtain a centrality distribution that is as broadly distributed as possible subject to the topology constraints. A maximum entropy distribution signifies the decentralization of the network. An appropriate change in the weight of a link alters the number of the shortest paths that pass through it, thereby modifying the centrality distribution. The idea is to obtain a centrality distribution that maximizes the entropy. This problem is shown to be NP-hard, and a heuristic approach is proposed. An application to handling link failure scenarios in Open Shortest Path First routing is discussed.

...read more

12 citations


Journal ArticleDOI
Yamila M. Omar1, Peter Plapper1Institutions (1)
15 Dec 2020-Entropy
TL;DR: A narrative literature review of information entropy metrics for complex networks is conducted following the PRISMA guidelines, identifying the areas in need for further development aiming to guide future research efforts.

...read more

Abstract: Information entropy metrics have been applied to a wide range of problems that were abstracted as complex networks. This growing body of research is scattered in multiple disciplines, which makes it difficult to identify available metrics and understand the context in which they are applicable. In this work, a narrative literature review of information entropy metrics for complex networks is conducted following the PRISMA guidelines. Existing entropy metrics are classified according to three different criteria: whether the metric provides a property of the graph or a graph component (such as the nodes), the chosen probability distribution, and the types of complex networks to which the metrics are applicable. Consequently, this work identifies the areas in need for further development aiming to guide future research efforts.

...read more

3 citations


Cites background or methods from "An entropy maximization problem in ..."

  • ...They further proposed the use of entropy maximization and betweenness entropy in order to make communications routing decentralized [25] and handle single edge failures [34]....

    [...]

  • ...For example, information functionals are based on edge or node betweenness centrality [24,25,34,50,53] distances to a given vertex [28], degree, degree power or probability distribution of degrees [31,41], paths or paths’ length [16,35], and closeness or eigenvector centrality [53]....

    [...]

  • ...[14] X X X X [15] X X acyclic [16] X X acyclic [17] X X X X [18] X X [19] X X strongly connected, aperiodic [20] X X connected [21] X X [22] X X [23] X X [24] X X no self-loops [25] X X [26] X X [27] X X [28] X X no self-loops [29] X X X [30] X X [31] X X [32] X X [33] X X [34] X X [35] X X [36] X X [37] X X [38] X X [39] X X [40] X X [41] X X [42] X X no self-loops [43] X X [44] X X [45] X X [46] X X [47] X X connected [48] X X [49] X X X X [50] X X [51] X X [52] X X [53] X X X [54] X X [55] X X [56] X X [57] X X [58] not specified [59] X X [60] X X [61] X X [62] X X [63] X X...

    [...]

  • ...[24] Chellappan, Vanniarajan and Sivalingam, Krishna M 2013 proceeding 2013 19th IEEE Workshop on Local & Metropolitan Area Networks (LANMAN) [25] V....

    [...]

  • ...[24,25,34] H(G) = − ∑ (u,v)∈E p(u, v) log p(u, v) p(u, v) = η•,•(u, v) ∑(x,y)∈E η•,•(x, y) where η•,•(u, v) is the...

    [...]


Journal ArticleDOI
TL;DR: This work aims to develop a Hybrid algorithm Dijkstra’s Floyd Warshall algorithm to solve entropy maximization routing protocol problem and is compared with the existing in order to find the best and shortest paths.

...read more

Abstract: The shortest path problem is to find a path between two vertices on a given graph, such that the sum of the weights on its constituent edges is minimized. The classic Dijkstra’s algorithm was designed to solve the single source shortest path problem for a static graph. It works starting from the source node and calculating the shortest path on the whole network. This work aims to develop a Hybrid algorithm Dijkstra’s Floyd Warshall algorithm to solve entropy maximization routing protocol problem. The algorithm has to find the shortest path between the source and destination nodes. Route guidance algorithm is use to find best shortest path in routing network, this is poised to minimize costs between the origin and destination nodes. The proposed algorithm is compared with the existing in order to find the best and shortest paths. General Terms Dijkstra’s Algorithm, Floyd-Warshall Algorithm, Hybrid algorithm Dijkstra’s-Floyd Warshall (HDFWA), Entropy.

...read more

2 citations


Cites background from "An entropy maximization problem in ..."

  • ...Network Centrality [1]: The network centrality or the entropy of SPBC is defined by...

    [...]

  • ...The Shortest Path Betweenness Centrality (SPBC) of a link (i) is defined as (1) Network Centrality [1]: The network centrality or the entropy of SPBC is defined by (2) Where, (3) represents the random variable associated with the probability distribution formed from equation (3)....

    [...]

  • ...Shortest Path Betweenness Centrality [1]: Let represent the total number of shortest paths between every pair of source-destination nodes (s, t): s and t V....

    [...]


References
More filters

Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.

...read more

Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

...read more

60,029 citations


Journal ArticleDOI
E. T. Jaynes1Institutions (1)
15 Oct 1957-Physical Review
Abstract: Information theory provides a constructive criterion for setting up probability distributions on the basis of partial knowledge, and leads to a type of statistical inference which is called the maximum-entropy estimate. It is the least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information. If one considers statistical mechanics as a form of statistical inference rather than as a physical theory, it is found that the usual computational rules, starting with the determination of the partition function, are an immediate consequence of the maximum-entropy principle. In the resulting "subjective statistical mechanics," the usual rules are thus justified independently of any physical argument, and in particular independently of experimental verification; whether or not the results agree with experiment, they still represent the best estimates that could have been made on the basis of the information available.It is concluded that statistical mechanics need not be regarded as a physical theory dependent for its validity on the truth of additional assumptions not contained in the laws of mechanics (such as ergodicity, metric transitivity, equal a priori probabilities, etc.). Furthermore, it is possible to maintain a sharp distinction between its physical and statistical aspects. The former consists only of the correct enumeration of the states of a system and their properties; the latter is a straightforward example of statistical inference.

...read more

11,158 citations


Journal ArticleDOI
Linton C. Freeman1Institutions (1)
01 Mar 1977-
Abstract: A family of new measures of point and graph centrality based on early intuitions of Bavelas (1948) is introduced. These measures define centrality in terms of the degree to which a point falls on the shortest path between others and there fore has a potential for control of communication. They may be used to index centrality in any large or small network of symmetrical relations, whether connected or unconnected.

...read more

6,934 citations


Journal ArticleDOI
01 Mar 1992-Biometrics
TL;DR: Partial table of contents: Maximum--Entropy Probability Distributions: Principles, Formalism and Techniques.

...read more

Abstract: Partial table of contents: Maximum--Entropy Probability Distributions: Principles, Formalism and Techniques. Maximum--Entropy Discrete Univariate Probability Distributions. Maximum--Entropy Discrete Multivariate Probability Distributions. Maximum--Entropy Continuous Multivariate Probability Distributions. Maximum--Entropy Distributions in Statistical Mechanics. Minimum Discrepancy Measures. Concavity (Convexity) of Maximum--Entropy (Minimum Information) Functions. Equivalence of Maximum--Entropy Principle and Gauss's Principle of Density Estimation. Maximum--Entropy Principle and Contingency Tables. Maximum--Entropy Principle and Statistics. Maximum--Entropy Models in Regional and Urban Planning. Maximum--Entropy Models in Marketing and Elections. Maximum--Entropy Spectral Analysis. Maximum--Entropy Image Reconstruction. Maximum--Entropy Principle in Operations Research. References. Author Index. Subject Index.

...read more

690 citations


"An entropy maximization problem in ..." refers background in this paper

  • ...Principle (MEP) aims to determine a uniform or as broad a probability distribution as possible subject to the available constraints [11], [12]....

    [...]


Book
01 Jan 1989-
Abstract: Partial table of contents: Maximum--Entropy Probability Distributions: Principles, Formalism and Techniques Maximum--Entropy Discrete Univariate Probability Distributions Maximum--Entropy Discrete Multivariate Probability Distributions Maximum--Entropy Continuous Multivariate Probability Distributions Maximum--Entropy Distributions in Statistical Mechanics Minimum Discrepancy Measures Concavity (Convexity) of Maximum--Entropy (Minimum Information) Functions Equivalence of Maximum--Entropy Principle and Gauss's Principle of Density Estimation Maximum--Entropy Principle and Contingency Tables Maximum--Entropy Principle and Statistics Maximum--Entropy Models in Regional and Urban Planning Maximum--Entropy Models in Marketing and Elections Maximum--Entropy Spectral Analysis Maximum--Entropy Image Reconstruction Maximum--Entropy Principle in Operations Research References Author Index Subject Index

...read more

603 citations


Network Information
Related Papers (5)
Performance
Metrics
No. of citations received by the Paper in previous years
YearCitations
20201
20191
20161