scispace - formally typeset
Search or ask a question
Journal ArticleDOI

A Survey of Information Entropy Metrics for Complex Networks.

15 Dec 2020-Entropy (Multidisciplinary Digital Publishing Institute (MDPI))-Vol. 22, Iss: 12, pp 1417
TL;DR: A narrative literature review of information entropy metrics for complex networks is conducted following the PRISMA guidelines, identifying the areas in need for further development aiming to guide future research efforts.
Abstract: Information entropy metrics have been applied to a wide range of problems that were abstracted as complex networks. This growing body of research is scattered in multiple disciplines, which makes it difficult to identify available metrics and understand the context in which they are applicable. In this work, a narrative literature review of information entropy metrics for complex networks is conducted following the PRISMA guidelines. Existing entropy metrics are classified according to three different criteria: whether the metric provides a property of the graph or a graph component (such as the nodes), the chosen probability distribution, and the types of complex networks to which the metrics are applicable. Consequently, this work identifies the areas in need for further development aiming to guide future research efforts.
Citations
More filters
Journal ArticleDOI
17 Sep 2021
TL;DR: By empirical means, this paper shows by empirical means that node importance can be evaluated by a dual perspective—by combining the traditional centrality measures regarding the whole network as one unit, and by analyzing the node clusters yielded by community detection.
Abstract: The stability and robustness of a complex network can be significantly improved by determining important nodes and by analyzing their tendency to group into clusters. Several centrality measures for evaluating the importance of a node in a complex network exist in the literature, each one focusing on a different perspective. Community detection algorithms can be used to determine clusters of nodes based on the network structure. This paper shows by empirical means that node importance can be evaluated by a dual perspective—by combining the traditional centrality measures regarding the whole network as one unit, and by analyzing the node clusters yielded by community detection. Not only do these approaches offer overlapping results but also complementary information regarding the top important nodes. To confirm this mechanism, we performed experiments for synthetic and real-world networks and the results indicate the interesting relation between important nodes on community and network level.

15 citations

Journal ArticleDOI
01 Feb 2022-Entropy
TL;DR: A new multiple local attributes-weighted centrality (LWC) based on information entropy, combining degree and clustering coefficient is proposed; both one-step and two-step neighborhood information are considered for evaluating the influence of nodes and identifying influential nodes in complex networks.
Abstract: Identifying influential nodes in complex networks has attracted the attention of many researchers in recent years. However, due to the high time complexity, methods based on global attributes have become unsuitable for large-scale complex networks. In addition, compared with methods considering only a single attribute, considering multiple attributes can enhance the performance of the method used. Therefore, this paper proposes a new multiple local attributes-weighted centrality (LWC) based on information entropy, combining degree and clustering coefficient; both one-step and two-step neighborhood information are considered for evaluating the influence of nodes and identifying influential nodes in complex networks. Firstly, the influence of a node in a complex network is divided into direct influence and indirect influence. The degree and clustering coefficient are selected as direct influence measures. Secondly, based on the two direct influence measures, we define two indirect influence measures: two-hop degree and two-hop clustering coefficient. Then, the information entropy is used to weight the above four influence measures, and the LWC of each node is obtained by calculating the weighted sum of these measures. Finally, all the nodes are ranked based on the value of the LWC, and the influential nodes can be identified. The proposed LWC method is applied to identify influential nodes in four real-world networks and is compared with five well-known methods. The experimental results demonstrate the good performance of the proposed method on discrimination capability and accuracy.

12 citations

Journal ArticleDOI
TL;DR: In this paper , the authors introduce and validate a mathematical model of the information processing capacity of a brain region in terms of neuronal activity, input storage capacity, and the arrival rate of afferent information.
Abstract: Neurophysiological measurements suggest that human information processing is evinced by neuronal activity. However, the quantitative relationship between the activity of a brain region and its information processing capacity remains unclear. We introduce and validate a mathematical model of the information processing capacity of a brain region in terms of neuronal activity, input storage capacity, and the arrival rate of afferent information. We applied the model to fMRI data obtained from a flanker paradigm in young and old subjects. Our analysis showed that-for a given cognitive task and subject-higher information processing capacity leads to lower neuronal activity and faster responses. Crucially, processing capacity-as estimated from fMRI data-predicted task and age-related differences in reaction times, speaking to the model's predictive validity. This model offers a framework for modelling of brain dynamics in terms of information processing capacity, and may be exploited for studies of predictive coding and Bayes-optimal decision-making.

4 citations

Journal ArticleDOI
01 Aug 2022-Sensors
TL;DR: An attention-fusion entropy weight method (En-Attn) for capturing warning features is proposed and an attention-based temporal convolutional neural network (ATCN) is used to predict the warning signals.
Abstract: The capture and prediction of rainfall-induced landslide warning signals is the premise for the implementation of landslide warning measures. An attention-fusion entropy weight method (En-Attn) for capturing warning features is proposed. An attention-based temporal convolutional neural network (ATCN) is used to predict the warning signals. Specifically, the sensor data are analyzed using Pearson correlation analysis after obtaining data from the sensors on rainfall, moisture content, displacement, and soil stress. The comprehensive evaluation score is obtained offline using multiple entropy weight methods. Then, the attention mechanism is used to weight and sum different entropy values to obtain the final landslide hazard degree (LHD). The LHD realizes the warning signal capture of the sensor data. The prediction process adopts a model built by ATCN and uses a sliding window for online dynamic prediction. The input is the landslide sensor data at the last moment, and the output is the LHD at the future moment. The effectiveness of the method is verified by two datasets obtained from the rainfall-induced landslide simulation experiment.

4 citations

Journal ArticleDOI
TL;DR: In this article , a novel family of Szeged-like entropies and several features of the entropie family are investigated and a cut method for computing these entropy from quotient graphs is proposed.

2 citations

References
More filters
Journal ArticleDOI
TL;DR: This final installment of the paper considers the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now.
Abstract: In this final installment of the paper we consider the case where the signals or the messages or both are continuously variable, in contrast with the discrete nature assumed until now. To a considerable extent the continuous case can be obtained through a limiting process from the discrete case by dividing the continuum of messages and signals into a large but finite number of small regions and calculating the various parameters involved on a discrete basis. As the size of the regions is decreased these parameters in general approach as limits the proper values for the continuous case. There are, however, a few new effects that appear and also a general change of emphasis in the direction of specialization of the general results to particular cases.

65,425 citations


"A Survey of Information Entropy Met..." refers background in this paper

  • ...This work aims at conducting a survey of existing graph entropy metrics that are specifically based on information entropy, as described by Shannon’s formulation [1]....

    [...]

  • ...Shannon [1] developed the concept of information entropy, which quantifies the average number of bits needed to store or communicate a message: one cannot store or communicate a message with n different symbols in less than log2 n bits....

    [...]

Journal ArticleDOI
TL;DR: Moher et al. as mentioned in this paper introduce PRISMA, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses, which is used in this paper.
Abstract: David Moher and colleagues introduce PRISMA, an update of the QUOROM guidelines for reporting systematic reviews and meta-analyses

62,157 citations

Journal ArticleDOI
04 Jun 1998-Nature
TL;DR: Simple models of networks that can be tuned through this middle ground: regular networks ‘rewired’ to introduce increasing amounts of disorder are explored, finding that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs.
Abstract: Networks of coupled dynamical systems have been used to model biological oscillators, Josephson junction arrays, excitable media, neural networks, spatial games, genetic control networks and many other self-organizing systems. Ordinarily, the connection topology is assumed to be either completely regular or completely random. But many biological, technological and social networks lie somewhere between these two extremes. Here we explore simple models of networks that can be tuned through this middle ground: regular networks 'rewired' to introduce increasing amounts of disorder. We find that these systems can be highly clustered, like regular lattices, yet have small characteristic path lengths, like random graphs. We call them 'small-world' networks, by analogy with the small-world phenomenon (popularly known as six degrees of separation. The neural network of the worm Caenorhabditis elegans, the power grid of the western United States, and the collaboration graph of film actors are shown to be small-world networks. Models of dynamical systems with small-world coupling display enhanced signal-propagation speed, computational power, and synchronizability. In particular, infectious diseases spread more easily in small-world networks than in regular lattices.

39,297 citations

Journal ArticleDOI
TL;DR: In this article, three distinct intuitive notions of centrality are uncovered and existing measures are refined to embody these conceptions, and the implications of these measures for the experimental study of small groups are examined.

14,757 citations


"A Survey of Information Entropy Met..." refers background in this paper

  • ...Closeness centrality is defined in terms of distance and it can be interpreted either as a metric of independence from control by others or as a measure of access or efficiency [4]....

    [...]