scispace - formally typeset
Search or ask a question

Showing papers on "Network theory published in 1987"


Journal ArticleDOI
TL;DR: In this article, the rank orderings by the four networks whose analysis forms the heart of this paper were analyzed and compared to the rank ordering by the three centrality measures, i.e., betweenness, nearness, and degree.
Abstract: 2In an influential paper, Freeman (1979) identified three aspects of centrality: betweenness, nearness, and degree. Perhaps because they are designed to apply to networks in which relations are binary valued (they exist or they do not), these types of centrality have not been used in interlocking directorate research, which has almost exclusively used formula (2) below to compute centrality. Conceptually, this measure, of which c(ot, 3) is a generalization, is closest to being a nearness measure when 3 is positive. In any case, there is no discrepancy between the measures for the four networks whose analysis forms the heart of this paper. The rank orderings by the

4,482 citations


Journal ArticleDOI
TL;DR: A positive assessment of the past scientific accomplishments of network research is tempered by serious epistemological problems: inadequate attention to network theory, network sampling problems that restrict the generalizability of results, an underemphasis upon data-gathering and measurement.

102 citations


Journal ArticleDOI
10 Sep 1987
TL;DR: In this article, the authors present the results of the Thirteenth Annual Meeting of the Berkeley Linguistics Society (BLS) and present a survey of the proceedings of the meeting.
Abstract: Proceedings of the Thirteenth Annual Meeting of the Berkeley Linguistics Society (1987), pp. 195-206

64 citations


Journal ArticleDOI
TL;DR: In this paper, the authors make use of the complexity from noise principle and probabilistic information theory to state formally, and in a kind of negative way, what the necessary conditions are for self-organization with increase in complexity (i.e., creation of information).
Abstract: In talking about self creation of meaning, a general introduction is in order to state the scope of the problem – since it would be self defeating to talk about meaning without noting the different meanings of meaning in different contexts of investigation and in different disciplines. I will then concentrate on two different attempts to state the problem of creation of meaning in biological systems. The first has to do with the so-called complexity from noise principle and makes use of probabilistic information theory to state formally, and in a kind of negative way, (stemming from the fact that Shannon's information theory does not explicitly take into account the meaning of information), what the necessary conditions are for self-organization with increase in complexity (i.e., creation of information). The second approach which I will discuss makes use of automata network theory. It is applied to: (i) computer simulations of phenotypic expressions of genomes viewed as collective behaviors of large numbers of interacting genes; (ii) a computer model of a machine which is built randomly, i.e., with no purpose in mind, and whose behavior is that of a pattern recognizer where the pattern to be recognized is the outcome of the functioning of the machine itself. A posteriori, this pattern can be analyzed and the functioning of the recognition system can be understood. As in the observation and analysis of a natural, non man-made, living system, the criterion for meaningfulness does not follow necessarily an a priori order set up by a purposeful rational being.

62 citations


Journal ArticleDOI
TL;DR: The theory is a generalization of the centrality function which is applicable to the network, where weight is assigned to the point and the length and capacity are assign to the edge.
Abstract: Often, in a system with a network structure, such as the communication network, traffic network and social relationships, the centrality of a point is discussed. The centrality of a point is usually measured by its relation to other points, and the distance has been used as a measure for the relation. Recently, a method based on the capacity has also been proposed. In contrast to the past theory of centrality function, which discriminated the cases into the relation between points into the distance and the capacity, this paper presents a unified theory by introducing the concept of modification of the space with respect to a point. Based on the modification of the space, the axiomatic system concerning the centrality and semi-centrality functions are newly defined, extracting the properties shared by the past centrality functions. The characterization of the real-valued function defined on a point is made based on the proposed axiomatic system, and it is shown that the proposed theory includes the past major results concerning the centrality function. Finally, the theory is applied to the network, where the weight is assigned to the point, and the length and capacity are assigned to the edge. Thus, the theory is a generalization of the centrality function which is applicable to the network, where weight is assigned to the point and the length and capacity are assigned to the edge.

1 citations


Journal ArticleDOI
TL;DR: A simple method of numerically evaluating network functions by computer is presented, based on, and reinforces, fundamental concepts encountered in introductory network theory courses and is readily grasped by students.
Abstract: A simple method of numerically evaluating network functions by computer is presented. The method is based on, and reinforces, fundamental concepts encountered in introductory network theory courses and is readily grasped by students. Basic language code is given for three subroutines which numerically perform Thevenin, series to parallel and parallel to series transformations.