scispace - formally typeset
Search or ask a question

Showing papers in "Systems Research and Behavioral Science in 1978"


Journal ArticleDOI
TL;DR: In this paper, the authors show how to convert a deterministic catastrophe model into a stochastic model with the aid of several reasonable assumptions, and how to calculate explicitly the resulting multimodal equilibrium probability density.
Abstract: Nonlinear models such as have been appearing in the applied catastrophe theory literature are almost universally deterministic, as opposed to stochastic (probabilistic). The purpose of this article is to show how to convert a deterministic catastrophe model into a stochastic model with the aid of several reasonable assumptions, and how to calculate explicitly the resulting multimodal equilibrium probability density. Examples of such models from epidemiology, psychology, sociology, and demography are presented. Lastly, a new statistical technique is presented, with which the parameters of empirical multimodal frequency distributions may be estimated.

152 citations


Journal ArticleDOI
TL;DR: In this paper, catastrophe models are proposed for social behavior, attitude change, and some other related processes, synthesizing many diverse and sometimes seemingly contradictory findings and suggest some unique hypotheses.
Abstract: Much past mathematical modeling of psychological processes has assumed (a) smooth and continuous change in behavior or cognitions or, if not, (b) simple step functions or thresholds. Many psychological phenomena which are not smooth and continuous, or do not meet the assumptions of simple step functions, seem to demonstrate the properties of the cusp or butterfly catastrophes. Catastrophe models have already been proposed for many psychological phenomena. In this paper catastrophe models are proposed for social behavior, attitude change, and some other related processes. These models synthesize many diverse and sometimes seemingly contradictory findings and suggest some unique hypotheses. The difficulties of testing catastrophe models are discussed and some means for improving empirical tests are suggested. It is concluded that catastrophe models hold promise for theoretical development in social psychology wherever high quality measurement and scaling techniques are available or can be developed.

101 citations


Journal ArticleDOI
TL;DR: The authors examined five voting procedures or social choice functions with respect to these conditions plus a condition requiring the social choice set to be externally stable, and concluded that two functions of the five examined are clearly better than the remaining three.
Abstract: This paper is concerned with the use of voting procedures by groups of individuals to produce a social choice. Several new conditions that voting systems might be required to meet are suggested in this paper. These criteria are varying expressions of the notion that when there are more than two alternatives, social choice should be based on the results of majority voting on pairs. Five voting procedures or social choice functions are examined with respect to these conditions plus a condition requiring the social choice set to be externally stable. Two functions of the five examined are clearly better, according to these conditions, than the remaining three.

79 citations


Journal ArticleDOI
TL;DR: The paper shows that the use of catastrophe concepts need not imply a special catastrophe model if analysis of any given dynamic model is extended to the problem of the implications for the events in state space of trajectories in parameter space.
Abstract: Even social and biological scientists familiar with the uses of mathematical models in their fields will find conceptually novel and sometimes mathematically forbidding elements in catastrophe models. One major difficulty is that these models use concepts that are more familiar to mathematicians and physicists than to behavioral scientists: potential functions and gradient systems, for instance. Starting from the basic elements of linear systems and their usual analysis (trajectories, equilibrium, stability of equilibria, comparative statics) it is possible to construct a conceptual and mathematical path from these familiar notions to more unfamiliar ideas and techniques dealing with structural stability and catastrophes. This is the aim: to present such a path from simple to complex, from old to new, from familiar to unfamiliar. In a later section of the paper, the approaches of Thom and Zeeman to the scientific use of the catastrophe concept are presented. The conceptual framework employed in the paper is that of general systems theory and the basic notion within which all the remaining techniques and ideas find their place is that of a state-determined dynamic system. Apart from such an expository function, the paper also shows that the use of catastrophe concepts need not imply a special catastrophe model if analysis of any given dynamic model is extended to the problem of the implications for the events in state space of trajectories in parameter space. In this way, an effort is made to make this notion more accessible and useful in science.

77 citations


Journal ArticleDOI
TL;DR: A tentative nonlinear model is proposed on the hypothesis that the graph of this relation is the equilibrium set of a dynamic system and is naturally compatible with the subjective dichotomy of bistable perception.
Abstract: Multistable figures show that the stimulus-percept relation is not a single valued function. We therefore propose a tentative nonlinear model on the hypothesis that the graph of this relation is the equilibrium set of a dynamic system. For simplicity and to obtain testable predictions, we consider a system whose bifurcations are gradient-like and thus generically described by the elementary catastrophes. We motivate this general model, and then show how, in conjunction with the principle of minimal singularity, it implies cusp catastrophe geometry in a specific perceptual example. Indeed, we argue for canonical cusp geometry in this case. The model incorporates naturally certain observed features of multistable perception, such as hysteresis and bias effects. Despite being a continuum model it is naturally compatible with the subjective dichotomy of bistable perception. The model makes testable predictions which may easily be extended to other specific examples of multistable perception.

62 citations


Journal ArticleDOI
TL;DR: In this paper, an empirical taxonomy of influence strategy mixes is developed and the choice of a particular influence mix is significantly related to the characteristics of the target person and of the dyadic relationship between focal and target persons.
Abstract: This paper proposes and evaluates a model of the choice and context of individual influence strategy mixes between dyads in complex organizational systems. Such influence strategy mixes are relevant to dyads from the same work group subsystem and from different work group subsystems, as well as to dyads from different organizational systems. An empirical taxonomy of influence strategy mixes is developed and the choice of a particular influence mix is shown to be significantly related to the characteristics of the target person and of the dyadic relationship between focal and target persons. Multivariate analytical procedures are used to delineate the potency of these characteristics in differentiating among the mix of influence strategies employed. The results are consistent with those of earlier studies and provide new insights on the nature of dyadic influence in organizational life.

55 citations


Journal ArticleDOI
TL;DR: Using the war model of Isnard and Zeeman as a paradigm, it is shown that many catastrophe theory models in social science possess serious weaknesses as discussed by the authors, and the catastrophes supposedly account for real-life behavior, but actually are only a restatement of the fact that discontinuities exist.
Abstract: Using the war model of Isnard and Zeeman as a paradigm, it is shown that many catastrophe theory models in social science possess serious weaknesses. The catastrophes supposedly account for real-life behavior, but actually are only a restatement of the fact that discontinuities exist. No deep mathematical results are actually used. The hypotheses are ambiguous or far-fetched. In addition, Thom's theorem, the mathematical centerpiece of applied catastrophe theory, is inherently uninformative for applications. The theory is helpful on neither the qualitative nor the quantitative level. Finally, better and simpler mathematical tools exist.

53 citations


Journal ArticleDOI
TL;DR: In this article, the usefulness of general systems theory for concept totalizing the young child as a member of a family, a living system at the level of the group, is explored.
Abstract: In the past, approaches to the analysis of early socialization have emphasized the importance of the mother-child dyad. As researchers have broadened their interests to include the father's role in early socialization the need to conceptualize the infant as a member of a family has emerged. The usefulness of general systems theory for concept totalizing the young child as a member of a family, a living system at the level of the group, is explored. Current research findings are related to a system framework and suggestions for conceptualizing the steady state processes of the family are proposed.

45 citations


Journal ArticleDOI
TL;DR: In this article, the role of equity considerations in decisions about resource allocation and policy decisions at the levels of the group, organization, society, and supranational systems is discussed.
Abstract: Some controversy surrounds the role of equity considerations in decisions about resource allocation and policy decisions at the levels of the group, organization, society, and supranational systems. In two descriptive studies arbitration judgments for hypothetical two-party conflicts made by 70 public administration students were compared to predictions from four social welfare functions incorporating equity considerations: Sen's absolute deviation model, Sen's variance model, Keeney and Raiffa's multilinear model, and Rawls' maximin principle. Two experiments—one using regression analyses of ratings, the other using conjoint measurement analyses of rankings—yielded the same conclusions. The arbitration judgments are best described by Sen's absolute deviation model, W = [(a + b)/2] + (1 - )[- | a - b |/2], where W is the arbiter's evaluative rating of a contract which assigns utilities a and b to parties 1 and 2, respectively, and where is an empirical constant representing the tradeoff between utility and equity. This model not only has a better statistical fit to the data than do the others tested, but also predicts the substantial violations of Pareto optimality which did occur. Over half the participants in each experiment showed a willingness to accept reduced total utility in order to obtain a more equitable distribution of utility to the two parties. Implications for axiomatic social welfare functions, for future research, and for policy applications are discussed.

45 citations


Journal ArticleDOI
TL;DR: In this article, a model is presented to describe how a calculating lobbyist should allocate resources most effectively among voters in a legislature, given that there is no opposition lobbying effort, and that equilibrium prices exist provided there are no veto players.
Abstract: A model is presented to describe how a calculating lobbyist should allocate resources most effectively among voters in a legislature, given that there is no opposition lobbying effort. Equilibrium prices exist provided there is no veto player. When there is opposition, a different model and a different concept of equilibrium result. The outcome of this model is treated for a case when the opposing forces have unequal resources. This results in an equilibrium which is essentially the nucleolus. Application is made to us Presidential campaigning for the Electoral College, and to the setting of legislators' salaries. The models are also shown to lead to new concepts of measuring the relative power of voters. While these measures are related for the competitive and noncompetitive models, their differences also point to the importance of considering the context in which power is to be measured.

35 citations


Journal ArticleDOI
TL;DR: The results of an experiment to test hypotheses derived from Olson's theory of collective action at the group level are reported in this article, where strong evidence in favor of the theory is found with respect to the effects of group size on the amount of the collective good provided and on the degree of suboptimality associated with noncooperative provision of the good.
Abstract: The results of an experiment to test hypotheses derived from Olson's theory of collective action at the group level are reported. Strong evidence in favor of the theory is found with respect to the effects of group size on the amount of the collective good provided and on the degree of suboptimality associated with noncooperative provision of the good. For reasons discussed in the paper, the experimental results do not support the exploitation hypothesis.

Journal ArticleDOI
TL;DR: In this article, a new experimental paradigm for two-person bargaining is introduced, which is intended to control equity and related processes as they affect the process and outcome of bargaining and is used in a comparative evaluation of the Nash and Smorodinsky-Kalai solutions to the bargaining problem.
Abstract: A new experimental paradigm for two-person bargaining is introduced. It can be applied to decision making in two-person systems at the level of the group as well as at the level of the organization, the society, and the supranational system. The paradigm is used in a comparative evaluation of the Nash and Smorodinsky-Kalai solutions to the bargaining problem. The aim is to devise a means for experimentally analyzing two-person two-party bargaining processes which avoids a number of difficulties afflicting previous experimental paradigms. In particular, the paradigm presented here is intended to control equity and related processes as they affect the process and outcome of bargaining. A discussion of means for conceiving equity and related processes using the Nash and Smorodinsky-Kalai bargaining models is provided together with an illustration of procedures for experimentally controlling them. The experimental results strongly support the Smorodinsky-Kalai model over the Nash model.


Journal ArticleDOI
TL;DR: In this article, the correlation between these characteristics within a unit, i.e., the individual correlation, may be a different parameter from the correlation among summary measures of each unit, the ecological or systemwide correlation, which is not taken into consideration in designing studies to evaluate relationships of characteristics of variable systems.
Abstract: This is a methodological study, applicable to all levels of living systems, which contrasts system wide measures with measures of individual units or components. If one samples K units and, in Unit i, samples ni observations of two characteristics, the correlation between these characteristics within a unit, the individual correlation, may be a different parameter from the correlation between summary measures of each unit, the ecological or systemwide correlation. If such a distinction is not taken into consideration in designing studies to evaluate relationships of characteristics of variable systems and in analyzing data from such studies, confusing and inaccurate inferences may be drawn. This general methodological point is illustrated by a study of testosterone level and orgasmic frequency in the human male.

Journal ArticleDOI
TL;DR: In this paper, it is argued that the sociotechnical macrosystem can best be viewed as a field of interacting forces which manifest certain unique phenomena. But, the authors argue that these models therefore lack fidelity, the faithful capturing of real world phenomena.
Abstract: It is proposed that further qualitative progress in the computer simulation modeling of complex societal systems is dependent on incorporation of field-theoretic constructs embodying truly behavioral and social forces. A number of separate developments are reviewed in the context of the revitalization of field theory in social science. These are: field theory in sociotechnical systems theory, hierarchy theory, critical phenomena, and catastrophe theory. It is argued that the sociotechnical macrosystem can best be viewed as a field of interacting forces which manifests certain unique phenomena. The phenomena emphasized are: slow, continuous change followed by sudden, discontinuous or catastrophic jumps; incipient changes in the field; hierarchical restructuring; emergence of new properties at successive hierarchical levels; and turbulence of the environmental field. Examples of these phenomena, taken from the dynamics and evolution of societies, are given. Two important contemporary advances in the computer simulation modeling of societies, the Systems Dynamics National Model of the United States socioeconomic system and Mankind at the Turning Point, are discussed with regard both to reflecting field-theoretic interpretations of the dynamics of the macrosystem and to the nontrivial incorporation of behavioral and social factors. Both modeling approaches are found wanting. It is argued that these models therefore lack fidelity, the faithful capturing of real world phenomena.

Journal ArticleDOI
TL;DR: In this paper, the authors tested four models of coalition behavior in decision-making on systems at the group level, in four different games, each of which was played by five individuals.
Abstract: This study tested four models of coalition behavior in decision making on systems at the group level, in four different games, each of which was played by five individuals. Each game established a different distribution of power among the players. Data for coalition frequencies and for the payoffs of the players when they were included in a particular coalition supported Komorita & Chertkoff s (1973) bargaining theory over Komorita's (1974) weighted probability model and Gamson's (1961) minimum resource theory. When the overall payoffs received by each of the players in each game were used as a measure of the player's success in bargaining, the predictions of the Roth-Shipley and the weighted probability models received mixed support. Finally, the results suggested when the “strength is weakness” phenomenon might be expected to occur. Players with equal Shapley values but different resources within a particular game supported “strength is weakness.” Players with different Shapley values supported a “strength is strength” conclusion. Reports of the players provided a possible explanation of the underlying causes of this phenomenon.

Journal ArticleDOI
TL;DR: In this paper, the authors present a review of sensitivity analysis of multi-attribute utility models in an attempt to answer the question of whether such additional complexities are worth the effort and complexity.
Abstract: Multiattribute utility models are used for evaluating alternatives when there are more than one criterion present. There is a trend toward the development of complicated versions of these models. These versions, although theoretically more accurate in the representation of decision makers' attitudes, require assessment procedures which are more difficult and time consuming to implement than simpler models. This paper reviews theoretical and empirical research involving the sensitivity analysis of multiattribute utility models in an attempt to answer the question of whether such additional complexities are worthwhile. Both deterministic and probabilistic models are considered and the studies are divided into four areas: (1) those involving sensitivity to the form of the multiattribute utility function; (2) those involving sensitivity to the parameters of the functions; (3) those involving sensitivity to the form and parameters of individual single attribute utility functions; and (4) those involving the relationship between deterministic and probabilistic models. A discussion of the results is given at the end.

Journal ArticleDOI
TL;DR: In this article, a cusp model of consumer behavior was developed to describe the effects of both price and price sensitivity on brand loyalty, and the qualitative model was instrumental in planning a strategy for a changing market environment characterized by inflation and decreasing price sensitivity.
Abstract: A cusp model of consumer behavior was developed to describe the effects of both price and price sensitivity on brand loyalty. The qualitative model was instrumental in planning a strategy for a changing market environment characterized by inflation and decreasing price sensitivity.

Journal ArticleDOI
TL;DR: In this article, it is proved that the Banzhaf index of power is equivalent to this measure and that it is the appropriate measure to use when the results, rather than the process of coalition formation, are the important considerations.
Abstract: A measure of power is defined for a weighted voting game which is based on the probability that an individual is in a winning coalition. It is proved that the Banzhaf index of power is equivalent to this measure. This suggests that the Banzhaf index is the appropriate measure to use when the results, rather than the process of coalition formation, are the important considerations. This article is applicable to decision making in living systems at the levels of the group, the organization, society, or the supranational system.


Journal ArticleDOI
Yves Balasko1
TL;DR: In this paper, a general economic equilibrium theory is formulated in a way similar to Thom's catastrophe theory, which permits the use of differential topology in the study of the laws of change of economic equilibria, especially from a qualitative viewpoint.
Abstract: A general economic equilibrium theory is formulated in a way similar to Thom's catastrophe theory. This permits the use of differential topology in the study of the laws of change of economic equilibria, especially from a qualitative viewpoint. This paper is primarily concerned with the relationship between the behavior of economic equilibria and their number, and it includes an application to international trade theory.

Journal ArticleDOI
TL;DR: In this paper, the catastrophe manifold is used as an illustrative metaphor for the local description of biological development, and it replaces earlier theories of development that utilized paths on an ad hoc epigenetic landscape, for example, with one that generates trajectories on a well-defined hypersurface.
Abstract: Elementary catastrophe theory provides a method for the qualitative description of systems with associated potential energy functions. It can serve at least as an illustrative metaphor for the local description of biological development. It replaces earlier theories of development that utilized paths on an ad hoc epigenetic landscape, for example, with one that generates trajectories on a well-defined hypersurface, the catastrophe manifold. While the elementary theory does provide a much richer mathematical language than the earlier geometric theories, it suffers from several drawbacks, outlined in this paper, which prevent it from being a conclusive theory for biological development.

Journal ArticleDOI
TL;DR: It is suggested that a university community might be a good place for early trials of such redesigned governance systems, because these geographically compact communities have the participants and the technologies necessary for such trials and presumably, the capability for intelligent self-analysis of the experience.
Abstract: It is reasoned that the time must be ripe for a serious effort aimed at restructuring the political processes supporting the ideal of democracy. The many theoretical insights and experimental findings published by scientists and other scholars during recent decades, especially at the group level, are important for developing the new design. One such design is presented in this paper and defended and criticized in order to illustrate the nature of such an effort. The scheme is labeled dynamic value voting with affinity group representation and includes computer conferencing as an emergent, interactive communication technology that makes possible the implementation of such processes for the first time ever. The design draws upon results from game theory, decision theory, social choice theory, mathematical programing theory, and adds a market concept under which group members are enabled to dynamically express their preferences to allow for differences in interpersonal utilities and in personal expectations. It is suggested that a university community might be a good place for early trials of such redesigned governance systems. These geographically compact communities have the participants and the technologies necessary for such trials and presumably, the capability for intelligent self-analysis of the experience.

Journal ArticleDOI
TL;DR: In this article, a model of employee turnover is developed using explanatory theories and studies of systems at the organization level dealing with behavior and cost data, and normative guidelines are produced by which the costs due to turnover are balanced against the costs of the measures needed to reduce turnover.
Abstract: A model of employee turnover is developed using explanatory theories and studies of systems at the organization level dealing with behavior and cost data. The model optimizes the costs associated with turnover and ignores the turnover rates. It is a synthesis of economic theory and behavioral phenomena. From this model normative guidelines are produced by which the costs due to turnover are balanced against the costs of the measures needed to reduce turnover. Obtaining the point of optimality detailed by the model rests upon the organization's knowledge of its costs and the attitudinal dispositions of its employees. The crucial assumptions behind the model necessitating further investigation are discussed and refinements to the model are suggested.

Journal ArticleDOI
TL;DR: A review of empirical evidence indicates that the hypothesis that the larger the number of times a signal is retransmitted in a noisy channel, the smaller is the optimal number of categories into which a sample of the signal can be broken down.
Abstract: Applying information theory to living systems suggests a hypothesis: The larger the number of times a signal is retransmitted in a noisy channel, the smaller is the optimal number of categories into which a sample of the signal can be broken down. This cross- level hypothesis is tested at the organism and organ levels of living systems. At the organism level, the hypothesis suggests that the “magical number” of categories into which a person breaks down information input is a function of signal-to-noise ratio. This number drops when one anticipates having to retransmit information which one has received. At the organ level, the hypothesis suggests a reason for the hemispheric specialization in the human brain. In particular, since information to be retransmitted seems first to undergo verbalization and since the optimal number of categories decreases when information is to be retransmitted, the left hemisphere of the brain, the verbalization hemisphere, has a smaller optimum number of categories than the right hemisphere. A review of empirical evidence indicates that it generally supports the hypothesis. Suggestions for further studies include possible research to other levels of living systems.

Journal ArticleDOI
TL;DR: The initial findings show that present upstream, i.e., toward the beginning of the system, workload affects future behavior of the criminal justice system downstream, rather than the reverse, especially for the more serious crime types.
Abstract: Much has been written recently about the underlying motivation of criminals and about the behavior of the criminal justice system itself. Typically, such studies have shown negative correlations between crime rate, probability of apprehension and probability of conviction. However, in addressing the behavior of the system, the present study examines measures of workload for the criminal justice system rather than performance indicators such as those above. Preliminary evidence is examined over several provinces of Canada and is used to investigate the hypothesis that there is adaptation by different subsystems in the criminal justice system to changing workloads in other parts. The initial findings show that present upstream, i.e., toward the beginning of the system, workload affects future behavior of the system downstream, i.e., toward the end, rather than the reverse, especially for the more serious crime types.

Journal ArticleDOI
TL;DR: In this article, the Murngin Tribe's marriage structures are analyzed, and not merely delineated graphically, and the dynamics result from the evolution from one homomorphic structure to another.
Abstract: This article concerns interrelated variables in the subsystems that produce the next generation in particular societal systems of various Australian tribes. Special attention is directed to the Murngin tribe. Previous analytic approaches have been static, employing group theory. This literature is critically reviewed and then a dynamic analysis is suggested as an improved procedure. In this approach tribal marriage structures are analyzed, and not merely delineated, graphically. The dynamics result from the evolution of societies from one homomorphic structure to another. Foundations are laid using such analytic methods to study marriage structures that are less rigid than those of the Australian tribes.

Journal ArticleDOI
TL;DR: The authors examined the effects of the prior relationship between the bargaining parties, constituent pressure on the bargaining representative, and the counterpart's strategy on decision making by bargaining in systems at the levels of groups, organizations, societies, and supranational systems and found that a cooperative-competitive counterpart strategy gave a higher level of overall subject cooperation, but a more competitive final decision, a less favorable attitude toward the counterpart, and less willingness to retain the counterpart in future bargaining.
Abstract: This article concerns decision making by bargaining in systems at the levels of groups, organizations, societies, and supranational systems. A 2 × 2 × 2 (conflictual or peaceful × strong or weak × cooperative-competitive or competitive-cooperative) factorial design was used to examine the effects of the prior relationship between the bargaining parties, constituent pressure on the bargaining representative, and the counterpart's strategy. The dependent variables were subject decisions in the bargaining simulation, attitudinal factors, and future intentions. Data were collected from 160 male subjects in a laboratory simulation of group interactions. The task consisted of assuming the role of a management bargaining representative and making ten bargaining decisions on various issues. Results indicated that a cooperative-competitive counterpart strategy gave a higher level of overall subject cooperation, but a more competitive final decision, a less favorable attitude toward the counterpart, and less willingness to retain the counterpart in future bargaining. Strong constituent pressure produced greater subject competitiveness, but a more negative attitude toward the constituent. Peaceful prior relations resulted in a more cooperative initial decision, but did not affect the overall level of cooperation. Interactive effects showed that subjects encountering a cooperative-competitive strategy and conflictual prior relations were more willing to participate in future bargaining when they received strong constituent pressure. Conversely, if a competitive-cooperative strategy and peaceful relations occurred, there was more willingness to participate in the future when weak constituent pressure existed.

Journal ArticleDOI
TL;DR: In this article, the authors present a model of participation in discussion aimed at explaining the deviations observed in testing earlier models dealing with this question, which are accounted for by lack of homogeneity of the group of discussants, who may form not one, but two (or more) subgroups, each with its own ranking.
Abstract: This article deals with decision processes concerning who shall speak in discussions in living systems at the level of the group. It presents a model of participation in discussion, aimed at explaining the deviations observed in testing earlier models dealing with this question. The deviations are accounted for by lack of homogeneity of the group of discussants, who may form not one, but two (or more) subgroups, each with its own ranking.

Journal ArticleDOI
TL;DR: Negative information concerns the introduction of confusion (which increases the number of alternatives under consideration) into the process of obtaining information in order to deduce the correct message from among a set of possible alternatives.
Abstract: Negative information concerns the introduction of confusion (which increases the number of alternatives under consideration) into the process of obtaining information in order to deduce the correct message from among a set of possible alternatives This intuitive description can be represented by a graph theoretic interpretation An analogy to the game of “twenty questions” is depicted in terms of standard binary trees Normative theory imposes certain constraints on the corresponding path through the trees associated with the sequential process, in that each edge of the tree should correspond to the procurement of one bit of information “Negative information occurs when there is a deviation from either the tree or path A classification scheme is developed to distinguish 12 common instances associated with the concept”