scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1981"


Journal ArticleDOI
TL;DR: A model for measuring the efficiency of Decision Making Units =DMU's is presented, along with related methods of implementation and interpretation, and suggests the additional possibility of new approaches obtained from PFT-NFT combinations which may be superior to either of them alone.
Abstract: A model for measuring the efficiency of Decision Making Units =DMU's is presented, along with related methods of implementation and interpretation. The term DMU is intended to emphasize an orientation toward managed entities in the public and/or not-for-profit sectors. The proposed approach is applicable to the multiple outputs and designated inputs which are common for such DMU's. A priori weights, or imputations of a market-price-value character are not required. A mathematical programming model applied to observational data provides a new way of obtaining empirical estimates of extrernal relations-such as the production functions and/or efficient production possibility surfaces that are a cornerstone of modern economics. The resulting extremal relations are used to envelop the observations in order to obtain the efficiency measures that form a focus of the present paper. An illustrative application utilizes data from Program Follow Through =PFT. A large scale social experiment in public school education, it was designed to test the advantages of PFT relative to designated NFT =Non-Follow Through counterparts in various parts of the U.S. It is possible that the resulting observations are contaminated with inefficiencies due to the way DMU's were managed en route to assessing whether PFT as a program is superior to its NFT alternative. A further mathematical programming development is therefore undertaken to distinguish between "management efficiency" and "program efficiency." This is done via procedures referred to as Data Envelopment Analysis =DEA in which one first obtains boundaries or envelopes from the data for PFT and NFT, respectively. These boundaries provide a basis for estimating the relative efficiency of the DMU's operating under these programs. These DMU's are then adjusted up to their program boundaries, after which a new inter-program envelope is obtained for evaluating the PFT and NFT programs with the estimated managerial inefficiencies eliminated. The claimed superiority of PFT fails to be validated in this illustrative application. Our DEA approach, however, suggests the additional possibility of new approaches obtained from PFT-NFT combinations which may be superior to either of them alone. Validating such possibilities cannot be done only by statistical or other modelings. It requires recourse to field studies, including audits e.g., of a U.S. General Accounting Office variety and therefore ways in which the results of a DEA approach may be used to guide such further studies or audits are also indicated.

1,544 citations


Journal ArticleDOI
TL;DR: The results strongly suggest that users who hold realistic expectations prior to implementation are more satisfied with the system and use it more than users whose pre-implementation expectations are unrealistic.
Abstract: Much of the research on MIS implementation which has been conducted in the past decade has focused on identifying and measuring the organizational characteristics which appear to be particularly co...

699 citations


Journal ArticleDOI
TL;DR: This paper explicitly recognizes the nature of F&P as future-oriented decision making activities and, as such, their dependence upon judgmental inputs and suggests reconceptualizing F &P through use of decision-theoretic concepts.
Abstract: The formal practice of forecasting and planning F&P has risen to prominence within a few decades and now receives considerable attention from both academics and practitioners. This paper explicitly recognizes the nature of F&P as future-oriented decision making activities and, as such, their dependence upon judgmental inputs. A review of the extensive psychological literature on human judgmental abilities is provided from this perspective. It is argued that many of the numerous information processing limitations and biases revealed in this literature apply to tasks performed in F&P. In particular, the "illusion of control," accumulation of redundant information, failure to seek possible disconfirming evidence, and overconfidence in judgment are liable to induce serious errors in F&P. In addition, insufficient attention has been given to the implications of numerous studies that show that the predictive judgment of humans is frequently less accurate than that of simple quantitative models. Applied studies of F&P are also reviewed and shown to mirror many of the findings from psychology. The paper subsequently draws implications from these reviews and suggests reconceptualizing F&P through use of decision-theoretic concepts. At the organizational level this involves recognizing that F&P may perform many, often conflicting, manifest and latent functions which should be identified and evaluated through a multi-attribute utility framework. Operationally, greater use should be made of sensitivity analysis and the concept of the value of information.

600 citations


Journal ArticleDOI
TL;DR: In the face of uncertainty, all available information should be used to make inferences or decisions as mentioned in this paper, when probability distributions for an uncertain quantity are obtained from experts, models, or models.
Abstract: Inferences or decisions in the face of uncertainty should be based on all available information. Thus, when probability distributions for an uncertain quantity are obtained from experts, models, or...

456 citations


Journal ArticleDOI
TL;DR: In this article, a case study is used to estimate the parameters and the problem is solved within a geometrical programming framework, and an extensive comparison with alternative procedures suggests this general model leads to significantly different allocation rules and superior profit performance.
Abstract: The allocation of scarce shelf space among competing products is a central problem in retailing. Space allocation affects store profitability through both the demand function, where both main and cross space elasticities have to be considered, and through the cost function procurement, carrying and out-of-stock costs. A model is developed which uniquely incorporates both effects. A case study is used to estimate the parameters and the problem is solved within a geometrical programming framework. An extensive comparison with alternative procedures suggests this general model leads to significantly different allocation rules and superior profit performance.

404 citations


Journal ArticleDOI
TL;DR: In this paper, a nonlinear program for place and show betting was developed to demonstrate that the profits are not due to chance but rather to proper identification of market inefficiencies.
Abstract: Many racetrack bettors have systems. Since the track is a market similar in many ways to the stock market one would expect that the basic strategies would be either fundamental or technical in nature. Fundamental strategies utilize past data available from racing forms, special sources, etc. to "handicap" races. The investor then wagers on one or more horses whose probability of winning exceeds that determined by the odds by an amount sufficient to overcome the track take. Technical systems require less information and only utilize current betting data. They attempt to find inefficiencies in the "market" and bet on such "overlays" when they have positive expected value. Previous studies and our data confirm that for win bets these inefficiencies, which exist for underbet favorites and overbet longshots, are not sufficiently great to result in positive profits. This paper describes a technical system for place and show betting for which it appears to be possible to make substantial positive profits and thus to demonstrate market inefficiency in a weak form sense. Estimated theoretical probabilities of all possible finishes are compared with the actual amounts bet to determine profitable betting situations. Since the amount bet influences the odds and theory suggests that to maximize long run growth a logarithmic utility function is appropriate the resulting model is a nonlinear program. Side calculations generally reduce the number of possible bets in any one race to three or less hence the actual optimization is quite simple. The system was tested on data from Santa Anita and Exhibition Park using exact and approximate solutions that make the system operational at the track given the limited time available for placing bets and found to produce substantial positive profits. A model is developed to demonstrate that the profits are not due to chance but rather to proper identification of market inefficiencies.

245 citations


Journal ArticleDOI
TL;DR: In this article, the authors integrate psychological theories on how consumers process information with economic models and psychometric measurement, and develop a theory for selecting physical features and price to achieve a profit maximizing perceptual position.
Abstract: An important component of marketing strategy is to "position" a product in perceptual space. But to realize a perceptual position we must model the link from physical characteristics to perceptual dimensions and we must use this model to maximize profit. This paper integrates psychological theories on how consumers process information with economic models and psychometric measurement, and it develops a theory for selecting physical features and price to achieve a profit maximizing perceptual position. The theory begins with a Lancaster-like transformation from goods space to characteristics space and investigates the implications of a mapping to a third space, perceptual space. We show that all products efficient in perceptual space must be efficient in characteristics space but not conversely. But consumers vary in their preference and the way they perceive products. Thus we introduce distributional components to the theory and derive both geometric and analytic methods to incorporate this consumer heterogeneity. Next, we investigate costs and derive a perceptual expansion path along which a profit maximizing position must exist. Conjoint analysis and quantal choice models provide the measurements to implement the theory. The theory is illustrated with a hypothetical example from the analgesics market and some of the potential psychometric measurements are illustrated with an empirical application in the communications market.

227 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examine how expectations concerning earnings per share effect share price and show that knowledge concerning analyst's forecasts of earnings cannot by itself lead to excess returns, and that much larger excess returns are earned if one is able to determine those stocks for which analysts most underestimate return.
Abstract: It is generally believed that security prices are determined by expectations concerning firm and economic variables. Despite this belief there is very little research examining expectational data. In this paper we examine how expectations concerning earning per share effect share price. We first show that knowledge concerning analyst's forecasts of earnings per share cannot by itself lead to excess returns. Any information contained in the consensus estimate of earnings per share is already included in share price. Investors or managers who buy high growth stocks where high growth is determined by consensus beliefs should not earn an excess return. This is not due to earnings having no effect upon share price since knowledge of actual earnings leads to excess return. Much larger excess returns are earned if one is able to determine those stocks for which analysts most underestimate return. Finally, the largest returns can be earned by knowing which stocks for which analysts will make the greatest revision in their estimates. This pattern of results suggests that share price is affected by expectations about earnings per share. Given any degree of forecasting ability managers can obtain best results by acting on the differences between their forecasts and concensus forecasts.

224 citations


Journal ArticleDOI
TL;DR: This paper investigates two alternative strategies for implementing Decision Support Systems DSS: evolutionary and traditional and results indicate significantly higher utilization of the DSS with the evolutionary approach.
Abstract: This paper investigates two alternative strategies for implementing Decision Support Systems DSS: evolutionary and traditional. The evolutionary approach utilizes judgement modeling boot strapping as a means to create felt need, to provide insight into the decision process and the implied weighting of decision variables, and to establish a learning-based, participatory implementation strategy. In contrast, the traditional approach is characterized by a problem solving orientation wherein the DSS is portrayed as providing a valuable "product" that can be theoretically justified. Decision making in a simulated production environment is used to test the alternative strategies. Decision style is also included in the experimental design. Results indicate significantly higher utilization of the DSS with the evolutionary approach. Decision style findings are consistent with reported research.

217 citations


Journal ArticleDOI
TL;DR: The authors identified the determinants of information value and synthesized some general results concerning their effects, while some attributes of an information system exhibit a consistent directional effect on information value, attributes of the decision setting and decision maker do not.
Abstract: This paper identifies the determinants of information value and synthesizes some general results concerning their effects. While some attributes of an information system exhibit a consistent directional effect on information value, attributes of the decision setting and decision maker do not.

215 citations


Journal ArticleDOI
TL;DR: The emphasis is on the practical problems and potential of applying the method in the simulation of complex systems and the conditions under which control variables could be profitably applied in practical simulations.
Abstract: This is a survey paper on the application of control variables to increase the efficiency of discrete event simulations. The emphasis is on the practical problems and potential of applying the method in the simulation of complex systems. The basic theory of control variables is reviewed and the equivalence of control variables and multiple estimators is discussed. Techniques for generating control variables are described. Inefficiencies resulting from the statistical estimation of control variable coefficients and the problem of confidence interval generation are treated. This is done both within the context of the method of independent replications and the regenerative method. The application literature is reviewed and the conditions under which control variables could be profitably applied in practical simulations are described. Finally, there is a set of recommended directions for future research.

Journal ArticleDOI
TL;DR: An inventory system which maintains stock to meet both high and low priority demands is considered, suggested by the operation of a spare parts pool in a military depot, to develop methods for comparing fill rates when there is rationing and when there are no rationing.
Abstract: This paper considers an inventory system which maintains stock to meet both high and low priority demands. This model is suggested by the operation of a spare parts pool in a military depot: high priority demands are those which might result in the grounding of an aircraft, for example, while low priority demands are those which arise from the routine restocking of base level inventories. We analyze the following type of control policy: there is a support level, say K > 0, such that when the level of on hand stock reaches K, all low priority demands are backordered while high priority demands continue to be filled. Both continuous review and periodic review systems are considered. The objective of the analysis is to develop methods for comparing fill rates when there is rationing and when there is no rationing for specified values of the reorder point, order quantity and support level.

Journal ArticleDOI
TL;DR: This paper addresses the problem of n jobs to be scheduled on a single machine in such a way that flow time variation is minimized and a heuristic method for scheduling is proposed.
Abstract: This paper addresses the problem of n jobs to be scheduled on a single machine in such a way that flow time variation is minimized. When the measure of variation is total absolute difference of completion times TADC the problem is shown to be quite simple. Sufficient conditions are shown for minimal TADC and a simple method for generating an optimal solution is provided. When the measure of variation is variance of flow time the problem is much more difficult. For this case a heuristic method for scheduling is proposed. The heuristic is simple and provides solutions which compare favorably with others found in the literature.

Journal ArticleDOI
TL;DR: Payne et al. as mentioned in this paper extended the work reported in Payne, Laughhunn, and Crum on the need to incorporate a target return, reference point, or aspiration level concept in the analysis of risky choice behavior.
Abstract: This Note extends the work reported in Payne, Laughhunn, and Crum Payne, J. W., D. J. Laughhunn, R. Crum. 1980. Translation of gambles and aspiration level effects in risky choice behavior. Management Sci.26 1039-1060. on the need to incorporate a target return, reference point, or aspiration level concept in the analysis of risky choice behavior. Two experiments are reported. The first experiment provides a more complete test of the model of reference point effects developed by Payne, Laughhunn, and Crum. A translation of outcomes procedure, which adds a constant to all outcomes, was used to vary the relationship of pairs of gambles to an assumed target or reference point. The results fully support the model. The second experiment provides evidence of the conceptual validity of the model by using explicit instructions to vary the target levels of managers, while holding gamble values constant.

Journal ArticleDOI
TL;DR: A repeat-purchase diffusion model is developed, incorporating the effect of marketing variables-detailing force effects in particular-as well as a word-of-mouth effect to forecast and control the rate of sales for a new product.
Abstract: This paper develops a model and an associated estimation procedure to forecast and control the rate of sales for a new product. A repeat-purchase diffusion model is developed, incorporating the effect of marketing variables-detailing force effects in particular-as well as a word-of-mouth effect. Bayesian estimation, with priors developed from past products is used to estimate and update the parameters of the model. The procedure is used to develop marketing policies for new product introduction.

Journal ArticleDOI
TL;DR: This paper provides a recursive procedure to solve knapsack problems and differs from classical optimization algorithms of convex programming in that it determines at each iteration the optimal value of at least one variable.
Abstract: The allocation of a specific amount of a given resource among competitive alternatives can often be modelled as a knapsack problem. This model formulation is extremely efficient because it allows convex cost representation with bounded variables to be solved without great computational efforts. Practical applications of this problem abound in the fields of operations management, finance, manpower planning, marketing, etc. In particular, knapsack problems emerge in hierarchical planning systems when a first level of decisions need to be further allocated among specific activities which have been previously treated in an aggregate way. In this paper we provide a recursive procedure to solve such problems. The method differs from classical optimization algorithms of convex programming in that it determines at each iteration the optimal value of at least one variable. Applications and computational results are presented.

Journal ArticleDOI
TL;DR: The results of the experiment provide limited support for the use of graphics presentation in an information system because decision or cognitive style also appears to be an important variable influencing the performance of an individual and the reaction to anInformation system.
Abstract: This paper presents the results of an experiment designed to investigate the impact of computer-based graphics on decision making The experimental task consisted of selecting quarterly reorder quantities for an importer under condition of uncertain demand Subjects in the experiment were participants in an executive program for middle and upper level managers Each subject received information on the cumulative probability distribution of demand and had an opportunity to run up to eight trial simulations with past demand data using his or her order quantities After completing the trial simulations, the subjects made quarterly ordering decisions for one year in which the quantities demanded were drawn from the demand distribution Treatments included the use of a hard copy terminal and five different types of displays on a CRT The results of the experiment provide limited support for the use of graphics presentation in an information system Decision or cognitive style also appears to be an important variable influencing the performance of an individual and the reaction to an information system The implications of the findings for the design of information systems are discussed

Journal ArticleDOI
TL;DR: This paper presents a simplex-based solution procedure for the multiple objective linear fractional programming problem that solves for all weakly efficient vertices of the augmented feasible region.
Abstract: This paper presents a simplex-based solution procedure for the multiple objective linear fractional programming problem. By 1 departing slightly from the traditional notion of efficiency and 2 augmenting the feasible region as in goal programming, the solution procedure solves for all weakly efficient vertices of the augmented feasible region. The article discusses the difficulties that must be addressed in multiple objective linear fractional programming and motivates the solution algorithm that is developed.

Journal ArticleDOI
TL;DR: A highly automated, real-time dispatch system is described which uses embedded optimization routines to replace extensive manual operations and to reduce substantially operating costs for a nation-wide fleet of petroleum tank trucks.
Abstract: A highly automated, real-time dispatch system is described which uses embedded optimization routines to replace extensive manual operations and to reduce substantially operating costs for a nation-wide fleet of petroleum tank trucks. The system is currently used in daily operations by the Order Entry and Dispatch segment of the Chevron U.S.A. Marketing System. Refined petroleum products valued at several billion dollars per year are dispatched from more than 80 bulk terminals on a fleet exceeding 300 vehicles in approximately 2600 loads per day. Centralized use of the dispatch system required its design and implementation as a set of transaction modules within a large management information system. This environment presents special challenges for the optimization methods; an heuristic sequential network assignment was developed for certified performance on these dispatch models in lieu of their solution as integer programs. Objectives include minimizing transportation costs approaching $100 million annually while maintaining equitable man and equipment workload distribution, safety standards, and customer service, and satisfying equipment compatibility restrictions.

Journal ArticleDOI
TL;DR: In this paper, a portfolio selection procedure was developed to assist the U.S. Department of Energy in selecting a portfolio of solar energy applications experiments, where the technical quality of each proposed applications experiment was summarized through the use of multiple evaluation measures, or attributes.
Abstract: This article reports a procedure developed to assist the U.S. Department of Energy in selecting a portfolio of solar energy applications experiments. The procedure has also been used in other government procurements and appears to be applicable in a variety of project funding processes. The technical quality of each proposed applications experiment was summarized through the use of multiple evaluation measures, or attributes. These were combined into a single index of the overall technical quality of an experiment through the use of a multiattribute utility function. Recently derived results in measurable value theory were applied to derive an index of the overall technical quality of a portfolio of experiments. Budgetary and programmatic issues were handled through the use of constraints. This approach allowed the portfolio selection problem to be formulated as an integer linear program. Details of the application are presented, including a discussion of the data requirements and assessment procedure used. The portfolio selection procedure was successfully applied, and variations of it have been successfully used in four other solar energy procurements.

Journal ArticleDOI
TL;DR: A simple computer-compatible algebraic notation scheme for identifying structure facility interrelationships within networks of the three types considered is discussed and two particularly naive heuristics also perform quite well in certain situations.
Abstract: Seven heuristic algorithms are discussed. Each can be used for production scheduling in an assembly network a network where each work station has at most one immediate successor work station, but may have any number of immediate predecessor work stations, distribution scheduling in an arborescence network a network where each warehouse or stocking point is supplied by at most one immediate predecessor stocking point, but may itself supply any number of immediate successor stocking points, and joint production-distribution scheduling in a conjoined assembly-arborescence network. The objective of each algorithm is to determine a production and/or product distribution schedule which satisfies final product demand and minimizes the sum of the average inventory holding costs and average fixed charges for processing ordering, delivery, or setup costs, per period, over an infinite planning horizon. Exogenous demand for product is assumed to be deterministic, at a constant rate, and to occur only at "retail" facilities of the networks. On the basis of their performance in 11,000 computer generated problems, the seven heuristic methods are compared with each other and with a dynamic programming algorithm. The results indicate that for most of the network structures considered, the best heuristic is the method of steepest descent; the second best is a simple extension of a method originally developed by Crowston, Wagner, and Henshaw. The improved myopic procedure of Graves and Schwarz performs very well for some particular types of structures. Surprisingly, two particularly naive heuristics also perform quite well in certain situations. In addition to the computer simulation experiments, we also discuss a simple computer-compatible algebraic notation scheme for identifying structure facility interrelationships within networks of the three types considered.

Journal ArticleDOI
TL;DR: Issues raised by the study include the use of automated and controlled baseline strategies to study decision making in complex situations, the need to develop normative guidelines for use in turbulent, competitive environments, and the multidimensional nature of the functions of decisionMaking in organizations.
Abstract: Are the costs of time and effort spent on analyzing decisions outweighed by benefits? This issue was examined in the context of a competitive business game where human teams were pitted against two kinds of simple-minded arbitrary decision rules: one where rules were applied consistently "arbitrary-consistent"; the other where rules were subject to a random component "arbitrary-random". The arbitrary-consistent rules outperformed, on average, 41% of human opponents, the corresponding figure for arbitrary-random being 19%. These results are discussed within the more general context of consistency in decision making which has received considerable attention in both the management and psychological literatures, albeit in the more restricted case of non-competitive and stable environments. Issues raised by the study include the use of automated and controlled baseline strategies to study decision making in complex situations, the need to develop normative guidelines for use in turbulent, competitive environments, and the multidimensional nature of the functions of decision making in organizations.

Journal ArticleDOI
TL;DR: An algorithm is presented for solving the study of the constant due-date assignment policy in a dynamic job shop and it is shown that the optimal lead time is a unique minimum point of strictly convex functions.
Abstract: This paper is concerned with the study of the constant due-date assignment policy in a dynamic job shop. Assuming that production times are randomly distributed, each job has a penalty cost that is some non-linear function of its due-date and its actual completion time. The due date is found by adding a constant to the time the job arrives to the shop. This constant time allowed in the shop is the lead time that a customer might expect between time of placing the order and time of delivery. The objective is to minimize the expected aggregate cost per job subject to restrictive assumptions on the priority discipline and the penalty functions. This aggregate cost includes 1 a cost that increases with increasing lead times, 2 a cost for jobs that are delivered after the due dates: the cost is proportional to tardiness and 3 a cost proportional to earliness for jobs that are completed prior to the due dates. We present an algorithm for solving this problem and show that the optimal lead time is a unique minimum point of strictly convex functions. The algorithm utilizes analytical procedures; computations can be made manually. No specific distributions are assumed; the distribution of total time a job is in the shop is utilized by the algorithm. This distribution can be theoretical or empirical. An example of a production system is presented.

Journal ArticleDOI
TL;DR: This work presents an approach for periodically reallocating beds to services to minimize the expected overflows in a large public health care delivery system and suggests that the model is relatively "robust" with respect to the case under consideration.
Abstract: Due to changing patient loads and demand patterns over time, assigning bed complements for various medical services in a hospital is a recurring problem facing the administrators. For a large public health care delivery system, we present an approach for periodically reallocating beds to services to minimize the expected overflows. Using a queueing model to approximate the patient population dynamics for each service-with admission rates provided by forecasts-the expected overflows under each configuration are computed via a Normal loss integral. Bed allocation is done in two stages. First, we establish a base line requirement for each service so that it can handle a prescribed amount of patient load based on a yearly projection of demand. We then use marginal analysis to distribute the remaining beds to minimize the expected total average overflows while taking month-to-month demand fluctuations into account. The proposed model requires only a modest amount of computation, because of several simplifying assumptions, which were tested for reasonableness. For the two largest services, we used empirical data to evaluate the nonhomogeneous Poisson representation of admissions, and we performed simulation experiments to assess the extent of discrepancy in performance characteristics caused by the ignorance of day-of-week effect on admission rates. In view of the intrinsic complexity of the underlying system, the results obtained from the validation studies suggest that the model is relatively "robust" with respect to the case under consideration. It is hoped that the simplicity of the model and the usefulness of the results will induce practitioners to use this type of formal analysis for bed allocation in an institutional setting on a routine basis.

Journal ArticleDOI
Yair Aharoni1
TL;DR: The use of SOEs as instruments of public policy and the resulting clashes between these enterprises and private firms on the one hand and government and other controllers on the other, are causing concern.
Abstract: State-Owned Enterprises SOEs have become important instruments of social and economic policy in industrialized mixed economies and in developing countries. The use of SOEs as instruments of public policy and the resulting clashes between these enterprises and private firms on the one hand and government and other controllers on the other, are causing concern. Public committees in different countries as well as international organizations have been searching for positive theory for guidance in handling the multitude of problems related to these enterprises. Theoretical models have made important contributions to the formalization of certain problems and the classification of the information needed to solve them. Unfortunately, these theoretical models have had little relevance for the solution of important real problems. Much of the research on SOEs is concerned with how these enterprises should behave, and what should be the product of their operations. Almost no research has been done on why SOEs function as they do. The paucity of knowledge about the operation of SOEs stems both from insufficient research effort, and from the concern of researchers with formal structures and products of these organizations and not with management behavior or with decision processes. The purpose of this paper is to call for research beyond the confines of traditional economics, using the tools of management science to obtain insights into the difficult but salient problems of SOEs.

Journal ArticleDOI
TL;DR: There is some effect of participation on the attitudes of participants, but no measurable effect on the amount of system usage, and the nature of system use was found to be related to inputs provided in the design process by participants.
Abstract: The concept of "participative systems design" should be as applicable to strategic management decision support systems as it is generally believed to be to lower-level systems. This study tests hypotheses concerning participative design using data describing the participation and performance of managers prior to during, and after implementation of a participatively designed system. The study concludes that there is some effect of participation on the attitudes of participants, but no measurable effect on the amount of system usage. The nature of system use was found to be related to inputs provided in the design process by participants. No advantage was found for participants in terms of the "quality" of decision making.

Journal ArticleDOI
TL;DR: The problems considered are generalizations of both the classical project scheduling problem and the time-cost trade-off problem, and the properties of optimal schedules are given for strictly concave, concave and convex activity models.
Abstract: This paper deals with a class of project scheduling problems concerning the allocation of continuously divisible resources under conditions in which both total usage at every moment and total consumption over the period of project duration are constrained. Typical examples of such resources, called doubly constrained, are money or energy, when the constraint on power or the rate of expenditure cannot be ignored as neither, of course, can. the constraint on resource consumption. Also manpower must often be considered as a doubly constrained resource. Mathematical models of project activities in which performing speeds are continuous functions of resource amounts are considered. The objective is a schedule which minimizes project duration. Thus, the problems considered are generalizations of both the classical project scheduling problem and the time-cost trade-off problem. The properties of optimal schedules are given for strictly concave, concave and convex activity models. On the basis of these properties, methods for finding optimal schedules are described for independent and dependent activities. We also consider the minimum resource consumption ensuring minimum project duration for a given level of resource usage, and the minimum level of resource usage ensuring minimum project duration for a given level of resource consumption. Possible generalizations of the presented results are indicated.

Journal ArticleDOI
TL;DR: In this paper, a conceptual model relating four sources of job related ambiguity and two individual difference variables locus of control and need for clarity to salesperson job satisfaction and job performance is presented.
Abstract: This study presents a conceptual model relating four sources of job related ambiguity and two individual difference variables locus of control and need for clarity to salesperson job satisfaction and job performance. Previous research related to the model is briefly reviewed. Then, drawing data from a multicompany sample of industrial salespersons and their managers, behavioral research methods are used to clarify the nature and strength of the relationships in the model. The analysis reveals that ambiguity concerning family expectations is positively related to performance, but ambiguity regarding sales manager and customer expectations is negatively related to performance. Lower levels of satisfaction are explained primarly by ambiguous managerial expectations. The individual difference variables are shown to be related to job outcomes even after adjusting for different levels of perceived ambiguity. The individual difference variables, however, do not moderate the relationships between sources of ambiguity and job outcomes.

Journal ArticleDOI
TL;DR: In this paper, the authors compare CCP to stochastic programming with recourse SPR and conclude that CCP is seriously deficient as a modeling technique and of limited value as a computational device.
Abstract: Some important conceptual problems concerning the application of chance constrained programming CCP to risky practical decision problems are discussed by comparing CCP to stochastic programming with recourse SPR. We expand on Garstka's distinction between mathematical equivalence and economic equivalence showing that much of practical usefulness is lost in the transition between SPR and CCP. By examining the literature on CCP applications we conclude that there is little evidence that CCP is used with the care that is necessary. Finally we conclude that CCP is seriously deficient as a modeling technique and of limited value as a computational device.

Journal ArticleDOI
TL;DR: In this article, Nash bargaining theory and recent developments in economic contract theory are employed in the analysis of the marketing channels and individual dyadic contracts involving payment schedules between members of a simple 3-level channel are investigated with particular reference to monitoring problems and intrachannel power relations.
Abstract: Nash bargaining theory and recent developments in economic contract theory are employed in the analysis of the marketing channels. Individual dyadic contracts involving payment schedules between members of a simple 3 level channel are investigated with particular reference to monitoring problems and intrachannel power relations. The interrelations between individual contracts are examined and the equilibrium set of contracts constituting the channel derived. The performance of the channel in terms of risk sharing, allocative efficiency and the distribution of gains is then evaluated. It is found that the risk aversion of channel members and the cost of monitoring and enforcement affect channel efficiency, and that under certain types of interdependencies and externalities, the nature of the power structure is crucial to channel efficiency.