scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1989"


Journal ArticleDOI
TL;DR: In this article, the authors address the ability to predict peoples' computer acceptance from a measure of their intentions, and explain their intentions in terms of their attitudes, subjective norms, perceived usefulness, perceived ease of use, and related variables.
Abstract: Computer systems cannot improve organizational performance if they aren't used. Unfortunately, resistance to end-user systems by managers and professionals is a widespread problem. To better predict, explain, and increase user acceptance, we need to better understand why people accept or reject computers. This research addresses the ability to predict peoples' computer acceptance from a measure of their intentions, and the ability to explain their intentions in terms of their attitudes, subjective norms, perceived usefulness, perceived ease of use, and related variables. In a longitudinal study of 107 users, intentions to use a specific system, measured after a one-hour introduction to the system, were correlated 0.35 with system use 14 weeks later. The intention-usage correlation was 0.63 at the end of this time period. Perceived usefulness strongly influenced peoples' intentions, explaining more than half of the variance in intentions at the end of 14 weeks. Perceived ease of use had a small but significant effect on intentions as well, although this effect subsided over time. Attitudes only partially mediated the effects of these beliefs on intentions. Subjective norms had no effect on intentions. These results suggest the possibility of simple but powerful models of the determinants of user acceptance, with practical value for evaluating systems and guiding managerial interventions aimed at reducing the problem of underutilized computer technology.

21,880 citations


Journal ArticleDOI
Ingemar Dierickx1, Karel Cool1
TL;DR: Barney as mentioned in this paper showed that the sustainability of a firm's asset position depends on how easily assets can be substituted or imitated, and that imitability is linked to the characteristics of the asset accumulation process: time compression diseconomies, asset mass efficiencies, interconnectedness, asset erosion and causal ambiguity.
Abstract: Given incomplete factor markets, appropriate time paths of flow variables must be chosen to build required stocks of assets. That is, critical resources are accumulated rather than acquired in "strategic factor markets" Barney [Barney, J. 1986. Strategic factor markets: Expectations, luck, and business strategy. Management Sci. October 1231-1241.]. Sustainability of a firm's asset position hinges on how easily assets can be substituted or imitated. Imitability is linked to the characteristics of the asset accumulation process: time compression diseconomies, asset mass efficiencies, inter-connectedness, asset erosion and causal ambiguity.

8,271 citations


Journal ArticleDOI
TL;DR: In this paper, the authors report an experiment on the generation of macrodynamics from microstructure in a common managerial context, where subjects manage a simulated inventory distribution system which contains multiple actors, feedbacks, nonlinearities, and time delays.
Abstract: Studies in the psychology of individual choice have identified numerous cognitive and other bounds on human rationality, often producing systematic errors and biases. Yet for the most part models of aggregate phenomena in management science and economics are not consistent with such micro-empirical knowledge of individual decision-making. One explanation has been the difficulty of extending the experimental methods used to study individual decisions to aggregate, dynamic settings. This paper reports an experiment on the generation of macrodynamics from microstructure in a common managerial context. Subjects manage a simulated inventory distribution system which contains multiple actors, feedbacks, nonlinearities, and time delays. The interaction of individual decisions with the structure of the simulated firm produces aggregate dynamics which systematically diverge from optimal behavior. An anchoring and adjustment heuristic for stock management is proposed as a model of the subjects' decision processes. ...

2,209 citations



Journal ArticleDOI
TL;DR: In this article, the authors examined the effect on product development of project scope, i.e., the extent to which a new product is based on unique parts developed in-house.
Abstract: This paper examines the effect on product development of project scope: the extent to which a new product is based on unique parts developed in-house. Using data from a larger study of product development in the world auto industry, the paper presents evidence on the impact of scope on lead time and engineering productivity. Studies of the automotive supplier industry suggest that very different structures and relationships exist in Japan, the U.S., and Europe. Yet there has been little study of the impact of these differences on development performance. Further, little is known about the effects of different parts strategies (i.e. unique versus common or carryover parts) on development. The evidence presented here suggests that project scope differs significantly in the industry, even for comparable products. These differences in strategy, in turn, explain an important part of differences in performance. In particular, it appears that a distinctive approach to scope among Japanese firms—high levels of un...

858 citations


Journal ArticleDOI
TL;DR: In this paper, student participants participated in an intertemporal choice experiment which manipulated three dimensions in a 4 × 4 ×4 × 4 factorial design: scenario (postponing a receipt, postponing a payment, expediting a receipt), time delay (0.5, 1, 2, and 4 years), and size of cashflow ($40, $200, $1000, and $5000).
Abstract: Two hundred and four students of economics and finance participated in an intertemporal choice experiment which manipulated three dimensions in a 4 × 4 × 4 factorial design: scenario (postponing a receipt, postponing a payment, expediting a receipt, expediting a payment), time delay (0.5, 1, 2, and 4 years), and size of cashflow ($40, $200, $1000, and $5000). Individual discount rates were inferred from the responses, and then used to test competitively four hypotheses regarding the behavior of discount rates. The classical hypothesis asserting that the discount rate is uniform across scenarios, time delays, and sums of cashflow was flatly rejected. A market segmentation approach was found lacking. The results support an implicit risk hypothesis according to which delayed consequences are associated with an implicit risk value, and an added compensation hypothesis which asserts that individuals require compensation for a change in their financial position.

838 citations


Journal ArticleDOI
TL;DR: In this paper, a branch and bound method for solving the job-shop problem is proposed, which is based on one-machine scheduling problems and is made more efficient by several propositions which limit the search tree by using immediate selections.
Abstract: In this paper, we propose a branch and bound method for solving the job-shop problem. It is based on one-machine scheduling problems and is made more efficient by several propositions which limit the search tree by using immediate selections. It solved for the first time the famous 10 × 10 job-shop problem proposed by Muth and Thompson in 1963.

836 citations


Journal ArticleDOI
TL;DR: Computer systems cannot improve organizational performance if they aren't used, and resistance to end-user systems by managers and professionals is a widespread problem.
Abstract: Computer systems cannot improve organizational performance if they aren't used. Unfortunately, resistance to end-user systems by managers and professionals is a widespread problem. To better predic...

749 citations


Journal ArticleDOI
TL;DR: Applications are given to a GI/G/1 queueing problem and response surface estimation and Computation of the theoretical moments arising in importance sampling is discussed and some numerical examples given.
Abstract: Importance sampling is one of the classical variance reduction techniques for increasing the efficiency of Monte Carlo algorithms for estimating integrals. The basic idea is to replace the original...

646 citations


Journal ArticleDOI
Ingemar Dierickx1, Karel Cool1
TL;DR: Dierickx et al. as mentioned in this paper presented a paper on asset stock accumulation and the sustainability of competitive advantage in the context of finance and finance management, which was published in 1989.
Abstract: Authors' reply to comments regarding their paper Dierickx, I., K. Cool. 1989. Asset stock accumulation and the sustainability of competitive advantage. Management Sci..

565 citations


Journal ArticleDOI
TL;DR: The results suggest that information presentation format influences the decision time and the selection of acquisition and evaluation strategies by influencing the cognitive costs and benefits of the task environment and the cognitive cost/benefit framework can provide a robust theoretical foundation for design decisions regarding graphical presentation formats in decision support systems.
Abstract: The designers of decision support systems lack theoretically based principles for designing graphical interfaces. The purpose of the reported research is to take a step toward developing such princ...

Journal ArticleDOI
TL;DR: A survey of over 50 representative problems in location research can be found in this paper, where the authors focus on problems for which operations research-type models have been developed, and most of the problems have been formulated as optimization problems.
Abstract: We present a survey of over 50 representative problems in location research. Our goal is not to review all variants of different location models or to describe solution results, but rather to provide a broad overview of major location problems that have been studied, indicating briefly how they are formulated and how they relate to one another. We review standard problems such as median, center, and warehouse location problems, as well as less traditional location problems which have emerged in recent years. Our primary focus is on problems for which operations research-type models have been developed. Most of the problems we review have been formulated as optimization problems.

Journal ArticleDOI
TL;DR: A theoretical basis for scheduling mixed-model assembly lines and Just-In-Time JIT production systems is developed, and new scheduling algorithms and heuristics are presented.
Abstract: Mixed-model assembly lines are used to produce many different products without carrying large inventories. The effective utilization of these lines requires that a schedule for assembling the different products be determined. For Just-In-Time JIT production systems, which require producing only the necessary products in the necessary quantities at the necessary times, the objective is to keep a constant rate of usage of all parts used by the line. This is called levelling or balancing the schedule. This paper develops a theoretical basis for scheduling these systems, and presents new scheduling algorithms and heuristics.

Journal ArticleDOI
TL;DR: In this paper, a Lagrangian relaxation of the capacity constraints of CLSP allows it to be decomposed into a set of uncapacitated single product lot sizing problems, which are solved by dynamic programming.
Abstract: This research focuses on the effect of setup time on lot sizing. The setting is the Capacitated Lot Sizing Problem (the single-machine lot sizing problem) with nonstationary costs, demands, and setup times. A Lagrangian relaxation of the capacity constraints of CLSP allows it to be decomposed into a set of uncapacitated single product lot sizing problems. The Lagrangian dual costs are updated by subgradient optimization, and the single-item problems are solved by dynamic programming. A heuristic smoothing procedure constructs feasible solutions (production plans) which do not require overtime. The algorithm solves problems with setup time or setup cost. Problems with extremely tightly binding capacity constraints were much more difficult to solve than anticipated. Solutions without overtime could not always be found for them. The most significant results are that (1) the tightness of the capacity constraint is a good indicator of problem difficulty for problems with setup time; and (2) the algorithm solve...

Journal ArticleDOI
TL;DR: A survey of more than 4,000 significant innovations and innovating firms in the UK from 1945-1983 shows that the scope and organisation of technological activities vary greatly as functions of firms' principal activities and size.
Abstract: A survey of more than 4,000 significant innovations and innovating firms in the UK from 1945-1983 shows that the scope and organisation of technological activities vary greatly as functions of firms' principal activities and size. 1. Technological opportunities and threats are greatest in firms in chemicals and engineering. Opportunities in such science-based and specialist supplier firms in general emerge horizontally in related product markets and downstream in user sectors. In scale-intensive e.g. steel, vehicles and supplier-dominated e.g. printing, construction firms, opportunities tend to be upstream in related production technologies. Breakthrough innovations in science-based firms also induce clusters of technological opportunities upstream for suppliers, horizontally for partners, and downstream for users. Their effective exploitation requires diversity of firms' technological activities greater than that strictly required for current output. 2. The nature of technological opportunities, and of organisation for their exploitation, also varies with firm size. Firms with fewer than 1,000 employees have major opportunities with specialised strategies in mechanical engineering and instruments. The prevalence of broad front technological strategies, and of divisionalisation, increases sharply with firm size, together with dependence on formal R and D activities. The size of innovating divisions has diminished sharply over the period. Divisionalisation improves the "goodness of fit" between the core business of innovating divisions and the innovations themselves, but 40% have remained consistently outside the core business of divisions. 3. These findings help identify the key tasks of technological strategy in firms in different industries, and of different sizes. Thus, in large firms, divisionalisation can create the small size of unit conducive to effective implementation, but it cannot absolve central management from the continuous task of matching technological opportunities with organisational forms and boundaries.

Journal ArticleDOI
TL;DR: The hypothesis that TBNVs progress according to this model is tested and some variation in interstage transition patterns are due to a progression imperative, although not all firms progressed as expected.
Abstract: This paper presents a stage of growth model for technology based new ventures TBNVs. TBNVs are postulated to evolve through four discrete stages of growth-Conception and Development, Commercialization, Growth, and Stability. Based on a longitudinal sample of 71 ventures in the computer and electronics industries, the hypothesis that TBNVs progress according to this model is tested using the del procedure for prediction analysis of cross-classification tables. The hypotheses are supported, although not all firms progressed as expected. These results suggest that some variation in interstage transition patterns are due to a progression imperative.

Journal ArticleDOI
TL;DR: Within the context of traditional data processing, the MIS literature has devoted considerable attention to the relationship between user involvement and MIS success: unfortunately, this research finds that this relationship is not well understood.
Abstract: Within the context of traditional data processing, the MIS literature has devoted considerable attention to the relationship between user involvement and MIS success: unfortunately, this research h...

Journal ArticleDOI
TL;DR: This paper develops the framework for assessment and analysis of linear-quadratic-Gaussian models within the influence diagram representation, and provides algorithms to translate between the Gaussian influence diagram and covariance matrix representations for the normal distribution.
Abstract: An influence diagram is a network representation of probabilistic inference and decision analysis models. The nodes correspond to variables that can be either constants, uncertain quantities, decisions, or objectives. The arcs reveal probabilistic dependence of the uncertain quantities and information available at the time of the decisions. The influence diagram focuses attention on relationships among the variables. As a result, it is increasingly popular for eliciting and communicating the structure of a decision or probabilistic model. This paper develops the framework for assessment and analysis of linear-quadratic-Gaussian models within the influence diagram representation. The "Gaussian influence diagram" exploits conditional independence in a model to simplify elicitation of parameters for the multivariate normal distribution. It is straightforward to assess and maintain a positive semi-definite covariance matrix. Problems of inference and decision making can be analyzed using simple transformations to the assessed model, and these procedures have attractive numerical properties. Algorithms are also provided to translate between the Gaussian influence diagram and covariance matrix representations for the normal distribution.

Journal ArticleDOI
TL;DR: In this paper, the authors present an approach for managing conflicts in project groups in information system development projects, which is an important but often neglected aspect of the process of conflict resolution.
Abstract: Information system development projects engage organizational members in a process with potential for conflict. Managing such conflicts in project groups is an important but often neglected aspect ...

Journal ArticleDOI
TL;DR: In this paper, the characteristics of innovative and non-innovative small firms have significant differences, and the authors address the problem of the difference between the two types of companies.
Abstract: This study addresses the proposition that the characteristics of innovative and noninnovative small firms have significant differences. It is based on a sample of 50 Texas manufacturers. Cluster analysis yielded five distinct groups-two innovative and three noninnovative. The two innovative groups were either young firms dubbed the "Young Turks," or, established, managerially competent firms headed by newcomers called the "Blue Chips." Among the three noninnovative groups, the "Silver Spoons" appeared to be surviving on past success; the "Striving Stoics" displayed continuing managerial effort but were led by executives who had been at the helm far longer than average; and the "Kismets" showed lesser competence and effort, were highly centralized, and were headed by executives tending more towards an external locus of control. Correlational analysis indicated a significant positive relationship between scanning and innovation. Challenges to the firm in the form of environmental dynamism and heterogeneity evoked positive innovatory responses but environmental hostility was a weak negative correlate inducing the firm to "pull in its horns." Finally, an abundance of resources encouraged proactiveness.

Journal ArticleDOI
TL;DR: A variation of the Beam Search method, called Filtered Beam Search, is proposed, able to use priority functions to search a number of solution paths in parallel and was not only efficient but also consistent in providing near-optimal solutions with a relatively small search tree.
Abstract: We examine the problem of scheduling a given set of jobs on a single machine to minimize total early and tardy costs. Two dispatch priority rules are proposed and tested for this NP-complete problem. These were found to perform far better than known heuristics that ignored early costs. For situations where the potential cost savings are sufficiently high to justify more sophisticated techniques, we propose a variation of the Beam Search method developed by researchers in artificial intelligence. This variant, called Filtered Beam Search, is able to use priority functions to search a number of solution paths in parallel. A computational study showed that this search method was not only efficient but also consistent in providing near-optimal solutions with a relatively small search tree. The study also includes an investigation of the impacts of Beam Search parameters on three variations of Beam Search for this problem.

Journal ArticleDOI
TL;DR: In this paper, the concepts of influence diagrams are used to construct knowledge maps that capture the diverse information possessed by an individual or a group, and redundant knowledge maps assessed iteratively to handle cases where the most comfortable way to assess the information does not correspond to any proper assessment order for the diagram.
Abstract: To get fragmented information out of people's heads, onto paper, and ultimately into a computer is a continually challenging problem. We show how to use the concepts of influence diagrams to construct knowledge maps that capture the diverse information possessed by an individual or a group. We use redundant knowledge maps assessed iteratively to handle cases where the most comfortable way to assess the information does not correspond to any proper assessment order for the diagram. We use disjoint knowledge maps when the particular assessment to be made does not require a complete joint distribution. The necessary inferential calculations are readily performed in simple cases by spreadsheet programs. Knowledge maps facilitate the processes of representing knowledge and of determining its implications.

Journal ArticleDOI
TL;DR: Hershey and Schoemaker as discussed by the authors employed both the probability and certainty equivalence methods to explore bias in utility assessment, and found that the direction and degree of bias depend on characteristics of the assessment gamble such as the reference probability and the difference between outcomes.
Abstract: Judgments about simple gambles, such as those used in utility assessment, can generate sizable and systematic bias. After Hershey and Schoemaker (Hershey, J. C., P. J. H. Schoemaker. 1985. Probability vs. certainty equivalence methods in utility measurement: Are they equivalent? Management Sci. 31 1213–1231.), we employ both the probability and certainty equivalence methods to explore bias. Our results show that: (1) the direction and degree of bias depend on characteristics of the assessment gamble such as the reference probability and the difference between outcomes, (2) presenting subjects with explicit anchors can change the size and direction of the bias, and (3) subjects using heuristic response strategies show significantly more bias than those using expectation strategies. We also discuss the status of possible explanations for the bias, in light of these new results, including PE mode reframing, random error, and anchoring and adjustment.

Journal ArticleDOI
TL;DR: A laboratory experiment was undertaken to test selected variables that should (or should not) affect forgetting, revealing two key findings: (1) Forgetting may be negligible for “continuous control” tasks but considerable for ‘procedural’ tasks, and (2) Bicycle riding is representative of continuous control, while operating a computer is clearly procedural.
Abstract: The industrial learning curve is widely used to predict costs and labor requirements wherever learning is taking place. Little is known, however, about the reverse of this process: the forgetting that occurs during production interruptions. The ability to estimate cost increases due to forgetting would be useful for economic lot size determinations, bidding on repeat orders, estimating the cost of strikes, and so on. Empirical studies apparently have not been published. Field data are difficult to obtain and easily confounded by extraneous variables. Thus a laboratory experiment was undertaken to test selected variables that should (or should not) affect forgetting. A review of relevant psychological literature reveals two key findings: (1) Forgetting may be negligible for “continuous control” tasks but considerable for “procedural” tasks. Bicycle riding is representative of continuous control, while operating a computer is clearly procedural. (2) Forgetting is a function of the amount learned and the pas...

Journal ArticleDOI
TL;DR: In this article, the authors use option pricing theory to value and analyze many performance-based fee contracts that are currently in use and present conditions for contract parameters that provide proper risk incentives for classes of investment strategies.
Abstract: This paper uses option pricing theory to value and analyze many performance-based fee contracts that are currently in use. A potential problem with some of these contracts is that they may induce portfolio managers to adversely alter the risk of the portfolios they manage. The paper is prescriptive in that it presents conditions for contract parameters that provide proper risk incentives for classes of investment strategies. For buy-and-hold and rebalancing strategies adverse risk incentives are avoided when the penalties for poor performance outweigh the rewards for good performance.

Journal ArticleDOI
TL;DR: A model that can be used by planners to both locate maintenance stations and to develop flight schedules that better meet the cyclical demand for maintenance is presented, formulated as a min-cost, multicommodity flow network with integral constraints, and solved using a two-phase heuristic.
Abstract: In an effort to control costs, airlines have begun to concentrate on their maintenance operations as a potential source for savings. Nevertheless, federal regulations and internal safety policies effectively limit cost savings to improvements in productivity and scheduling. The purpose of this paper is to present a model that can be used by planners to both locate maintenance stations and to develop flight schedules that better meet the cyclical demand for maintenance. The problem is formulated as a min-cost, multicommodity flow network with integral constraints, and solved using a two-phase heuristic. The procedure is demonstrated with data supplied by American Airlines for their Boeing 727 fleet. The results show a significant improvement over current techniques, and indicate that substantial cost reductions can be achieved by eliminating up to 5 of the 22 maintenance bases now in operation. Similar results were obtained for American's Super 80 and DC-10 fleets. Perturbation analysis confirms the robustness of these findings, and suggests that loss in flexibility due to interruptions in the flight schedule will be negligible.

Journal ArticleDOI
TL;DR: In this paper, the authors show that unless channel members can offer credible guarantees that unobservable agreements do not exist, the strategic effects of delegation disappear and conclude that mechanisms other than strategic ones must be responsible for the existence of delegated channels, and make some suggestions about promising avenues for future theory research in channel structure.
Abstract: Several papers in the recent marketing literature have suggested that delegation in distribution e.g., the use of independent middlemen helps manufacturers to precommit strategically to profit-enhancing competitive actions. Further, the literature suggests that the profitability of such actions depends on market structure. We challenge these conclusions here. This is done in two steps. First, we perform an analysis of the entire class of models which have been used in the literature. Using internally consistent assumptions about market structure and contracting, the only subgame perfect equilibrium is one in which all distribution channels have infinitely many levels of delegation. Obviously, this is not what we see in the real world. Next, we relax a key hidden assumption, namely that all intra-channel agreements are observable to competitors. Without this assumption, the usual results unravel. Unless channel members can offer credible guarantees that unobservable agreements do not exist, the strategic effects of delegation disappear. Since these guarantees are virtually impossible to maintain credibly, we would expect to reject the hypothesis in the earlier literature relating channel structure to competition in an empirical study controlling for observability. We conclude that mechanisms other than strategic ones must be responsible for the existence of delegated channels, and make some suggestions about promising avenues for future theory research in channel structure.

Journal ArticleDOI
TL;DR: In this paper, the effects of risk sensitivity and bargaining power on quantity discounts are discussed for alternative bargaining models, and all-units and incremental quantity discounts that permit transaction at a negotiated outcome are described.
Abstract: Quantity discounts offered by a monopolist are considered in the context of a bargaining problem in which the buyer and the seller negotiate over the order quantity and the average unit price. All-units and incremental quantity discounts that permit transaction at a negotiated outcome are described. The effects of risk sensitivity and bargaining power on quantity discounts are discussed for alternative bargaining models.

Journal ArticleDOI
TL;DR: Simple heuristic formulas are developed to estimate the simulation run lengths required to achieve desired statistical precision in queueing simulations and apply to stochastic processes that can be approximated by reflected Brownian motion, such as the queue-length process in the standard GI/G/1 model.
Abstract: Simple heuristic formulas are developed to estimate the simulation run lengths required to achieve desired statistical precision in queueing simulations. The formulas are intended to help in the early planning stages before any data have been collected. The queueing simulations considered are single replications (one long run) conducted to estimate steady-state characteristics such as expected equilibrium queue lengths. The formulas can be applied to design simulation experiments to develop and evaluate queueing approximations. In fact, this work was motivated by efforts to develop approximations for packet communication networks with multiple classes of traffic having different service characteristics and bursty arrival processes. In addition to indicating the approximate simulation run length required in each case of a designed experiment, the formulas can help determine what cases to consider, what statistical precision to aim for, and even whether to conduct the experiment at all. The formulas are bas...

Journal ArticleDOI
TL;DR: The operational control problem in a general N-stage serial production system is modeled as a discrete time Markov process, describing the effects of the number of kanbans, the machine reliability, the demand variability and safety stock requirements on the performance of a kanban controlled pull system.
Abstract: Three problem areas exist in designing and implementing a kanban controlled JIT system: the identification of flow lines problem, the flow line loading problem and the operational control problem. This paper addresses the operational control problem. A general N-stage serial production system is modeled as a discrete time Markov process. Capacity constraints, stochastic machine reliability and demand variability are included. The model is illustrated by a 3-stage system, describing the effects of the number of kanbans, the machine reliability, the demand variability and safety stock requirements on the performance of a kanban controlled pull system.