scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1993"


Journal ArticleDOI
TL;DR: In this paper, a modified version of DEA based upon comparison of efficient DMUs relative to a reference technology spanned by all other units is developed, which provides a framework for ranking efficient units and facilitates comparison with rankings based on parametric methods.
Abstract: Data Envelopment Analysis DEA evaluates the relative efficiency of decision-making units DMUs but does not allow for a ranking of the efficient units themselves. A modified version of DEA based upon comparison of efficient DMUs relative to a reference technology spanned by all other units is developed. The procedure provides a framework for ranking efficient units and facilitates comparison with rankings based on parametric methods.

3,320 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the effect of statistical aggregation in mitigating relative risk in decision-making in organizations and found that over optimistic forecasts result from the adoption of an inside view of the problem, which anchors predictions on plans and scenarios.
Abstract: Decision makers have a strong tendency to consider problems as unique. They isolate the current choice from future opportunities and neglect the statistics of the past in evaluating current plans. Overly cautious attitudes to risk result from a failure to appreciate the effects of statistical aggregation in mitigating relative risk. Overly optimistic forecasts result from the adoption of an inside view of the problem, which anchors predictions on plans and scenarios. The conflicting biases are documented in psychological research. Possible implications for decision making in organizations are examined.

2,120 citations


Journal ArticleDOI
TL;DR: The authors present a context-dependent model that expresses the value of each option as an additive combination of two components: a contingent weighting process that captures the effect of the background context, and a binary comparison process that describes the local context.
Abstract: The standard theory of choice—based on value maximization—associates with each option a real value such that, given an offered set, the decision maker chooses the option with the highest value. Despite its simplicity and intuitive appeal, there is a growing body of data that is inconsistent with this theory. In particular, the relative attractiveness of x compared to y often depends on the presence or absence of a third option z, and the “market share” of an option can actually be increased by enlarging the offered set. We review recent empirical findings that are inconsistent with value maximization, and present a context-dependent model that expresses the value of each option as an additive combination of two components: a contingent weighting process that captures the effect of the background context, and a binary comparison process that describes the effect of the local context. The model accounts for observed violations of the standard theory and provides a framework for analyzing context-dependent p...

1,281 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a conceptual model for identifying specific flexibility dimensions and the manner in which these dimensions may limit the effectiveness of a manufacturing process, and the problems in operationalizing them.
Abstract: To help meet competitive realities operations managers need to know more about the strategic aspects of manufacturing flexibility. This paper takes steps toward meeting that need by critically reviewing the literature and establishing a research agenda for the area. A conceptual model, which places flexibility within a broad context, helps to identify certain assumptions of theoretical studies which need to be challenged. The model also provides a basis for identifying specific flexibility dimensions. The manner in which these dimensions may limit the effectiveness of a manufacturing process, and the problems in operationalizing them are discussed. Focusing next on the neglected area of applied work, concepts are presented for analyzing whether desired amounts of flexibility are being achieved and whether the potential for flexibility built into a manufacturing process is being tapped. Once more, a procedure is outlined for altering a plant's types and amounts of flexibility over time. The research agenda...

1,145 citations


Journal ArticleDOI
TL;DR: In this article, the authors provided a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA) and showed that DEA estimators of the best practice monotone increasing and concave production function are also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotonically decreasing probability density function.
Abstract: This paper provides a formal statistical basis for the efficiency evaluation techniques of data envelopment analysis (DEA). DEA estimators of the best practice monotone increasing and concave production function are shown to be also maximum likelihood estimators if the deviation of actual output from the efficient output is regarded as a stochastic variable with a monotone decreasing probability density function. While the best practice frontier estimator is biased below the theoretical frontier for a finite sample size, the bias approaches zero for large samples. The DEA estimators exhibit the desirable asymptotic property of consistency, and the asymptotic distribution of the DEA estimators of inefficiency deviations is identical to the true distribution of these deviations. This result is then employed to suggest possible statistical tests of hypotheses based on asymptotic distributions.

908 citations


Journal ArticleDOI
TL;DR: This paper examined the determinants of Japanese entry into the United States by focusing on firms of one country entering a single market, and they were able to separate the impact of a firm's strategy from that of the characteristics of the target industry or country.
Abstract: Multinational firms can enter a foreign market by taking over existing local firms acquisitions or by setting up new ventures greenfield investments. Surprisingly, there has been limited empirical work on this topic. This paper examines the determinants of this choice by looking at Japanese entries into the United States. By focusing on firms of one country entering a single market, we are able to separate the impact of a firm's strategy from that of the characteristics of the target industry or country. This paper tests simultaneously a number of competing hypotheses. The results suggest that acquisitions are used by Japanese investors with weak competitive advantages, while investors with strong advantages find that greenfield investments are a more efficient way to transfer these advantages to the U.S. Acquisitions are also chosen to enter industries with either very high or very low growth rates, when entry is at a scale that is large relative to the parent, and when entry is into a different industry. The Japanese investor's previous experience of the U.S. market, its financial situation, and its status as a follower in an oligopolistic industry have no statistically significant impact on the entry mode. Neither do U.S. stock market conditions.

720 citations


Journal ArticleDOI
TL;DR: In this article, the authors examined the influence of a set of antecedent constructs (superordinate goals, accessibility, physical proximity and formalized rules and procedures) on the attainment of both cross-functional cooperation and perceived project outcomes.
Abstract: Cross-functional teams can greatly facilitate the successful implementation of projects This study examined the influence of a set four antecedent constructs (superordinate goals, accessibility, physical proximity and formalized rules and procedures) on the attainment of both cross-functional cooperation and perceived project outcomes Through the use of path analysis, the results indicated that superordinate goals, physical proximity and project team rules and procedures have significant direct and/or indirect effects on project outcomes through influencing cross-functional cooperation Further, cross-functional cooperation was a significant predictor of both perceived task and psychosocial project outcomes Directions for management practice and future research are discussed

719 citations


Journal ArticleDOI
TL;DR: The branch-and-cut solver as discussed by the authors generates cutting planes based on the underlying structure of the polytope defined by the convex hull of the feasible integer points and incorporates these cuts into a tree-search algorithm that uses automatic reformulation procedures, heuristics and linear programming technology to assist in the solution.
Abstract: The crew scheduling problem is one that has been studied almost continually for the past 40 years but all prior approaches have always approximated the problem of finding an optimal schedule for even the smallest of an airline's fleets. The problem is especially important today since costs for flying personnel of major U.S. carriers have grown and now often exceed $1.3 billion a year and are the second largest item next to fuel cost of the total operating cost of major U.S. carriers. Thus even small percentage savings amount to substantial dollar amounts. We present a branch-and-cut approach to solving to proven optimality large set partitioning problems arising within the airline industry. We first provide some background related to this important application and then describe the approach for solving representative problems in this problem class. The branch-and-cut solver generates cutting planes based on the underlying structure of the polytope defined by the convex hull of the feasible integer points and incorporates these cuts into a tree-search algorithm that uses automatic reformulation procedures, heuristics and linear programming technology to assist in the solution. Numerical experiments are reported for a sample of 68 large-scale real-world crew scheduling problems. These problems include both pure set partitioning problems and set partitioning problems with side constraints. These "base constraints" represent contractual labor requirements and have heretofore not been represented explicitly in the construction of crew schedules thus making it impossible to provide any measure of how far the obtained solution was from optimality. An interesting result of obtaining less costly schedules is that the crews themselves are happier with the schedules because they spend more of their duty time flying than waiting on the ground.

559 citations


Journal ArticleDOI
TL;DR: In this article, the authors address the operational issue of quantity allocation between two uncertain suppliers and its effects on the inventory policies of the buyer, and derive the optimal ordering policies that minimize the total ordering, holding and penalty costs with backlogging.
Abstract: Supply chain management is becoming an increasingly important issue, especially when in most industries the cost of materials purchased comprises 40-60% of the total sales revenue. Despite the benefits cited for single sourcing in the popular literature, there is enough evidence of industries having two/three sources for most parts. In this paper we address the operational issue of quantity allocation between two uncertain suppliers and its effects on the inventory policies of the buyer. Based on the type of delivery contract a buyer has with the suppliers, we suggest three models for the supply process. Model I is a one-delivery contract with all of the order quantity delivered either in the current period with probability β, or in the next period with probability 1-β. Model II is also a one-delivery contract with a random fraction of the order quantity delivered in the current period; the portion of the order quantity not delivered is cancelled. Model III is similar to Model II with the remaining quantity delivered in the next period. We derive the optimal ordering policies that minimize the total ordering, holding and penalty costs with backlogging. We show that the optimal ordering policy in period n for each of these models is as follows: for x ≥ A«n, order nothing; for vI„n ≤ x < A«n, use only one supplier; and for x

469 citations


Journal ArticleDOI
TL;DR: A decision making task portraying new product dynamics is used to test the theory by varying the strength of key feedback processes in a simulated market, supporting the misperception of feedback hypothesis and implications for educational use of simulations and games.
Abstract: Boom and bust is a pervasive dynamic for new products. Word of mouth, marketing, and learning curve effects can fuel rapid growth, often leading to overcapacity, price war, and bankruptcy. Previous experiments suggest such dysfunctional behavior can be caused by systematic "misperceptions of feedback," where decision makers do not adequately account for critical feedbacks, time delays, and nonlinearities which condition system dynamics. However, prior studies often failed to vary the strength of these feedbacks as treatments, omitted market processes, and failed to allow for learning. A decision making task portraying new product dynamics is used to test the theory by varying the strength of key feedback processes in a simulated market. Subjects performed the task repeatedly, encouraging learning. Nevertheless, performance relative to potential is poor and is severely degraded when the feedback complexity of the environment is high, supporting the misperception of feedback hypothesis. The negative effects of feedback complexity on performance were not moderated by experience, even though average performance improved. Models of the subjects' decision making heuristics are estimated; changes over trials in estimated cue weights explain why subjects improve on average but fail to gain insight into the dynamics of the system. Though conditions for learning are excellent, experience does not appear to mitigate the misperceptions of feedback or systematic dysfunction they cause in dynamic decision making tasks. We discuss implications for educational use of simulations and games.

363 citations


Journal ArticleDOI
TL;DR: In this paper, the authors analyze the value of a corporation as a function of its ownership structure and show that under a strong condition on the purchase or sale of shares by large stockholders, investors have incentives to trade toward the ownership structure that maximizes the social surplus.
Abstract: This article analyzes the value of a corporation as a function of its ownership structure. Shareholders can acquire costly information about the manager's effort to produce output. Concentrating share ownership leads the largest shareholder to (i) acquire more precise signals of effort and (ii) modify the compensation contract. Better monitoring increases output, and hence firm value. However, the (risk averse) large shareholder bears more idiosyncratic firm risk as his stake in the firm increases. These forces equilibrate at a unique welfare maximizing ownership structure. Under a strong condition on the purchase or sale of shares by large stockholders, investors have incentives to trade toward the ownership structure that maximizes the social surplus. When all investors are price takers only a diffuse ownership structure can arise in a competitive equilibrium.

Journal ArticleDOI
TL;DR: In this paper, the makespan minimization problem in the 3-machine assembly-type flow shop scheduling problem is considered and a branch and bound solution scheme is suggested. But the problem is not solved in practice.
Abstract: This paper considers minimizing the makespan in the 3-machine assembly-type flowshop scheduling problem. After problem formulation, we present a proof to show that the general version of this problem is strongly NP-complete. We then discuss a few polynomially solvable cases of the problem and present the solution algorithms. Next, a branch and bound solution scheme is suggested. Finally, three heuristics to find approximate solutions to the general problem are proposed and their error bounds are analyzed.

Journal ArticleDOI
TL;DR: In this article, the authors provide evidence on the time pattern of lottery participation to see whether actual behavior is consistent with the gambler's fallacy, and they find a clear and consistent tendency for the amount of money bet on a particular number to fall sharply immediately after it is drawn, and then gradually to recover to its former level over the course of several months.
Abstract: The "gambler's fallacy" is the belief that the probability of an event is lowered when that event has recently occurred, even though the probability of the event is objectively known to be independent from one trial to the next. This paper provides evidence on the time pattern of lottery participation to see whether actual behavior is consistent with this fallacy. Using data from the Maryland daily numbers game, we find a clear and consistent tendency for the amount of money bet on a particular number to fall sharply immediately after it is drawn, and then gradually to recover to its former level over the course of several months. This pattern is consistent with the hypothesis that lottery players are in fact subject to the gambler's fallacy.

Journal ArticleDOI
TL;DR: In this article, the agency theory approach to understand salesforce compensation plans is modified to incorporate the intratemporal nature of the salesperson's effort-rate decision, i.e., the decision about the effort-level decision at any given point in time potentially depends upon the sales performance up to that point of time in the accounting period.
Abstract: The agency theory approach to understanding salesforce compensation plans is modified to incorporate the intratemporal nature of the salesperson's effort-rate decision, i.e., the decision about the effort-rate at any given point in time potentially depends upon the sales performance up to that point in time in the accounting period. Under the assumptions considered in this paper, Holmstrom and Milgrom (1987) have shown that the optimal compensation plan is linear in total sales over the accounting period. The comparative statics results obtained here corroborate most of the corresponding results in the salesforce compensation literature; moreover, we derive many additional results not available in the literature. It is demonstrated that the commission income as a fraction of total compensation goes up with an increase in the effectiveness of the sales-effort or an increase in base sales. On the other hand, the salary component of the total compensation goes up with increases in uncertainty, absolute risk ...

Journal ArticleDOI
TL;DR: In this article, the authors formalize the problem as a mathematical program where the objective of the firm is either profit or total welfare, and they develop a new greedy heuristic for the profit problem, and its application to simulated problems shows that it too runs quickly, and with better performance than various alternatives and previously published heuristics.
Abstract: Designing and pricing a product-line is the very essence of every business. In recent years quantitative methods to assist managers in this task have been gaining in popularity. Conjoint analysis is already widely used to measure preferences for different product profiles, and build market simulation models. In the last few years several papers have been published that suggest how to optimally choose a product-line based on such data. We formalize this problem as a mathematical program where the objective of the firm is either profit or total welfare. Unlike alternative published approaches, we introduce fixed and variable costs for each product profile. The number of products to be introduced is endogenously determined on the basis of their desirability, fixed and variable costs, and in the case of profits, their cannibalization effect on other products. While the problem is difficult NP-complete, we show that the maximum welfare problem is equivalent to the uncapacitated plant location problem, which can be solved very efficiently using the greedy interchange heuristic. Based on past published experience with this problem, and on simulations we perform, we show that optimal or near optimal solutions are obtained in seconds for large problems. We develop a new greedy heuristic for the profit problem, and its application to simulated problems shows that it too runs quickly, and with better performance than various alternatives and previously published heuristics. We also show how the methodology can be applied, taking existing products of both the firm and the competition into account.

Journal ArticleDOI
TL;DR: In this article, a method is proposed for obtaining and quantitatively evaluating verbal judgments in which each analyst uses a limited vocabulary that he or she has individually selected and scaled, and an experiment compared this method to standard numerical responding under three different payoff conditions.
Abstract: Despite the common reliance on numerical probability estimates in decision research and decision analysis, there is considerable interest in the use of verbal probability expressions to communicate opinion. A method is proposed for obtaining and quantitatively evaluating verbal judgments in which each analyst uses a limited vocabulary that he or she has individually selected and scaled. An experiment compared this method to standard numerical responding under three different payoff conditions. Response mode and payoff never interacted. Probability scores and their components were virtually identical for the two response modes and for all payoff groups. Also, judgments of complementary events were essentially additive under all conditions. The two response modes differed in that the central response category was used more frequently in the numerical than the verbal case, while overconfidence was greater verbally than numerically. Response distributions and degrees of overconfidence were also affected by payoffs. Practical and theoretical implications are discussed.

Journal ArticleDOI
TL;DR: In this article, the effect of ambiguity on individual decisions and the resulting market price in market settings was evaluated. But the authors did not examine whether ambiguity effects persist in the face of market incentives and feedback.
Abstract: Prior studies have shown that individuals are averse to ambiguity in probability. Many decisions are, however, made in market settings where an individual's decision is influenced by decisions of others participating in the market. In this paper, we extend the previous research to evaluate the effect of ambiguity on individual decisions and the resulting market price in market settings. We therefore examine an important issue: whether ambiguity effects persist in the face of market incentives and feedback. Two different market organizations, the sealed bid auction and the double oral auction, were employed. The subjects in the experiments were graduate business students and bank executives. Our results show that the individual bids and market prices for lotteries with ambiguous probabilities are consistently lower than the corresponding bids and market prices for equivalent lotteries with well-defined probabilities. The aversion to ambiguity therefore does not vanish in market settings. Our results provide insights into what a manager can expect in bidding situations where the object of the sale oil leases, mineral rights involves ambiguity in probability due to, for example, lack of information or prior experience. The results may also be useful in understanding some phenomena in insurance and equity markets.

Journal ArticleDOI
TL;DR: In this article, the authors explain why gain/loss discount rate differences reported in previous studies cannot be attributed to outcome sign alone, but rather, must be associated with particular outcome-sign/question-frame combinations.
Abstract: This study explains why gain/loss discount rate differences reported in previous studies cannot be attributed to outcome sign alone, but rather, must be associated with particular outcome-sign/question-frame combinations. To do so, it extends Loewenstein's (1988) framing model of intertemporal choice to negative outcomes and uses the resulting predictions to interpret and integrate the results of three previous studies comparing subjective discount rates (Thaler 1981, Loewenstein 1988, Benzion et al. 1989). The new framework reveals previously unidentified linkages among outcome signs, question frames, and discount rates. To investigate whether losses and gains are, in fact, discounted differently, an experiment is conducted that includes a neutral-frame intertemporal choice scenario (no proposed change in outcome timing) for each outcome sign. The results show that subjective discount rates vary in a predictable way according to the outcome sign and question frame combination examined.

Journal ArticleDOI
TL;DR: In this article, the authors compared installation and echelon stock policies for multilevel inventory control for serial and assembly systems and concluded that echelan stock policies are, in general, superior to installation stock policies.
Abstract: This paper compares installation and echelon stock policies for multilevel inventory control. The major results are for serial and assembly systems. For (Q, r)-rules, echelon stock policies are, in general, superior to installation stock policies. A Kanban-policy is identified as a restricted type of installation stock (Q, r)-policy.

Journal ArticleDOI
TL;DR: An integrated conceptual framework for resource planning is presented to examine how tool management issues, depending upon their scope, can be classified into tool-level, machine- level, and system-level concerns, and how information from lower levels feeds back to higher level decisions.
Abstract: The evidence is clear that a lack of attention to structured tool management has resulted in the poor performance of many manufacturing systems. Plant tooling systems affect product design options, machine loading, job batching, capacity scheduling, and real-time part routing decisions. With increasing automation in manufacturing systems, there is a growing need to integrate tool management more thoroughly into system design, planning and control. This paper critically evaluates various tool management approaches, identifying the operational tradeoffs and analyzing the models developed to address management decisions involving tooling. These decisions range from selecting the optimal machining parameters and the most economic processing rate for a particular operation, to the loading of tools and jobs on machines and the determination of the optimal tool-mix inventories needed for a particular production schedule. We present an integrated conceptual framework for resource planning to examine how tool management issues, depending upon their scope, can be classified into tool-level, machine-level, and system-level concerns. This framework specifies how decisions made at one level constrain those at lower levels, and how information from lower levels feeds back to higher level decisions. The framework structures our critical evaluation of the modeling approaches found in the academic literature and points to promising directions for future research.

Journal ArticleDOI
TL;DR: The simulation results show that the injection of autocorrelation into interarrival times, and to a lesser extent into service demands, can have a dramatic impact on performance measures.
Abstract: The performance of single-server queues with independent interarrival intervals and service demands is well understood, and often analytically tractable. In particular, the M/M/1 queue has been thoroughly studied, due to its analytical tractability. Little is known, though, when autocorrelation is introduced into interarrival times or service demands, resulting in loss of analytical tractability. Even the simple case of an M/M/1 queue with autocorrelations does not appear to be well understood. Such autocorrelations do, in fact, abound in real-life systems, and worse, simplifying independence assumptions can lead to very poor estimates of performance measures. This paper reports the results of a simulation study of the impact of autocorrelation on performance in an FIFO queue. The study used two computer methods for generating autocorrelated random sequences, with different autocorrelation characteristics. The simulation results show that the injection of autocorrelation into interarrival times, and to a lesser extent into service demands, can have a dramatic impact on performance measures. From a performance viewpoint, these effects are generally deleterious, and their magnitude depends on the method used to generate the autocorrelated process. The paper discusses these empirical results and makes some recommendations to practitioners of performance analysis of queuing systems.

Journal ArticleDOI
TL;DR: This paper found that subjects do not adjust their judgments properly if the range of outcomes of each attribute and the weight for that attribute is varied, and that the bias was smaller for a regression procedure than for the direct ratio method.
Abstract: Multiattribute utility theory requires a specific relation between the range of outcomes of each attribute and the weight for that attribute. The greater the range, the greater the weight has to be. Experimental results show that subjects do not adjust their judgments properly if the range is varied. For the two methods tested the adjustment is smaller than required by theory. The bias was smaller for a regression procedure than for the direct ratio method. Weights based on an intuitive range were not found to be superior to those elicited over different ranges.

Journal ArticleDOI
TL;DR: In this paper, the authors explore the rational effect of price variation on sales and consumption in markets where consumers are uncertain about the future price of goods and derive an optimal ordering policy which expresses the amount a consumer should purchase and consume in a given period as a function of the observed price of the good, the distribution of future prices, and the nature of his or her inventory.
Abstract: We explore the rational effect of price variation on sales and consumption in markets where consumers are uncertain about the future price of goods. We first derive an optimal ordering policy which expresses the amount a consumer should purchase and consume in a given period as a function of the observed price of the good, the distribution of future prices, and the nature of his or her inventory. This policy extends previous normative models of inventory control, such as those by Golabi 1985 and Kalymon 1970 to the case where the amount to consume in a given period is an explicit decision variable and prices follow a first-order stochastic process. We then use this model to explore how changes in the long-run frequency and temporal correlations of price promotions should normatively affect the contemporaneous relationship between purchase, consumption and price. Among the predictions which follow from the model are that consumption should rationally increase with the size of existing inventories, the short-term sensitivity of sales to prices should be greater than that of consumption to price, and this discrepancy increases with decreases in the temporal correlation of price deals and the long-term relative frequency of price deals.

Journal ArticleDOI
TL;DR: In this article, the feasibility of improving performance in dynamic tasks by providing cognitive feedback or feedforward was examined, and the results indicated that subjects provided with cognitive feedback performed best, followed by those provided with feedforward.
Abstract: Studies conducted in recent years have shown that outcome feedback in dynamic decision-making tasks does not lead to improved performance. This has led researchers to examine alternatives to outcome feedback for improving decision makers' performance in such tasks. This study examines the feasibility of improving performance in dynamic tasks by providing cognitive feedback or feedforward. We report a laboratory experiment in which subjects managed a set of simulated software development projects. Results indicate that subjects provided with cognitive feedback performed best, followed by those provided with feedforward. Subjects provided with outcome feedback performed poorly. We discuss the implications of the results for decision support in dynamic tasks.

Journal ArticleDOI
TL;DR: In this article, the mean number of busy servers as a function of time in an Mt/G/∞ queue having a nonhomogeneous Poisson arrival process with a sinusoidal arrival rate function is described.
Abstract: In this paper we describe the mean number of busy servers as a function of time in an Mt/G/∞ queue having a nonhomogeneous Poisson arrival process with a sinusoidal arrival rate function. For an Mt/G/∞ model with appropriate initial conditions, it is known that the number of busy servers at time t has a Poisson distribution for each t, so that the full distribution is characterized by its mean. Our formulas show how the peak congestion lags behind the peak arrival rate and how much less is the range of congestion than the range of offered load. The simple formulas can also be regarded as consequences of linear system theory, because the mean function can be regarded as the image of a linear operator applied to the arrival rate function. We also investigate the quality of various approximations for the mean number of busy servers such as the pointwise stationary approximation and several polynomial approximations. Finally, we apply the results for sinusoidal arrival rate functions to treat general periodic arrival rate functions using Fourier series. These results are intended to provide a better understanding of the behavior of the Mt/G/∞ model and related Mt/G/s/r models where some customers are lost or delayed.

Journal ArticleDOI
TL;DR: In this article, the authors analyze diffusion rates in Japan, Western Europe and the United States of flexible manufacturing systems and determine whether U.S. firms have been relatively slow to introduce this innovation, and if so, why.
Abstract: This paper is concerned with the rate of diffusion of flexible manufacturing systems, one of the most important industrial applications of information technology. Using data from about 175 firms, I analyze diffusion rates in Japan, Western Europe and the United States, a major purpose being to determine whether, as is often claimed, U.S. firms have been relatively slow to introduce this innovation, and if so, why.

Journal ArticleDOI
TL;DR: Probability of survival assessed by physicians for patients admitted to an intensive care unit are studied, and the key factor in relative overall performance is the level of discrimination provided by the probabilities.
Abstract: In this paper, probabilities of survival assessed by physicians for patients admitted to an intensive care unit are studied. The probabilities from each of four types of physicians are evaluated on an overall basis and in terms of specific attributes, and the groups are compared. The physicians with the most experience and expertise perform better overall. All four groups appear to be reasonably well calibrated, and the key factor in relative overall performance is the level of discrimination provided by the probabilities. Averages of two, three, and four probabilities for each individual patient are also analyzed. As the number of the probabilities in the average increases, performance improves on average on all dimensions, although the best overall performance is exhibited by a combination of probabilities from the two physician types performing best individually. Some comparisons are made with previous work, and implications for probability assessment and combination in medicine and more generally in other areas of application are discussed. Important characteristics of the study are the fact that it was conducted on-line in a real setting, the involvement of individuals with different levels of expertise, the use of a true predictive situation with a clearly-defined event, the consideration of multiple dimensions of the quality of judgments, and the collection of multiple probabilities for each case to permit the investigation of a variety of possible combinations of probabilities.

Journal ArticleDOI
TL;DR: In this article, a graph-theoretic approach is used to determine an optimal solution for this goal for a new, nonconvex objective function, and a schedule always exists such that, at all times, the deviation of actual production from the desired level of production for every product is never more than one unit.
Abstract: A mixed-model manufacturing facility running under a Just-in-Time JIT production system is controlled by setting the production sequence of the final assembly process. This sequence is set to achieve the primary goal of an organization operating under a JIT system, which is to maintain a constant rate of part usage. In this paper, a graph-theoretic approach is used to determine an optimal solution for this goal for a new, nonconvex objective function. Furthermore, it is shown that a schedule always exists such that, at all times, the deviation of actual production from the desired level of production for every product is never more than one unit.

Journal ArticleDOI
TL;DR: In this article, the authors focus on the statistics of the past in evaluating current plans and isolate the current choice from future opportunities, neglecting the statistics from the past to evaluate current plans.
Abstract: Decision makers have a strong tendency to consider problems as unique. They isolate the current choice from future opportunities and neglect the statistics of the past in evaluating current plans. ...

Journal ArticleDOI
TL;DR: Computational studies reveal that the heuristic policies applied to the Fixed-life Perishability Problem are near optimal, and are easy to compute.
Abstract: This paper details the application of a class of heuristics to the Fixed-life Perishability Problem formulated by Nahmias 1975a and Fries 1975. Various assumptions for this model include i.i.d. demand, linear ordering, holding and penalty costs. Goods have a known fixed lifetime and perished goods cause a linear outdating cost to be incurred. The approach we use, that of developing heuristics from 'near myopic' bounds, involves viewing periodic inventory problems in the framework of the classic "newsboy" model. We exploit various properties of the problem under consideration to derive tight bounds on the newsboy parameters, thus leading to efficient bounds on the order quantities. Computational studies reveal that the heuristic policies are near optimal, and are easy to compute.