scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1991"


Journal ArticleDOI
TL;DR: In this article, a portfolio optimization model using the L1 risk (mean absolute deviation risk) function can remove most of the difficulties associated with the classical Markowitz's model while maintaining its advantages over equilibrium models.
Abstract: The purpose of this paper is to demonstrate that a portfolio optimization model using the L1 risk (mean absolute deviation risk) function can remove most of the difficulties associated with the classical Markowitz's model while maintaining its advantages over equilibrium models In particular, the L1 risk model leads to a linear program instead of a quadratic program, so that a large-scale optimization problem consisting of more than 1,000 stocks may be solved on a real time basis Numerical experiments using the historical data of NIKKEI 225 stocks show that the L1 risk model generates a portfolio quite similar to that of the Markowitz's model within a fraction of time required to solve the latter

1,408 citations


Journal ArticleDOI
TL;DR: In this paper, the authors developed the perspective that joint ventures are created as real options to expand in response to future technological and market developments, and the exercise of the option is accompanied by an acquisition of the venture.
Abstract: This article develops the perspective that joint ventures are created as real options to expand in response to future technological and market developments. The exercise of the option is accompanied by an acquisition of the venture. It is hypothesized that the timing of the acquisition should be triggered by a product market signal indicating an increase in the venture's valuation. Based on a sample of 92 manufacturing joint ventures, this hypothesis is tested by estimating the effect of product market signals on the hazard of acquisition. The results indicate that unexpected growth in the product market increases the likelihood of acquisition; unexpected shortfalls in product shipments have no effect on the likelihood of dissolution. This asymmetry in the results strongly supports the interpretation of joint ventures as options to expand.

1,261 citations


Journal ArticleDOI
TL;DR: In this article, a large sample empirical study of the factors which influence the choice of Japanese firms between full or partial ownership of their U.S. manufacturing subsidiaries is presented, showing that the degree of ownership taken by Japanese manufacturing investors in their American subsidiaries is driven by the same general transaction costs variables that determine the choices made by their Japanese parents joint venture when they need to combine with other firms intermediate inputs.
Abstract: This paper offers the first large sample empirical study of the factors which influence the choice of Japanese firms between full or partial ownership of their U.S. manufacturing subsidiaries. It studies for the first time the ownership policies of investors of a single home country in a single host country, thus keeping variations within home and host countries constant. One methodological improvement over previous studies is the use as independent variables of the relevant characteristics of the investing firms. These had been proxied in previous studies by data on U.S. industries entered. The results suggest that the degree of ownership taken by Japanese manufacturing investors in their American subsidiaries is driven by the same general transaction costs variables that determine the choices made by their U.S. counterparts: Japanese parents joint venture when they need to combine with other firms intermediate inputs which are subject to high market transaction costs. An intriguing result, however, is the lack of significance of two variables which, in the U.S. case, strongly push towards full control of foreign subsidiaries. In this study neither the Japanese parent's R&D nor its advertising intensities had any significant impact on their ownership policies.

1,004 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the failure to match strategy and environment hurts financial performance, and that the lack of a match between environment and strategy is positively related to financial performance.
Abstract: It has often been argued that an organization's strategy and structure must be tailored or matched to the challenges posed by its environment. Our research shows that this match is less likely to be achieved by long-tenured CEO's than by their counterparts with less tenure. It also suggests that the failure to match strategy and environment hurts financial performance. More specifically, CEO tenure related inversely to the prescribed match between organization and environment, especially in uncertain settings and where ownership was concentrated. The match between environment and strategy was in turn positively related to financial performance.

934 citations


Journal ArticleDOI
TL;DR: In this article, the authors show that in some circumstances, even with significant piracy, not protecting can be the best policy, both raising firm profits and lowering selling prices, since consumers have an incentive to economize on post-purchase learning and customization costs.
Abstract: Software piracy by users is generally believed to harm both software firms through lower profits and buying customers through higher prices. Thus, it is thought that perfect and costless technological protection would benefit both firms and consumers. The model developed here suggests that in some circumstances, even with significant piracy, not protecting can be the best policy, both raising firm profits and lowering selling prices. Key to the analysis is joining the presence of a positive network externality with the fact that piracy increases the total number of program users. The network externality exists because consumers have an incentive to economize on post-purchase learning and customization costs.

676 citations


Journal ArticleDOI
TL;DR: This exploratory paper sketches some of the behavioral processes that give rise to the learning curve, and constructs a model of productivity improvement as a function of cumulative output and two managerial variables-engineering changes and workforce training.
Abstract: This exploratory paper sketches some of the behavioral processes that give rise to the learning curve. Using data from two manufacturing departments in an electronic equipment company, we construct a model of productivity improvement as a function of cumulative output and two managerial variables-engineering changes and workforce training. Exploration of this model highlights the complex relationship between first-order and second-order learning.

608 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider a one-to-one correspondence between the behavioral violations of the respective normative theories for the two domains (i.e., expected utility and discounted utility models) and argue that such violations are broadly consistent with three propositions about the weight that an attribute receives in both types of multiattribute choice.
Abstract: This paper considers a number of parallels between risky and intertemporal choice. We begin by demonstrating a one-to-one correspondence between the behavioral violations of the respective normative theories for the two domains (i.e., expected utility and discounted utility models). We argue that such violations (or preference reversals) are broadly consistent with three propositions about the weight that an attribute receives in both types of multiattribute choice. Specifically, it appears that: (1) if we add a constant to all values of an attribute, then that attribute becomes less important; (2) if we proportionately increase all values of an attribute, or if we change the sign of an attribute, from positive to negative, then that attribute becomes more important. The generality of these propositions, as well as the constraints they would impose on separable representations of multiattribute preferences, is discussed.

585 citations


Journal ArticleDOI
TL;DR: In this article, a simple forward algorithm for the generalized Wagner-Whitin model is proposed, which solves the general model in 0n log n time and 0n space, as opposed to the well known shortest path algorithm advocated over the last 30 years with 0n2 time.
Abstract: This paper is concerned with the general dynamic lot size model, or generalized Wagner-Whitin model. Let n denote the number of periods into which the planning horizon is divided. We describe a simple forward algorithm which solves the general model in 0n log n time and 0n space, as opposed to the well-known shortest path algorithm advocated over the last 30 years with 0n2 time. A linear, i.e., 0n-time and space algorithm is obtained for two important special cases: a models without speculative motives for carrying stock, i.e., where in each interval of time the per unit order cost increases by less than the cost of carrying a unit in stock; b models with nondecreasing setup costs. We also derive conditions for the existence of monotone optimal policies and relate these to known planning horizon and other results from the literature.

403 citations


Journal ArticleDOI
TL;DR: In this article, the authors combine Data Envelopment Analysis DEA with regression modelling to estimate relative efficiency in the public school districts of Connecticut, and find that while productivity of school inputs varies considerably across districts this can be attributed to differences in the socio-economic background of the communities served.
Abstract: This study combines Data Envelopment Analysis DEA with regression modelling to estimate relative efficiency in the public school districts of Connecticut. Factors affecting achievements are classified as school inputs and other socio-economic factors. DEA is performed with the school inputs only. Efficiency measures obtained from DEA are subsequently related to the socio-economic factors in a regression model with a one-sided disturbance term. The findings suggest that while productivity of school inputs varies considerably across districts this can be ascribed to a large extent to differences in the socio-economic background of the communities served. Variation in managerial efficiency is much less than what is implied by the DEA results.

403 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the relationship between organizational quality context and actual and ideal quality management using data collected from 152 managers from 77 business units of 20 manufacturing and service companies in order to measure managers' perceptions of ideal and actual quality management in terms of eight critical factors including product/service design, training, employee relations and top management leadership.
Abstract: While the quality literature abounds with prescriptions for how quality should be managed, no one has proposed an organization-theory explanation for how quality is managed in organizations. This paper proposes a system-structural model of quality management that relates organizational quality context, actual quality management, ideal quality management, and quality performance. The relationships between organizational quality context and actual and ideal quality management are investigated using data collected from 152 managers from 77 business units of 20 manufacturing and service companies. A previously reported instrument is used to measure managers' perceptions of ideal and actual quality management in terms of eight critical factors including product/service design, training, employee relations, and top management leadership. Several measures are used to characterize organizational quality context including company type, company size, degree of competition, and corporate support for quality. The results indicate that organizational quality context influences managers' perceptions of both ideal and actual quality management. This suggests that knowledge of organizational quality context is useful for explaining and predicting quality management practice. Important contextual variables are corporate support for quality, past quality performance, managerial knowledge, and the extent of external quality demands.

354 citations


Journal ArticleDOI
David Abramson1
TL;DR: This paper considers a solution to the school timetabling problem and describes the simulated annealing method, which can provide a faster solution than the equivalent sequential algorithm.
Abstract: This paper considers a solution to the school timetabling problem. The timetabling problem involves scheduling a number of tuples, each consisting of class of students, a teacher, a subject and a room, to a fixed number of time slots. A Monte Carlo scheme called simulated annealing is used as an optimisation technique. The paper introduces the timetabling problem, and then describes the simulated annealing method. Annealing is then applied to the timetabling problem. A prototype timetabling environment is described followed by some experimental results. A parallel algorithm which can be implemented on a multiprocessor is presented. This algorithm can provide a faster solution than the equivalent sequential algorithm. Some further experimental results are given.

Journal ArticleDOI
TL;DR: An estimable production frontier model of software maintenance is developed, using a new methodology that allows the simultaneous estimation of both the production frontier and the effects of several productivity factors.
Abstract: The cost of maintaining application software has been rapidly escalating, and is currently estimated to comprise from 50-80% of corporate information systems department budgets. In this research we develop an estimable production frontier model of software maintenance, using a new methodology that allows the simultaneous estimation of both the production frontier and the effects of several productivity factors. Our model allows deviations on both sides of the estimated frontier to reflect the impact of both production inefficiencies and random effects such as measurement errors. The model is then estimated using an empirical dataset of 65 software maintenance projects from a large commercial bank. The insights obtained from the estimation results are found to be quite consistent for reasonable variations in the specification of the model. Estimates of the marginal impacts of all of the included productivity factors are obtained to aid managers in improving productivity in software maintenance.

Journal ArticleDOI
TL;DR: In this article, new multinomial models are presented that include as special cases existing models and more general models are shown to be computationally more efficient than the more general ones.
Abstract: Contingent claims whose values depend on multiple sources of uncertainty arise in many financial contracts and in the analysis of real projects Unfortunately closed form solutions for these options are rare and numerical methods can be computationally expensive This article extends the literature on multinomial approximating models Specifically, new multinomial models are presented that include as special cases existing models The more general models are shown to be computationally more efficient

Journal ArticleDOI
TL;DR: In this article, a methodology for evaluating and selecting R&D projects in a collective decision setting, especially useful at sectorial and national levels, is proposed, which consists of two major phases: Evaluation and Selection.
Abstract: This paper proposes a methodology for evaluating and selecting R&D projects in a collective decision setting, especially useful at sectorial and national levels. It consists of two major phases: Evaluation and Selection. The evaluation process repeatedly uses mathematical programming models to determine the “relative values” of a given R&D project from the viewpoint of the other R&D projects. The selection process of R&D projects is based on these “relative values” and is done through a model-based outranking method. The salient features of the methodology developed are its ability to (1) permit the evaluation of an R&D project from the viewpoint of the other R&D projects without at first imposing a uniform evaluation scheme, and (2) maximize the level of consensus as to which projects should not be retained in the R&D Program being funded, thus minimizing the level of possible resentment in those organizations or departments whose projects are not included in the R&D Program. Also discussed in this paper...

Journal ArticleDOI
TL;DR: The accuracy of an easily computed approximation for long run, average performance measures such as expected delay and probability of delay in multiserver queueing systems with exponential service times and periodic sinusoidal Poisson arrival processes is empirically explored.
Abstract: We empirically explore the accuracy of an easily computed approximation for long run, average performance measures such as expected delay and probability of delay in multiserver queueing systems with exponential service times and periodic sinusoidal Poisson arrival processes. The pointwise stationary approximation is computed by integrating over time that is taking the expectation of the formula for the stationary performance measure with the arrival rate that applies at each point in time. This approximation, which has been empirically confirmed as a tight upper bound of the true value, is shown to be very accurate for a range of parameter values corresponding to a reasonably broad spectrum of real systems.

Journal ArticleDOI
TL;DR: In this paper, the authors analyze dual sourcing in the context of the "reorder point, order quantity" inventory model with constant demand and stochastic lead times and compare it with single sourcing, showing that when the uncertainty in the lead times is high and the ordering costs are low, dual sourcing could be cost effective.
Abstract: When supply lead times are uncertain, the simultaneous procurement from two sources offers savings in inventory holding and shortage costs. Economies are achieved if these savings outweigh the increase in ordering costs. In this paper we analyze dual sourcing in the context of the "reorder point, order quantity" inventory model with constant demand and stochastic lead times and compare it with single sourcing. Two cases are studied, using the uniform and the exponential distributions, which may be thought of as two extreme ways of representing stochastic lead times. In our two-vendor model, the order quantity is split equally between the two vendors and the split orders are placed simultaneously when the inventory position reaches the reorder level. A comparison of the total expected costs suggests that when the uncertainty in the lead times is high and the ordering costs are low, dual sourcing could be cost effective.

Journal ArticleDOI
TL;DR: The use of anonymity to separate personalities from the issues and promote more objective evaluation was found to improve option generation in some circumstances, particularly those with increased criticalness and/or power differences among the participants.
Abstract: The study of negotiating groups, whether distributive between competing parties i.e., "win-lose" or integrative between essentially friendly parties from the same organization i.e., "win-win", remains important. While much previous research in this area has focused on key analytical issues such as evaluating proposed options, much less research has addressed the equally important initial stage of negotiation: generating options for mutual gain. In general, groups do this poorly, as there are many obstacles that inhibit successful option generation. Recent advances in computer technology provide additional approaches that can be used to support option generation as one component in an overall Negotiation Support System. This paper presents an integrated series of laboratory and field studies that investigated various aspects of computer-supported option generation for groups that meet at the same place and time. The use of anonymity to separate personalities from the issues and promote more objective evaluation was found to improve option generation in some circumstances, particularly those with increased criticalness and/or power differences among the participants. Larger groups were found to be more effective than smaller groups, several smaller groups combined, and nominal groups. We present several implications for theory development and system design and use, as well as a tentative model for computer-supported group option generation.

Journal ArticleDOI
TL;DR: In this paper, the authors report a laboratory experiment to examine how a general purpose group decision support system (GDSS) influenced conflict management in small groups making a budget allocation decision.
Abstract: Computer systems to support decision-making, planning, and negotiation in groups have the potential for wide-ranging application. However, knowledge of their effects is sparse, particularly for difficult situations such as group conflict. This study reports a laboratory experiment to examine how a general purpose group decision support system GDSS influenced conflict management in small groups making a budget allocation decision. The study tests a model that posits that GDSS impacts on conflict outcomes are mediated by group interaction processes, particularly how the GDSS enters into group interaction. The model posits seven potential impacts-some positive and some negative-that GDSS technology might have on conflict interaction processes. The impacts do not automatically occur, but depend on the nature of the GDSS and how the group applies it. Hence, a given GDSS might result only in a subset of the seven impacts. Among other things, the model predicts that the particular combination of GDSS impacts that materializes differs across groups and that the balance of these impacts, positive or negative, determines positive or negative conflict outcomes. Groups using a particular GDSS, the Software Aided Meeting Management SAMM system, were compared to groups using a manual version of the same decision structures built into SAMM and to unsupported groups. Results indicated that there were differences in the level of conflict in SAMM-supported versus manually-supported and unsupported groups, and in conflict management behaviors adopted by the different conditions. Moreover, there were differences in the impacts of SAMM for different groups, and there is some evidence that these contributed to consensus change. Overall, the theoretical model was supported by the study. This model and approach used in this study seem useful for designing future studies concerning the impacts of computer technology on group judgment and choice.

Journal ArticleDOI
TL;DR: In this paper, the authors compared four weighting methods in multiattribute utility measurement: the ratio method, the swing weighting, the tradeoff method and the pricing out method.
Abstract: This paper compares four weighting methods in multiattribute utility measurement: the ratio method, the swing weighting method, the tradeoff method and the pricing out method. 200 subjects used these methods to weight attributes for evaluating nuclear waste repository sites in the United States. The weighting methods were compared with respect to their internal consistency, convergent validity, and external validity. Internal consistency was measured by the degree to which ordinal and cardinal or ratio responses agreed within the same weighting method. Convergent validity was measured by the degree of agreement between the weights elicited with different methods. External validity was determined by the degree to which weights elicited in this experiment agreed with weights that were elicited with managers of the Department of Energy. In terms of internal consistency, the tradeoff method fared worst. In terms of convergent validity, the pricing out method turned out to be an outlier. In terms of external validity, the pricing out method showed the best results. While the ratio and swing methods are quite consistent and show a fair amount of convergent validity, their external validity problems cast doubt on their usefulness. The main recommendation for applications is to improve the internal consistency of the tradeoff method by careful interactive elicitation and to use it in conjunction with the pricing out method to enhance its external validity.

Journal ArticleDOI
TL;DR: This work addresses the problem of sequencing jobs, each of which is characterized by one of a large number of possible combinations of customer-specified options, on a paced assembly line, in the automotive industry and describes the optimal solution for this problem.
Abstract: We address the problem of sequencing jobs, each of which is characterized by one of a large number of possible combinations of customer-specified options, on a paced assembly line. These problems arise frequently in the automotive industry. One job must be launched into the system at equal time intervals, where the time interval (or cycle time) is prespecified. The problem is to sequence the jobs to maximize the total amount of work completed, or equivalently, to minimize the total amount of incomplete work (or work overload). Since there is a large number of option combinations, each job is almost unique. This fact precludes the use of existing mixed model assembly line sequencing techniques. We first consider the sequencing problem for a single station which can perform two different sets of operations. We characterize the optimal solution for this problem and use the results as the basis for a heuristic procedure for multiple stations. Computational results with data from a major automobile company are...

Journal ArticleDOI
TL;DR: In this paper, the authors examined the use of a similar, newly-developing organizational form for purposes of innovation; namely, the internal corporate joint venture ICJV, which has characteristics of both traditional joint ventures and internal corporate venturing.
Abstract: Organizations have increasingly turned to alternative organizational forms such as joint ventures and internal corporate ventures to enhance innovation. The present study examines the use of a similar, newly-developing organizational form for purposes of innovation; namely, the internal corporate joint venture ICJV, which has characteristics of both traditional joint ventures and internal corporate venturing. This study presents an industry-specific analysis of innovation across 53 ICJV's hospital/physician group combinations, using qualitative and quantitative analyses to identify those factors most strongly associated with the degree of innovativeness in these new organizations. The empirical findings suggest three factors most significantly associated with innovation in the ICJV's in our sample: 1 age similarity among organizational members, 2 the sponsoring organization's orientation towards innovation, and 3 ICJV participation in integrative activities with the sponsoring organization. The study concludes by suggesting that greater attention be devoted to studying "nested innovation," i.e., innovation within a new organizational form that is itself an administrative innovation.

Journal ArticleDOI
TL;DR: This paper reviewed several of the current controversies in the relative value of judgemental and statistical forecasting methods, and concluded that the quality of expert, informed judgemental forecasts is higher than many researchers have previously asserted, and circumstances favourable to this are identified.
Abstract: This paper reviews several of the current controversies in the relative value of judgemental and statistical forecasting methods. Where expert, informed judgemental forecasts are being used, a critical analysis of the evidence suggests that their quality is higher than many researchers have previously asserted, and circumstances favourable to this are identified. The issue of the interaction of judgemental and statistical methods is, however, identified as a more worthwhile line of inquiry, and research in this area is reviewed, differentiating approaches aimed at synthesising both of these inputs.

Journal ArticleDOI
TL;DR: In this article, the principal-agent agency paradigm is used to find incentive plans that will induce marketing and manufacturing managers to act in their self-interest so that the owner of the firm can attain as much as possible of the residual returns.
Abstract: Stereotypically, marketing is mainly concerned about satisfying customers and manufacturing is mainly interested in factory efficiency. Using the principal-agent agency paradigm, which assumes that the marketing and manufacturing managers of the firm will act in their self-interest, we seek incentive plans that will induce those managers to act so that the owner of the firm can attain as much as possible of the residual returns. One optimal incentive plan can be interpreted as follows: The owner subcontracts to pay the manufacturing manager a fixed rate for all capacity he delivers. Each marketing manager receives all of the returns from his product. In turn, all managers pay a fixed fee to the owner. Under this plan, the marketing managers will often complain about the stock level decisions, even though these levels are announced in advance. Under a revised plan, the owner can eliminate such complaints by delegating the stocking decisions to the respective marketing managers, without any loss. This plan is interpreted as requiring the owner to make a futures market for manufacturing capacity, paying the manufacturing manager the expected marginal value for each unit of capacity delivered, receiving the realized marginal value from the marketing managers, and losing money on average in the process.

Journal ArticleDOI
TL;DR: The results show that, in sharp contrast to prior multiechelon models, transit-time variances play an important role in system performance.
Abstract: This paper is concerned with multiechelon inventory systems, where each demand triggers a separate order one-for-one replenishment policies. The exogenous demands are independent Poisson processes. The transit times between locations may be stochastic. The way we model these transit times follows closely the standard treatment of stochastic leadtimes in single-location models. This is different, however, from the usual approach in the multiechelon setting e.g., METRIC. We describe simple methods for computing or approximating the steady-state behavior of the system. The results show that, in sharp contrast to prior multiechelon models, transit-time variances play an important role in system performance.

Journal ArticleDOI
Andreas Drexl1
TL;DR: This paper considers the nonpreemptive variant of a resource-constrained project job-assignment problem, where job durations as well as costs depend upon the assigned resource, and presents a hybrid brand and bound/dynamic programming algorithm with a rather efficient Monte Carlo type heuristic upper bounding technique.
Abstract: A recurring problem in project management involves the allocation of scarce resources to the individual jobs comprising the project. In many situations such as audit scheduling, the resources correspond to individuals skilled labour. This naturally leads to an assignment type project scheduling problem, i.e. a project has to be processed by assigning one of several individuals resources to each job. In this paper we consider the nonpreemptive variant of a resource-constrained project job-assignment problem, where job durations as well as costs depend upon the assigned resource. Regarding precedence relations as well as release dates and deadlines, the question arises, to which jobs resources should be assigned in order to minimize overall costs. For solving this time-resource-cost-tradeoff problem we present a hybrid brand and bound/dynamic programming algorithm with a rather efficient Monte Carlo type heuristic upper bounding technique as well as various relaxation procedures for determining lower bounds. Computational results are presented as well.

Journal ArticleDOI
Abstract: The Maximal Covering Location Problem MCLP has been the focus of considerable attention both in research and practice for some time, and numerous extensions have been proposed to broaden its appeal and enhance its applicability. In this paper, we are concerned with the addition of workload limits on the facilities. While not generally difficult to formulate, these capacity constraints make the model substantially more difficult to solve, as well as create certain pathological results, particularly in the assignment of uncovered demand to facilities. First we discuss these pathologies and extend the capacitated MCLP to address them. Then, we present an efficient solution procedure that is applicable to both simple and extended problem formulations. Finally, results of extensive tests on the solution procedure are presented and a "real-world" scale example is solved to explore the implications of the model.

Journal ArticleDOI
TL;DR: In this paper, two classes of multi-item lot-sizing problems are considered and solved as mixed integer programs based on an appropriate choice of the initial problem formulation and the addition of cuts which are generated automatically by a mathematical programming system MPSARX.
Abstract: We consider two classes of multi-item lot-sizing problems. The first is a class of single stage problems involving joint machine capacity constraints and/or start up costs, and the second is a class of multistage problems with general product structure. The problems are solved as mixed integer programs based on i an appropriate choice of the initial problem formulation and ii the addition of cuts which are generated automatically by a mathematical programming system MPSARX. Our results extend and complement those of Karmarkar and Schrage 1985, Afentakis and Gavish 1986, Eppen and Martin 1987 and Van Roy and Wolsey 1987. A major advantage of this approach is its robustness or flexibility. By using just a matrix generator and a mathematical programming system with automatic cut generation routines we can formulate and solve model variants without incurring the costs of adapting an algorithm.

Journal ArticleDOI
TL;DR: In this paper, the authors present a framework and mechanisms for problem restructuring based on the goals and goal relationships of the negotiating parties as well as means of manipulating the parties' utility estimates.
Abstract: To achieve movement towards a negotiated settlement, it is often necessary to restructure the problem under negotiation. Problem restructuring can lead to changed perception of the issues by the parties, thus breaking deadlocks and increasing the parties' willingness to compromise. We present a framework and mechanisms for problem restructuring based on the goals and goal relationships of the negotiating parties as well as means of manipulating the parties' utility estimates. In addition, previous negotiations are a source of heuristic advice in the restructuring task. The restructuring approach has been implemented in the PERSUADER, a computer program that acts as a labor mediator in labor management disputes. To achieve its task, the PERSUADER negotiates separately with each party, company and union, to guide them in reaching agreement.

Journal ArticleDOI
TL;DR: In this paper, a polynomial algorithm was proposed to determine the optimal sequence for an objective function that is mathematically different, but intuitively similar to the objective functions of previous researchers.
Abstract: We provide a new formulation and solution procedure to sequence a mixed model just-in-time (JIT) assembly system. Mixed model JIT assembly systems are a fundamental part of the well-known “Toyota Production System.” The underlying idea in sequencing these systems is to maintain a constant usage rate of all parts on the line. We give a polynomial algorithm to determine the optimal sequence for an objective function that is mathematically different, but intuitively similar to the objective functions of previous researchers. Furthermore, a computational study indicates that the new algorithm is robust in that its sequences are as good as those generated by other algorithms when evaluated with respect to traditional objectives, but are found 200 times faster.

Journal ArticleDOI
TL;DR: In this article, a general form of parametric quadratic programming is used to perform sensitivity analysis for mean-variance portfolio problems, which allows an investor to examine how parametric changes in either the means or the right-hand side of the constraints affect the composition, mean and variance of the optimal portfolio.
Abstract: This paper shows how to perform sensitivity analysis for Mean-Variance MV portfolio problems using a general form of parametric quadratic programming. The analysis allows an investor to examine how parametric changes in either the means or the right-hand side of the constraints affect the composition, mean, and variance of the optimal portfolio. The optimal portfolio and associated multipliers are piecewise linear functions of the changes in either the means or the right-hand side of the constraints. The parametric parts of the solution show the rates of substitution of securities in the optimal portfolio, while the parametric parts of the multipliers show the rates at which constraints are either tightening or loosening. Furthermore, the parametric parts of the solution and multipliers change in different intervals when constraints become active or inactive. The optimal MV paths for sensitivity analyses are piecewise parabolic, as in traditional MV analysis. However, the optimal paths may contain negatively sloping segments and are characterized by types of kinks, i.e., points of nondifferentiability, not found in MV analysis.