scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1986"


Journal ArticleDOI
TL;DR: Models are proposed that show how organizations can be designed to meet the information needs of technology, interdepartmental relations, and the environment to both reduce uncertainty and resolve equivocality.
Abstract: This paper answers the question, "Why do organizations process information?" Uncertainty and equivocality are defined as two forces that influence information processing in organizations. Organization structure and internal systems determine both the amount and richness of information provided to managers. Models are proposed that show how organizations can be designed to meet the information needs of technology, interdepartmental relations, and the environment. One implication for managers is that a major problem is lack of clarity, not lack of data. The models indicate how organizations can be designed to provide information mechanisms to both reduce uncertainty and resolve equivocality.

8,674 citations


Journal ArticleDOI
TL;DR: In this paper, the authors introduce the concept of a strategic factor market, i.e., a market where the resources necessary to implement a strategy are acquired, and show that such markets will be imperfectly competitive when different firms have different expectations about the future value of strategic resources.
Abstract: Much of the current thinking about competitive strategy focuses on ways that firms can create imperfectly competitive product markets in order to obtain greater than normal economic performance. However, the economic performance of firms does not depend simply on whether or not its strategies create such markets, but also on the cost of implementing those strategies. Clearly, if the cost of strategy implementation is greater than returns obtained from creating an imperfectly competitive product market, then firms will not obtain above normal economic performance from their strategizing efforts. To help analyze the cost of implementing strategies, we introduce the concept of a strategic factor market, i.e., a market where the resources necessary to implement a strategy are acquired. If strategic factor markets are perfect, then the cost of acquiring strategic resources will approximately equal the economic value of those resources once they are used to implement product market strategies. Even if such strategies create imperfectly competitive product markets, they will not generate above normal economic performance for a firm, for their full value would have been anticipated when the resources necessary for implementation were acquired. However, strategic factor markets will be imperfectly competitive when different firms have different expectations about the future value of a strategic resource. In these settings, firms may obtain above normal economic performance from acquiring strategic resources and implementing strategies. We show that other apparent strategic factor market imperfections, including when a firm already controls all the resources needed to implement a strategy, when a firm controls unique resources, when only a small number of firms attempt to implement a strategy, and when some firms have access to lower cost capital than others, and so on, are all special cases of differences in expectations held by firms about the future value of a strategic resource. Firms can attempt to develop better expectations about the future value of strategic resources by analyzing their competitive environments or by analyzing skills and capabilities they already control. Environmental analysis cannot be expected to improve the expectations of some firms better than others, and thus cannot be a source of more accurate expectations about the future value of a strategic resource. However, analyzing a firm's skills and capabilities can be a source of more accurate expectations. Thus, from the point of view of firms seeking greater than normal economic performance, our analysis suggests that strategic choices should flow mainly from the analysis of its unique skills and capabilities, rather than from the analysis of its competitive environment.

5,339 citations


Journal ArticleDOI
TL;DR: How lead users can be systematically identified, and how lead user percepts can be statistically identified, are explored.
Abstract: Accurate marketing research depends on accurate user judgments regarding their needs. However, for very novel products or in product categories characterized by rapid change—such as “high technology” products—most potential users will not have the real-world experience needed to problem solve and provide accurate data to inquiring market researchers. In this paper I explore the problem and propose a solution: Marketing research analyses which focus on what I term the “lead users” of a product or process. Lead users are users whose present strong needs will become general in a marketplace months or years in the future. Since lead users are familiar with conditions which lie in the future for most others, they can serve as a need-forecasting laboratory for marketing research. Moreover, since lead users often attempt to fill the need they experience, they can provide new product concept and design data as well. In this paper I explore how lead users can be systematically identified, and how lead user percept...

4,604 citations


Journal ArticleDOI
TL;DR: It is argued that electronic mail does not simply speed up the exchange of information but leads to the exchangeof new information as well, and much of the information conveyed through electronic mail was information that would not have been conveyed through another medium.
Abstract: This paper examines electronic mail in organizational communication. Based on ideas about how social context cues within a communication setting affect information exchange, it argues that electronic mail does not simply speed up the exchange of information but leads to the exchange of new information as well. In a field study in a Fortune 500 company, we used questionnaire data and actual messages to examine electronic mail communication at all levels of the organization. Based on hypotheses from research on social communication, we explored effects of electronic communication related to self-absorption, status equalization, and uninhibited behavior. Consistent with experimental studies, we found that decreasing social context cues has substantial deregulating effects on communication. And we found that much of the information conveyed through electronic mail was information that would not have been conveyed through another medium.

2,452 citations


Journal ArticleDOI
TL;DR: In this article, the authors empirically tested whether different models are needed to predict the adoption of technical process innovations that contain a high degree of new knowledge radical innovations and a low degree of incremental innovations.
Abstract: This paper proposes and empirically tests whether different models are needed to predict the adoption of technical process innovations that contain a high degree of new knowledge radical innovations and a low degree of new knowledge incremental innovations. Results from a sample of 40 footwear manufacturers suggest that extensive knowledge depth measured by the number of technical specialists is important for the adoption of both innovation types. Larger firms are likely to have both more technical specialists and to adopt radical innovations. The study did not find associations between the adoption of either innovation type and decentralized decision making, managerial attitudes toward change, and exposure to external information. By implication, managers trying to encourage technical process innovation adoption need not be as concerned about modifying centralization of decision making, managerial attitudes and exposure to external information as would managers trying to encourage other types of innovation adoption, e.g., innovations in social services where these factors have been found to be important. Instead, investment in human capital in the form of technical specialists appears to be a major facilitator of technical process innovation adoption.

2,389 citations


Journal ArticleDOI
TL;DR: In this article, an axiomatic treatment of the Analytic Hierarchy Process (AHP) is presented, which is a special case of axioms for priority setting in systems with feedback which allow for a wide class of dependencies.
Abstract: This paper contains an axiomatic treatment of the Analytic Hierarchy Process AHP. The set of axioms corresponding to hierarchic structures are a special case of axioms for priority setting in systems with feedback which allow for a wide class of dependencies. The axioms highlight: 1 the reciprocal property that is basic in making paired comparisons; 2 homogeneity that is characteristic of people's ability for making comparisons among things that are not too dissimilar with respect to a common property and, hence, the need for arranging them within an order preserving hierarchy; 3 dependence of a lower level on the adjacent higher level; 4 the idea that an outcome can only reflect expectations when the latter are well represented in the hierarchy. The AHP neither assumes transitivity or the stronger condition of consistency nor does it include strong assumptions of the usual notions of rationality. A number of facts are derived from these axioms providing an operational basis for the AHP.

1,646 citations


Journal ArticleDOI
TL;DR: In this article, the authors report the results of an empirical investigation based on data obtained from a random sample of 100 U.S. manufacturing firms, providing new findings bearing on each of these questions.
Abstract: To what extent would the rate of development and introduction of inventions decline in the absence of patent protection? To what extent do firms make use of the patent system, and what differences exist among firms and industries and over time in the propensity to patent? These questions are in need of much more study. This paper, which reports the results of an empirical investigation based on data obtained from a random sample of 100 U.S. manufacturing firms, provides new findings bearing on each of these questions.

1,240 citations


Journal ArticleDOI
TL;DR: In this paper, the authors summarize what areas are becoming consensual among most writers on effectiveness, and point out continuing areas of disagreement and conflict, concluding that agreement about effectiveness is mainly an agreement to disagree.
Abstract: Attention to the subject of organizational effectiveness has been increasing in the last several years as popular management books have extolled management excellence, almost two million jobs have been lost due to poor U.S. competitiveness, and economic conditions have put pressure on organizations to become more accountable with their resources. However, despite its popularity, much confusion continues in the organizational literature regarding the definition, circumscription, and appropriate criteria for assessing effectiveness. In this paper, I summarize what areas are becoming consensual among most writers on effectiveness, and I point out continuing areas of disagreement and conflict. The five statements summarizing consensual characteristics of effectiveness and the three statements summarizing areas of continuing conflict point out that agreement about effectiveness is mainly an agreement to disagree. Conflicts center mainly on the incompatibility and inappropriateness of commonly selected criteria. The main theme of the paper, however, is a discussion of an inherent, but largely ignored, characteristic of effectiveness in organizations-the paradoxical nature of effectiveness criteria. This discussion illustrates that the most effective organizations are also those characterized by paradoxes-i.e., contradictions, simultaneous opposites, and incompatibilities. Taking account of this characteristic helps explain one reason why so much confusion and disagreement continues to surround effectiveness, and it uncovers a new set of research questions that can guide future investigations. Some suggestions are provided for how research on paradoxes in effectiveness might be pursued in the future.

910 citations


Journal ArticleDOI

853 citations


Journal ArticleDOI
TL;DR: The simple assembly line balancing problem (SALBP) as discussed by the authors is a deterministic optimization problem where all input parameters are assumed to be known with certainty and all the algorithms discussed are exact.
Abstract: In this survey paper we discuss the development of the simple assembly line balancing problem SALBP; modifications and generalizations over time; present alternate 0-1 programming formulations and a general integer programming formulation of the problem; discuss other well-known problems related to SALBP; describe and comment on a number of exact i.e., optimum-seeking methods; and present a summary of the reported computational experiences. All models discussed here are deterministic i.e., all input parameters are assumed to be known with certainty and all the algorithms discussed are exact. The problem is termed "simple" in the sense that no "mixed-models," "subassembly lines," "zoning restrictions," etc. are considered. Due to the richness of the literature, we exclude from discussion here a the inexact i.e., heuristic/approximate algorithms for SALPB and b the algorithms for the general assembly line balancing problem including the stochastic models.

834 citations


Journal ArticleDOI
TL;DR: In this paper, an empirical investigation of 97 firms was conducted to determine the relationships that three aspects of the chief executive's CEO's personality have with the strategies, structures, decision making methods and performance of their firms.
Abstract: An empirical investigation of 97 firms was conducted to determine the relationships that three aspects of the chief executive's CEO's personality have with the strategies, structures, decision making methods and performance of their firms. CEO flexibility was associated with niche strategies, simple, informal structures, and intuitive, risk-embracing decision making. CEO need for achievement was related to broadly focussed, marketing-oriented strategies, formal and sophisticated structures, and proactive, analytical decision making. Executives with an internal locus of control pursued more product innovation, were more future oriented, and tailored their approaches to the circumstances facing their firms. The relationships between personality and organizational characteristics were found to be by far the strongest in small firms and also somewhat more significant in dynamic environments. Flexibility and locus of control related to corporate performance under certain conditions; need for achievement did not.

Journal ArticleDOI
TL;DR: The paper treats the cases when the categorical variable can be controllable or uncontrollable by the manager, for the cases of technical and scale inefficiency, and the approach is illustrated using real data.
Abstract: Data Envelopment Analysis has now been extensively applied in a range of empirical settings to identify relative inefficiencies, and provide targets for improvements. It accomplishes this by developing peer groups for each unit being operated. The use of categorical variables is an important extension which can improve the peer group construction process and incorporate "on-off" characteristics, e.g., presence of drive-in window or not in a banking network. It relaxes the stringent need for factors to display piecewise constant marginal productivities. In so doing, it substantially strengthens the credibility of the insights obtained. The paper treats the cases when the categorical variable can be controllable or uncontrollable by the manager, for the cases of technical and scale inefficiency. The approach is illustrated using real data.

Journal ArticleDOI
TL;DR: The results suggest that MIS research is not well-grounded in organization theory nor have MIS research results been widely diffused in the organizational literature, and suggestions for developing a better link between MIS and organizational theory are presented.
Abstract: Researchers in all academic disciplines benefit from an understanding of the intellectual development of their field. This understanding is essential for conducting studies which build systematically on prior research. The purpose of this study is to document the intellectual development of the ideas represented by published research in Management Information Systems MIS based on an author co-citation analysis. The resulting mapping is intended to serve as a benchmark for future assessments of MIS as a field as well as a means for documenting the emergence of new research specialties. The study sought to identify 1 the subfields which constitute MIS research, 2 the reference disciplines of these subfields, 3 the diffusion of the ideas represented by these subfields to other disciplines, and 4 which of these subfields represent active areas of current MIS research. Nine invisible colleges, or informal clusters of research were uncovered. These nine empirically defined conceptual groupings collectively define the intellectual foundations of MIS as well as the forces currently shaping MIS research. Four of the clusters represent early MIS research themes which are still popular, based on subsequent citation patterns. Despite the centrality of the concept of the organization to widely-accepted definitions of MIS, the results suggest that MIS research is not well-grounded in organization theory nor have MIS research results been widely diffused in the organizational literature. Suggestions for developing a better link between MIS and organizational theory are presented based on the concept of organizational effectiveness.

Journal ArticleDOI
TL;DR: In this paper, the authors review the extent to which the components of a contingent behavioral theory of organizational effectiveness already exist, one that incorporates the paradoxes and tradeoffs inherent in real life organizations.
Abstract: Concern with the effectiveness, productivity, efficiency or excellence of organizations is a subject that has motivated the writings of economists, organization theorists, management philosophers, financial analysts, management scientists, consultants, and practitioners. It has served as a unifying theme for over a century of research on the management and design of organizations, yet the empirical research has not contributed to the development of a universal theory of organizational effectiveness. In this paper we review the extent to which the components of a contingent behavioral theory of organizational effectiveness already exist, one that incorporates the paradoxes and tradeoffs inherent in real life organizations. We consider the problem of effectiveness measurement, and we propose a research approach utilizing a strategy of engineering organizational effectiveness which could lead to an inductive, applied, empirically-based theory of contingent organization design. In this context the engineering of organizational effectiveness has a dual purpose. For the organizations and their managers participating in the research, it refers to understanding, constructing and managing organizational activities in given contexts so as to achieve and maintain improved performance as measured by one or more situationally-determined criteria. In addition, to provide the empirical basis from which induction can proceed, the engineered events must be structured and implemented in such a way so as to facilitate theory construction and testing.

Journal ArticleDOI
TL;DR: In this article, the joint problem of ordering and offering price discount by a supplier to his sole/major buyer is analyzed, where the objective is to induce the buyer to alter his order schedule and size so that the supplier can benefit from lower set up, ordering and inventory holding costs.
Abstract: In this paper, the joint problem of ordering and offering price discount by a supplier to his sole/major buyer is analyzed. The objective is to induce the buyer to alter his order schedule and size so that the supplier can benefit from lower set up, ordering, and inventory holding costs. We generalize the quantity discount pricing model of Monahan (Monahan, J. P. 1984. A quantity discount pricing model to increase vendor profits. Management Sci. 30 (6) 720–726.) to: (1) explicitly incorporate constraints imposed on the amount of discount that can be offered; and (2) relax the implicit assumption of a lot-for-lot (or order-for-order) policy adopted by the supplier. An algorithm is developed to solve the supplier's joint ordering and price discount problem.

Journal ArticleDOI
TL;DR: It is shown that significant levels of backup coverage may possibly be provided within a system without substantial loss of first Coverage in the present work.
Abstract: Backup coverage, the second coverage of a demand node, is suggested as a decision criterion in modelling the location of emergency services on a network. The efficient handling of stochastic demand by vehicles which can respond to only one call at a time may require backup coverage in areas of high demand as a means to maintain a more uniform level of service. This new criterion is applied in the context of the classic covering models, the Location Set Covering Problem and the Maximal Covering Location Problem. First coverage as defined in these models is traded off against backup coverage in the present work. Other efforts which incorporate additional levels of coverage are reviewed. We also show how to extend these models to third as well as subsequent coverage. Based on an example problem we show that significant levels of backup coverage may possibly be provided within a system without substantial loss of first coverage.

Journal ArticleDOI
TL;DR: In this paper, the authors compared the relative strengths of the estimation methods by applying both models to the same data to obtain inferences about the nature of underlying cost and production correspondences.
Abstract: This paper compares inferences about hospital cost and production correspondences from two different estimation models: 1 the econometric modeling of the translog cost function, and 2 the application of Data Envelopment Analysis DEA. While there are numerous examples of the application of each approach to empirical data, this paper provides insights into the relative strengths of the estimation methods by applying both models to the same data. The translog results suggest that constant reruns are operant, whereas the DEA results suggest that both increasing and decreasing returns to scale may be observed in different segments of the production correspondence, in turn suggesting that the translog model may be 'averaging' diametrically opposite behavior. On the other hand, by examining the rate of output transformation, both models agree that patient days devoted to care of children are more resource intensive than those devoted to adults or to the elderly. In addition, we compare estimates of technical efficiencies of individual hospitals obtained from the two methods. The DEA estimates are found to be highly related to the capacity utilization, but no such relationship was found for the translog estimates. This comparative application of different estimation models to the same data to obtain inferences about the nature of underlying cost and production correspondences sheds interesting light on the strengths of each approach, and suggests the need for additional research comparing estimation models using real as well as simulated data.

Journal ArticleDOI
TL;DR: A number of directions in which models require extension are outlined, in particular the representation of such aspects of FMS operation as the tool delivery systems, the blocking phenomenon, the transient behavior and the differences between flexible machining systems and flexible assembly systems.
Abstract: This paper reviews recent work on the development of analytical models of Flexible Manufacturing Systems (FMSs). The contributions of each of the groups concerned with model development are summarized and an assessment is made of the strengths and weaknesses of its modelling approach. A number of directions in which models require extension are outlined, in particular the representation of such aspects of FMS operation as the tool delivery systems, the blocking phenomenon, the transient behavior and the differences between flexible machining systems and flexible assembly systems. Further work is also required on the structure of FMS control and the integration with plant production planning and control.

Journal ArticleDOI
TL;DR: A branch and bound algorithm for the generalized assignment problem in which bounds are obtained from a Lagrangian relaxation with the multiplier adjustment method appears to be about one order of magnitude faster than the best previously existing algorithms for this problem.
Abstract: We describe a branch and bound algorithm for the generalized assignment problem in which bounds are obtained from a Lagrangian relaxation with the multipliers set by a heuristic adjustment method. The algorithm was tested on a large sample of small random problems and a number of large problems derived from a vehicle routing application. Computation times were reasonable in all cases and the branch and bound trees generated had nearly two orders of magnitude fewer nodes than for competing algorithms. Although comparison of running times on different machines is difficult, the multiplier adjustment method appears to be about one order of magnitude faster than the best previously existing algorithms for this problem.

Journal ArticleDOI
TL;DR: By introducing the idea of a quality-based learning curve, this paper links the previously disjoint literatures of quality control and learning curves to explain why high quality and low costs need not be inconsistent.
Abstract: Recent interest in product quality suggests that effort devoted to improving the quality of manufactured products may reduce unit costs. This conjecture-that improving quality can lower costs-challenges the traditional assumption that unit costs increase with increased quality assurance activities and has significant implications for quality management. By introducing the idea of a quality-based learning curve, this paper links the previously disjoint literatures of quality control and learning curves to explain why high quality and low costs need not be inconsistent. When costs are affected by a quality-based learning curve, product quality favorably influences the rate of cost reduction due to learning. Thus, costs decline more rapidly with the experience of producing higher quality products. Two formulations of the quality-based learning phenomenon are presented. The first assumes that quality-based experience affects direct manufacturing costs. For this formulation, the optimal quality level is decreasing over time, but is always larger than the optimal base-case quality level. The optimal production quantity is constant if the interest rate is zero and increasing in time when the interest rate is positive. The second formulation assumes that quality-based experience affects quality control costs. In this case, the optimal quality level is always increasing over time. The optimal quantity behavior is qualitatively similar to the first formulation. A key feature of the second quality-based model is that it resolves the controversy between the economic conformance level model of Juran, which asserts that one should use cost-tradeoff analysis to find the optimal quality level, and the claims of Deming and Crosby, that zero defects is always the optimal quality level. For certain parameter values, the optimal quality policy in the second model conforms to the economic conformance level model but the dynamics of the model demonstrates the optimality of always pushing towards zero defects.

Journal ArticleDOI
TL;DR: In this article, the authors examined the effects of component commonality on optimal safety stock levels in a two-product, two-level inventory model, where the criterion is to minimize system safety stock subject to a service level constraint.
Abstract: We examine the effects of component commonality on optimal safety stock levels in a two-product, two-level inventory model. The criterion is to minimize system safety stock subject to a service level constraint. Although our model is specialized, its analysis provides insights not available in other multilevel inventory models.

Journal ArticleDOI
TL;DR: Both optimal and heuristic procedures are developed for this problem and are based on a dynamic programming formulation, which depends on the ability to solve the static problem efficiently.
Abstract: The problem of plant layout has generally been treated as a static one. In this paper, we deal with the dynamic nature of this problem. Both optimal and heuristic procedures are developed for this problem and are based on a dynamic programming formulation. The use of one of these approaches depends on the ability to solve the static problem efficiently. Finally, we briefly discuss the issue of extending the planning horizon, and how to resolve system nervousness when previously planned layouts need to be changed.

Journal ArticleDOI
TL;DR: In this article, the results of a study of 29 organizations indicate that certain managerial strategies are strongly associated with high static scores and with improving effectiveness over time, and that proactive strategies and those with an external emphasis are more successful than internal and reactive strategies.
Abstract: Some authors have argued that research on organizational effectiveness should cease. This study demonstrates why organizational effectiveness studies are crucial in certain types of organizations, and it points out how many of the weaknesses and criticisms of past investigations can be addressed. The results of this study of 29 organizations indicate that certain managerial strategies are strongly associated with high static scores and with improving effectiveness over time. Managerial strategies, in fact, were found to be more important than structure, demographics, finances, and other factors. Proactive strategies and those with an external emphasis are more successful than internal and reactive strategies. Managerial strategies that are multifaceted are more likely to lead to effectiveness than monolithic strategies.

Journal ArticleDOI
TL;DR: It is argued that the domains of existing design paradigms are declining in scope, and that the nature of current and future organizational environments requires use of a design paradigm that responds to the increasing frequency and criticality of the decision-making process.
Abstract: This paper introduces and explicates the decision-making paradigm of organizational design. We argue that the domains of existing design paradigms are declining in scope, and that the nature of current and future organizational environments requires use of a design paradigm that responds to the increasing frequency and criticality of the decision-making process. In particular, we argue that the decision-making paradigm is applicable when the organizational environments are hostile, complex, and turbulent. The focal concept of the decision-making paradigm is that organizations should be designed primarily to facilitate the making of organizational decisions. The paper sets forth the paradigm's six major concepts and discusses the principal domains of its application. The paper also examines the relationships between the decision-making paradigm and the literatures on 1 organizational decision making, 2 the information processing view of organizations, and 3 the need for compatibility between the organization's design and the design of its technologically supported information systems. The paper concludes by identifying ten organizational design guidelines that follow from the decision-making paradigm.

Journal ArticleDOI
TL;DR: In this paper, a survey of cost allocation methods based on the nucleolus and the Shapley value is presented, and also a new one, the so-called cost gap allocation method which is based on I„-value.
Abstract: Problems of allocating joint costs in a reasonable way arise in many practical situations where people decide to work together to save costs. Cost allocation methods based on game theoretical concepts take into account the strategic aspects of cost allocation situations. We give a survey of cost allocation methods based on the nucleolus and the Shapley value, and introduce also a new one, the so-called cost gap allocation method which is based on the I„-value. It is shown that for some large subclasses of cost allocation problems this new cost allocation method coincides with old separable cost methods proposed in the thirties by the Tennessee Valley Authority and also with the separable costs-remaining benefits SCRB method. Properties of this cost gap allocation method are also treated.

Journal ArticleDOI
TL;DR: A computational experiment designed to assess the efficacy of 26 heuristic decision rules which group work tasks into work stations along an assembly line such that the number of work stations required is minimized is minimized.
Abstract: In this paper, we report on a computational experiment designed to assess the efficacy of 26 heuristic decision rules which group work tasks into work stations along an assembly line such that the number of work stations required is minimized. The heuristic decision rules investigated vary from simple list processing procedures that consider a single attribute of each work task for assignment, to procedures which are optimal seeking, but which have had their search terminated through the imposition of a limit on the amount of computation time that can be devoted to each search. Also included are heuristic decision rules which backtrack in an attempt to locate an improved solution, and decision rules which probabilistically search for improved solutions. Our investigation differs from those reported previously, in that the objective in balancing each line is to determine the minimum number of work stations for a given limit on the time available for assembly at each work station the cycle time. Previous approaches have investigated the problem of determining the minimum cycle time for a given line length. We also compare the results obtained with the optimal solution for a subset of the problems investigated. Both problems which have appeared in the open literature and problems which have been solved for the first time are included. Because a portion of our results differ from those reported previously, we suggest why these differences have occurred. Guidelines are also given to those balancing industrial assembly lines on the choice of the heuristic decision rule to use whether one is attempting to obtain a minimum station balance given a limit on the time available for assembly at each work station, or whether one is attempting to minimize the time devoted to assembly at a work station given a limit on the number of work stations available.

Journal ArticleDOI
TL;DR: A new linearization technique is presented for the solution of linearly constrained zero-one quadratic programming problems, demonstrated to yield a tighter continuous or linear programming relaxation than is available through other methods.
Abstract: This paper is concerned with the solution of linearly constrained zero-one quadratic programming problems. Problems of this kind arise in numerous economic, location decision, and strategic planning situations, including capital budgeting, facility location, quadratic assignment, media selection, and dynamic set covering. A new linearization technique is presented for this problem which is demonstrated to yield a tighter continuous or linear programming relaxation than is available through other methods. An implicit enumeration algorithm which uses Lagrangian relaxation, Benders' cutting planes, and local explorations is designed to exploit the strength of this linearization. Computational experience is provided to demonstrate the usefulness of the proposed linearization and algorithm.

Journal ArticleDOI
TL;DR: This paper develops a research design for examining the relative influence of managers and environments on organizational activity over time and describes procedures for operationalizing two basic parameters of research design.
Abstract: This paper develops a research design for examining the relative influence of managers and environments on organizational activity over time. We outline three basic models of organization evolution: 1 an inertial model, which emphasizes constraints on evolution imposed by early patterns of exchange; 2 an external control model, which posits change in organizational activities that is guided by changes in environmental conditions over time; and 3 a strategic management model, which emphasizes the role of senior executives in choosing patterns and domains of competitive activity. Using the general logic of experimental design, we outline methods for comparing longitudinal patterns in change and persistence that will distinguish between these alternative perspectives. Specifically, we describe procedures for operationalizing two basic parameters of research design: 1 the organization population cohort, which imposes systematic restrictions on sampling; and 2 a generalized version of the product class life cycle, which helps isolate changes in environmental conditions for comparing organizational activity patterns over time. Data from an ongoing study of firms in the minicomputer product class are presented to illustrate these concepts.

Journal ArticleDOI
TL;DR: This paper proposed a simple procedure for assessing utility functions which avoids many difficulties of the standard techniques, such as chain responses from one question to the next, and change ranges and reference points constantly, introducing range effects and other distortions.
Abstract: This note describes a simple procedure for assessing utility functions which avoids many difficulties of the standard techniques. The conventional methods suffer from at least three drawbacks; they 1 generate utility functions that depend on the probability levels used; 2 chain responses from one question to the next, so that any bias is propagated and even magnified; and 3 change ranges and reference points constantly, introducing range effects and other distortions. Noting the evidence linking the dependence of utility functions on the "certainty effect," our method: 1 compares lotteries with other lotteries rather than certain amounts; 2 does not "chain" responses; and 3 consistently uses "elementary lotteries" which control for range and reference points. Experimental work supports the proposed procedure.

Journal ArticleDOI
TL;DR: A nonlinear integer mathematical programming formulation of the loading problem is formulated and an efficient solution procedure is proposed and illustrated with an example to demonstrate the efficiency of the suggested special-purpose procedures.
Abstract: A flexible manufacturing system FMS is an integrated system of computer numerically controlled machine tools connected with automated material handling. A set of production planning problems for FMSs has been defined Stecke [Stecke, Kathryn E. 1983. Formulation and solution of nonlinear integer production planning problems for flexible manufacturing systems. Management Sci.29 3, March 273-288.], and this paper considers one called the loading problem. This problem involves assigning to the machine tools, operations and associated cutting tools required for part types that have been selected to be produced simultaneously. The part types will be machined during the upcoming production period say, of one to three weeks duration on average and according to a prespecified part mix. This assignment is constrained by the capacity of each machine's tool magazine as well as by the production capacities of both the system, and each machine type. There are several loading objectives that are applicable in a flexible manufacturing situation. This paper considers the most commonly applied one, that of balancing the workload on all machines. This paper first discusses a nonlinear integer mathematical programming formulation of the loading problem. The problem is formulated in all detail. Then an efficient solution procedure is proposed and illustrated with an example. Computational results are provided to demonstrate the efficiency of the suggested special-purpose procedures.