scispace - formally typeset
Search or ask a question

Showing papers in "Management Science in 1983"


Journal ArticleDOI
Danny Miller1
TL;DR: In this paper, the authors derived a crude typology of firms: Simple firms are small and their power is centralized at the top, while planning firms are big, their goal being smooth and efficient operation through the use of formal controls and plans.
Abstract: The objective of the research was to discover the chief determinants of entrepreneurship, the process by which organizations renew themselves and their markets by pioneering, innovation, and risk taking. Some authors have argued that personality factors of the leader are what determine entrepreneurship, others have highlighted the role played by the structure of the organization, while a final group have pointed to the importance of strategy making. We believed that the manner and extent to which entrepreneurship would be influenced by all of these factors would in large measure depend upon the nature of the organization. Based upon the work of a number of authors we derived a crude typology of firms: Simple firms are small and their power is centralized at the top. Planning firms are bigger, their goal being smooth and efficient operation through the use of formal controls and plans. Organic firms strive to be adaptive to their environments, emphasizing expertise-based power and open communications. The predictiveness of the typology was established upon a sample of 52 firms using hypothesis-testing and analysis of variance techniques. We conjectured that in Simple firms entrepreneurship would be determined by the characteristics of the leader; in Planning firms it would be facilitated by explicit and well integrated product-market strategies, and in Organic firms it would be a function of environment and structure. These hypotheses were largely borne out by correlational and multiple regression analyses. Any programs which aim to stimulate entrepreneurship would benefit greatly from tailoring recommendations to the nature of the target firms.

5,067 citations


Journal ArticleDOI
TL;DR: This paper presents a framework for organizational analysis that organizes the organizational effectiveness literature, indicates which concepts are most central to the construct of organizational effectiveness, makes clear the values in which the concepts are embedded, and provides an overarching framework to guide subsequent efforts at organizational assessment.
Abstract: This paper presents a framework for organizational analysis. The empirically derived approach does not emerge from the observation of actual organizations, but from the ordering, through multivariate techniques, of criteria that organizational theorists and researchers use to evaluate the performance of organizations. In a two-stage study, organizational theorists and researchers were impaneled to make judgments about the similarity of commonly used effectiveness criteria. The model derived from the second group closely replicated the first, and in convergence suggested that three value dimensions control-flexibility, internal-external, and means-ends underlie conceptualizations of organizational effectiveness. When these value dimensions are juxtaposed, a spatial model emerges. The model serves a number of important functions. It organizes the organizational effectiveness literature, indicates which concepts are most central to the construct of organizational effectiveness, makes clear the values in which the concepts are embedded, demonstrates that the effectiveness literature and the general literature on organizational analysis are analogues of one another, and provides an overarching framework to guide subsequent efforts at organizational assessment.

3,183 citations


Journal ArticleDOI
TL;DR: This paper reports on a technique for measuring and analyzing computer user satisfaction, starting with the literature and using the critical incident interview technique, and creating a questionnaire for measuring satisfaction using the semantic differential scaling technique.
Abstract: This paper reports on a technique for measuring and analyzing computer user satisfaction. Starting with the literature and using the critical incident interview technique, 39 factors affecting satisfaction were identified. Adapting the semantic differential scaling technique, a questionnaire for measuring satisfaction was then created. Finally, the instrument was pilot tested to prove its validity and reliability. The results of this effort and suggested uses of the questionnaire are reported here.

2,634 citations


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the relationships between stage of development in organizational life cycles and organizational effectiveness and conclude that major criteria of effectiveness change in predictable ways as organizations develop through their life cycles.
Abstract: This paper discusses the relationships between stage of development in organizational life cycles and organizational effectiveness. We begin the paper by reviewing nine models of organizational life cycles that have been proposed in the literature. Each of these models identifies certain characteristics that typify organizations in different stages of development. A summary model of life cycle stages is derived that integrates each of these nine models. Next, a framework of organizational effectiveness developed by Quinn and Rohrbaugh is introduced. This framework organizes criteria of effectiveness into four models-rational goal, open systems, human relations, and internal processes models. We hypothesize that certain of the models are important in evaluating the effectiveness of organizations in particular life cycle stages but not in others. The analysis of a state agency's development over five years provides some evidence to support these hypothesized relationships between life cycle stages and criteria of effectiveness. We conclude that major criteria of effectiveness change in predictable ways as organizations develop through their life cycles. Some shifts in state of development are resisted by the organization much more than are others, and intervention into organizations may be needed to help make the transitions less painful and costly. We also discuss why the predictions of contingency theory often are not substantiated by research because the responses of organizations to the external environment vary in different life cycle stages.

1,693 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a model of the strategic process concerning entrepreneurial activity in large, complex organizations and make the following key points: First, firms need both diversity and order in their strategic activities to maintain their viability.
Abstract: This paper presents a model of the strategic process concerning entrepreneurial activity in large, complex organizations. Previous empirical and theoretical findings can be integrated in this new conceptual framework. The paper makes the following key points. First, firms need both diversity and order in their strategic activities to maintain their viability. Diversity results primarily from autonomous strategic initiatives of participants at the operational level. Order results from imposing a concept of strategy on the organization. Second, managing diversity requires an experimentation-and-selection approach. Middle level managers play a crucial role in this through their support for autonomous strategic initiatives early on, by combining these with various capabilities dispersed in the firm's operating system, and by conceptualizing strategies for new areas of business. Third, top management's critical contribution consists in strategic recognition rather than planning. By allowing middle level managers to redefine the strategic context, and by being fast learners, top management can make sure that entrepreneurial activities will correspond to their strategic vision, retroactively. Fourth, strategic management at the top should be to a large extent concerned with balancing the emphasis on diversity and order over time. Top management should control the level and the rate of change rather than the specific content of entrepreneurial activity. Finally, new managerial approaches and innovative administrative arrangements are required to facilitate the collaboration between entrepreneurial participants and the organizations in which they are active.

1,329 citations


Journal ArticleDOI
TL;DR: This paper defines a set of five production planning problems that must be solved for efficient use of an FMS, and addresses specifically the grouping and loading problems.
Abstract: A flexible manufacturing system FMS is an integrated, computer-controlled complex of automated material handling devices and numerically controlled machine tools that can simultaneously process medium-sized volumes of a variety of part types. FMSs are becoming an attractive substitute for the conventional means of batch manufacturing, especially in the metal-cutting industry. This new production technology has been designed to attain the efficiency of well-balanced, machine-paced transfer lines, while utilizing the flexibility that job shops have to simultaneously machine multiple part types. Some properties and constraints of these systems are similar to those of flow and job shops, while others are different. This technology creates the need to develop new and appropriate planning and control procedures that take advantage of the system's capabilities for higher production rates. This paper defines a set of five production planning problems that must be solved for efficient use of an FMS, and addresses specifically the grouping and loading problems. These two problems are first formulated in detail as nonlinear 0-1 mixed integer programs. In an effort to develop solution methodologies for these two planning problems, several linearization methods are examined and applied to data from an existing FMS. To decrease computational time, the constraint size of the linearized integer problems is reduced according to various methods. Several real world problems are solved in very reasonable time using the linearization that results in the fewest additional constraints and/or variables. The problem characteristics that determine which linearization to use, and the application of the linearized models in the solution of actual planning problems, are also discussed.

638 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigate empirically the impact of the number and choice of forecasting methods on the accuracy of simple averages, and conclude that the forecasting accuracy improves, and that the variability of accuracy among different combinations decreases, as the number of methods in the average increases.
Abstract: An alternative to using a single forecasting method is to average the forecasts obtained from several methods. In this paper we investigate empirically the impact of the number and choice of forecasting methods on the accuracy of simple averages. It is concluded that the forecasting accuracy improves, and that the variability of accuracy among different combinations decreases, as the number of methods in the average increases. Thus, combining forecasts seems to be a reasonable practical alternative when, as is often the case, a "true" model of the data-generating process or a single "best" forecasting method cannot be or is not, for whatever reasons, identified.

609 citations


Journal ArticleDOI
TL;DR: This model sees the extent of customer contact with the service organization as a major variable affecting system performance and advocates reconfiguring the structure of the service organizations to reflect this impact.
Abstract: The literature on organization design has been dominated by descriptive models in its dealing with structure and operations. This paper takes an alternative view advocating the use of a normative model to be used in the design of service organizations. This model sees the extent of customer contact with the service organization as a major variable affecting system performance and advocates reconfiguring the structure of the service organization to reflect this impact. The discussion describes a taxonomy used to classify firms along the contact dimension and develops 13 propositions which convey critical distinctions between high and low contact services. Application of the model for managerial decision making involves the use of decoupling and the paper identifies factors which favor/disfavor decoupling in light of existing and desired service delivery objectives.

452 citations


Journal ArticleDOI
TL;DR: In this article, the authors present six specific bases for these two conclusions: 1) the currently available literature on cognitive style is an unsatisfactory basis for deriving operational design guidelines, and 2 further cognitive style research is unlikely to provide a satisfactory body of knowledge from which to derive such guidelines.
Abstract: It is commonly believed that the user's cognitive style should be considered in the design of Management Information Systems and Decision Support Systems. In contrast, an examination of the literature and a consideration of some of the broader issues involved in MIS and DSS design lead to the conclusions that: 1 the currently available literature on cognitive style is an unsatisfactory basis for deriving operational design guidelines, and 2 further cognitive style research is unlikely to provide a satisfactory body of knowledge from which to derive such guidelines. The article presents six specific bases for these two conclusions. From a manager's pespective, the outcome of the study is a suggestion: maintain a healthy skepticism if it is suggested that paper and pencil assessments of the user's cognitive style should be used as a basis for MIS or DSS designs. From a researcher's viewpoint, the study raises two questions: 1 If our research interest is MIS and DSS design, does it seem that further research in cognitive style is a wise allocation of our research resources? 2 If our research interest is cognitive style, does it seem that the use of cognitive style as a basis for MIS and DSS designs will become an important application area?

407 citations


Journal ArticleDOI
TL;DR: The final topic of this paper, Product Structure Compression, is introduced as a method to reduce the size of the problem without losing optimality.
Abstract: This paper introduces a line of research on capacity-constrained multi-stage production scheduling problems. The first section introduces the problem area as it arises from a failure of MRP systems. Then a review of the literature and an analysis of the type of problems that exist are presented in §2. Section 3 outlines linear and mixed integer-linear programming formulations. These formulations compute the required production lead times according to the demands on available capacity, thereby reducing in-process inventory compared to the usual practice in MRP. A discussion of how to use the LP version is included. However, the size of the problems in practice implies that more efficient solution techniques must be found. The final topic of this paper, Product Structure Compression, is introduced as a method to reduce the size of the problem without losing optimality.

394 citations


Journal ArticleDOI
TL;DR: The authors compared a number of approximations used to estimate means and variances of continuous random variables and/or to serve as substitutes for the probability distributions of such variables, with particular emphasis on three-point approximants.
Abstract: This paper compares a number of approximations used to estimate means and variances of continuous random variables and/or to serve as substitutes for the probability distributions of such variables, with particular emphasis on three-point approximations. Numerical results from estimating means and variances of a set of beta distributions indicate surprisingly large differences in accuracy among approximations in current use, with some of the most popular ones such as the PERT and triangular-density-function approximations faring poorly. A simple new three-point approximation, which is a straightforward extension of earlier work by Pearson and Tukey, outperforms the others significantly in these tests, and also performs well in related multivariate tests involving the Dirichlet family of distributions. It offers an attractive alternative to currently used approximations in a variety of applications.

Journal ArticleDOI
TL;DR: In this paper, the authors present a survey of the literature on tree-based network location problems, focusing on single objective function problems, where the objective is to minimize either a sum of transport costs proportional to network travel distances between existing facilities and closest new facilities, or a maximum of "losses" proportional to such travel distances, or the total number of new facilities to be located.
Abstract: Network location problems occur when new facilities are to be located on a network. The network of interest may be a road network, an air transport network, a river network, or a network of shipping lanes. For a given network location problem, the new facilities are often idealized as points, and may be located anywhere on the network; constraints may be imposed upon the problem so that new facilities are not too far from existing facilities. Usually some objective function is to be minimized. For single objective function problems, typically the objective is to minimize either a sum of transport costs proportional to network travel distances between existing facilities and closest new facilities, or a maximum of "losses" proportional to such travel distances, or the total number of new facilities to be located. There is also a growing interest in multiobjective network location problems. Of the approximately 100 references we list, roughly 60 date from 1978 or later; we focus upon work which deals directly with the network of interest, and which exploits the network structure. The principal structure exploited to date is that of a tree, i.e., a connected network without cycles. Tree-like networks may be encountered when having cycles is very expensive, as with portions of interstate highway systems. Further, simple distribution systems with a single distributor at the "hub" can often be modeled as star-like trees. With trees, "reasonable" functions of distance are often convex, whereas for a cyclic network such functions of distance are usually nonconvex. Convexity explains, to some extent, the tractability of tree network location problems.

Journal ArticleDOI
TL;DR: This paper develops a method for interactive multiple objective linear programming assuming an unknown pseudo concave utility function satisfying certain general properties and presents the supporting theory and algorithm.
Abstract: This paper develops a method for interactive multiple objective linear programming assuming an unknown pseudo concave utility function satisfying certain general properties. The method is an extension of our earlier method published in this journal Zionts, S., Wallenius, J. 1976. An interactive programming method for solving the multiple criteria problem. Management Sci.22 6 652-663.. Various technical problems present in predecessor versions have been resolved. In addition to presenting the supporting theory and algorithm, we discuss certain options in implementation and summarize our practical experience with several versions of the method.

Journal ArticleDOI
TL;DR: In this paper, the authors investigate the normative implications of decision regret and look at situations where the joint probability distribution of consequences between alternatives is not specified at the time of the decision, including cases where the outcomes produced by alternatives not chosen are never resolved.
Abstract: Some people find decision making under uncertainty difficult because they fear making the “wrong decision,” wrong in the sense that the outcome of their chosen alternative proves to be worse than could have been achieved with another alternative. These people may be willing to pay a premium to avoid consequences that produce this decision regret. This paper continues an earlier investigation into the normative implications of decision regret and looks at situations where the joint probability distribution of consequences between alternatives is not specified at the time of the decision. It includes a discussion of cases where the outcomes produced by alternatives not chosen are never resolved. A consequence of this model of preferences for risky situations is that two components of risk aversion may be identified, decreasing marginal value and regret aversion.

Journal ArticleDOI
TL;DR: A review of the empirical literature on subjective probability encoding from a psychological and psychometric perspective can be found in this paper, where it is suggested that the usual encoding techniques can be regarded as instances of the general methods used to scale psychological variables.
Abstract: In order to review the empirical literature on subjective probability encoding from a psychological and psychometric perspective, it is first suggested that the usual encoding techniques can be regarded as instances of the general methods used to scale psychological variables. It is then shown that well-established concepts and theories from measurement and psychometric theory can provide a general framework for evaluating and assessing subjective probability encoding. The actual review of the literature distinguishes between studies conducted with nonexperts and with experts. In the former class, findings related to the reliability, internal consistency, and external validity of the judgments are critically discussed. The latter class reviews work relevant to some of these characteristics separately for several fields of expertise. In die final section of the paper the results from these two classes of studies are summarized and related to a view of vague subjective probabilities. Problems deserving addi...

Journal ArticleDOI
TL;DR: In this article, a new procedure based on gaussian quadrature is developed to decrease the error in the approximation to any desired level, which can be used to increase the accuracy.
Abstract: Practical limits on the size of most probabilistic models require that probability distributions be approximated by a few representative values and associated probabilities. This paper demonstrates that methods commonly used to determine discrete approximations of probability distributions systematically underestimate the moments of the original distribution. A new procedure based on gaussian quadrature is developed in this paper. It can be used to decrease the error in the approximation to any desired level.

Journal Article
TL;DR: In this article, the authors propose a method to use the information of the user's interaction with the service provider in order to improve the performance of a service provider, such as the one described in this paper.
Abstract: Маркетинговый канал дистрибуции состоит из нескольких участников, каждый из которых имеет свои собственные критерии принятия решений. Однако решения каждого участника канала влияют на прибыли и, соответственно, на деятельность всех остальных участников маркетингового канала. Недостаток координации в деятельности участников канала ведет к нежелательным последствиям для всех.

Journal ArticleDOI
TL;DR: In this paper, it is shown that failure to discount benefits implies that programs are always improved by delay, if the ability to produce the non-monetary effect does not diminish too quickly over time.
Abstract: Cost-effectiveness analysts generally assume that preferences over time are such that streams of monetary and nonmonetary program effects can be reduced to one discounted sum of monetary costs and another of effects. It is known that if the nonmonetary effects can be cashed out in a way that does not vary with time, then the rates of discount for monetary and nonmonetary effects have to be equal. This paper presents a more compelling argument for the equality of those rates when hard to monetize benefits such as life-saving are involved. It shows that if the ability to produce the nonmonetary effect does not diminish too quickly over time, failure to discount benefits implies that programs are always improved by delay. In general, discounting benefits and costs at different rates can lead to peculiar results.

Journal ArticleDOI
TL;DR: This paper reported a replication of Mintzberg's field study in all important dimensions and explained the similarities and differences between organizations and industries, and provided explanations for similarities between industries and organizations.
Abstract: This paper reports a replication of Mintzberg (McCall M. W., Jr. A. M. Morrison, R. L. Hannan. 1978. Studies of managerial work: results and methods. Technical Report #9, Center for Creative Leadership.). Structured observation with supplemental unstructured interviewing was used to study four top managers for one week each. Mintzberg's field study was supported by our replication in all important dimensions. Explanations for similarities and differences between organizations and industries are briefly discussed.

Journal ArticleDOI
TL;DR: In this paper, the effect of alternative utility functions and parameter values on the optimal composition of a risky investment portfolio is examined and the results agree well with the available theory and imply utility functions that are appropriate for investors with particular risk-bearing attitudes.
Abstract: This paper examines the effect of alternative utility functions and parameter values on the optimal composition of a risky investment portfolio. Normally distributed assets are the setting for the theoretical and empirical analyses. The results agree well with the available theory and imply utility functions and parameter values that are appropriate for investors with particular risk-bearing attitudes. The results give strong empirical support to the proposition that utility functions having different functional forms and parameter values but "similar" absolute risk aversion indices have "similar" optimal portfolios. These results suggest that over horizons up to one year one can safely substitute "convenient" surrogate utility functions for other utility functions, for reasons of tractability or otherwise. The results also provide guidance regarding the significance of the magnitude and change of particular numerical values of the risk aversion index. Moreover, theoretical "exact" results are obtained using Rubinstein's measure of global risk aversion.

Journal ArticleDOI
TL;DR: In this paper, the authors show that even a very restricted version of the original problem is NP-hard, and they also give an implicit enumeration procedure for testing the feasibility of a schedule in which product i is set up I·i integer times per year.
Abstract: Given N products and a set of "basic data" for each product, e.g., production rate, demand rate, setup time, setup cost and holding cost, etc., the Economic Lot Scheduling Problem is to find a feasible schedule that allows cyclic production pattern for each product and such that the sum of the setup and inventory carrying costs for all products per unit time is minimized. The routine application of the economic lot size formula to each product separately, often leads to the phenomenon of "interference," i.e., the machine will be required to produce two items at the same time, which is impossible. In this paper we show that even a very restricted version of the original problem becomes NP-hard. We also give an implicit enumeration procedure for testing the feasibility of a schedule in which product i is set up I·i integer times per year. The main advantage of our procedure is that we can by-pass the tedious study of the start times and concentrate on the combinatorial structure of the production runs, which thus, theoretically cuts down the number of possible branchings in the enumeration.

Journal ArticleDOI
TL;DR: In this paper, a general sales territory alignment model which accommodates these properties is developed and an actual implementation of the general model is described, along with a comparison with a similar model which has been frequently cited in the marketing literature.
Abstract: The sales territory alignment problem may be viewed as the problem of grouping small geographic sales coverage units into larger geographic clusters called sales territories in a way that the sales territories are acceptable according to managerially relevant alignment criteria. This paper first reviews sales territory alignment models which have appeared in the marketing literature. A framework for sales territory alignment and several properties of a good sales territory alignment are developed in the course of the review. A general sales territory alignment model which accommodates these properties is developed. A solution procedure for the general model is presented. Finally, an actual implementation of the general model is described. The implementation provides a comparison of the general model with a similar model which has been frequently cited in the marketing literature.

Journal ArticleDOI
TL;DR: In this article, the role of problem formulation in planning and design is discussed, including problematic, physiological, psychological and environmental factors that can affect the formulation process; and problem formulation heuristics.
Abstract: This paper deals with the role of problem formulation in planning and design, including: a the importance of problem formulation to planning and design; b problematic, physiological, psychological and environmental factors that can affect the formulation process; and c problem formulation heuristics. Two types of formulation heuristics are identified-problem reduction and problem expansion. Because the latter type has received little empirical research, an initial study of a problem expansion heuristic Problem-Purpose Expansion was conducted. Experimentation showed that Problem-Purpose Expansion may have a positive effect on idea generation, particularly for individuals working on problems that fall outside their area of expertise. Exhorting the importance of problem formulation, a second treatment studied in these experiments, produced little measurable effect on idea generation.

Journal ArticleDOI
TL;DR: Heavy use and high perceived impact of financial methods for project selection, selective use of network models, some dissatisfaction over the methods available for project scheduling and control, and no usage of mathematical programming models for R&D resource allocation were key findings.
Abstract: The results of an empirical study on the current usage of quantitative techniques for R&D project management in “Fortune 500” industrial firms are presented. A nonrandom sample of 40 respondents from 29 firms were selected to represent a mix of industrial sectors and geographic regions. The information was obtained via personal interview of R&D budget heads and some high level staff. Extensive demographic data on the respondent and his company were collected and related to familiarity and usage of project management techniques. Data were also collected on the perceived impact of techniques on project decision making, and any recent/planned changes in the cadre of techniques. Heavy use and high perceived impact of financial methods for project selection, selective use of network models, some dissatisfaction over the methods available for project scheduling and control, and no usage of mathematical programming models for R&D resource allocation were key findings. As a result, R&D managers must have a thorou...

Journal ArticleDOI
TL;DR: An approach that makes computer programs more "human-like" by basing them on human decision making behavior is presented, assessing the extent that the computer program "thinks" and "talks" like a human decision maker.
Abstract: A major complaint of people who use "decision-making" computer programs is that these programs merely provide a final decision, and fail to present the supporting argumentation, in terms the user can understand. This article presents an approach that makes computer programs more "human-like" by basing them on human decision making behavior. Decision making processes of student financial analysts are captured by asking them to think aloud during their evaluation. These verbal traces, called protocols, are analyzed at various levels of detail, resulting in specific models of the decision making processes involved, the strategies used, and the task-specific financial knowledge that is required to perform the task. The models and strategies are translated into executable computer programs. Extensive comparisons between human behavior and model simulation output are provided, assessing the extent that the computer program "thinks" and "talks" like a human decision maker. Although the model clearly suffers from "linguistic rigidity," it does appear to perform the evaluation in a similar manner as the human decision maker, examining the same information in the same order, making the same inferences, and reporting the same conclusions.

Journal ArticleDOI
TL;DR: In this paper, an alternative charging scheme for an exponential service case with pre-emptive LIFO service is presented which confirms that differences between individual and social optima occur precisely because individuals fail to consider the inconvenience that they cause to others.
Abstract: Customers arrive at a service area according to a Poisson process. An arriving customer must choose one of K servers without observing present congestion levels. The only available information about the kth server is the service time distribution with expected duration µk-1 and the cost per unit time of waiting at the kth server hk. Although service distributions may differ from server to server and need not be exponential, it is assumed that they share the same coefficient of variation. Individuals acting in self-interest induce an arrival rate pattern λ??1, λ??2, ', λ??k. In contrast, the social optimum is the arrival rate pattern λ1*, λ2*, ', λk* which minimizes long-run average cost per unit time for the entire system. The main result is that λ??k's and λ??k*'s differ systematically. Individuals overload the servers with the smallest hk/µk values. For an exponential service case with pre-emptive LIFO service an alternative charging scheme is presented which confirms that differences between individual and social optima occur precisely because individuals fail to consider the inconvenience that they cause to others.

Journal ArticleDOI
TL;DR: In this article, the authors explored the due-date performance of job shop control systems which base job due dates on a time-phased representation of the workload and the machine capacity in the shop.
Abstract: This study explores the due-date performance of job shop control systems which base job due dates on a time-phased representation of the workload and the machine capacity in the shop. The performance is measured by the mean and the standard deviation of the lateness. Two parameters are used to vary the functioning of the due-date assignment system: a minimum allowance for waiting, denoted by SL, and a maximum fraction of the available capacity allowed for loading, denoted by CLL. The system increases the waiting lime allowance if congestion is observed when loading a new job. The capability of the system to observe congestion is determined by the parameters CLL and SL. Simulation experiments are used to investigate the performance of the assignment system. It is shown that the assignment system performs quite well with respect to reducing the standard deviation of the lateness; the performance is not very sensitive however to the parameter values used; with an expected capacity utilization of 90%, CLL should be set between 0.80 and 1.00 times the mean available capacity and SL should be set between 0.55 and 0.90 times the mean operation waiting time in the shop. The assignment system may also perform well with respect to controlling the mean lateness. If SL is set between 0.55 and 0.75 times the mean expected waiting time in the shop, a constant mean lateness is obtained independent of the utilization of the shop if CLL is set between 0.70 and 0.80 times of the mean available capacity. However, the mean lateness turns out to be quite sensitive to variations in the job-mix of the workload. Finally it is shown that if the values of the assignment parameters are adequate, the mean job lateness is independent of the number of operations in a job. This property can be used to monitor the correctness of the parameter values.

Journal ArticleDOI
TL;DR: Computational experience is presented to show that an easily implemented application of linear programming frequently produces optimal solutions to shift and days-off scheduling problems.
Abstract: Shift and days-off scheduling problems have received much attention in the literature of integer programming approaches to workforce scheduling. A typical managerial use would be to schedule full-time employees to minimize the number of labor hours while satisfying variable workforce requirements of a service delivery system. We present computational experience to show that an easily implemented application of linear programming frequently produces optimal solutions to these problems. When the context progresses toward a continuous operating environment (service delivery over 24 hours a day, 7 days a week) we stress the need to shed the myopic views of the shift and days-off scheduling formulations in favor of an integrative tour scheduling formulation. For this problem we observe that a simple heuristic initiated by rounding down the associated LP solution consistently produces near optimal solutions. This observation is based on experiments over varying workforce requirement patterns.

Journal ArticleDOI
TL;DR: In this article, the authors present and empirically test two procedures for correcting the bias in OLS estimates of beta, which are affected by friction in the trading process which delays the adjustment of a security's price to informational change and hence leads to an "intervalling effect" bias.
Abstract: The concept of beta as the measure of systematic risk has been widely accepted in the academic and financial community. Increasingly, betas are being used to estimate the cost of capital for corporations. Despite this, however, biases are generally present in ordinary least squares OLS estimates of beta. In particular, empirical estimates of beta are affected by friction in the trading process which delays the adjustment of a security's price to informational change and hence leads to an “intervalling-effect” bias. In this paper, we present and empirically test two procedures for correcting this bias. The first is to estimate the asymptotic value that OLS beta approaches as the differencing interval is lengthened without bound. The second procedure is to infer the value of beta by adjusting OLS beta for cross-sectional differences in the intervalling effect as a function of the depth of the market for a security as measured by its value of shares outstanding. Our results suggest that a substantial correction is needed to get “true” beta estimates from short differencing interval data.

Journal ArticleDOI
TL;DR: The discussion of the fundamental p-center and p-median problems in Part I provides the basis for the work surveyed in Part II, which deals with the minimax and minisum location problems with mutual communication, location problems involving multiple objectives, the distance constraints problem and problems involving the location of paths.
Abstract: The discussion of the fundamental p-center and p-median problems in Part I of this paper provides the basis for the work surveyed in Part II. Part II deals with the minimax and minisum location problems with mutual communication, location problems involving multiple objectives, the distance constraints problem and problems involving the location of paths. In addition, convexity issues in network location problems are discussed. Virtually all of this work exploits network structure. Specifically it is based on the assumption that the network is a tree. The conclusion section gives a brief discussion of the state of the art and of current trends in network location research.