scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Operational Research Society in 2003"


Journal ArticleDOI
TL;DR: It is found that both the LS-SVM and neural network classifiers yield a very good performance, but also simple classifiers such as logistic regression and linear discriminant analysis perform very well for credit scoring.
Abstract: In this paper, we study the performance of various state-of-the-art classification algorithms applied to eight real-life credit scoring data sets. Some of the data sets originate from major Benelux and UK financial institutions. Different types of classifiers are evaluated and compared. Besides the well-known classification algorithms (eg logistic regression, discriminant analysis, k-nearest neighbour, neural networks and decision trees), this study also investigates the suitability and performance of some recently proposed, advanced kernel-based classification algorithms such as support vector machines and least-squares support vector machines (LS-SVMs). The performance is assessed using the classification accuracy and the area under the receiver operating characteristic curve. Statistically significant performance differences are identified using the appropriate test statistics. It is found that both the LS-SVM and neural network classifiers yield a very good performance, but also simple classifiers such as logistic regression and linear discriminant analysis perform very well for credit scoring.

906 citations


Journal ArticleDOI
TL;DR: The forecasts produced by the new double seasonal Holt–Winters method outperform those from traditional Holt-Winters and from a well-specified multiplicative double seasonal ARIMA model.
Abstract: This paper considers univariate online electricity demand forecasting for lead times from a half-hour-ahead to a day-ahead. A time series of demand recorded at half-hourly intervals contains more than one seasonal pattern. A within-day seasonal cycle is apparent from the similarity of the demand profile from one day to the next, and a within-week seasonal cycle is evident when one compares the demand on the corresponding day of adjacent weeks. There is strong appeal in using a forecasting method that is able to capture both seasonalities. The multiplicative seasonal ARIMA model has been adapted for this purpose. In this paper, we adapt the Holt-Winters exponential smoothing formulation so that it can accommodate two seasonalities. We correct for residual autocorrelation using a simple autoregressive model. The forecasts produced by the new double seasonal Holt-Winters method outperform those from traditional Holt-Winters and from a well-specified multiplicative double seasonal ARIMA model.

640 citations


Journal ArticleDOI
TL;DR: It is assumed that the retailer also adopts the trade credit policy to stimulate his/her customer demand to develop the retailer's replenishment model and a theorem is developed to determine efficiently the optimal ordering policies for the retailer.
Abstract: The main purpose of this note is to modify the assumption of the trade credit policy in previously published results to reflect the real-life situations. All previously published models implicitly assumed that the supplier would offer the retailer a delay period, but the retailer would not offer the trade credit period to his/her customer. In most business transactions, this assumption is debatable. In this note, we assume that the retailer also adopts the trade credit policy to stimulate his/her customer demand to develop the retailer's replenishment model. Furthermore, we assume that the retailer's trade credit period offered by supplier M is not shorter than the customer's trade credit period offered by retailer N(M⩾N). Under these conditions, we model the retailer's inventory system as a cost minimization problem to determine the retailer's optimal ordering policies. Then a theorem is developed to determine efficiently the optimal ordering policies for the retailer. We deduce some previously published results of other researchers as special cases. Finally, numerical examples are given to illustrate the theorem obtained in this note.

429 citations


Journal ArticleDOI
TL;DR: This survey paper starts with a critical analysis of various performance metrics for supply chain management (SCM), used by a specific manufacturing company, and summarizes how economic theory treats multiple performance metrics.
Abstract: This survey paper starts with a critical analysis of various performance metrics for supply chain management (SCM), used by a specific manufacturing company. Then it summarizes how economic theory treats multiple performance metrics. Actually, the paper proposes to deal with multiple metrics in SCM via the balanced scorecard — which measures customers, internal processes, innovations, and finance. To forecast how the values of these metrics will change — once a supply chain is redesigned — simulation may be used. This paper distinguishes four simulation types for SCM: (i) spreadsheet simulation, (ii) system dynamics, (iii) discrete-event simulation, and (iv) business games. These simulation types may explain the bullwhip effect, predict fill rate values, and educate and train users. Validation of simulation models requires sensitivity analysis; a statistical methodology is proposed. The paper concludes with suggestions for a possible research agenda in SCM. A list with 50 references for further study is included.

337 citations


Journal ArticleDOI
TL;DR: A novel type of Kriging is discussed, which ‘detrends’ data through the use of linear regression, which gives more weight to ‘neighbouring’ observations in random or stochastic simulation.
Abstract: Whenever simulation requires much computer time, interpolation is needed. Simulationists use different interpolation techniques (eg linear regression), but this paper focuses on Kriging. This technique was originally developed in geostatistics by DG Krige, and has recently been widely applied in deterministic simulation. This paper, however, focuses on random or stochastic simulation. Essentially, Kriging gives more weight to ‘neighbouring’ observations. There are several types of Kriging; this paper discusses—besides Ordinary Kriging—a novel type, which ‘detrends’ data through the use of linear regression. Results are presented for two examples of input/output behaviour of the underlying random simulation model: Ordinary and Detrended Kriging give quite acceptable predictions; traditional linear regression gives the worst results.

265 citations


Journal ArticleDOI
TL;DR: New genetic algorithms are developed, extending the representation and operators previously designed for the single-mode version of the problem with makespan minimisation as the objective and a new fitness function for the individuals who are infeasible is defined.
Abstract: In this paper we consider the Multi-Mode Resource-Constrained Project Scheduling Problem with makespan minimisation as the objective. We have developed new genetic algorithms, extending the representation and operators previously designed for the single-mode version of the problem. Moreover, we have defined a new fitness function for the individuals who are infeasible. We have tested different variants of the algorithm and chosen the best to be compared to different heuristics previously published, using standard sets of instances included in PSPLIB. Results illustrate the good performance of our algorithm.

260 citations


Journal ArticleDOI
TL;DR: It is argued that this conception of CST is neither theoretically sound nor conducive to reflective practice, and to a new view of reflective professional practice in general, as critically systemic discourse.
Abstract: Professional competence in applied disciplines such as OR/MS requires both technical expertise and critically reflective skills. Yet, a widespread misconception has taken hold of the OR/MS communit...

248 citations


Journal ArticleDOI
TL;DR: This paper presents a framework within which to examine and compare the main philosophical assumptions underpinning management science methods, and their principle aims and purposes, in order to be able to make more informed and critically aware choices when designing particular combinations in practice.
Abstract: This paper presents a framework within which to examine and compare the main philosophical assumptions underpinning management science methods. It takes the position that they all have in common the basic mechanism of modelling, but that they differ in terms of what they model (ontology), how they model (epistemology), and why they model (axiology). A wide range of both hard and soft methods and methodologies ace categorised within the paper. One of the purposes of the framework is to assist in the process of multimethodology—that is, combining together several methods in an intervention. In particular, it will assist users in understanding both the implicit or explicit assumptions underlying methods, and their principle aims and purposes, in order to be able to make more informed and critically aware choices when designing particular combinations in practice.

226 citations


Journal ArticleDOI
TL;DR: A new super-efficiency model is proposed that generates the samesuper-efficiency scores as conventional super- efficiency models for all units having a feasible solution under the latter, and generates a feasible solutions for all Units not having a viable solution underThe latter.
Abstract: DEA super-efficiency models were introduced originally with the objective of providing a tie-breaking procedure for ranking units rated as efficient in conventional DEA models. This objective has been expanded to include sensitivity analysis, outlier identification and inter-temporal analysis. However, not all units rated as efficient in conventional DEA models have feasible solutions in DEA super-efficiency models. We propose a new super-efficiency model that (a) generates the same super-efficiency scores as conventional super-efficiency models for all units having a feasible solution under the latter, and (b) generates a feasible solution for all units not having a feasible solution under the latter. Empirical examples are provided to compare the two super-efficiency models.

184 citations


Journal ArticleDOI
TL;DR: A districting study undertaken for the Côte-des-Neiges local community health clinic in Montreal by means of a tabu search technique that iteratively moves a basic unit to an adjacent district or swaps two basic units between adjacent districts was solved.
Abstract: This article describes a districting study undertaken for the Cote-des-Neiges local community health clinic in Montreal. A territory must be partitioned into six districts by suitably grouping territorial basic units. Five districting criteria must be respected: indivisibility of basic units, respect for borough boundaries, connectivity, visiting personnel mobility, and workload equilibrium. The last two criteria are combined into a single objective function and the problem is solved by means of a tabu search technique that iteratively moves a basic unit to an adjacent district or swaps two basic units between adjacent districts. The problem was solved and the clinic management confirmed its satisfaction after a 2 year implementation period.

146 citations


Journal ArticleDOI
TL;DR: In this article, the authors demonstrate that Data Envelopment Analysis (DEA) can augment the traditional ratio analysis and provide a consistent and reliable measure of managerial or operational efficiency of a firm.
Abstract: Ratio analysis is a commonly used analytical tool for verifying the performance of a firm. While ratios are easy to compute, which in part explains their wide appeal, their interpretation is problematic, especially when two or more ratios provide conflicting signals. Indeed, ratio analysis is often criticized on the grounds of subjectivity, that is the analyst must pick and choose ratios in order to assess the overall performance of a firm. In this paper we demonstrate that Data Envelopment Analysis (DEA) can augment the traditional ratio analysis. DEA can provide a consistent and reliable measure of managerial or operational efficiency of a firm. We test the null hypothesis that there is no relationship between DEA and traditional accounting ratios as measures of performance of a firm. Our results reject the null hypothesis indicating that DEA can provide information to analysts that is additional to that provided by traditional ratio analysis. We also apply DEA to the oil and gas industry to demonstrate how financial analysts can employ DEA as a complement to ratio analysis.

Journal ArticleDOI
TL;DR: A new hybrid genetic algorithm to address the capacitated VRP is proposed, combining variations of key concepts inspired from routing techniques and search strategies used for a time variant of the problem to further provide search guidance while balancing intensification and diversification.
Abstract: Recently proved successful for variants of the vehicle routing problem (VRP) involving time windows, genetic algorithms have not yet shown to compete or challenge current best search techniques in solving the classical capacitated VRP. A new hybrid genetic algorithm to address the capacitated VRP is proposed. The basic scheme consists in concurrently evolving two populations of solutions to minimize total travelled distance using genetic operators combining variations of key concepts inspired from routing techniques and search strategies used for a time variant of the problem to further provide search guidance while balancing intensification and diversification. Results from a computational experiment over common benchmark problems report the proposed approach to be very competitive with the best-known methods.

Journal ArticleDOI
TL;DR: This paper argues that, within an organisational context, a useful alternative view is one in which knowledge is viewed as a systemic property of the organisational system to which it belongs, and investigates the potential that this stance offers OR practitioners.
Abstract: A central issue in the knowledge management literature is the definition of the nature of knowledge, and particularly the distinction between tacit and explicit knowledge. This paper reviews some of the common standpoints on this issue, but argues that, within an organisational context, a useful alternative view is one in which knowledge is viewed as a systemic property of the organisational system to which it belongs. Thus, attempts to codify knowledge, and position it on a tacit-explicit continuum, are sometimes misplaced. Instead, this paper advocates approaches that view knowledge as a holistic system property. The paper considers the practical implication of this stance, from the perspective of knowledge transfer between individuals and between organisations, and investigates the potential that this stance offers OR practitioners.

Journal ArticleDOI
TL;DR: The optimal ordering policy exhibits nice structural properties and can easily be implemented by a computer program and the service level and profit uncertainty level under the optimal policy are discussed.
Abstract: We investigate in this paper an optimal two-stage ordering policy for seasonal products. Before the selling season, a retailer can place orders for a seasonal product from her supplier at two distinct stages satisfying the lead-time requirement. Market information is collected at the first stage and is used to update the demand forecast at the second stage by using Bayesian approach. The ordering cost at the first stage is known but the ordering cost at the second stage is uncertain. A two-stage dynamic optimization problem is formulated and an optimal policy is derived using dynamic programming. The optimal ordering policy exhibits nice structural properties and can easily be implemented by a computer program. The detailed implementation scheme is proposed. The service level and profit uncertainty level under the optimal policy are discussed. Extensive numerical analyses are carried out to study the performance of the optimal policy.

Journal ArticleDOI
TL;DR: In this article, the authors compare the classification accuracy of a model based only on accepted applicants, relative to one based on a sample of all applicants, and find that the lower the risk band of the training sample, the less accurate the predictions for all applicants.
Abstract: One of the aims of credit scoring models is to predict the probability of repayment of any applicant and yet such models are usually parameterised using a sample of accepted applicants only. This may lead to biased estimates of the parameters. In this paper we examine two issues. First, we compare the classification accuracy of a model based only on accepted applicants, relative to one based on a sample of all applicants. We find only a minimal difference, given the cutoff scores for the old model used by the data supplier. Using a simulated model we examine the predictive performance of models estimated from bands of applicants, ranked by predicted creditworthiness. We find that the lower the risk band of the training sample, the less accurate the predictions for all applicants. We also find that the lower the risk band of the training sample, the greater the overestimate of the true performance of the model, when tested on a sample of applicants within the same risk band — as a financial institution would do. The overestimation may be very large. Second, we examine the predictive accuracy of a bivariate probit model with selection (BVP). This parameterises the accept–reject model allowing for (unknown) omitted variables to be correlated with those of the original good–bad model. The BVP model may improve accuracy if the loan officer has overridden a scoring rule. We find that a small improvement when using the BVP model is sometimes possible.

Journal ArticleDOI
TL;DR: Computational results indicate that the linear version of the new model performs significantly better than the most successful linearization of the old model both in terms of average and maximum CPU times as well as in core storage requirements.
Abstract: We study the hub covering problem which, so far, has remained one of the unstudied hub location problems in the literature. We give a combinatorial and a new integer programming formulation of the hub covering problem that is different from earlier integer programming formulations. Both new and old formulations are nonlinear binary integer programs. We give three linearizations for the old model and one linearization for the new one and test their computational performances based on 80 instances of the CAB data set. Computational results indicate that the linear version of the new model performs significantly better than the most successful linearization of the old model both in terms of average and maximum CPU times as well as in core storage requirements.

Journal ArticleDOI
TL;DR: In this paper, a data envelopment analysis (DEA) approach is proposed to measure the relative efficiency of decision-making units in the presence of a multiple input-multiple output (MIMO) structure.
Abstract: The solidarity and social responsibility features that characterize the ethical mutual funds satisfy the fulfillment of humanitarian aims, but may lower the investment profitability. Hence, when we measure the performance of ethical mutual funds, we cannot disregard the ethical component. In this contribution, we propose a performance indicator that considers the expected return, the investment risk, the ethical component and the subscription and redemption costs together. The performance measure proposed is obtained using a data envelopment analysis (DEA) approach, which allows one to measure the relative efficiency of decision-making units in the presence of a multiple input–multiple output structure. The DEA performance indicator for ethical funds can be computed with different models, according to the nature of the ethical indicator that characterizes the socially responsible funds. In particular, a DEA categorical variable model seems appropriate.

Journal ArticleDOI
TL;DR: System dynamics is employed to illustrate the relationship between recruitment, training, skills, and knowledge in a causal loop form and it is anticipated that systems dynamics modelling would help organisations to devise efficient human resource management strategies.
Abstract: Recent transitions from the industrial to knowledge economy suggest an immediate and wholesale retraining scenario so that many organisations can remain at the cutting edge of technology. The dynamics of the job market is creating a challenge for many organisations in recruiting and retaining their core staff. In fact, many companies are in fear of losing critical business knowledge when their employees leave. In this paper, systems dynamics is employed to illustrate the relationship between recruitment, training, skills, and knowledge in a causal loop form. Strategies for human resource management are developed by conducting time-based dynamic analysis. We anticipate that systems dynamics modelling would help organisations to devise efficient human resource management strategies.

Journal ArticleDOI
TL;DR: A theoretical framework for measuring the efficiency of banking services taking into account physical and human resources, service quality and performance is proposed here, thereby linking the marketing variables to the financial metrics.
Abstract: In this paper, we present the development of a theoretical framework for measuring the efficiency of banking services taking into account physical and human resources, service quality and performance. Expenditures on quality improvement efforts and the impact of service quality on financial outcomes have long intrigued researchers. Banks have traditionally focused on how to transform their physical resources to generate financial performance, and they inadvertently ignored the mediating intangible factor of service quality. A theoretical framework on the optimization triad of resource, service quality and performance is proposed here, thereby linking the marketing variables to the financial metrics. A measure for the return on quality is developed as the ratio of the potential improvements in financial performance by enhancement of service quality to the observed performance figures. Empirical results obtained from a study of 27 Indian public sector banks and their customers allow us to measure the impact of service quality on financial performance, optimal level of service quality that can be generated using existing resources and the opportunity cost for sub-optimal service delivery. Banks delivering better service are shown to have better transformation of resource to performance using superior service delivery as the medium. Our results confirm the linkage between resource, service quality and performance for services.

Journal ArticleDOI
TL;DR: This paper complements the shortcoming of Chang and Dye's inventory model by adding the non-constant purchase cost into the model, and shows that the total cost is a convex function of the number of replenishments.
Abstract: For seasonal products, fashionable commodities and high-tech products with a short product life cycle, the willingness of a customer to wait for backlogging during a shortage period is diminishing with the length of waiting time Recently, Chang and Dye developed an inventory model in which the backlogging rate declines as the waiting time increases In this paper, we complement the shortcoming of their model by adding the non-constant purchase cost into the model In addition, we show that the total cost is a convex function of the number of replenishments We further simplify the search process by providing an intuitively good starting value, which reduces the computational complexity significantly Finally, we characterize the influences of the demand patterns over the replenishment cycles and others

Journal ArticleDOI
C S Sung1, Sang Hwa Song1
TL;DR: This paper considers an integrated service network design problem for a given set of freight demands that is concerned with integration of locating cross-docking (CD) centers and allocating vehicles for the associated direct (transportation) services from origin node to a CD centers or from a CD center to the destination node.
Abstract: This paper considers an integrated service network design problem for a given set of freight demands that is concerned with integration of locating cross-docking (CD) centers and allocating vehicles for the associated direct (transportation) services from origin node to a CD center or from a CD center to the destination node. For the vehicle allocation, direct services (sub-routes) should be determined for the given freight demands, and then the vehicle allocation has to be made in consideration of routing for the associated direct service fulfillment subject to vehicle capacity and service time restriction. The problem is modeled as a path-based formulation for which a tabu-search-based solution algorithm is proposed. To guarantee the performance of the proposed solution algorithm, strong valid inequalities are derived based on the polyhedral characteristics of the problem domain and an efficient separation heuristic is derived for identifying any violated valid inequalities. Computational experiments are performed to test the performance of the proposed solution algorithm and also that of a valid-inequality separation algorithm, which finds that the solution algorithm works quite well and the separation algorithm provides strengthened lower bounds. Its immediate application may be made to strategic (integrated) service network designs and to tactical service network planning for the CD network.

Journal ArticleDOI
TL;DR: This paper reviews some of the work seeking to model and explain the behaviour of complex projects, and explains why lessons are difficult to learn from such projects.
Abstract: The ‘learning organisation’ is frequently emphasised in the literature and in practice, and this is particularly important for project-oriented organisations. However, experience tells us that organisations tend not to learn adequately from project experiences. This paper reviews some of the work seeking to model and explain the behaviour of complex projects, which explains why lessons are difficult to learn from such projects—not the easy and obvious lessons but the lessons about complex non-intuitive project behaviours. From there it looks at why projects are frequently not reviewed, and seeks to offer practical proposals for carrying out reviews, using small models to enable lessons to be learned that provide understanding (rather than simply data), and distributing that learning around the organisation.

Journal ArticleDOI
TL;DR: It is indicated that use of standard yield management approaches to pricing by airlines can result in significantly reduced revenues when buyers are using an informed and strategic approach to purchasing.
Abstract: Using tools from operations research, airlines have, for many years, taken a strategic approach to pricing the seats available on a particular flight based on demand forecasts and information. The result of this approach is that the same seat on the same flight is often offered at different fares at different times. Setting of these prices using yield-management approaches is a major activity for many airlines and is well studied in the literature. However, consumers are becoming increasingly aware of the existence of pricing strategies used by airlines. In addition, the availability of airline travel pricing on the Internet affords consumers the opportunity to behave more strategically when making purchase decisions. The onset of the information age makes it possible for an informed consumer or a third party, such as a travel agent, to obtain demand information similar to that used by the airlines. In particular, it is possible for consumers or travel agents to purchase historical data or to obtain it by monitoring the seats that are available at various prices for a given flight. If a consumer understands the pricing strategy and has access to demand information, he/she may decide to defer purchase of a ticket because they believe that a cheaper seat may yet become available. If consumers were to make use of this information to make such strategic purchasing decisions, what would be the impact on airline revenues? The purpose of this paper is to investigate these impacts. This work indicates that use of standard yield management approaches to pricing by airlines can result in significantly reduced revenues when buyers are using an informed and strategic approach to purchasing. Therefore, when airlines are setting or presenting prices, they should investigate the effect of strategic purchasing on their decisions.

Journal ArticleDOI
TL;DR: The proposed metaheuristic has a remarkably simple structure, it is lean and parsimonious and it produces high quality solutions over a set of published benchmark instances.
Abstract: In real life situations most companies that deliver or collect goods own a heterogeneous fleet of vehicles. Their goal is to find a set of vehicle routes, each starting and ending at a depot, making the best possible use of the given vehicle fleet such that total cost is minimized. The specific problem can be formulated as the Heterogeneous Fixed Fleet Vehicle Routing Problem (HFFVRP), which is a variant of the classical Vehicle Routing Problem. This paper describes a variant of the threshold accepting heuristic for the HFFVRP. The proposed metaheuristic has a remarkably simple structure, it is lean and parsimonious and it produces high quality solutions over a set of published benchmark instances. Improvement over several of previous best solutions also demonstrates the capabilities of the method and is encouraging for further research.

Journal ArticleDOI
TL;DR: A discrete event simulation model was developed and used to explore changes to Edmonton EMS operations, finding that a SS system increased average unit availability and the fraction of calls reached within the department's response time standard, particularly during the current shift changeover periods.
Abstract: The City of Edmonton's Emergency Medical Services (EMS) department proposed to move to a ‘single start station system’ (SS system) in which all ambulances would begin and end their shifts at the same location. We developed a discrete event simulation model to estimate the impact of this change and subsequently used this model to explore other changes to Edmonton EMS operations, including the addition of stations, the addition of ambulances, different shifts, and a different redeployment system. We found that a SS system increased average unit availability and the fraction of calls reached within the department's response time standard, particularly during the current shift changeover periods. The paper describes the development and validation of the simulation model and summarizes the results of its application.

Journal ArticleDOI
TL;DR: There is no evidence of a skills gap, rather a feeling that there is no net gain from employing simulation methods when simpler methods will suffice, and that BPS projects are typically short, relatively non-technical, and rely on good project management for their success.
Abstract: In order to understand the requirements of people engaged in business process simulation (BPS), a survey was conducted among potential business process simulation users. The survey had a 37% response rate and revealed a low usage of simulation in the design, modification and improvement of business processes. It confirms that BPS projects are typically short, relatively non-technical, and rely on good project management for their success. Most BPS users employ general-purpose simulation software rather than purpose-designed business process simulators. There is no evidence of a skills gap, rather a feeling that there is no net gain from employing simulation methods when simpler methods will suffice. These findings are discussed and conclusions drawn.

Journal ArticleDOI
TL;DR: The transitivity of AHP scales is discussed, a consistency measure is proposed to reflect the judgmental inconsistency and a scale is derived based on the transitivity, so it is naturally transitive.
Abstract: Transitivity is important in multicriteria decision-making. The analytic hierarchy process (AHP), as one of the widely used decision analysis tools, is criticized since it suffers from scale intransitivity. This paper first reviews and compares different scales from different aspects, then discusses the transitivity of AHP scales and derives a scale based on the transitivity, so it is naturally transitive. Besides, two approaches are provided to determine the scale parameter for the derived transitive scale. In order to deal with the transitivity problem, the AHP provides a consistency index for testing pairwise comparison consistency. So, finally, this paper proposes a consistency measure to reflect the judgmental inconsistency.

Journal ArticleDOI
TL;DR: Examination of half a million observations of the size of orders from customers at an electrical wholesaler finds a strong relationship linking the mean and the variance of order size and the scheme employed is described.
Abstract: This paper examines half a million observations of the size of orders from customers at an electrical wholesaler. It notes: the distribution of the size of customer orders for a single item (stock keeping unit or SKU) is very skewed and resembles a geometric distribution; while the average size of an order is different for different items, for one SKU the mean order size is effectively the same at different branches even when the branches have very different demand rates; across a range of SKUs there is a strong relationship linking the mean and the variance of order size. The general results above are shown to apply to even the slowest movers. This extension is important because for items with intermittent demand the size of customer orders is required to produce an unbiased estimate of demand. Also a knowledge of the distribution of demand is important for setting maximum and minimum stock levels and the scheme employed is described.

Journal ArticleDOI
TL;DR: This paper develops a simple approximation for queuing systems with a non-integer number of servers and shows that the near-optimal utilisation rate of the repair servers is usually high and depends mainly on the relative price of the servers compared with inventory items.
Abstract: The availability of repairable technical systems depends on the availability of (repairable) spare parts, to be influenced by (1) inventory levels and (2) repair capacity. In this paper, we present a procedure for simultaneous optimisation of these two factors. Our method is based on a modification of the well-known VARI-METRIC procedure for determining near-optimal spare part inventory levels and results for multi-class, multi-server queuing systems representing repair shops. The modification is required to avoid non-convexity problems in the optimisation procedure. To include part-time and overtime working, we allow for a non-integer repair capacity. To this end, we develop a simple approximation for queuing systems with a non-integer number of servers. Our computational experiments show that the near-optimal utilisation rate of the repair servers is usually high (0.80–0.98) and depends mainly on the relative price of the servers compared with inventory items. Further, the size of the repair shop (the minimal number of servers required for a stable system) plays its part. We also show that our optimisation procedure is robust for the choice of the step size for the server capacity.

Journal ArticleDOI
TL;DR: This research is based on detailed records from the use of JOURNEY Making, where it has used special purpose Group Support software to aid the group problem structuring, and suggests a typology of knowledge sharing.
Abstract: Problem-structuring techniques are an integral aspect of 'Soft-OR'. SSM, SAST, Strategic Choice, and JOURNEY Making, all depend for their success on a group developing a shared view of a problem through some form of explicit modelling. The negotiated problem structure becomes the basis for problem resolution. Implicit to this process is an assumption that members of the group share and build their knowledge about the problem domain. This paper explores the extent to which this assumption is reasonable. The research is based on detailed records from the use of JOURNEY Making, where it has used special purpose Group Support software to aid the group problem structuring. This software continuously tracks the contributions of each member of the group and thus the extent to which they appear to be 'connecting' and augmenting their own knowledge with that of other members of the group. Software records of problem resolution in real organisational settings are used to explore the sharing of knowledge among senior managers. These explorations suggest a typology of knowledge sharing. The implications of this typology for problem structuring and an agenda for future research are considered.