scispace - formally typeset
Search or ask a question

Showing papers in "Omega-international Journal of Management Science in 2009"


Journal ArticleDOI
TL;DR: In this article, the authors examined various factors associated with the adoption of e-procurement and found that firm size, top management support, perceived indirect benefits, and business partner influence are positively and significantly associated with e-procurement adoption.
Abstract: This study examines various factors associated with the adoption of e-procurement. A survey questionnaire was administered to collect data from 141 companies in Singapore. Using logistic regression analysis, we found that firm size, top management support, perceived indirect benefits, and business partner influence are positively and significantly associated with the adoption of e-procurement. Further, industry type does not show any relationship with e-procurement adoption. Implications of our results are discussed.

372 citations


Journal ArticleDOI
TL;DR: In this paper, the impact of supply disruption risks on the choice between the famous single and dual sourcing methods in a two-stage supply chain with a non-stationary and price-sensitive demand is evaluated.
Abstract: The focus of this paper is placed on evaluating the impacts of supply disruption risks on the choice between the famous single and dual sourcing methods in a two-stage supply chain with a non-stationary and price-sensitive demand. The expected profit functions of the two sourcing modes in the presence of supply chain disruption risks are first obtained, and then compared so that the critical values of the key factors affecting the final choice are identified. Finally, the sensitivity of the buyer's expected profit to various input factors is examined through numerical examples, which provide guidelines for how to use each sourcing method.

369 citations


Journal ArticleDOI
TL;DR: Knowledge management is a set of relatively-new organizational activities that are aimed at improving knowledge, knowledge-related practices, organizational behaviors and decisions and organizational performance and the "intermediate outcomes" of KM are improved organizational behaviors, decisions, products, services, processes and relationships.
Abstract: Knowledge management (KM) is a set of relatively-new organizational activities that are aimed at improving knowledge, knowledge-related practices, organizational behaviors and decisions and organizational performance. KM focuses on knowledge processesknowledge creation, acquisition, refinement, storage, transfer, sharing and utilization. These processes support organizational processes involving innovation, individual learning, collective learning and collaborative decision-making. The "intermediate outcomes" of KM are improved organizational behaviors, decisions, products, services, processes and relationships that enable the organization to improve its overall performance. Knowledge Management and Organizational Learning presents some 20 papers organized into five sections covering basic concepts of knowledge management; knowledge management issues; knowledge management applications; measurement and evaluation of knowledge management and organizational learning; and organizational learning. Volume editor William R. King is the University Professor of Business Administration at the Joseph M. Katz Graduate School of Business and College of Business Administration, University of Pittsburgh. He was the founding president of the Association for Information Systems (AIS) and a past president of The Institute of Management Sciences (TIMS) (198990), an international professional society with 8,000 members, which he guided to merge with the Operations Research Society of America to form INFORMS. He has twice served as chair of ICISthe annual International Conference on Information Systems (1988; 2005), has served as editor-in-chief of the Management Information Systems Quarterly, the primary journal in the field of information systems, and was the key figure in the founding of a new journal, Information Systems Research.

351 citations


Journal ArticleDOI
TL;DR: In this article, the authors developed and tested a model of innovation behavior in the hotel industry, which relates four types of innovation (management, external communication, service scope and back-office) to the key determinants: service provider characteristics, customer competences and the market drivers.
Abstract: We develop and test a model of innovation behavior in the hotel industry. The model relates four types of innovation—i.e., management, external communication, service scope and back-office—to the key determinants: service provider characteristics, customer competences and the market drivers. Using statistical probit models and cross-sectional survey data from a stratified sample of hotels in the Balearic Islands ( N = 331 ) we were able to verify the model including innovation types determinants and the innovation impact on hotels performance. Main findings verify the model indicating the effects of these determinants on innovation and the positive impact of the innovation on the hotels performance. Hence, innovation decisions determinants are: the additional services on offer, that bookings are made through tour operators, that hotels are part of a hotel chain and that the owners of the hotel run the business.

336 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used loss aversion to model manager's decision-making behavior in the single-period newsvendor problem, and found that if shortage cost is not negligible, then a loss-averse news vendor may order more than a risk-neutral news vendor.
Abstract: Newsvendor models are widely used in the literature, and usually based upon the assumption of risk neutrality. This paper uses loss aversion to model manager's decision-making behavior in the single-period newsvendor problem. We find that if shortage cost is not negligible, then a loss-averse newsvendor may order more than a risk-neutral newsvendor. We also find that the loss-averse newsvendor's optimal order quantity may increase in wholesale price and decrease in retail price, which can never occur in the risk-neutral newsvendor model.

290 citations


Journal ArticleDOI
TL;DR: In this article, a modified variable returns to scale (VRS) model was proposed to evaluate the performance of the countries in Olympic games, where each DMU is viewed as a competitor via non-cooperative game and a multiplier bundle is determined that optimizes the efficiency score for that DMU.
Abstract: A number of studies have used data envelopment analysis (DEA) to evaluate the performance of the countries in Olympic games. While competition exists among the countries in Olympic games/rankings, all these DEA studies do not model competition among peer decision making units (DMUs) or countries. These DEA studies find a set of weights/multipliers that keep the efficiency scores of all DMUs at or below unity. Although cross efficiency goes a further step by providing an efficiency measure in terms of the best multiplier bundle for the unit and all the other DMUs, it is not always unique. This paper presents a new and modified DEA game cross-efficiency model where each DMU is viewed as a competitor via non-cooperative game. For each competing DMU, a multiplier bundle is determined that optimizes the efficiency score for that DMU, with the additional constraint that the resulting score should be at or above that DMU 's estimated best performance. The problem, of course, arises that we will not know this best performance score for the DMU under evaluation until the best performances of all other DMUs are known. To combat this “chicken and egg” phenomenon, an iterative approach leading to the Nash equilibrium is presented. The current paper provides a modified variable returns to scale (VRS) model that yields non-negative cross-efficiency scores. The approach is applied to the last six Summer Olympic Games. Our results may indicate that our game cross-efficiency model implicitly incorporates the relative importance of gold, silver and bronze medals without the need for specifying the exact assurance regions.

221 citations


Journal ArticleDOI
TL;DR: In this paper, the impact of financial constraint on the performance of a supply chain was examined under pre-order, consignment and combination of these two modes and the authors showed that with financial constraint, the combination mode is the most efficient mode even if the retailer earns zero internal capital.
Abstract: A supply chain may operate under either preorder mode, consignment mode or the combination of these two modes. Under preorder, the retailer procures before the sale and takes full inventory risk during the sale, while under consignment, the retailer sells the product for the supplier with the supplier taking the inventory risk. The combination mode shares the risk in the supply chain. The existing research has examined the supply chain modes from various operational aspects. However, the impact of financial constraint is neglected. This paper examines the impact of financial constraint and investigates the supply chain efficiency under each mode. Based on a Stackelberg game with the supplier being the leader, we show that without financial constraint the supplier always prefers the consignment mode, taking full inventory risk. Whereas, in the presence of financial constraint, the supplier will sell part of the inventory to the retailer through preorder, which shares the inventory risk in the supply chain. We show that with financial constraint, the combination mode is the most efficient mode even if the retailer earns zero internal capital.

202 citations


Journal ArticleDOI
TL;DR: This study develops an OSS success model from a previous Information Systems success model incorporating the characteristics of OSS, and demonstrates that software quality and community service quality have significant effects on user satisfaction.
Abstract: Since the mid-1990s, there has been a surge of interest among academics and practitioners in open source software (OSS). While there is an abundance of literature on OSS, most studies on OSS success are either qualitative or exploratory in nature. To identify the factors that influence OSS success and establish generalizability, an empirical study measuring OSS success would enable OSS developers and users to improve OSS usage. In this study, we develop an OSS success model from a previous Information Systems success model incorporating the characteristics of OSS. Using the proposed model, we identify five determinants for OSS success as well as a number of significant relationships among these determinants. Our findings demonstrate that software quality and community service quality have significant effects on user satisfaction. Software quality and user satisfaction, in turn, have significant effects on OSS use. Additionally, OSS use and user satisfaction have significant effects on individual net benefits. This research contributes towards advancing theoretical understanding of OSS success as well as offering OSS practitioners for enhancing OSS success.

196 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used the Kruskal-Wallis test to statistically verify the existence of a relationship between the organizational form of a chain and its efficiency score, and showed that plural form networks are in average more efficient than strictly franchised and wholly owned chains.
Abstract: Plural form tends to be the most popular organization form in retail and service networks compared to purely franchised or purely company-owned systems. In the first part, this paper exposes the evolution of researchers’ state of mind from the way of thinking which considers franchising and ownership as substitutable organizational forms to theories which analyze the utilization of both franchise and company arrangements. The paper describes the main attempts to explain theoretically the superiority of plural forms. In the second part, the paper discusses the hypothesis which says that there is a relationship between the organizational form of the chain and its efficiency score. It is demonstrated through the application of a data envelopment analysis method on French hotel chains that plural form networks are in average more efficient than strictly franchised and wholly owned chains. The Kruskal–Wallis test which is a distribution-free rank-order statistic is used to statistically verify this relationship. The result does not permit the rejection of the null hypothesis regarding whether an organizational form is more efficient than another one. Hence, this paper opens prospects for researches aiming at testing the organizational form effect on different samples and with other methods.

187 citations


Journal ArticleDOI
TL;DR: In this article, the authors illustrate an application of non-oriented network slacks-based measure using simulated profit center data that, in turn, rely on actual aggregate data on domestic commercial banks in the United Arab Emirates (UAE).
Abstract: Standard data envelopment analysis (DEA) does not provide adequate detail to identify the specific sources of inefficiency embedded in interacting divisions of an organization On the other hand, network DEA gives access to this underlying diagnostic information that would otherwise remain undiscovered As a first study of its kind, the paper illustrates an application of non-oriented network slacks-based measure using simulated profit center data that, in turn, rely on actual aggregate data on domestic commercial banks in the United Arab Emirates (UAE) The study also contributes to a perennial research problem, namely, inability of the outside researcher to access internal data for developing or testing new methods In addition to these contributions to the Operations Research literature, focusing on UAE contributes to banking literature because this rapidly expanding part of the Middle East seldom appears in frontier efficiency literature

183 citations


Journal ArticleDOI
TL;DR: In this paper, the authors modify one of the centralized data envelopment analysis (DEA) models and suggest modifying it to only consider adjustments of previously inefficient units, where some variables are controlled by a central authority (e.g. Head Office) rather than individual unit managers.
Abstract: In two recent papers, Lozano and Villa [Centralized resource allocation using data envelopment analysis. Journal of Productivity Analysis 2004;22:143–61. [1]] and Lozano et al. [Centralized target setting for regional recycling operations using DEA. OMEGA 2004;32:101–10. [2]] introduce the concept of “centralized” data envelopment analysis (DEA) models, which aim at optimizing the combined resource consumption by all units in an organization rather than considering the consumption by each unit separately. This is particularly relevant for situations where some variables are controlled by a central authority (e.g. Head Office) rather than individual unit managers. In this paper we reconsider one of the centralized models proposed by the above-mentioned authors and suggest modifying it to only consider adjustments of previously inefficient units. We show how this new model formulation relate to a standard DEA model, namely as the analysis of the mean inefficient point. We also provide a procedure that can be used to generate alternative optimal solutions, enabling a decision maker to search through alternate solution possibilities in order to select the preferred one. We then extend the model to incorporate non-transferable as well as strictly non-discretionary variables and illustrate the models using an empirical example of a public service organization.

Journal ArticleDOI
TL;DR: Five new methods that outperform NEH are shown as supported by careful statistical analyses using the well-known instances of Taillard to counter the excessive greediness of NEH by carrying out re-insertions of already inserted jobs at some points in the construction of the solution.
Abstract: The well-known NEH heuristic from Nawaz, Enscore and Ham proposed in 1983 has been recognized as the highest performing method for the permutation flowshop scheduling problem under the makespan minimization criterion. This performance lead is maintained even today when compared against contemporary and more complex heuristics as shown in recent studies. In this paper we show five new methods that outperform NEH as supported by careful statistical analyses using the well-known instances of Taillard. The proposed methods try to counter the excessive greediness of NEH by carrying out re-insertions of already inserted jobs at some points in the construction of the solution. The five proposed heuristics range from extensions that are slightly slower than NEH in most tested instances to more comprehensive methods based on local search that yield excellent results at the expense of some added computational time. Additionally, NEH has been profusely used in the flowshop scheduling literature as a seed sequence in high performing metaheuristics. We demonstrate that using some of our proposed heuristics as seeds yields better final results in comparison.

Journal ArticleDOI
TL;DR: A note on mean-variance analysis of the Newsvendor model with stockout cost is given in this paper, with a discussion of the stockout costs of the model.
Abstract: Note: Pre-published version entitled: A Note on Mean-variance Analysis of the Newsvendor Model with Stockout Cost.

Journal ArticleDOI
TL;DR: In this paper, the capacity decisions and expected performance of two alternative manufacturing network configurations when demand and return flows are both uncertain are examined, and the underlying decision problems are formulated as two-stage stochastic programs with recourse.
Abstract: Efficient implementation of product recovery requires appropriate network structures. In this paper, we study the network design problem of a firm that manufactures new products and remanufactures returned products in its facilities. We examine the capacity decisions and expected performance of two alternative manufacturing network configurations when demand and return flows are both uncertain. Concerning the market structure, we further distinguish between the case where newly manufactured and remanufactured products are sold on the same market and the case where recovered products have to be sold on a secondary market. We consider network structures where manufacturing and remanufacturing are both conducted in common plants as well as structures that pool all remanufacturing activities in a separate plant. The underlying decision problems are formulated as two-stage stochastic programs with recourse. Based on numerical studies with normally distributed demands and returns, we show that particularly network size, investment costs of (re-)manufacturing capacity, and market structure have a strong impact on the choice of a network configuration. Concerning the general role of manufacturing configuration in a system with product recovery, our results indicate that the investigated structures can lead to very different expected profits. We also examine the sensitivity of network performance to changes in return volumes, return variability and correlation between return and demand. Based on these results, we find that integrated plants are more beneficial in the common market setting. This relative advantage tends to diminish when demand is segmented, thus investing in more specialized, dedicated resources should be considered.

Journal ArticleDOI
TL;DR: This paper provides a state-of-the-art review of the research and future research projections and is a starting point for anyone conducting research in the deterministic dynamic demand lot-sizing field.
Abstract: Due to their importance in industry and mathematical complexity, dynamic demand lot-sizing problems are frequently studied. In this article, we consider coordinated lot-size problems, their variants and exact and heuristic solutions approaches. The problem class provides a comprehensive approach for representing single and multiple items, coordinated and uncoordinated setup cost structures, and capacitated and uncapacitated problem characteristics. While efficient solution approaches have eluded researchers, recent advances in problem formulation and algorithms are enabling large-scale problems to be effectively solved. This paper updates a 1988 review of the coordinated lot-sizing problem and complements recent reviews on the single-item lot-sizing problem and the capacitated lot-sizing problem. It provides a state-of-the-art review of the research and future research projections. It is a starting point for anyone conducting research in the deterministic dynamic demand lot-sizing field.

Journal ArticleDOI
TL;DR: In this article, a new set of financial and non-financial performance indicators that can be used by high-tech manufacturing companies and have developed a business performance evaluation model are used in the model.
Abstract: The nature of competition in the high-technology manufacturing industry has changed dramatically over the last two decades, and any of the traditional indicators of business performance are insufficient today. We have identified a new set of financial and non-financial performance indicators that can be used by high-tech manufacturing companies and have developed a business performance evaluation model. A data envelopment analysis (DEA), an analytic hierarchy process (AHP), and a fuzzy multi-criteria decision-making approach are used in the model. Data from large-sized thin-film transistor liquid-crystal display panel companies in Taiwan were collected via a field survey and from various published databases and were fed into the model to determine the relative business performance of the companies. We hope that our findings will help high-tech manufacturing executives determine their companies’ strengths and weaknesses and lead to future improvements in business operations.

Journal ArticleDOI
TL;DR: This work highlights the role of multi-criteria decision analysis (MCDA) within RODOS in ensuring the transparency of decision processes within emergency and remediation management and improving the acceptability of the system as a whole.
Abstract: Environmental emergency situations can differ in many ways, for instance according to their causes and the dimension of their impacts. Yet, they share the characteristic of sudden onset and the necessity for a coherent and effective emergency management. In this paper we consider decision support in the event of a nuclear or radiological accident in Europe. RODOS, an acronym for real-time on-line decision support system, is a decision support system designed to provide support from the early phases through to the medium and long-term phases. This work highlights the role of multi-criteria decision analysis (MCDA) within RODOS in ensuring the transparency of decision processes within emergency and remediation management. Special emphasis is placed on the evaluation of alternative remediation or countermeasure strategies using the multi-criteria decision support tool Web-HIPRE in scenario focused decision making workshops involving different stakeholder and expert groups. Decision support is enhanced by a module that generates natural language explanations to facilitate the understanding of the evaluation process, therefore contributing to the direct involvement of the decision makers, with the aim of increasing their confidence in the results of the analyses carried out, forming an audit trail for the decision making process and improving the acceptability of the system as a whole.

Journal ArticleDOI
TL;DR: In this article, a special form of the single-period inventory problem (newsvendor problem) with a known demand and stochastic supply (yield) is studied, and a general analytic solution for two types of yield risks, additive and multiplicative, is described.
Abstract: A special form of the single-period inventory problem (newsvendor problem) with a known demand and stochastic supply (yield) is studied. A general analytic solution for two types of yield risks, additive and multiplicative, is described. Numerical examples demonstrate the solutions for special cases of uniform distribution yield risks. An analysis of a two-tier supply chain of customer and producer reveals that the customer may find it optimal to order more than is needed, since a larger order increases the producer's optimal production quantity.

Journal ArticleDOI
TL;DR: In this article, the authors analyzed technical efficiency of Italian and Spanish football during three recent seasons, to shed light on the sport performance of professional football clubs, using mathematical optimization methods, particularly DEA models, which enable the calculation of the frontiers of efficient production.
Abstract: This paper analyses technical efficiency of Italian and Spanish football during three recent seasons, to shed light on the sport performance of professional football clubs. To achieve this we have used mathematical optimisation methods, particularly DEA models, which enable the calculation of the frontiers of efficient production. Some of the most interesting results are the following. Firstly, the Spanish league is clearly more homogeneous and competitive than the Italian league. Secondly, to obtain a better classification in the Italian league, it is much more important to improve defensive, rather than offensive, efficiency. The popular maxim holds in Italy: the best attack begins with a good defence. Third, in Spain our analysis supports the idea that to improve the ranking in the league, the best-rewarded strategy consists in improving offensive efficiency playing at home ground, followed by increasing offensive efficiency when playing away from home.

Journal ArticleDOI
TL;DR: The aim is to differentiate between two dimensions of flexibility important to the manufacturing value chain, i.e., volume and product mix flexibility, and to investigate how different flexibility configurations are related to various manufacturing practices.
Abstract: In this paper we address flexibility and investigate the relationship between volume and product mix flexibility. One view of flexibility is that of being a capability in itself: another view is th ...

Journal ArticleDOI
TL;DR: New ways of utilizing preference information specified by the decision maker in interactive reference point based methods are introduced to take the desires of the decision makers into account more closely when projecting the reference point onto the set of nondominated solutions.
Abstract: In this paper, we introduce new ways of utilizing preference information specified by the decision maker in interactive reference point based methods. A reference point consists of desirable values for each objective function. The idea is to take the desires of the decision maker into account more closely when projecting the reference point onto the set of nondominated solutions. In this way we can support the decision maker in finding the most satisfactory solutions faster. In practice, we adjust the weights in the achievement scalarizing function that projects the reference point. We identify different cases depending on the amount of additional information available and demonstrate the cases with examples. Finally, we summarize results of extensive computational tests that give evidence of the efficiency of the ideas proposed.

Journal ArticleDOI
TL;DR: In this article, a combination of data envelopment analysis (DEA) and a Malmquist index is combined with a bootstrap method to determine the performance of grain producers in Eastern Norway.
Abstract: Previous applications of data envelopment analysis (DEA) and its subsequent Malmquist indices to efficiency and productivity measurements have been criticised for not providing statistical inferences regarding the significance of observed results. In this paper, DEA and a Malmquist index are combined with a bootstrap method in order to provide succinct statistical inferences that determine the performance of grain producers in Eastern Norway. The data cover the period between 1987 and 1997. Results reveal: (i) a significant degree of inefficiency (approximately 11%) and an average productivity progress of 38% over the period considered; (ii) the formidable productivity progress observed is primarily explained by technical efficiency changes that enabled producers to catch up with front runners; and (iii) environmental factors, such as weather conditions, impact both efficiency and productivity. Finally, the analysis reveals that using bootstrapping to make statistical inferences suggests that researchers should be careful in making performance comparisons based on conventional DEA methods, as any discovered differences may not be significant.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated changes in productive efficiency for 62 of the largest US public accounting firms between the periods (2000-2001 and (2003-2004) and the periods before and after enactment of SOX in July of 2002.
Abstract: There have been many criticisms of the Sarbanes–Oxley (SOX) Act passed in July of 2002 to correct business accountability and performance practices. The act has a major emphasis on accounting and its practices. This paper attempts a response to these criticisms by investigating changes in productive efficiency for 62 of the largest US public accounting firms between the periods (2000–2001) and (2003–2004)—the periods before and after enactment of SOX in July of 2002. DEA is used to calculate Malmquist indexes of productivity and efficiency changes. This index is used because it can distinguish between changes in technical efficiency, which limit the possibilities, and changes in the performance efficiencies for each firm. Contrary to many of the criticisms, results indicate that accounting firms have exhibited significant post SOX growth in productive efficiency which is better than pre-SOX performances.

Journal ArticleDOI
TL;DR: In this paper, a two-period pricing and ordering model for a dominant retailer with demand uncertainty in a declining price environment is presented, and it is shown that the maximum expected profit function is continuous concave.
Abstract: Retailing channels are increasingly being dominated by ‘power’ retailers who are in a position to dictate prices and ordering schedules to manufacturers and suppliers. A dominant retailer, such as Wal-Mart, has the ‘power’ to decide retail prices of products because there are so many manufacturers who are keen to sell their products through or to such a large and powerful retailer. Several products, such as electronic products, can be sold in the market for some periods during their lifecycles before they retreat, except when they are not popular with consumers after been introduced. Therefore, in case of such products, the retailer should not just consider a single-period pricing and ordering policy. It should make dynamic pricing and ordering decisions based on market demand forecast, in order to obtain maximum cumulative profit from the product during its lifecycle. In this study, we consider this scenario and construct a two-period model to discuss pricing and ordering problems for a dominant retailer with demand uncertainty in a declining price environment. We show that the maximum expected profit function is continuous concave, so the optimal solution to pricing and ordering policy exists and it is the one and only. We also analyze sensitivity of retailer's expected profit to the effects of parameters of price-discount sharing scheme and market demand.

Journal ArticleDOI
TL;DR: The SMAA-P method as discussed by the authors combines the piecewise linear difference functions of prospect theory with stochastic multicriteria acceptability analysis (SMAA) and computes indices that measure how widely acceptable different alternatives are with assumed behavior.
Abstract: We consider problems where multiple decision makers (DMs) want to choose their most preferred alternative from a finite set based on multiple criteria. Several approaches to support DMs in such problems have been suggested. Prospect theory has appealed to researchers through its descriptive power, but rare attempts have been made to apply it to support multicriteria decision making. The basic idea of prospect theory is that alternatives are evaluated by a difference function in terms of gains and losses with respect to a reference point. The function is suggested to be concave for gains and convex for losses and steeper for losses than for gains. Stochastic multicriteria acceptability analysis (SMAA) is a family of multicriteria decision support methods that allows representing inaccurate, uncertain, or partly missing information about criteria measurements and preferences through probability distributions. SMAA methods are based on exploring the weight and criteria measurement spaces in order to describe weights that would result in a certain rank for an alternative. This paper introduces the SMAA-P method that combines the piecewise linear difference functions of prospect theory with SMAA. SMAA-P computes indices that measure how widely acceptable different alternatives are with assumed behavior. SMAA-P can be used in decision problems, where the DMs’ preferences (weights, reference points and coefficients of loss aversion) are difficult to assess accurately. SMAA-P can also be used to measure how robust a decision problem is with respect to preference information. We demonstrate the method by reanalyzing a past real-life example.

Journal ArticleDOI
TL;DR: This paper develops a deterministic model as well as a stochastic model under continuous review for the system, and provides numerical examples for illustration.
Abstract: Product take-back and recovery activities have grown in recent times as a consequence of stringent government regulations and increased customer awareness of environmental pollution. Inventory management in the context of product returns has drawn the attention of many researchers. However, the inherent complexity of the system with uncertain returns makes the analysis of the system extremely difficult. So far, the literature on this type of system is mostly limited to single echelons. The few papers available in literature on multi-echelon systems with returns base their analyses on simplified assumptions such as non-existence or non-relevance of set-up and holding costs at different levels. In this paper, we relax these assumptions and consider a two-echelon system with returns under more generalized conditions. We develop a deterministic model as well as a stochastic model under continuous review for the system, and provide numerical examples for illustration.

Journal ArticleDOI
TL;DR: In this article, three composite heuristics are proposed by integrating forward pair-wise exchangerestart (FPE-R) and FPE with an effective iterative method.
Abstract: In this paper, permutation flow shops with total flowtime minimization are considered. General flowtime computing (GFC) is presented to accelerate flowtime computation. A newly generated schedule is divided into an unchanged subsequence and a changed part. GFC computes total flowtime of a schedule by inheriting temporal parameters from its parent in the unchanged part and computes only those of the changed part. Iterative methods and LR (developed by Liu J, Reeves, CR. Constructive and composite heuristic solutions to the P ∥ Σ C i scheduling problem, European Journal of Operational Research 2001; 132:439–52) are evaluated and compared as solution improvement phase and index development phase. Three composite heuristics are proposed in this paper by integrating forward pair-wise exchange-restart (FPE-R) and FPE with an effective iterative method. Computational results show that the proposed three outperform the best existing three composite heuristics in effectiveness and two of them are much faster than the existing ones.

Journal ArticleDOI
TL;DR: This work surveys applications where quality has been incorporated into DEA models and considers the concerns that arise when the results show that quality measures have been effectively ignored, and identifies three modeling techniques that are effective in insuring that DEA results discriminate between high and low quality performance.
Abstract: When using data envelopment analysis (DEA) as a benchmarking technique for nursing homes, it is essential to include measures of the quality of care. We survey applications where quality has been incorporated into DEA models and consider the concerns that arise when the results show that quality measures have been effectively ignored. Three modeling techniques are identified that address these concerns. Each of these techniques requires some input from management as to the proper emphasis to be placed on the quality aspect of performance. We report the results of a case study in which we apply these techniques to a DEA model of nursing home performance. We examine in depth not only the resulting efficiency scores, but also the benchmark sets and the weights given to the input and output measures. We find that two of the techniques are effective in insuring that DEA results discriminate between high and low quality performance.

Journal ArticleDOI
TL;DR: An effective heuristic is developed to provide a near-optimal schedule for the single-machine scheduling problem with periodic maintenance and some important theorems associated with the problem are implemented in the algorithm.
Abstract: This paper considers a single-machine scheduling problem with periodic maintenance. In this study, a schedule consists of several maintenance periods and each maintenance period is scheduled after a periodic time interval. The objective is to find a schedule that minimizes the number of tardy jobs subject to periodic maintenance and nonresumable jobs. Based on the Moore's algorithm, an effective heuristic is developed to provide a near-optimal schedule for the problem. A branch-and-bound algorithm is also proposed to find the optimal schedule. Some important theorems associated with the problem are implemented in the algorithm. Computational results are presented to demonstrate the effectiveness of the proposed heuristic.

Journal ArticleDOI
TL;DR: In this article, a deterministic EOQ with partial backordering was proposed, where only a percentage of stockouts will be backordered and only a portion of stocks will be ordered.
Abstract: Several authors have developed models for the EOQ when only a percentage of stockouts will be backordered Most of these models are complicated, with equations unlike those for the EOQ with full backordering In this paper we extend work by Pentico and Drake [The deterministic EOQ with partial backordering: a new approach European Journal of Operational Research 2008; in press] that developed equations for the EOQ with partial backordering that are more like those for the EOQ with full backordering to develop a comparable model for the EPQ with partial backordering