scispace - formally typeset
Search or ask a question

Showing papers in "Journal of the Operational Research Society in 2013"


Journal ArticleDOI
TL;DR: A critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas are presented.
Abstract: Hyper-heuristics comprise a set of approaches that are motivated (at least in part) by the goal of automating the design of heuristic methods to solve hard computational search problems. An underlying strategic research challenge is to develop more generally applicable search methodologies. The term hyper-heuristic is relatively new; it was first used in 2000 to describe heuristics to choose heuristics in the context of combinatorial optimisation. However, the idea of automating the design of heuristics is not new; it can be traced back to the 1960s. The definition of hyper-heuristics has been recently extended to refer to a search method or learning mechanism for selecting or generating heuristics to solve computational search problems. Two main hyper-heuristic categories can be considered: heuristic selection and heuristic generation. The distinguishing feature of hyper-heuristics is that they operate on a search space of heuristics (or heuristic components) rather than directly on the search space of solutions to the underlying problem that is being addressed. This paper presents a critical discussion of the scientific literature on hyper-heuristics including their origin and intellectual roots, a detailed account of the main types of approaches, and an overview of some related areas. Current research trends and directions for future research are also discussed.

1,023 citations


Journal ArticleDOI
TL;DR: This work considers the problem of coordination of a manufacturer and a retailer in a vertical supply chain, who put in efforts for ‘greening’ their operations, and finds that the ratio of the optimal greening efforts put in by the manufacturer and retailer is equal to the ratios of their green sensitivity ratios and greening cost ratios.
Abstract: Environmental consciousness has become increasingly important in everyday life and business practice. The effort to reduce the impact of business activities on the environment has been labelled as ...

407 citations


Journal ArticleDOI
Bin Zhu1, Zeshui Xu1
TL;DR: The weighted hesitant fuzzy Bonferroni mean (WHFBM) is proposed considering different importance degrees of input arguments, and the procedure of multi-criteria decision making based on the WHFBM is given under hesitant fuzzy environment as typical applications.
Abstract: Due to the desirable characteristic of the Bonferroni mean (BM) that it can capture the interrelationship between input arguments, and in order to provide the properties and the modelling capability of the BMs under hesitant fuzzy environment, we explore some new hesitant fuzzy Bonferroni means (HFBMs). The properties and the special cases of HFBMs are studied in detail. We specially define a concept of hesitant Bonferroni element (HBE), which is considered as a ‘bonding satisfaction’ factor used as a calculation unit in the HFBM. The HBE can reflect the correlation between hesitant fuzzy arguments, which makes the HFBM have particular advantages in aggregating arguments. In addition, the weighted hesitant fuzzy Bonferroni mean (WHFBM) is also proposed considering different importance degrees of input arguments. Furthermore, the procedure of multi-criteria decision making based on the WHFBM is given under hesitant fuzzy environment as typical applications, which has much sense in theory and practice for the BM.

140 citations


Journal ArticleDOI
TL;DR: Investigation of the suitability and performance of several resampling techniques when applied in conjunction with statistical and artificial intelligence prediction models over five real-world credit data sets, which have artificially been modified to derive different imbalance ratios.
Abstract: In real-life credit scoring applications, the case in which the class of defaulters is under-represented in comparison with the class of non-defaulters is a very common situation, but it ha...

139 citations


Journal ArticleDOI
TL;DR: A review of the methods and algorithms developed to examine the area of construction schedule optimization (CSO) is undertaken and the developed algorithms can be classified into three methods: mathematical, heuristic and metaheuristic.
Abstract: Optimizing construction project scheduling has received a considerable amount of attention over the past 20 years. As a result, a plethora of methods and algorithms have been developed to address specific scenarios or problems. A review of the methods and algorithms that have been developed to examine the area of construction schedule optimization (CSO) is undertaken. The developed algorithms for solving the CSO problem can be classified into three methods: mathematical, heuristic and metaheuristic. The application of these methods to various scheduling problems is discussed and implications for future research are identified.

121 citations


Journal ArticleDOI
TL;DR: The aim of this paper is to summarize the most recent developments in the application of evolutionary algorithms to credit scoring by means of a thorough review of scientific articles published during the period 2000–2012.
Abstract: The last years have seen the development of many credit scoring models for assessing the creditworthiness of loan applicants. Traditional credit scoring methodology has involved the use of statistical and mathematical programming techniques such as discriminant analysis, linear and logistic regression, linear and quadratic programming, or decision trees. However, the importance of credit grant decisions for financial institutions has caused growing interest in using a variety of computational intelligence techniques. This paper concentrates on evolutionary computing, which is viewed as one of the most promising paradigms of computational intelligence. Taking into account the synergistic relationship between the communities of Economics and Computer Science, the aim of this paper is to summarize the most recent developments in the application of evolutionary algorithms to credit scoring by means of a thorough review of scientific articles published during the period 2000–2012.

97 citations


Journal ArticleDOI
TL;DR: This paper asks whether there is value in being much more open and analytical about questions and answers about problem structuring groups and if so, then how can the authors make the unwritten processes and outcomes of PSGs written?
Abstract: In problem structuring methods, facilitators often ask of themselves questions such as: what makes a ‘good’ problem structuring group (PSG) and indeed what does ‘good’ mean? How can group dynamics be improved and does it matter in terms of the quality of the problem structuring that that group engages in? On the surface these questions seem to be straightforward. Indeed, those who have helped facilitate many participatory workshops will think they intuitively know the answers to these questions; they can, from their professional practice, ‘feel’ which PSGs are doing well and producing novel insights and those which are functioning less well and perhaps generating something that is less imaginative and more routine as a consequence. The intuitive, practice-learned insight will depend upon a rich array of visual signals that become more obvious with experience. This paper asks whether there is value in being much more open and analytical about these questions and answers. If so, then how can we make the unwritten processes and outcomes of PSGs written? Indeed, open to whom? Finally, how much of any insights learned by facilitators should be shared with those engaged in workshops?

93 citations


Journal ArticleDOI
TL;DR: A new method is presented that jointly considers uncertainty in metal content and commodity prices, and incorporates time-dependent discounted values of mining blocks when designing optimal production phases and ultimate pit limit, while honouring production capacity constraints.
Abstract: Conventional open pit mine optimization models for designing mining phases and ultimate pit limit do not consider expected variations and uncertainty in metal content available in a mineral deposit (supply) and commodity prices (market demand). Unlike the conventional approach, a stochastic framework relies on multiple realizations of the input data so as to account for uncertainty in metal content and financial parameters, reflecting potential supply and demand. This paper presents a new method that jointly considers uncertainty in metal content and commodity prices, and incorporates time-dependent discounted values of mining blocks when designing optimal production phases and ultimate pit limit, while honouring production capacity constraints. The structure of a graph representing the stochastic framework is proposed, and it is solved with a parametric maximum flow algorithm. Lagragnian relaxation and the subgradient method are integrated in the proposed approach to facilitate producing practical designs. An application at a copper deposit in Canada demonstrates the practical aspects of the approach and quality of solutions over conventional methods, as well as the effectiveness of the proposed stochastic approach in solving mine planning and design problems.

77 citations


Journal ArticleDOI
TL;DR: A comparative analysis with a recent work by Barlas and Gunduz is performed, showing that the adoption of the proposed performance measurement system can help academics and practitioners to better understand, study and avoid the bullwhip effect.
Abstract: A bullwhip measurement system based on a two-criterion assessment—‘internal process efficiency’ and ‘customer service level’—is developed in this paper. The framework is designed to assess both individual (single member) and systemic (whole supply chain) performances. Data collection and calculation methods, update and monitoring mechanisms, as well as related procedures for each metric used, are detailed. A comparative analysis with a recent work by Barlas and Gunduz is performed, showing that the adoption of the proposed performance measurement system can help academics and practitioners to better understand, study and avoid the bullwhip effect. Such analysis also provides evidence on the relevance of considering when analysing the bullwhip effect in supply chains, the ‘customer importance’ aspect that is often forgotten in the published literature.

76 citations


Journal ArticleDOI
TL;DR: Search mechanisms, based on the tabu search methodology, are developed for the maximally diverse grouping problem, including a strategic oscillation that enables search paths to cross a feasibility boundary.
Abstract: We propose new heuristic procedures for the maximally diverse grouping problem (MDGP). This NP-hard problem consists of forming maximally diverse groups—of equal or different size—from a given set of elements. The most general formulation, which we address, allows for the size of each group to fall within specified limits. The MDGP has applications in academics, such as creating diverse teams of students, or in training settings where it may be desired to create groups that are as diverse as possible. Search mechanisms, based on the tabu search methodology, are developed for the MDGP, including a strategic oscillation that enables search paths to cross a feasibility boundary. We evaluate construction and improvement mechanisms to configure a solution procedure that is then compared to state-of-the-art solvers for the MDGP. Extensive computational experiments with medium and large instances show the advantages of a solution method that includes strategic oscillation.

68 citations


Journal ArticleDOI
TL;DR: A range of discrete hierarchical location models with bicriteria efficiency/equity objectives are presented for use in location of facilities within hierarchical systems where a fair but efficient hierarchical service is sought.
Abstract: As is often the case in healthcare provision, public services may offer facilities at a hierarchy of levels in different locations, ranging from basic to specialised levels of care. In addition to efficiency objectives, with public services there is the concern of equity of provision when locating new facilities. We present, as a tool-kit for decision makers, a range of discrete hierarchical location models with bicriteria efficiency/equity objectives. These models are for use in location of facilities within hierarchical systems where a fair but efficient hierarchical service is sought. The hierarchical models have as efficiency criteria both p-median and maximal-covering types. These components are combined in a novel manner with appropriate equity objectives to give decision makers a range of choices of scenarios. We illustrate use of the models in a healthcare setting.

Journal ArticleDOI
TL;DR: An economic order quantity model is proposed from the seller's prospective to determine its optimal trade credit and order quantity simultaneously and the important and relevant fact that trade credit has a positive impact on demand rate but a negative impact on receiving the buyer's debt obligations is incorporated.
Abstract: In practice, to attract new buyers and to avoid lasting price competition, a seller frequently offers its buyers a permissible delay in payment (ie, trade credit). However, the policy of granting a permissible delay in payment adds an additional dimension of default risk to the seller. In contrast to previous researchers for finding optimal solutions to buyers, we first propose an economic order quantity model from the seller's prospective to determine its optimal trade credit and order quantity simultaneously. In addition, we incorporate the important and relevant fact that trade credit has a positive impact on demand rate but a negative impact on receiving the buyer's debt obligations. Then the necessary and sufficient conditions to obtain the seller's optimal trade credit and order quantity are derived. An algorithm to determine the seller's optimal trade credit is also proposed. Finally, we use some numerical examples to illustrate the theoretical results and to provide some managerial insights.

Journal ArticleDOI
TL;DR: This paper presents a brief summary of barriers and facilitators to the successful use of the S:G software, but its main purpose is to focus more broadly on factors influencing the successful adoption of simulation tools in general within healthcare organisations.
Abstract: This paper addresses a key issue in the health OR literature, namely the apparent failure of OR modelling to become embedded and widely implemented within healthcare organisations. The research presented here is a case study to evaluate the adoption of one particular simulation modelling tool, Scenario Generator (S:G), which was developed by the SIMUL8 Corporation in a PPI partnership with the UK's National Health Service (NHS) Institute for Innovation and Improvement. The study involved semi-structured interviews with employees of 28 Primary Care Trusts who had all been engaged in some way with the initiative, with participants classified as ‘Not Started’, ‘Given Up’ and ‘Actively Using’. This paper presents a brief summary of barriers and facilitators to the successful use of the S:G software, but its main purpose is to focus more broadly on factors influencing the successful adoption of simulation tools in general within healthcare organisations. The insights gained in this study are relevant to improving the uptake of OR modelling in general within the NHS.

Journal ArticleDOI
TL;DR: An enhanced Data Envelopment Analysis (DEA) model is developed that provides a single summary measure of countries’ environmental performance, based on the aggregation of the indicators that underlie the estimation of the Environmental Performance Index (EPI).
Abstract: Environmental performance assessments are often conducted using environmental indicators. Although these indicators provide a starting point for performance assessments, they do not provide guidelines that countries should follow to improve performance. This paper develops an enhanced Data Envelopment Analysis (DEA) model that provides a single summary measure of countries’ environmental performance, based on the aggregation of the indicators that underlie the estimation of the Environmental Performance Index (EPI). The DEA model used is based on a novel specification of weight restrictions. The main contribution of the methodology used in this paper is to enable benchmarking in such a way that it becomes possible to identify the strengths and weaknesses of each country, as well as the peers with similar features to the country under assessment. These peers provide examples of good environmental practices that countries with worse performance should follow to improve performance.

Journal ArticleDOI
TL;DR: A general decision-making framework based on the intuitionistic fuzzy rough set model over two universes with a constructive approach is proposed and a new approach of decision making in uncertainty environment by using the intuitionism fuzzy rough sets over two universe is given.
Abstract: Rough set theory has been combined with intuitionistic fuzzy sets in dealing with uncertainty decision making. This paper proposes a general decision-making framework based on the intuitionistic fuzzy rough set model over two universes. We first present the intuitionistic fuzzy rough set model over two universes with a constructive approach and discuss the basic properties of this model. We then give a new approach of decision making in uncertainty environment by using the intuitionistic fuzzy rough sets over two universes. Further, the principal steps of the decision method established in this paper are presented in detail. Finally, an example of handling medical diagnosis problem illustrates this approach.

Journal ArticleDOI
TL;DR: The empirical findings show that management contract technology has achieved the potential best practice within the four groups and that there exists a significant gap between the potentialbest practice and present performance in the domestic, franchise, and membership chain technologies.
Abstract: The non-convex metafrontier model was introduced by O’Donnell et al (2008) with a brief sketch and interpreted by Tiedemann et al (2011) through a graphical instruction. However, the specific function was not present in former studies. In order to strengthen the non-parametric metafrontier approach, this study proposes the specific modelling function and calculational operation for the non-convex metafrontier model and applies the developed model to investigate the technology gaps between the four operating types of Taiwan’s international tourist hotels. The empirical findings show that management contract technology has achieved the potential best practice within the four groups and that there exists a significant gap between the potential best practice and present performance in the domestic, franchise, and membership chain technologies.

Journal ArticleDOI
TL;DR: A prediction model is presented here that combines both airport layout and historic taxi time information within a multiple linear regression analysis, identifying the most relevant factors affecting the variability of taxi times for both arrivals and departures.
Abstract: With the expected continued increases in air transportation, the mitigation of the consequent delays and environmental effects is becoming more and more important, requiring increasingly sophisticated approaches for airside airport operations. Improved on-stand time predictions (for improved resource allocation at the stands) and take-off time predictions (for improved airport-airspace coordination) both require more accurate taxi time predictions, as do the increasingly sophisticated ground movement models which are being developed. Calibrating such models requires historic data showing how long aircraft will actually take to move around the airport, but recorded data usually includes significant delays due to contention between aircraft. This research was motivated by the need to both predict taxi times and to quantify and eliminate the effects of airport load from historic taxi time data, since delays and re-routing are usually explicitly considered in ground movement models. A prediction model is presented here that combines both airport layout and historic taxi time information within a multiple linear regression analysis, identifying the most relevant factors affecting the variability of taxi times for both arrivals and departures. The promising results for two different European hub airports are compared against previous results for US airports.

Journal ArticleDOI
TL;DR: David Snowden's Cynefin framework is explored, particularly in its ability to help recognise which analytic and modelling methodologies are most likely to offer appropriate support in a given context.
Abstract: David Snowden's Cynefin framework, introduced to articulate discussions of sense-making, knowledge management and organisational learning, has much to offer discussion of statistical inference and decision analysis. I explore its value, particularly in its ability to help recognise which analytic and modelling methodologies are most likely to offer appropriate support in a given context. The framework also offers a further perspective on the relationship between scenario thinking and decision analysis in supporting decision makers.

Journal ArticleDOI
TL;DR: This work proposes mathematical programming models that represent different variants of the master physician scheduling problem, and proposed heuristic algorithm to generate good solutions for large-scale instances that could not be solved by the exact method.
Abstract: We study a real-world problem arising from the operations of a hospital service provider, which we term the master physician scheduling problem. It is a planning problem of assigning physicians’ full range of day-to-day duties (including surgery, clinics, scopes, calls, administration) to the defined time slots/shifts over a time horizon, incorporating a large number of constraints and complex physician preferences. The goals are to satisfy as many physicians’ preferences and duty requirements as possible while ensuring optimum usage of available resources. We propose mathematical programming models that represent different variants of this problem. The models were tested on a real case from the Surgery Department of a local government hospital, as well as on randomly generated problem instances. The computational results are reported together with analysis on the optimal solutions obtained. For large-scale instances that could not be solved by the exact method, we propose a heuristic algorithm to generate good solutions.

Journal ArticleDOI
TL;DR: Experimental results showed that the proposed ABC-based algorithm outperformed three state-of-art metaheuristic-based algorithms from the literature and can serve as a new benchmark approach for future research on the OAS problem addressed in this study.
Abstract: Taipei, Taiwan, R.O.C The order acceptance and scheduling (OAS) problem is an important topic for make-to-order production systems with limited production capacity and tight delivery requirements. This paper proposes a new algorithm based on Artificial Bee Colony (ABC) for solving the single machine OAS problem with release dates and sequence-dependent setup times. The performance of the proposed ABC-based algorithm was validated by a benchmark problem set of test instances with up to 100 orders. Experimental results showed that the proposed ABC-based algorithm outperformed three state-of-art metaheuristic-based algorithms from the literature. It is believed that this study successfully demonstrates a high-performance algorithm that can serve as a new benchmark approach for future research on the OAS problem addressed in this study.

Journal ArticleDOI
TL;DR: This paper addresses the balancing problem for straight assembly lines where task times are not known exactly but given by intervals of their possible values and a breadth-first search procedure is developed and evaluated on benchmark instances.
Abstract: This paper addresses the balancing problem for straight assembly lines where task times are not known exactly but given by intervals of their possible values. The objective is to assign the tasks to workstations minimizing the number of workstations while respecting precedence and cycle-time constraints. An adaptable robust optimization model is proposed to hedge against the worst-case scenario for task times. To find the optimal solution(s), a breadth-first search procedure is developed and evaluated on benchmark instances. The results obtained are analysed and some practical recommendations are given.

Journal ArticleDOI
TL;DR: Two truncated learning models are proposed in single-machine scheduling problems and two-machine flowshop scheduling problems with ordered job processing times, respectively, where the actual processing time of a job is a function of its position and a control parameter.
Abstract: Scheduling with learning effects has received growing attention nowadays. A well-known learning model is called ‘position-based learning’ in which the actual processing time of a job is a non-increasing function of its position to be processed. However, the actual processing time of a given job drops to zero precipitously as the number of jobs increases. Motivated by this observation, we propose two truncated learning models in single-machine scheduling problems and two-machine flowshop scheduling problems with ordered job processing times, respectively, where the actual processing time of a job is a function of its position and a control parameter. Under the proposed learning models, we show that some scheduling problems can be solved in polynomial time. In addition, we further analyse the worst-case error bounds for the problems to minimize the total weighted completion time, discounted total weighted completion time and maximum lateness.

Journal ArticleDOI
TL;DR: Numerical examples are examined to show the importance and necessity of the use of relative importance weights for cross-efficiency aggregation and the most efficient DMU can be significantly affected by taking into consideration the relative importance of cross-efficiencies.
Abstract: Cross-efficiency evaluation has been widely used for identifying the most efficient decision making unit (DMU) or ranking DMUs in data envelopment analysis (DEA). Most existing approaches for cross-efficiency evaluation are focused on how to determine input and output weights uniquely, but pay little attention to the aggregation process of cross-efficiencies and simply aggregate them equally without considering their relative importance. This paper focuses on aggregating cross-efficiencies by taking into consideration their relative importance and proposes three alternative approaches to determining the relative importance weights for cross-efficiency aggregation. Numerical examples are examined to show the importance and necessity of the use of relative importance weights for cross-efficiency aggregation and the most efficient DMU can be significantly affected by taking into consideration the relative importance weights of cross-efficiencies.

Journal ArticleDOI
TL;DR: The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed, and the Wakeby models provided the best distribution fits and were used to calculate schedule overrun probabilities by contract size.
Abstract: The probability of schedule overruns for construction and engineering projects can be ascertained using a ‘best fit’ probability distribution from an empirical distribution. The statistical characteristics of schedule overruns occurring in 276 Australian construction and engineering projects were analysed. Skewness and kurtosis values revealed that schedule overruns are non-Gaussian. Theoretical probability distributions were then fitted to the schedule overrun data; including the Kolmogorov–Smirnov, Anderson–Darling and Chi-Squared non-parametric tests to determine the ‘Goodness of Fit’. A Four Parameter Burr probability function best described the behaviour of schedule overruns, provided the best overall distribution fit and was used to calculate the probability of a schedule overrun being experienced. The statistical characteristics of contract size and schedule overruns were also analysed, and the Wakeby ( AU$101 m) models provided the best distribution fits and were used to calculate schedule overrun probabilities by contract size.

Journal ArticleDOI
TL;DR: Findings indicated that private universities with higher levels of QM efficiency on stakeholder-focus indicators achieved better performance in terms of fulfilling the expectations of their stakeholders, while public universities were more successful in managing QM practices for a superior teaching and research performance.
Abstract: Using data envelopment analysis (DEA) in conjunction with stochastic frontier analysis (SFA), the aim of this study was to measure the relative efficiency of quality management (QM) practices in Turkish public and private universities. Based on the extant literature, a set of nine critical QM factors and seven performance indicators for Turkish universities were identified as input and output variables, respectively. SFA confirmed the existence of significant relationships between QM factors and performance indicators. DEA findings indicated that private universities with higher levels of QM efficiency on stakeholder-focus indicators achieved better performance in terms of fulfilling the expectations of their stakeholders. In contrast, public universities were more successful in managing QM practices for a superior teaching and research performance. Finally, after eliminating the managerial discrepancies, no significant structural efficiency difference was found between these two groups of universities through stakeholder-focus model, though some significant variation was noted in both factor-efficiency and total-efficiency models. As for total-efficiency model, we may infer that the structural differences found in favour of public universities for factor-efficiencies are counterbalanced by private universities which tend to focus more on their stakeholders in managing QM applications.

Journal ArticleDOI
TL;DR: The TSPPD is solved by considering a metaheuris-tic algorithm based on Iterated Local Search with Variable Neighbourhood Descent and Random neighbourhood ordering, which shows that the algorithm finds or improves the best known results reported in the literature within reasonable computational time.
Abstract: The Travelling Salesman Problem with Pickups and Deliveries (TSPPD) consists in designing a minimum cost tour that starts at the depot, provides either a pickup or delivery service to each of the customers and returns to the depot, in such a way that the vehicle capacity is not exceeded in any part of the tour. In this paper, the TSPPD is solved by considering a metaheuris-tic algorithm based on Iterated Local Search with Variable Neighbourhood Descent and Random neighbourhood ordering. Our aim is to propose a fast, flexible and easy to code algorithm, also capable of producing high quality solutions. The results of our computational experience show that the algorithm finds or improves the best known results reported in the literature within reasonable computational time.

Journal ArticleDOI
TL;DR: This paper applies quantile regression to two problems in financial portfolio construction, index tracking and enhanced indexation and presents a mixed-integer linear programming formulation of these problems based onquantile regression.
Abstract: Quantile regression differs from traditional least-squares regression in that one constructs regression lines for the quantiles of the dependent variable in terms of the independent variable. In this paper we apply quantile regression to two problems in financial portfolio construction, index tracking and enhanced indexation. Index tracking is the problem of reproducing the performance of a stock market index, but without purchasing all of the stocks that make up the index. Enhanced indexation deals with the problem of out-performing the index. We present a mixed-integer linear programming formulation of these problems based on quantile regression. Our formulation includes transaction costs, a constraint limiting the number of stocks that can be in the portfolio and a limit on the total transaction cost that can be incurred. Numeric results are presented for eight test problems drawn from major world markets, where the largest of these test problems involves over 2000 stocks.

Journal ArticleDOI
TL;DR: A new procedure is proposed which, while maintaining the essence of DEA-inspired procedures, determines a common weighting vector to evaluate the complete set of alternatives and includes the discrimination between alternatives as an objective of the procedure.
Abstract: Composite indicators (CIs), constructed on the basis of the ‘benefit of the doubt’ criterion, are characterized by endogenously determined vectors of weights. In this paper, a new procedure to construct CIs is proposed in an attempt to overcome the main weakness of this family of procedures: the determination of an individual weighting vector for each alternative, which makes comparisons between alternatives difficult and the existence of multiple ties. A new procedure is proposed which, while maintaining the essence of DEA-inspired procedures determines a common weighting vector to evaluate the complete set of alternatives and includes the discrimination between alternatives as an objective of the procedure.

Journal ArticleDOI
Lu Zhen1
TL;DR: A mixed integer programming model is formulated to minimize the expected value of the route length of container transshipping flows in the yard and a heuristic algorithm is developed for solving the problem in large-scale realistic environments.
Abstract: This paper is concerned with yard management in transshipment hubs, where a consignment strategy is often used to reduce reshuffling and vessel turnaround time. This strategy groups unloaded containers according to their destination vessels. In this strategy, yard template determines the assignment of the spaces (sub-blocks) in the yard to the vessels. This paper studies how to make a good yard template under uncertain environment, for example, uncertain berthing time and berthing positions of the arriving vessels. To reduce the potential traffic congestion of prime movers, the workload distribution of sub-blocks within the yard is considered. A mixed integer programming model is formulated to minimize the expected value of the route length of container transshipping flows in the yard. Moreover, a heuristic algorithm is developed for solving the problem in large-scale realistic environments. Numerical experiments are conducted to validate the efficiency of the proposed algorithm.

Journal ArticleDOI
TL;DR: It is demonstrated that only in the near or complete absence of defaulters should semi-supervised OCC algorithms be used instead of supervised two-class classification algorithms, and for data sets whose class labels are unevenly distributed that optimising the threshold value on classifier output yields an improvement in classification performance.
Abstract: In credit scoring, low-default portfolios (LDPs) are those for which very little default history exists. This makes it problematic for financial institutions to estimate a reliable probability of a customer defaulting on a loan. Banking regulation (Basel II Capital Accord), and best practice, however, necessitate an accurate and valid estimate of the probability of default. In this article the suitability of semi-supervised one-class classification (OCC) algorithms as a solution to the LDP problem is evaluated. The performance of OCC algorithms is compared with the performance of supervised two-class classification algorithms. This study also investigates the suitability of over sampling, which is a common approach to dealing with LDPs. Assessment of the performance of one- and two-class classification algorithms using nine real-world banking data sets, which have been modified to replicate LDPs, is provided. Our results demonstrate that only in the near or complete absence of defaulters should semi-supervised OCC algorithms be used instead of supervised two-class classification algorithms. Furthermore, we demonstrate for data sets whose class labels are unevenly distributed that optimising the threshold value on classifier output yields, in many cases, an improvement in classification performance. Finally, our results suggest that oversampling produces no overall improvement to the best performing two-class classification algorithms.