scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Industrial and Production Engineering in 2016"


Journal ArticleDOI
TL;DR: In this article, a framework of strategies to guide designers and business strategists in the move from a linear to a circular economy is developed, where the terminology of slowing, closing, and narrowing resource loops is introduced.
Abstract: The transition within business from a linear to a circular economy brings with it a range of practical challenges for companies. The following question is addressed: What are the product design and business model strategies for companies that want to move to a circular economy model? This paper develops a framework of strategies to guide designers and business strategists in the move from a linear to a circular economy. Building on Stahel, the terminology of slowing, closing, and narrowing resource loops is introduced. A list of product design strategies, business model strategies, and examples for key decision-makers in businesses is introduced, to facilitate the move to a circular economy. This framework also opens up a future research agenda for the circular economy.

1,702 citations


Journal ArticleDOI
TL;DR: The proposed model tries to determine an optimal or near-optimal configuration of tasks, workstations in U-shaped assembly line balancing using two different multi- objective evolutionary algorithms (MOEAs) to solve the problem.
Abstract: Nowadays, robots are used extensively in robotic assembly line balancing system because of the capabilities of the robots. Robotic assembly lines are used to manufacture high volume product in customization and specialization production. In this paper, type II robotic mixed-model assembly line balancing is considered. The goals are to minimize robot purchasing costs, robot setup costs, sequence dependent setup costs, and cycle time. The proposed model tries to determine an optimal or near-optimal configuration of tasks, workstations in U-shaped assembly line balancing. In this model, two types of tasks including the special task for one product model and the common task for several products models exist. The problem with the aforementioned conditions is NP-hard problem. So, we used two different multi- objective evolutionary algorithms (MOEAs) to solve the problem. First algorithm is non-dominated sorting genetic algorithm (NSGA-II) and the second one is multi-objective particle swarm optimization...

55 citations


Journal ArticleDOI
TL;DR: In this paper, a data mining approach is presented to identify the key parameters and the key steps in the manufacture process and the AdaBoost classifier with PCA has been shown the most effective in identifying key parameters in fault detection.
Abstract: In this paper, a data mining approach is presented to identify the key parameters and the key steps in the manufacture process. For key parameters, a principal component analysis (PCA) algorithm is first used to filter data before classification models of fault detection are established by using SVM and AdaBoost algorithms. A decision tree is then used to locate the key step for root cause identification. In the preliminary study in terms of a set of real wafer fabrication profile data in semiconductor manufacturing, the AdaBoost classifier with PCA has been shown the most effective in identifying key parameters in fault detection. Subsequently, these key parameters along with associated reading values at different timings were used to build a decision tree for the set of empirical rules to best identify problematic timing. It has been further verified that the critical timing among this set of empirical rules had occurred in the same manufacturing phase.

34 citations


Journal ArticleDOI
TL;DR: A new bi-objective mixed integer programming model is presented for the two-stage assembly flow shop scheduling problem with preventive maintenance activities, in which the reliability/availability approach is employed to model the maintenance concepts of a problem.
Abstract: This paper presents a new bi-objective mixed integer programming model for the two-stage assembly flow shop scheduling problem with preventive maintenance (PM) activities, in which the reliability/availability approach is employed to model the maintenance concepts of a problem. PM activities carry out the operations on machines and tools before the breakdown takes place. Therefore, it helps to prevent failures before they happen. After developing a new bi-objective model, an Epsilon-constraint method is proposed to solve the problem. This problem has been known as Np-hard. Therefore, three multi-objective optimization methods, namely fast non-dominated sorting genetic algorithm, Multi-objective imperialist competitive algorithm, and non-dominated ranking genetic algorithm (NRGA) are employed to find the pareto-optimal front for large sized problems. The parameters of the proposed algorithms are calibrated using artificial neural network (ANN) and the performances of the proposed algorithms on the problems...

32 citations


Journal ArticleDOI
TL;DR: In this paper, a two-warehouse inventory model with fuzzy deterioration rate and fuzzy demand rate under conditionally permissible delay in payment is developed, where the deterioration rate is considered as right-shaped fuzzy numbers.
Abstract: The classical inventory models usually consider deterioration rate and demand rate as constant quantities, but, in the practical situations, the deterioration rate and the demand rate are uncertain in nature. In this paper, a new two-warehouse inventory model with fuzzy deterioration rate and fuzzy demand rate under conditionally permissible delay in payment is developed. The deterioration rate and demand rate are considered as right-shaped fuzzy numbers. The objective was to obtain the optimum value of the fuzzy total cost. The numerical examples are also considered to compare the crisp model and its corresponding fuzzy model. Finally, a sensitivity analysis is carried out to get the sensitiveness of the tolerance of different input parameters.

31 citations


Journal ArticleDOI
TL;DR: A genetic algorithm and simulated annealing are proposed to compute the best sequence and scheduling for a two-stage assembly hybrid shop problem and numerical results demonstrate the effectiveness of the presented model and the proposed solution approach.
Abstract: We address the two-stage assembly scheduling problem where there are m machines at the first stage and n assembly machines at the second stage under lot sizing environment. Lot streaming (lot sizing) means breaking a lot into some sublots, where each sublot is transferred to the next machine for continuing operations. This problem can be considered as a production system model consisting of production stage and assembly stage. If different production operations are done in parallel machines independently, then the manufactured parts transferred to the next stage are assembled with purchased parts at n machines according to the operation process chart to produce the final products. Here, work-in-process inventories, work shifts, and sequence-dependent setup times are also considered as three important presumptions in order to make the problem more realistic. The objective is to minimize the sum of weighted completion times of products in each shift in order to furnish better machine utilization for the nex...

29 citations


Journal ArticleDOI
TL;DR: In this article, the authors analyzed the deciding factors for battery post-vehicle applications and their potential impacts on EV business models, emphasizing the importance of battery ownership, inter-industry partnerships, and policy support.
Abstract: The universal market adoption of electric vehicles (EVs) is still impeded by the high cost of batteries. Repurposing EV batteries in secondary applications could recoup a portion of that initial cost and reduce upfront EV costs. Further, when integrated in energy storage systems for renewables, second-life batteries could clean the electricity mix for EV charging and alleviate environmental concerns over battery disposal. This paper presents business models of different EV stakeholders that facilitate battery reuse. Based on interviews with various stakeholders, as well as industry reports and academic literature, we analyze the deciding factors for battery “post-vehicle” applications and their potential impacts on EV business models. The findings emphasize the importance of battery ownership, inter-industry partnerships, and policy support. In this early stage, government support constitutes the most important trigger for battery reuse. The results also suggest the potential of battery reuse as a catalys...

27 citations


Journal ArticleDOI
TL;DR: Lean six sigma (LSS) is integrated with strategic control system that has assisted organizations to achieve better organizational performance through high-level continuous improvement activity and better saving in terms of operation and quality cost.
Abstract: Automotive industry is one of the most active industries involved in the quality effort, low production cost, and continuous improvement activities. Today, lean six sigma (LSS) is integrated with strategic control system that has assisted organizations to achieve better organizational performance through high-level continuous improvement activity and better saving in terms of operation and quality cost. Analytic hierarchy process methodology was used in developing a tool of lean six sigma performance improvement to measure the performance of automotive companies. This tool was developed using a system based on Excel. Five case studies were prepared to validate this tool. The result of these case studies suggested that this tool was timely and suitable to determine organizational performance of automotive companies. Through these case studies, this tool was able to identify the strength and weaknesses that directed to where and how improvement should be done.

27 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the interrelationships between pillars and constructs of a proposed framework of lean supply chain management in Indian manufacturing sector, and established a mental model of framework of Lean supply chain using interpretive structural modeling and path analysis.
Abstract: Manufacturing industries all around the world have embraced the philosophy of lean thinking. The realm of “lean” is no longer limited to shop floor; it has propagated to almost every aspect of business. Thus, lean thinking is now applicable to complete supply chain. This paper investigates the interrelationships between pillars and constructs of a proposed framework of lean supply chain management in Indian manufacturing sector. The results establish a mental model of framework of lean supply chain using interpretive structural modeling and path analysis.

27 citations


Journal ArticleDOI
TL;DR: This study demonstrates that the combination of the RST model and flow graphs assists in identifying the needs of customers, determining their characteristics, and facilitating the development of an improvement strategy.
Abstract: This study differs from previous studies by applying multivariate statistical analysis and multi-criterion decision-making methods to the improvement of service quality. We use the rough set theory (RST) with a flow graph approach to determine customer attitudes regarding service quality, which can assist managers in developing strategies to improve service quality and thus satisfy the needs of customers. A set of rules is derived from a large sample of airline customers, and its predictive ability is evaluated. The flow graph and the cause-and-effect relationship of the decision rules are heavily exploited in service quality characteristics. As compared with the results of other data-mining analyses, our results are encouraging. This study demonstrates that the combination of the RST model and flow graphs assists in identifying the needs of customers, determining their characteristics, and facilitating the development of an improvement strategy.

21 citations


Journal ArticleDOI
TL;DR: This study develops perishable items production inventory models with backordering and rework process with coordination strategy and proves that the coordination strategy can achieve system optimization.
Abstract: This study develops perishable items production inventory models with backordering and rework process. In the absence of coordination, shortages are allowed for the buyer and damaged products are returned to the vendor. The vendor himself makes arrangement for screening or disposal of the damaged products. During coordination strategy, shortages are not allowed, the quantity discount is provided to the buyer and the damaged products are not returned to the vendor. The buyer himself makes the arrangement for screening or disposal of the damaged products. In both cases, optimum multiples of orders and optimum order quantity are determined by vendor and buyer, which will minimize the total inventory cost. It proves that the coordination strategy can achieve system optimization. A numerical example is given to illustrate the solution procedure for the model. Finally, based on the example, we have conducted a sensitivity analysis of the model.

Journal ArticleDOI
TL;DR: A novel framework is presented to seek critical service attributes (SAs) that can enhance customer satisfaction and customer retention and uses the identified key predictors to forecast customer retention.
Abstract: The global economic recession as well as emerging low-cost carriers have led to declining revenues for worldwide airlines. A novel framework is presented to seek critical service attributes (SAs) that can enhance customer satisfaction and customer retention. Initially, fuzzy Kano model is employed to capture customer perceptions of SAs and convert them into quantitative degrees of customer satisfaction. Then, multiple regression and logistic regression are used to extract the weights of SAs and identify the key SAs for forecasting customer retention, respectively. Finally, the importance-performance analysis is conducted to offer managerial insights. Furthermore, support vector machine is used to justify the validity of customer retention. In summary, the main contributions are described as follows: (1) capturing passenger perceptions of airline services, (2) indicating which SAs should be improved first to enhance passenger satisfaction, and (3) using the identified key predictors to forecast cus...

Journal ArticleDOI
TL;DR: An overview of some well-known and more applicable tools and methods that have been developed and are available today and a scoring model is proposed as a new tool for sustainable PD.
Abstract: Systematic consideration of environmental aspects within the early stages of product development (PD) can be considered highly significant in order for the overall environmental performance of the product to be improved. Many methods and tools have been developed aiming to enable this consideration and provide the properties that need to be considered and improved. This article provides an overview of some well-known and more applicable tools and methods that have been developed and are available today. The identified tools are generally classified in two groups: Guidelines and Analytical tools. The limitations and barriers of current tools are assessed and categorized and two areas for work are proposed in order to address current limitations in the existing literature. One of the areas is followed and a scoring model is proposed as a new tool for sustainable PD.

Journal ArticleDOI
TL;DR: Two ant-colony optimization algorithms based on distinct procedures are presented and analyzed andumerical results show that both the proposed algorithms are capable of solving industrial-dimensioned problems within reasonable computation time and accuracy.
Abstract: This paper addresses the problem of scheduling jobs on unrelated parallel machines with eligibility constraints, where job-processing times are controllable through the allocation of a nonrenewable common resource, and can be modeled by a linear resource consumption function. The objective is to assign the jobs to machines and to allocate resources, so that the makespan is minimized. We provide an exact formulation of the addressed problem as a mixed integer programming model. In view of the computational complexity associated with the formulation makes it difficult for standard solvers to deal with large-sized problems in reasonable solution time, two ant-colony optimization algorithms based on distinct procedures, respectively, are also presented and analyzed. Numerical results show that both the proposed algorithms are capable of solving industrial-dimensioned problems within reasonable computation time and accuracy.

Journal ArticleDOI
TL;DR: In this paper, a two-stage stochastic program with recourse model is proposed to address the hub network design problem considering demand uncertainty and hub congestion effects, which provides a consistent set of hub locations, while adjusting network configuration in response to different demand realizations.
Abstract: This study addresses the hub network design problem considering demand uncertainty and hub congestion effects. The problem is formulated as a two-stage stochastic program with recourse model. The model provides a consistent set of hub locations, while adjusting network configuration in response to different demand realizations. A case study collected from real-world data was used to test the proposed model, and a sensitivity analysis was performed to know how several important parameters affected the solution.

Journal ArticleDOI
TL;DR: This study proposes a novel PSO algorithm, namely, the binary PSO based on surrogate information with proportional acceleration coefficients (BPSOSIPAC), which was tested on 135 benchmark problems from the OR-Library to validate and demonstrate the efficiency in solving multidimensional knapsack problems.
Abstract: The 0-1 multidimensional knapsack problem (MKP) has been proven it belongs to difficult NP-hard combinatorial optimization problems. There are various search algorithms based on population concept to solve these problems. The particle swarm optimization (PSO) technique is adapted in our study, which proposes a novel PSO algorithm, namely, the binary PSO based on surrogate information with proportional acceleration coefficients (BPSOSIPAC). The proposed algorithm was tested on 135 benchmark problems from the OR-Library to validate and demonstrate the efficiency in solving multidimensional knapsack problems. The results were then compared with those in the other nine existing PSO algorithms. The simulation and evaluation results showed that the proposed algorithm, BPSOSIPAC, is superior to the other methods according to success rate, average number of function evaluations, average number of function evaluations of successful runs, average error (AE), mean absolute deviation, mean absolute percentage error, ...

Journal ArticleDOI
TL;DR: In this paper, the authors explore the impact of correlation and data translation on the average data envelopment analysis (DEA) efficiency score of a data set and find that high correlation between inputs and outputs tend to be associated with high average efficiency scores.
Abstract: This article explores with extensive computational and statistical analyses the impact that correlation and data translation have, independently, on the average data envelopment analysis (DEA) efficiency score of a data set. The article explores three types of correlations, (a) between inputs and outputs, (b) among inputs only, and (c) among outputs only. The results suggest that the degree of correlation between the inputs and outputs tends to affect average efficiency scores. High correlations between inputs and outputs tend to be associated with high efficiency scores on the average. If correlation between inputs and outputs is relatively high, the degree of correlation between inputs or between outputs is not relevant; at the most, higher correlation among inputs or among outputs tends to be associated with slightly lower average efficiency scores. When correlation between inputs and outputs is close to zero, the average efficiency score of a data set is usually small, independently of how much correl...

Journal ArticleDOI
TL;DR: It has been concluded that the reduced scenario tree by the EU-E criterion has less number of possible paths, less uncertainty, and lengthier expected project duration than that with smaller trade-off coefficient λ.
Abstract: The aim of this study is to propose a scenario-based approach with utility-entropy decision model to measure the uncertainty related to the evolution of a resource-constrained project scheduling problem with uncertain activity durations (a stochastic RCPSP). The approach consists of two stages. The first is to apply the proposal proposed by Tseng and Ko to convert a stochastic RCPSP into a full scenario tree. In stage two, we introduce the Expected Utility–Entropy (EU-E) decision model, a weighted linear average of expected utility and entropy, to establish an EU-E criterion. Then we apply the criterion to prune the worse branch(es) to lead a reduced scenario tree. Based on an illustrated example, it has been concluded that the reduced scenario tree by the EU-E criterion with larger trade-off coefficient λ has less number of possible paths, less uncertainty, and lengthier expected project duration than that with smaller trade-off coefficient λ. Thus, this has demonstrated that not only can the who...

Journal ArticleDOI
TL;DR: A mathematical formulation model for preemption resource-constrained project scheduling problems (PRCPSP) with fuzzy random duration and resource availabilities is built and the proactive strategy, i.e. activities can be split one or more times in scheduling process, is presented.
Abstract: The aim of this paper is to present the proactive strategy for resource-constrained project scheduling problems under a fuzzy random environment with activity splitting, i.e. activities can be split one or more times in scheduling process. After giving the motivation and justification for employing fuzzy random variables, a mathematical formulation model for preemption resource-constrained project scheduling problems (PRCPSP) with fuzzy random duration and resource availabilities is built. In this model, the objective is to maximize the sum of free slack which contains two parts: one is the time buffer that the start time of some activities can be delayed, the other is the shortage of resource is allowed. Since the difficulties of dealing with fuzzy random variables, the model is transformed into an equivalent crisp one. A numerical example is finally applied to demonstrate the efficiency of the proposed model and proactive strategy, and the generated results verify the robustness for PRCPSP.

Journal ArticleDOI
TL;DR: A multi-period, multi-echelon, and bi-objective closed-loop supply chain network model in a fuzzy environment is proposed to minimize the risks and total costs and fuzzy set theory and posteriori method are used.
Abstract: Risks control has become a hot research area in supply chain management field. Risks indicate the reduction in revenue and profit or additional costs. Therefore, risks should be avoided as far as possible. This paper integrates risks into the design of closed-loop supply chain network and proposes a multi-period, multi-echelon, and bi-objective closed-loop supply chain network model in a fuzzy environment. The forward supply chain network includes component suppliers, assemblers, distribution centers, and customer zones. While the reverse supply chain network is composed of customer zones, recovery centers, and dismantling centers. The objectives of this model are to minimize the risks and total costs. In order to handle this model with risks caused by fuzziness, fuzzy set theory and posteriori method (compromise programming approach) are used. Computational experiments verify the practicability and validation of the fuzzy model and the solution method. The model and the method in this paper provide decis...

Journal ArticleDOI
TL;DR: A redesign method based on modular product architecture is developed to support a sustainable product considering its materials, assembly sequence and line balance at initial design phase and can be more easily maintained during product usage and be recycled at the end of product life.
Abstract: Nowadays, the pursuance of sustainability obligates manufacturers to redesign products in order to reduce negative environmental impacts. However, only a few studies have simultaneously considered environmental sustainability and assemblability. To bridge this research gap, this study aimed to develop a redesign method based on modular product architecture. This method manages to support a sustainable product considering its materials, assembly sequence and line balance at initial design phase. This method begins with a current product analysis based on economic and environmental performances (i.e., total cost and CO2 emissions). Additionally, new materials and assembly methods are incorporated into redesigning a more sustainable product without compromising production performance. To ensure assemblability, the line balance of 60% is served as one of the constraints. This study applies the particle swarm optimization algorithm to calculate an optimal module organization along with assembly methods and ass...

Journal ArticleDOI
TL;DR: The research is focused on a plastic component previously unexplored and analyzed using tools that have not been employed for this application, and found that by changing the material to high-density polyethylene there would be approximately a 30% reduction in carbon footprint, 24% reduce in air acidification, 26% reductionin water eutrophication and 15% reductionInjection molding is found to be the most sustainable manufacturing process.
Abstract: Recent literature in automotive research indicates that studies of the environmental impact mostly concern metal-based components. Environmental effects are mainly analyzed using “environmental performance indicators” and “life cycle assessment” techniques. Therefore, a knowledge gap in the field of studying automotive plastic components should be conducted based on analyzing material and manufacturing processes selection at the design stage. The research is focused on a plastic component previously unexplored and analyzed using tools that have not been employed for this application. A computer-aided tool was used to model the part and its associated sustainability function was used to analyze its environmental impact. The component was analyzed using different materials and manufacturing processes, then redesigned to be more ergonomic. The improved component design was manufactured using rapid prototyping and a consumer preference survey was conducted to determine which component was preferred. The resea...

Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors presented a modified Darwish's model with quality loss for obtaining the optimal process mean, number of shipment, and production lot size, and numerical results show that the modified model has larger process mean and expected total relevant cost of product.
Abstract: In 2009, Darwish developed a single-vendor single-buyer supply chain model with process mean integration for determining the optimal process mean, production lot size, and number of shipment of product to buyer. Taguchi proposed the quadratic quality loss function for measuring the product quality and redefines the product quality as the loss of society when the product is shipped and sold to the customer. The quality cost of conforming products is excluded in Darwish’s model, which is inconsistent with Taguchi’s quality concept. In this paper, the author presents a modified Darwish’s model with quality loss for obtaining the optimal process mean, number of shipment, and production lot size. The numerical results show that the modified model has larger process mean and expected total relevant cost of product, smaller production lot size, and same number of shipment than those of original one. The original Darwish’s model will underestimate the expected total relevant cost of product because he doesn’t con...

Journal ArticleDOI
TL;DR: The development of strategic balanced scorecard tool (SBST) for the Malaysian automotive industry was developed using the Analytic Hierarchy Process (AHP) methodology, which involves building a hierarchy of SBST, calculating relative weights ofSBST by pairwise comparison, rating SBST measures, calculating performance scores, and ranking companies based on their scores.
Abstract: In the world of information technology, business process requires a decision-making tool that is effective and systematic in measuring and assessing the current performance of the organization and enhances its competitive edge. In this study, the development of strategic balanced scorecard tool (SBST) for the Malaysian automotive industry was developed using the Analytic Hierarchy Process (AHP) methodology, which involves building a hierarchy of SBST, calculating relative weights of SBST by pairwise comparison, rating SBST measures, calculating performance scores, and ranking companies based on their scores. Further to that, an Excel-based tool was developed for automating the calculation of SBST for automotive industry. These SBSTs contribute to assist practitioners to evaluate internal performance, vendor performance, and in decision-making process to select the suppliers or reward the best supplier.

Journal ArticleDOI
TL;DR: A mathematical programming model for scheduling the shift of the physicians and residents of the medical department is developed and shows the optimal schedules are much better than ones as before and more efficiency to create in few seconds.
Abstract: Hospitals provide a 24-hour service under high working pressure. Nowadays, physician schedule faces a problem from shortage of staff. This situation makes it difficult for physicians to provide a non-stop service all day long and results in frequent overtime. Therefore, an applicable shift becomes more important for that. In this study, we developed a mathematical programming model for scheduling the shift of the physicians and residents of the medical department. Except the rules that must to restricted, we regard the preference of people who choose the shifts will be involved in objective function which maximize the satisfaction of all physicians and residents in real case. The result shows the optimal schedules are much better than ones as before and more efficiency to create in few seconds.

Journal ArticleDOI
TL;DR: Two mathematical models are constructed that increase the efficiency of nurse shift and day-off scheduling and then improves the quality of nursing and leisure life of nurse and the results can be integrated to create the optimal results.
Abstract: For most hospitals in Taiwan, the works of nurse shift and day-off scheduling were made by hand. It would spend several hours and could not consider various constraints very well. The scheduling problem usually puzzles the leader of nurses. Thus, the purpose of this study was to construct two mathematical models that increase the efficiency of nurse shift and day-off scheduling and then improves the quality of nursing and leisure life of nurse. We cite Hsia’s model (2009) to find out an optimal shift scheduling table and then established a day-off scheduling model to search for an optimal day-off scheduling assignment. Through this way, this paper can integrate the nurse shift and day-off scheduling models to create the optimal results. They can also improve the qualities of nurse shift and day-off scheduling tasks and save the time tremendously.

Journal ArticleDOI
TL;DR: In this article, a branch-and-bound algorithm is proposed to solve the problem of scheduling preemptive jobs on identical parallel machines to minimize total tardiness, which is known to be NP-hard.
Abstract: This article examines the problem of scheduling preemptive jobs on identical parallel machines to minimize total tardiness. The problem is known to be NP-hard. An efficient heuristic is developed for solving large-sized problems. A lower bound scheme is also presented. Both of the proposed heuristic and lower bound are incorporated into a branch-and-bound algorithm to optimally solve small-sized problems. Computational results are reported. The branch-and-bound algorithm can handle problems of up to 16 jobs and 5 machines in size within a reasonable amount of time. The solution obtained by the heuristic has an average percentage deviation of 4.01% from the optimal value, while the initial lower bound has an average percentage deviation of 41.55% from the optimal value. Moreover, the heuristic finds approved optimal solutions for more than 45% of the problem instances tested.

Journal ArticleDOI
TL;DR: This paper deals with a two-warehouse inventory control problem for non-instantaneous deteriorating items in stochastic framework, wherein shortages are permissible and are mixture of partially backlog and lost sales.
Abstract: This paper deals with a two-warehouse inventory control problem for non-instantaneous deteriorating items in stochastic framework, wherein shortages are permissible and are mixture of partially backlog and lost sales. The paper tackles a situation in which the retailer procures more quantity than his/her own warehouse (OW) storage capacity. In such a situation, a rented warehouse (RW) is needed to keep the over purchasing. The stochastic framework captures the situation of random planning horizon of trading. The multifariousness of the random planning horizon is discussed by considering two special cases, namely, uniform and truncated normal distributions. Depending upon the consumption times to and tr of OW and RW inventories, respectively, and preserve time tp after that product starts to deteriorate, we formulate mathematical models for three cases: (i) tp ≤ tr ≤ to, (ii) tr ≤ tp ≤ to, and (iii) tr ≤ to ≤ tp. The discussion is further elongated by presenting some numerical illustrations with co...

Journal ArticleDOI
TL;DR: An evaluation index system and an evaluation model based on triangular fuzzy entropy and Choquet integral for emergency logistics support capability are constructed and the results show that the proposed model can evaluate the emergency logisticsSupport capability feasibly and effectively.
Abstract: The comprehensive evaluation of emergency logistics support capability can promote the construction of the emergency logistics system effectively. This paper constructs an evaluation index system and an evaluation model based on triangular fuzzy entropy and Choquet integral for emergency logistics support capability. In view of the imperfection of subjective determining index weight method in traditional evaluation, we compute the index weight based on triangular fuzzy weight and information entropy, which combining the subjective judgment with objective evaluation. Due to the interactions among indexes, we evaluate the emergency logistics support capability using complexity-based discrete Choquet integral. We present a case study of our evaluation method for the natural disaster in some area. The results show that the proposed model can evaluate the emergency logistics support capability feasibly and effectively.

Journal ArticleDOI
TL;DR: In this article, different heuristics are proposed for the problem of minimizing the number of tardy jobs in assembly flow shop scheduling, which is directly related to the percentage of on-time shipments.
Abstract: The objective of minimizing the number of tardy jobs is important as it is directly related to the percentage of on-time shipments, which is often used to rate managers’ performance in many manufacturing environments. To the best of our knowledge, the assembly flowshop scheduling problem with this objective has not been addressed so far, and thus is addressed in this paper. Given that the problem is NP-hard, different heuristics are proposed for the problem in this paper. The proposed heuristics are genetic algorithm (GA), improved genetic algorithm (IGA), simulated annealing algorithm with three different neighborhood structures (SA-1, SA-2, SA-3), Dhouib et al.’s simulated annealing algorithm (DSA), and an improved cloud theory-based simulated annealing algorithm (CSA). The heuristics are evaluated based on extensive computational experiments and all the heuristics were run for the same computational time for a fair comparison. The experiments reveal that the overall average errors of DSA, GA, IGA, CSA,...