scispace - formally typeset
Search or ask a question

Showing papers in "Journal of The Chinese Institute of Industrial Engineers in 2004"


Journal ArticleDOI
TL;DR: In this paper, the authors proposed an evaluation model that integrates triangular fuzzy numbers and the analytical hierarchy process to develop a fuzzy multiple-attribute decision-making (FMADM) model for key quality performance evaluation.
Abstract: This research proposes an evaluation model that integrates triangular fuzzy numbers and the analytical hierarchy process to develop a fuzzy multiple-attribute decision-making(FMADM) model for key quality-performance evaluation. Using the proposed model, decision-makers can determine, in advance, critical quality elements that might significantly affect quality performance. A practical example of an electronic product has been used to illustrate the theoretical qualitative proposed evaluation model. Results demonstrate that decision-makers can use the flexibility of the proposed model by adjusting the confidence coefficient to express their degree of understangidng with respect to the importance of each component. Moreover, the result of FMADM analysis is a significant contribution to quality improvement.

123 citations


Journal ArticleDOI
TL;DR: This survey reveals that while makespan minimization has been fairly widely studied, problems that include processing characteristics such as release times, sequence dependent setups, and preemptions remain largely unstudied.
Abstract: This paper surveys the literature related to solving traditional unrelated parallel-machine scheduling problems. It compiles algorithms for the makespan, total weighted sum of completion times, maximum tardiness, total tardiness, total earliness and tardiness, and multiple criteria performance measures. The review of the existing algorithms is restricted to the deterministic problems without setups, preemptions, or side conditions on the problem. Even for such traditional problems, this survey reveals that while makespan minimization has been fairly widely studied, problems that include processing characteristics such as release times, sequence dependent setups, and preemptions remain largely unstudied. Research in solving unrelated parallel-machine scheduling problems involving the minimization of the number of tardy jobs, weighted number of tardy jobs, total tardiness, and total weighted tardiness is quite limited.

88 citations


Journal ArticleDOI
TL;DR: A meta-model of the relationships between key inputs and performance measures of an apparel retail operations using neural network technology is developed using EM method, which simulates the electromagnetism theory of physics by considering each weight connection in a neural network as an electrical charge.
Abstract: This paper applies a heuristic algorithm, called the “Electromagnetism Algorithm” (EM) [3], for neural network training. We develop a meta-model of the relationships between key inputs and performance measures of an apparel retail operations using neural network technology. This method simulates the electromagnetism theory of physics by considering each weight connection in a neural network as an electrical charge. Through the attraction and repulsion of the charges, weights move toward the optimality without being trapped into local optima like other algorithms such as genetic algorithm and gradient descent method. The computation results show that the EM algorithm not only converges much faster than those of genetic algorithms and back propagation algorithms in terms of CPU time but also saves more memories than those in genetic algorithms and back propagation algorithms.

63 citations


Journal ArticleDOI
TL;DR: Wang et al. as discussed by the authors used stepwise regression analysis to identify the key variables that affect the trend of the stock market index significantly, according to the identified variables, three models, i.e. a multiple regression analysis model, a backpropagation neural network, and an autoregressive integrated moving average model were built.
Abstract: The investigation to the stock market index in the literature could be divided into two aspects. The first one is the technical analysis by using the historical trend of the stock market index to predict the future fluctuation. The other one is the basic analysis; it analyzes the factors that affect the macroeconomics to forecast the stock market index. In this research, the technical analysis and the basic analysis were applied to investigate the trend of the weighted stock market index in Taiwan. Stepwise regression analysis was first used to identify the key variables that affect the trend of the stock market index significantly. According to the identified variables, three models, i.e. a multiple regression analysis model, a backpropagation neural network, and an autoregressive integrated moving average model were built. A hybrid model that integrates the technical and basic analyses was developed and expected to forecast the stock market index more accurately. The result has showed that the ...

38 citations


Journal ArticleDOI
TL;DR: The Mahalanobis-Taguchi System (MTS) as discussed by the authors is a pattern information technology developed by Dr. Taguchi to provide a better prediction for multivariate data through the construction of a multivariate measurement scale.
Abstract: The Mahalanobis-Taguchi System (MTS) is a pattern information technology developed by Dr. Taguchi. This technology is aimed at providing a better prediction for multivariate data through the construction of a multivariate measurement scale. In this study, two sets of data are analyzed to demonstrate the effectiveness of the MTS. Implementation results reveal that the MTS outperforms traditional discriminant analysis methods. In addition, several important issues regarding the MTS are summarized in the conclusion section.

28 citations



Journal ArticleDOI
TL;DR: In this paper, a complete set of performance indexes for semiconductor manufacturing management are developed and applied data mining techniques for analyzing production data collected in a semiconductor fab in Taiwan to validate this approach.
Abstract: The indexes for semiconductor manufacturing management are complicated and interrelated. Therefore, it is hard to clarify the relationships among the indexes and to derive useful rules for production management. Existing approaches rely on following individual indexes without considering the production system as a whole. This study aims to fill the gap by reviewing the related studies on semiconductor manufacturing management and developing a complete set of performance indexes in hierarchy. In addition, we apply data mining techniques for analyzing production data collected in a semiconductor fab in Taiwan to validate this approach. The empirically derived patterns among the critical indexes were useful for supporting production management decisions. The results demonstrate the practical viability of this approach. This study concludes with results and discussion on future research.

24 citations


Journal ArticleDOI
TL;DR: A heuristic scheduling algorithm with the complexity of O(n^3) to solve a single machine scheduling problem in which the objective function is to minimize the total weighted tardiness with different release dates and sequence-dependent setup times is proposed.
Abstract: This paper attempts to solve a single machine scheduling problem (n|l|r(subscript i), s(subscript ij)|∑w(subscript i)T(subscript i)), in which the objective function is to minimize the total weighted tardiness with different release dates and sequence-dependent setup times. There is not a mathematical programming or a heuristic method for this type of problem up to now. In this study, we propose a heuristic scheduling algorithm with the complexity of O(n^3)to solve this problem. To validate the performance of the heuristic proposed herein, a mathematical programming with logical constraints model is also formulated. Experimental results show that this algorithm can find10321 optimal solutions out of 12600 randomly generated problems. Total average solution quality is 98.03%. A11-Job case(large problem)requires only 0.00065 seconds, on average, to obtain an ultimate solution. The results demonstrate that the heuristic scheduling algorithm can efficiently solve this kind of problem.

15 citations


Journal ArticleDOI
TL;DR: In this article, the authors present a neural network based approach in QFD process to prescribe a new methodology to generate a conceptual design baseling, a generalized neural networks oriented conceptual design process is introduced and a hybrid intelligent system combining neural networks and expert systems for conceptual design is illustrated as well.
Abstract: Quality Function Deployment(QFD) with applied statistics techniques are employed to facilitate the translation of prioritized set of customer requirements into a set of system-level requirements during conceptual design. Engineering systems have become increasingly complex to desing and build while the demand for quality and effective development at lower cost and shorter time continues. The aim of this research is to present neural networks based approach in QFD process to prescribe a new methodology to generate a conceptual design baseling. A generalized neural networks oriented conceptual design process in introduced and a hybrid intelligent system combining neural networks and expert systems for conceptual design is illustrated as well.

13 citations


Journal ArticleDOI
TL;DR: In this article, a probability model for prediction of fatigue failures of aluminum disc wheels is presented, which intends to better link the prediction using simulation results with historical test data, and is directly applied to the fatigue prediction of dynamic radial fatigue test.
Abstract: Disc wheels intended for normal use on passenger cars have to pass three tests before going into production: the dynamic cornering fatigue test, the dynamic radial fatigue test, and the impact test. This paper describes a probability model for prediction of fatigue failures of aluminum disc wheels, which intends to better link the prediction using simulation results with historical test data. Finite element models of 54 aluminum wheels, which are already physically tested, are constructed to simulate the dynamic cornering fatigue test. Their mean stresses and stress amplitudes during the fatigue loading cycle are calculated and plotted on a two-dimensional plane. Matching with historical test data, the failure probability contour can be drawn. For a new wheel, the failure probability of dynamic cornering fatigue test can be read directly from this probability contour. The test result of the new wheel can be added into the set of historical test data and the failure probability contour is updated. Same procedure is directly applied to the fatigue prediction of dynamical radial fatigue test. At this point we only have 20 historical test data to construct the failure contour. The prediction will become more and more reliable as the number of historical test data increases.

13 citations


Journal ArticleDOI
TL;DR: In this article, the authors used the maximum likelihood estimation (MLE) method to estimate the starting time of a step change disturbance and showed the fruitful results for monitoring variability change with the proposed approaches through a series of simulation.
Abstract: The main function of the SPC charts is to monitor the process. Once the out of control signal is triggered, a search for the root cause of the process disturbance has been initiated. It is easier and quicker to gain process quality when the assignable causes have been eliminated. This study uses the maximum likelihood estimation (MLE) method to estimate the starting time of a step change disturbance. Statistical properties of these estimators are discussed. This study shows the fruitful results for monitoring a variability change with the proposed approaches through a series of simulation.

Journal ArticleDOI
TL;DR: In this paper, a grey prediction model with factor analysis technique is proposed to deal with series prediction problems with multi-factor, and three stages are conducted in this study, the factor evaluation is performed in stage one, the transformation of the qualitative measure to the quantitative measure of the grey relational grade is conducted in stage two, and the prediction procedure based on results of stage one was performed in the third stage.
Abstract: In tradition, the grey theory is wildly employed for dealing with short-term prediction problems with rare data. However, only one factor has been considered in the conventional model. In most cases, prediction problems usually consist of more than one factor. Therefore, a grey prediction model with factor analysis technique is proposed in this study to deal with series prediction problems with multi-factor. Three stages are conducted in this study. The factor evaluation is performed in stage one. In addition, the transformation of the qualitative measure to the quantitative measure of the grey relational grade is conducted in stage two. Finally, the prediction procedure based on results of stage one is performed in the third stage. Numerical results show that the proposed model outperforms the other existing models.

Journal ArticleDOI
TL;DR: This paper attempts to use fuzzy numbers to represent the parameters' values, and then develops a Fuzzy GERT model, and one numerical example based on an existing GERT network has been adopted.
Abstract: GERT networks have been widely applied in different domains. The parameters of GERT networks are usually determined by the decision makers' subjective opinions or experts' suggestions. It implies that the values of parameters in GERT networks are imprecise and uncertain. However, traditional GERT networks cannot reflect the characteristics of real-world network problems. To overcome the above weaknesses, this paper attempts to use fuzzy numbers to represent the parameters' values, and then develops a Fuzzy GERT model. The developed fuzzy GERT model includes three procedures: (1) draw the GERT network based on system analysis, (2) use triangular fuzzy numbers to represent the parameters' values in the GERT network and (3) apply the operations of triangular fuzzy numbers to formulate the Fuzzy GERT model. To validate the proposed fuzzy GERT model, one numerical example based on an existing GERT network has been adopted.

Journal ArticleDOI
TL;DR: A method that can extract and classify turning and non-turning features that are concave, convex, or complex for rotational parts taking a 3D data file as input is proposed.
Abstract: Feature extraction and classification is considered as the bridge between Computer-Aided Design (CAD) and Computer-Aided Process Planning (CAPP). This paper proposes a method that can extract and classify turning (including symmetric and non-symmetric) and non-turning features that are concave, convex, or complex for rotational parts taking a 3D data file as input. In addition, feature interactions are also taken into consideration in this methodology. The proposed feature extraction and classification method consists of three basic procedures. The first procedure extracts concave, convex and complex features from a 3D CAD data file. The second procedure classifies the extracted features. The third procedure merges and decomposes extracted features. Two sample application descriptions are presented for demonstration purposes. The system has been implemented in C on a PC-based system.

Journal ArticleDOI
TL;DR: Results of this study showed that symbol- and wording-color and training were significantly factors for subjects' visual identification performance on Labels of Flammable gases, Non-flammable gas, and Dangerous when wet substances.
Abstract: This study investigated the effects of symbol- and wording-color (black and white) of three hazardous material labels, surround color (white and brown), and training (before and after training) on subjects' visual identification performance under different ambient illuminance (100 and 800 lx). Results of this study showed that symbol- and wording-color and training were significantly factors for subjects' visual identification performance on Labels of Flammable gases (red background), Non-flammable gases (green background), and Dangerous when wet substances (blue background). Subjects' visual identification performance was significantly better when the symbol- and wording-color was black than white, and was also significantly better after subjects received labels training. Consequently, training and the black symbol- and wording-color are recommended as using the hazardous labels of Flammable gases, Non-flammable gases, and Dangerous when wet substances to increase individuals' visual identificat...

Journal Article
TL;DR: In this article, a modified fuzzy c means (M-FCM) algorithm along with clustering quality index (ψ) is applied to cluster the characteristic values of low-yield wafers and therefore an optimal solution is obtained.
Abstract: ABSTRACT For the past few years, semiconductor manufacturing has emerged as one of the most important industries in Taiwan. The complex manufacturing processes, expensive raw materials and machines, and the near particle free environment have made semiconductor manufacturing a high-cost industry. If low yield occurs in certain lots, the yield loss will result in high manufacturing cost. For this reason, all the semiconductor manufacturing companies have devoted huge efforts to analyze the yield related data in the hope for reducing the occurrence of process variations and achieving the goal of yield enhancement and cost reduction. However, many of the different yield enhancement methods used by the semiconductor manufacturing companies are in lack of integration and therefore are inefficient. Engineers from different departments usually analyze large amount of engineering data. They also spend lots of time tracing the possible variations by analyzing the defect maps, wafer bin maps, process parameters, machines, and WAT parameters with simple statistical analysis techniques and tools. In this situation, the efficiency and effectiveness of locating the root causes are strongly associated with the experiences of the engineers. Based on the above consideration, this research intends to construct a model to analyze the characteristic values of low-yield wafer in semiconductor manufacturing. Firstly, the characteristic values of low-yield wafers are retrieved. Secondly, the proposed modified fuzzy c means (M-FCM) algorithm along with clustering quality index (ψ) is applied to cluster the characteristic values of low-yield wafers and therefore an optimal solution is obtained. With the proposed model, the possible root causes can be identified more easily and decisions can be made to correct manufacturing problems more efficiently.

Journal ArticleDOI
TL;DR: This study evaluated the performance of Taiwan IC design house by Data Envelopment Analysis (DEA) and Return on Investment Capital (ROIC) to provide suggestions to each company to provide a performance improving strategy.
Abstract: This study analyzed the factors of performance variations in Taiwan IC design house firstly, and then evaluated the performance of Taiwan IC design house by Data Envelopment Analysis (DEA) and Return on Investment Capital (ROIC) to provide suggestions to each company. Company can find its performance out via return of sales (ROS) and capital turnover to clarify which its performance effort by ROIC. DEA supports company to understand how to improve its performance. To integrate ROIC and DEA into a performance measurement procedure, and then develop a performance measurement matrix with ROIC and DEA could help company to trace its performance orbit. Finally, to provide a performance improving strategy for Taiwan IC design house.

Journal ArticleDOI
TL;DR: In this paper, the error data collected through observing the workers' operation status, instead of experts' judgments, can be used as the source information for analyzing human error and developing safety index.
Abstract: Human error has been recognized as the major cause for industrial accidents. Because of the changing role of human in modern manufacturing environment and the lack of understanding of the human cost, the severity of human error problem is worsen by the increasing mental workload and the insufficient allocated company resources. To prevent the accidents from happening, it is necessary for managers to realize the potential human error threats and its likelihood of creating loss from daily operations. This study proposes the error data collected through observing the workers' operation status, instead of experts' judgments, can be used as the source information for analyzing human error and developing safety index. Safety index is used as a monitoring tool to control human errors. An industrial case is carried out to illustrate the implementation of this approach. It shows the operators' behaviors can be incorporated into the human errors assessment process and the factory managers can get closer understanding of their manufacturing risk. The problem of traditional Human Reliability Assessment Techniques, which can not provide the manager the daily operation risk information, is solved.

Journal ArticleDOI
TL;DR: This research based on the basis of fuzzy theory and a fuzzy-critical-chain-based project algorithm is modeled and a quantitative comparison among fuzzy PERT, critical chain, and fuzzy critical chain is done.
Abstract: The critical chain algorithm, developed by Goldratt, is different from the traditional project management method like PERT The main difference between the two approaches mentioned above is the former consider the projector psychology factor, but the latter doesn't Since the critical chain algorithm has covered the projector psychology factor, the project management is improved This research based on the basis of fuzzy theory and a fuzzy-critical-chain-based project algorithm is modeled This algorithm modeled by this research is for single project case The determination of buffer time in critical chain was improved in order to obtain a reasonable feeding buffer and project buffer An example was selected to illustration and to enhance practical application Finally, a quantitative comparison among fuzzy PERT, critical chain, and fuzzy critical chain is done

Journal ArticleDOI
TL;DR: In this article, a different viewpoint is proposed in comparison with the unit time demand hypothesis of the lead time and ordering quantities based inventory model that r(reorder point)=μL+KσL½ is proposed by Ben-Daya and Raouf and Ouyang.
Abstract: Traditionally, the “lead time” of inventory model is often regarded as given or random variable. In other words, the lead time is not controllable. A different viewpoint is proposed in comparison with the unit time demand hypothesis of the lead time and ordering quantities based inventory model that r(re-order point)=μL+KσL½ is proposed by Ben-Daya and Raouf and Ouyang. In this paper we suggests to utilize negative exponential function for crash cost, and then to construct an inventory model based on ordering quantities and lead time under the hypothesis of demand frequency and quantities corresponding Poisson and Normal distribution respectively. The result indicates: Under the reordering point specified in this paper and the same demand expectation value, the probability of the occurrence of stock insufficiency is lower than that assumed by above scholars. After simulating the actual ordering case, the variance of total expectation cost (EAC) is small relatively. This finding validates the corr...

Journal ArticleDOI
TL;DR: The rating system using total-involved-quality-costs analysis, as proposed in this study, could be beneficial for manufacturers in selecting the best suppliers and driving operational quality improvements.
Abstract: This research establishes a cost-effectiveness based performance rating system for suppliers and operations. The purpose is to provide a methodology for ”integrating supplier and manufacturer capabilities through a common goal- ”profitability improvement” based on lowering the cost of purchased materials. The merits of measuring supplier quality performance using Total-Involved-Quality-Costs (TIQC) analysis include:1).a common measurement language-money,2).very simple and visible numbers along with direct and indirect loss ratios to help management and employees understand the importance of ”doing things right the first time”. This study investigated the interactions and mutual movements among the three groups in the supply chain: ”Supplier-manufacturer-customer”, integrated the results from different stages: Incoming inspection, internal customers, external customers and severity levels for quality events: rejects, sorting, rework, shutdown, scrap and customer returns. A total-involved-quality-costs analysis along with a pre-determined cost structure and the Management By Objectives (MBO) principle were developed and utilized in planning and establishing this rating system for supplier performance. The rating system using total-involved-quality-costs analysis, as proposed in this study could be beneficial for manufacturers in selecting the best suppliers and driving operational quality improvements.

Journal ArticleDOI
TL;DR: A hybrid heuristic procedure is proposed, which includes the Hough transformation, point feature segmentation, Vector Quantization, and a genetic algorithm, to derive the minimal solutions of the energy function.
Abstract: This paper presents a method that converts the line-matching problem in image space into a simple point-matching problem in Hough space. By means of solving this point-matching problem, the correspondence between lines is derived. In this study, line features are first extracted in image space, and transformed into Hough space as points. Then, a point-feature based stereo matching problem is formulated and further constructed as an energy function for minimization. We propose a hybrid heuristic procedure, which includes the Hough transformation, point feature segmentation, Vector Quantization, and a genetic algorithm, to derive the minimal solutions of the energy function. Two experiments were conducted to verify the proposed method. Experimental results show hat the proposed method effectively and efficiently solves the correspondence problem in Hough space.

Journal ArticleDOI
TL;DR: This work proposes two new new simple heuristic algorithms and empirically compare their effectiveness and efficiency with several existing algorithms.
Abstract: We consider the problem of scheduling n jobs on m identical parallel machines. An optimal schedule to the proposed problem is defined as one that gives the smallest makespan (the completion time of the last job on any one of the parallel machines) among the set of all schedules with optimal total flowtime (the sum of the completion times of all jobs). We propose two new simple heuristic algorithms and empirically compare their effectiveness and efficiency with several existing algorithms.

Journal ArticleDOI
TL;DR: In this article, a tabu search heuristic, TSDL that consists of dynamic tabu tenure with long-term memory mechanism is presented to solve the generalized assignment problem (GAP) determines the maximum profit or minimum cost assignment of n jobs to m agents.
Abstract: The generalized assignment problem (GAP) determines the maximum profit or minimum cost assignment of n jobs to m agents, which is a problem embedded in the cell formation problem. In this paper, a tabu search heuristic, TSDL that consists of dynamic tabu tenure with long-term memory mechanism is presented to solve the GAPs. A standard set of 84 test problems adopted from the literature is used to evaluate the performance of the proposed algorithm and for comparison with other existing methods. The TSDL can very efficiently find solutions with good quality. The proposed algorithm should thus be useful to practitioners and researchers.

Journal ArticleDOI
TL;DR: In this article, a generalized Bayes-type approach is presented for the construction of a lower interval limit for the process measure C pm, which is based on the integration of a gamma distribution.
Abstract: The main focus of this paper is on a particular type of process capability indices, termed C pm in the quality control community, enabling to consider proximity of the process center to the pre-specified target value as well as process inherent variability altogether whilst investigating process capability. In order to take the probabilistic properties of the point estimator used on C pm into account, the key element of this study is to present a generalized Bayes-type approach that allows for the construction of a lower interval limit for the process measure C pm . In comparison to the classical analogue for C pm (built upon the traditional frequentist theory) entailing a non-central chi-square distribution that the practitioners are generally unfamiliar with, the Bayesian method relates simply to the integration of a gamma distribution. The lower Bayesian interval estimate on C pm is also compared with the other three lower confidence bounds posed in the literature and then various experimental...

Journal ArticleDOI
TL;DR: In this paper, the authors describe how to manage enterprises to integrate business processes, reduce costs, create innovation and collaborate with dispersed divisions, partners, and customers is one of the important objective of our manufacturers' chasing.
Abstract: As various requirements of coming needs for globalizing, internet and customers satisfaction, how to manage enterprises to integrate business processes, reduce costs, create innovation and collaborate with dispersed divisions, partners, and customers is one of the important objective of our manufacturers' chasing. Under this situation, speed is key successful factor for the enterprise competitive. To achieve to take this advantage, enterprises which make design-to-order products must reduce time-to-market of a product and produce those products more efficiently and profitably as well. For running the global logistics and e-enterprise operation, product collaborative design has appeared the important position in the internal and external of the enterprise Owing to international enterprises focus on the kernel ability of product development, they outsource product to overseas manufacturers whom is named contract manufacturers or called Original Equipment Manufacturer (OEM). Hence, this trend of out...

Journal ArticleDOI
TL;DR: In this article, a production planning system for a semiconductor enterprise with multi-site fabs is proposed, where profit achievement of the whole enterprise and quick response mechanism for the due date setting are the major considerations.
Abstract: This paper proposes a production planning system for a semiconductor enterprise with multi-site fabs. Profit achievement of the whole enterprise and quick response mechanism for the due date setting are the major considerations of the proposed system. Two modules are included in this system. Throughput planning module considers the achievement of enterprise's profit, operating characteristics of each fab site and cycle time impact to each level of orders. With this module, the proper production quantity and product mix for each fab is derived so as to achieve enterprise's profit target. Then, job-order planning module is applied for order allocation and due date setting. Rapidly distributing customers' orders to each proper site and responding reliable due date to customers are the main functions of this module. A simulation experiment is performed to demonstrate the effectiveness and efficiency of the proposed system.

Journal ArticleDOI
TL;DR: In this paper, a fuzzy multi-attribute decision-making algorithm for evaluating flexibility in a manufacturing system development is presented, where the evaluation problem is formulated as a multidimensional decision making model in a fuzzy environment and solved by a fusion method based on the MEOWA operators.
Abstract: This paper presents a fuzzy multi-attribute decision-making algorithm for evaluating flexibility in a manufacturing system development. This evaluation problem is formulated as a multi-attribute decision-making model in a fuzzy environment and solved by a fusion method based on the MEOWA operators. While evaluating the degree of manufacturing flexibility (MF), one may find the need for improving MF, and determine the dimensions of MF as the best directions to the improvement of MF until she/he can accept it. We also show that higher the combination of importance grade and performance rating the higher the degree of MF.

Journal ArticleDOI
TL;DR: In this article, two new heuristics were proposed with the use of decision indexes that assign the priorities to jobs in the sequence and a dominance rule was also developed to eliminate the node in which its partially scheduled sequence is dominated.
Abstract: In this sstudy, we considered a single-machine scheduling problem with release times and the objective is to minimize the total weighted completion time. Two new heuristics were proposed with the use of decision indexes that assign the priorities to jobs in the sequence. The former decision index is based on the rearrangement of the objective function whereas the latter is based on a decomposition procedure to generate a better lower bound. A dominance rule was also developed to eliminate the node in which its partially scheduled sequence is dominated by a simple heuristic developed in this study. Experimental results showed that both heuristics yielded near-optimal solutions in a very short time.

Journal ArticleDOI
TL;DR: In this article, the authors considered the group replacement problem for an M/M/N production/service system and developed a specific class of m-failure group replacement policy where the repair is started as soon as the number of failed machines reached a predetermined level m. In the repair process, they assume the positive repair time and allow server failures during replacement.
Abstract: In this research we consider the group replacement problem for an M/M/N production/service system. The servers are unreliable with identically exponentially distributed failure times. The repair cost consists of a fixed cost and a variable cost proportional to the number of repaired machines. In addition there is a holding cost for each customer in the system per unit of time. We develop a specific class of m-failure group replacement policy where the repair is started as soon as the number of failed machines reaches a predetermined level m. In the repair process, we assume the positive repair time and allow server failures during replacement. Finally, we formulate a matrix-geometric model to perform the steady state analysis and to obtain the expected average number of customers and the expected average cost. Besides the mathematical analysis, we numerically demonstrate the properties of the optimal policy for various sets of parameter values.