scispace - formally typeset
Search or ask a question

Showing papers in "Journal of Industrial Engineering, International in 2013"


Journal ArticleDOI
TL;DR: Results show that the ANN that uses the most influential features is able to forecast the daily direction of S&P 500 significantly better than the traditional logit model and indicate that ANN could significantly improve the trading profit as compared with the buy-and-hold strategy.
Abstract: The main objective of this research is to forecast the daily direction of Standard & Poor's 500 (S&P 500) index using an artificial neural network (ANN). In order to select the most influential features (factors) of the proposed ANN that affect the daily direction of S&P 500 (the response), design of experiments are conducted to determine the statistically significant factors among 27 potential financial and economical variables along with a feature defined as the number of nodes of the ANN. The results of employing the proposed methodology show that the ANN that uses the most influential features is able to forecast the daily direction of S&P 500 significantly better than the traditional logit model. Furthermore, experimental results of employing the proposed ANN on the trades in a test period indicate that ANN could significantly improve the trading profit as compared with the buy-and-hold strategy.

152 citations


Journal ArticleDOI
TL;DR: In this paper, an interpretive structural-based model has been presented, and variables have been classified using matrice d'impacts croisesmultiplication applique a unclassement analysis.
Abstract: The role of customers in green supply chain management needs to be identified and recognized as an important research area. This paper is an attempt to explore the involvement aspect of customers towards greening of the supply chain (SC). An empirical research approach has been used to collect primary data to rank different variables for effective customer involvement in green concept implementation in SC. An interpretive structural-based model has been presented, and variables have been classified using matrice d'impacts croises-multiplication applique a un classement analysis. Contextual relationships among variables have been established using experts' opinions. The research may help practicing managers to understand the interaction among variables affecting customer involvement. Further, this understanding may be helpful in framing the policies and strategies to green SC. Analyzing interaction among variables for effective customer involvement in greening SC to develop the structural model in the Indian perspective is an effort towards promoting environment consciousness.

146 citations


Journal ArticleDOI
TL;DR: In this article, a deterministic inventory model with time-dependent demand and time-varying holding cost where deterioration is time proportional is considered, and the model is solved analytically by minimizing the total inventory cost.
Abstract: In this paper, we considered a deterministic inventory model with time-dependent demand and time-varying holding cost where deterioration is time proportional. The model considered here allows for shortages, and the demand is partially backlogged. The model is solved analytically by minimizing the total inventory cost. The result is illustrated with numerical example for the model. The model can be applied to optimize the total inventory cost for the business enterprises where both the holding cost and deterioration rate are time dependent.

95 citations


Journal ArticleDOI
TL;DR: A fuzzy multi-criteria decision-making (FMCDM) model is presented by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels by using a combination of weights obtained by both approaches in evaluating servicequality in hotel industries.
Abstract: This paper presents a fuzzy multi-criteria decision-making (FMCDM) model by integrating both subjective and objective weights for ranking and evaluating the service quality in hotels. The objective method selects weights of criteria through mathematical calculation, while the subjective method uses judgments of decision makers. In this paper, we use a combination of weights obtained by both approaches in evaluating service quality in hotel industries. A real case study that considered ranking five hotels is illustrated. Examples are shown to indicate capabilities of the proposed method.

47 citations


Journal ArticleDOI
TL;DR: In this paper, a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency.
Abstract: As governmental subsidies to universities are declining in recent years, sustaining excellence in academic performance and more efficient use of resources have become important issues for university stakeholders. To assess the academic performances and the utilization of the resources, two important issues need to be addressed, i.e., a capable methodology and a set of good performance indicators as we consider in this paper. In this paper, we propose a set of performance indicators to enable efficiency analysis of academic activities and apply a novel network DEA structure to account for subfunctional efficiencies such as teaching quality, research productivity, as well as the overall efficiency. We tested our approach on the efficiency analysis of academic colleges at Alzahra University in Iran.

37 citations


Journal ArticleDOI
TL;DR: The fuzzy DEMATEL method was extended into type-2 fuzzy sets in order to obtain the weights of dependent criteria based on the words and the application of the proposed method is presented for knowledge management evaluation criteria.
Abstract: Most decision making methods used to evaluate a system or demonstrate the weak and strength points are based on fuzzy sets and evaluate the criteria with words that are modeled with fuzzy sets. The ambiguity and vagueness of the words and different perceptions of a word are not considered in these methods. For this reason, the decision making methods that consider the perceptions of decision makers are desirable. Perceptual computing is a subjective judgment method that considers that words mean different things to different people. This method models words with interval type-2 fuzzy sets that consider the uncertainty of the words. Also, there are interrelations and dependency between the decision making criteria in the real world; therefore, using decision making methods that cannot consider these relations is not feasible in some situations. The Decision-Making Trail and Evaluation Laboratory (DEMATEL) method considers the interrelations between decision making criteria. The current study used the combination of DEMATEL and perceptual computing in order to improve the decision making methods. For this reason, the fuzzy DEMATEL method was extended into type-2 fuzzy sets in order to obtain the weights of dependent criteria based on the words. The application of the proposed method is presented for knowledge management evaluation criteria.

30 citations


Journal ArticleDOI
TL;DR: Thedefine-measure-analyze-improve-control (DMAIC) approach is employed and, using this DMAIC approach, standard deviation is reduced, the values of process potential and performance capability indices are reduced, and the analysis of variance (ANOVA) of the mean values are analyzed.
Abstract: Statistical process control is an excellent quality assurance tool to improve the quality of manufacture and ultimately scores on end-customer satisfaction. SPC uses process monitoring charts to record the key quality characteristics (KQCs) of the component in manufacture. This paper elaborates on one such KQC of the manufacturing of a connecting rod of an internal combustion engine. Here the journey to attain the process potential capability index (Cp )a nd the process performance capability index (Cpk) values greater than 1.33 is elaborated by identifying the root cause through quality control tools like the cause-and-effect diagram and examining each cause one after another. In this paper, the define-measure-analyze-improve-control (DMAIC) approach is employed. The definition phase starts with process mapping and identifying the KQC. The next phase is the measurement phase comprising the cause-and-effect diagram and data collection of KQC measurements. Then follows the analysis phase where the process potential and performance capability indices are calculated, followed by the analysis of variance (ANOVA) of the mean values. Finally, the process monitoring charts are used to control the process and prevent any deviations. By using this DMAIC approach, standard deviation is reduced from 0.48 to 0.048, the Cp values from 0.12 to 1.72, and the Cpk values from 0.12 to 1.37, respectively.

30 citations


Journal ArticleDOI
TL;DR: In this paper, the authors used the linear mixed models to account autocorrelation within observations which is gathered on phase II of the monitoring process, and enhanced a Hotelling's T 2 statistic, a multivariate exponential weighted moving average (MEWMA), and a multiivariate cumulative sum (MCUSUM) control chart to monitor process.
Abstract: In many circumstances, the quality of a process or product is best characterized by a given mathematical function between a response variable and one or more explanatory variables that is typically referred to as profile. There are some investigations to monitor auto-correlated linear and nonlinear profiles in recent years. In the present paper, we use the linear mixed models to account autocorrelation within observations which is gathered on phase II of the monitoring process. We undertake that the structure of correlated linear profiles simultaneously has both random and fixed effects. The work enhanced a Hotelling’s T 2 statistic, a multivariate exponential weighted moving average (MEWMA), and a multivariate cumulative sum (MCUSUM) control charts to monitor process. We also compared their performances, in terms of average run length criterion, and designated that the proposed control charts schemes could effectively act in detecting shifts in process parameters. Finally, the results are applied on a real case study in an agricultural field.

27 citations


Journal ArticleDOI
TL;DR: In this article, the process parameters, namely, welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process (load, hydraulic pressure, air pressure, and fixture height), and painting process (total alkalinity, temperature, pH value of rinsing water, and timing), are optimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by taguchi method.
Abstract: The various process parameters affecting the quality characteristics of the shock absorber during the process were identified using the Ishikawa diagram and by failure mode and effect analysis. The identified process parameters are welding process parameters (squeeze, heat control, wheel speed, and air pressure), damper sealing process parameters (load, hydraulic pressure, air pressure, and fixture height), washing process parameters (total alkalinity, temperature, pH value of rinsing water, and timing), and painting process parameters (flowability, coating thickness, pointage, and temperature). In this paper, the process parameters, namely, painting and washing process parameters, are optimized by Taguchi method. Though the defects are reasonably minimized by Taguchi method, in order to achieve zero defects during the processes, genetic algorithm technique is applied on the optimized parameters obtained by Taguchi method.

26 citations


Journal ArticleDOI
TL;DR: A robust decision support tool for detailed production planning based on statistical multivariate method including principal component analysis and logistic regression is proposed and has been used in a real case in Iranian automotive industry.
Abstract: Production planning and control (PPC) systems have to deal with rising complexity and dynamics. The complexity of planning tasks is due to some existing multiple variables and dynamic factors derived from uncertainties surrounding the PPC. Although literatures on exact scheduling algorithms, simulation approaches, and heuristic methods are extensive in production planning, they seem to be inefficient because of daily fluctuations in real factories. Decision support systems can provide productive tools for production planners to offer a feasible and prompt decision in effective and robust production planning. In this paper, we propose a robust decision support tool for detailed production planning based on statistical multivariate method including principal component analysis and logistic regression. The proposed approach has been used in a real case in Iranian automotive industry. In the presence of existing multisource uncertainties, the results of applying the proposed method in the selected case show that the accuracy of daily production planning increases in comparison with the existing method.

24 citations


Journal ArticleDOI
TL;DR: In this article, a bi-objective stochastic mixed-integer nonlinear programming model is proposed to solve the single-sourcing network design problem for a three-level supply chain, which considers risk-pooling, the inventory existence at distribution centers (DCs) under demand uncertainty, the existence of several alternatives to transport the product between facilities, and routing of vehicles from distribution centers to customer.
Abstract: This paper considers a single-sourcing network design problem for a three-level supply chain. For the first time, a novel mathematical model is presented considering risk-pooling, the inventory existence at distribution centers (DCs) under demand uncertainty, the existence of several alternatives to transport the product between facilities, and routing of vehicles from distribution centers to customer in a stochastic supply chain system, simultaneously. This problem is formulated as a bi-objective stochastic mixed-integer nonlinear programming model. The aim of this model is to determine the number of located distribution centers, their locations, and capacity levels, and allocating customers to distribution centers and distribution centers to suppliers. It also determines the inventory control decisions on the amount of ordered products and the amount of safety stocks at each opened DC, selecting a type of vehicle for transportation. Moreover, it determines routing decisions, such as determination of vehicles' routes starting from an opened distribution center to serve its allocated customers and returning to that distribution center. All are done in a way that the total system cost and the total transportation time are minimized. The Lingo software is used to solve the presented model. The computational results are illustrated in this paper.

Journal ArticleDOI
TL;DR: This study demonstrates that the developed procedure has proved a useful optimization procedure that can be applied in practice to the injection molding process.
Abstract: Product quality for plastic injection molding process is highly related with the settings for its process parameters. Additionally, the product quality is not simply based on a single quality index, but multiple interrelated quality indices. To find the settings for the process parameters such that the multiple quality indices can be simultaneously optimized is becoming a research issue and is now known as finding the efficient frontier of the process parameters. This study considers three quality indices in the plastic injection molding: war page, shrinkage, and volumetric shrinkage at ejection. A digital camera thin cover is taken as an investigation example to show the method of finding the efficient frontier. Solidworks and Moldflow are utilized to create the part’s geometry and to simulate the injection molding process, respectively. Nine process parameters are considered in this research: injection time, injection pressure, packing time, packing pressure, cooling time, cooling temperature, mold open time, melt temperature, and mold temperature. Taguchi’s orthogonal array L27 is applied to run the experiments, and analysis of variance is then used to find the significant process factors with the significant level 0.05. In the example case, four process factors are found significant. The four significant factors are further used to generate 34 experiments by complete experimental design. Each of the experiments is run in Moldflow. The collected experimental data with three quality indices and four process factors are further used to generate three multiple regression equations for the three quality indices, respectively. Then, the three multiple regression equations are applied to generate 1,225 theoretical datasets. Finally, data envelopment analysis is adopted to find the efficient frontier of the 1,225 theoretical datasets. The found datasets on the efficient frontier are with the optimal quality. The process parameters of the efficient frontier are further validated by Moldflow. This study demonstrates that the developed procedure has proved a useful optimization procedure that can be applied in practice to the injection molding process.

Journal ArticleDOI
TL;DR: A new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controLLable factors.
Abstract: In this paper, a new method is proposed to optimize a multi-response optimization problem based on the Taguchi method for the processes where controllable factors are the smaller-the-better (STB)-type variables and the analyzer desires to find an optimal solution with smaller amount of controllable factors. In such processes, the overall output quality of the product should be maximized while the usage of the process inputs, the controllable factors, should be minimized. Since all possible combinations of factors’ levels, are not considered in the Taguchi method, the response values of the possible unpracticed treatments are estimated using the artificial neural network (ANN). The neural network is tuned by the central composite design (CCD) and the genetic algorithm (GA). Then data envelopment analysis (DEA) is applied for determining the efficiency of each treatment. Although the important issue for implementation of DEA is its philosophy, which is maximization of outputs versus minimization of inputs, this important issue has been neglected in previous similar studies in multi-response problems. Finally, the most efficient treatment is determined using the maximin weight model approach. The performance of the proposed method is verified in a plastic molding process. Moreover a sensitivity analysis has been done by an efficiency estimator neural network. The results show efficiency of the proposed approach.

Journal ArticleDOI
TL;DR: A new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment and a non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level.
Abstract: Nowadays, customer expectations are increasing and organizations are prone to operate in an uncertain environment. Under this uncertain environment, the ultimate success of the firm depends on its ability to integrate business processes among supply chain partners. Supply chain management emphasizes cross-functional links to improve the competitive strategy of organizations. Now, companies are moving from decoupled decision processes towards more integrated design and control of their components to achieve the strategic fit. In this paper, a new approach is developed to design a multi-echelon, multi-facility, and multi-product supply chain in fuzzy environment. In fuzzy environment, mixed integer programming problem is formulated through fuzzy goal programming in strategic level with supply chain cost and volume flexibility as fuzzy goals. These fuzzy goals are aggregated using minimum operator. In tactical level, continuous review policy for controlling raw material inventories in supplier echelon and controlling finished product inventories in plant as well as distribution center echelon is considered as fuzzy goals. A non-linear programming model is formulated through fuzzy goal programming using minimum operator in the tactical level. The proposed approach is illustrated with a numerical example.

Journal ArticleDOI
TL;DR: This study is concerned with threshold F-policy and N-policy for controlling the arrivals and service in the queueing scenario of a machining system, having active and redundant components.
Abstract: This study is concerned with threshold F-policy and N-policy for controlling the arrivals and service in the queueing scenario of a machining system, having active and redundant components. For both F-policy and N-policy models, the queue size distributions are determined by the recursive method. Various performance measures, namely the average number of failed units in the system, probability that the server is busy or idle in the system, etc., are established using the queue size distribution.

Journal ArticleDOI
TL;DR: This paper presents a multi-objective mixed-integer nonlinear programming model to design a group layout of a cellular manufacturing system in a dynamic environment, in which the number of cells to be formed is variable.
Abstract: This paper presents a multi-objective mixed-integer nonlinear programming model to design a group layout of a cellular manufacturing system in a dynamic environment, in which the number of cells to be formed is variable. Cell formation (CF) and group layout (GL) are concurrently made in a dynamic environment by the integrated model, which incorporates with an extensive coverage of important manufacturing features used in the design of CMSs. Additionally, there are some features that make the presented model different from the previous studies. These features include the following: (1) the variable number of cells, (2) the integrated CF and GL decisions in a dynamic environment by a multi-objective mathematical model, and (3) two conflicting objectives that minimize the total costs (i.e., costs of intra and inter-cell material handling, machine relocation, purchasing new machines, machine overhead, machine processing, and forming cells) and minimize the imbalance of workload among cells. Furthermore, the presented model considers some limitations, such as machine capability, machine capacity, part demands satisfaction, cell size, material flow conservation, and location assignment. Four numerical examples are solved by the GAMS software to illustrate the promising results obtained by the incorporated features.

Journal ArticleDOI
TL;DR: The development of a model based decision support system is presented with a case study on solving the supplier selection problem in a chemical processing industry and an appropriate platform for process industries in selecting suppliers is proposed.
Abstract: This paper presents the development of a model based decision support system with a case study on solving the supplier selection problem in a chemical processing industry. For the evaluation and selection of supplier, the analytical hierarchy process (AHP) and grey relational analysis (GRA) were used. The intention of the study is to propose an appropriate platform for process industries in selecting suppliers, which was tested with an electroplating industry during the course of development. The sensitivity analysis was performed in order to improve the robustness of the results with regard to the relative importance of the evaluation criteria and the parameters of the evaluation process. Finally, a practical implementation study was carried out to reveal the procedure of the proposed system and identify the suitable supplier with detailed discussions about the benefits and limitations.

Journal ArticleDOI
TL;DR: In this article, a new control chart to monitor multi-binomial processes is first proposed based on a transformation method, and the maximum likelihood estimators of change points designed for both step changes and line-artrend disturbances are derived.
Abstract: In this paper, a new control chart to monitor multi-binomial processes is first proposed based on a transformation method. Then, the maximum likelihood estimators of change points designed for both step changes and lineartrend disturbances are derived. At the end, the performances of the proposed change-point estimators are evaluated and are compared using some Monte Carlo simulation experiments, considering that the real change type presented in a process are of either a step change or a linear-trend disturbance. According to the results obtained, the change-point estimator designed for step changes outperforms the change-point estimator designed for linear-trend disturbances, when the real change type is a step change. In contrast, the change-point estimator designed for linear-trend disturbances outperforms the change-point estimator designed for step changes, when the real change type is a linear-trend disturbance.

Journal ArticleDOI
TL;DR: In this article, an iterative weighting method is used to modify both the outliers and the residuals that follow abnormal trends in variation, like descending or ascending trends, so they will have less effect on the coefficient estimation.
Abstract: In this paper, the main idea is to compute the robust regression model, derived by experimentation, in order to achieve a model with minimum effects of outliers and fixed variation among different experimental runs. Both outliers and nonequality of residual variation can affect the response surface parameter estimation. The common way to estimate the regression model coefficients is the ordinary least squares method. The weakness of this method is its sensitivity to outliers and specific residual behavior, so we pursue the modified robust method to solve this problem. Many papers have proposed different robust methods to decrease the effect of outliers, but trends in residual behaviors pose another important issue that should be taken into account. The trends in residuals can cause faulty estimations and thus faulty future decisions and outcomes, so in this paper, an iterative weighting method is used to modify both the outliers and the residuals that follow abnormal trends in variation, like descending or ascending trends, so they will have less effect on the coefficient estimation. Finally, a numerical example illustrates the proposed approach.

Journal ArticleDOI
TL;DR: In this paper, a multi-objective pricing-inventory model for a retailer is developed, where the retailer's profit and the service level are the objectives, and shortage is allowed.
Abstract: The integration of marketing and demand with logistics and inventories (supply side of companies) may cause multiple improvements; it can revolutionize the management of the revenue of rental companies, hotels, and airlines. In this paper, we develop a multi-objective pricing-inventory model for a retailer. Maximizing the retailer's profit and the service level are the objectives, and shortage is allowed. We present the model under stochastic lead time with uniform and exponential distributions. Since pricing is important and influences demand, the demand is considered as a general function of price. The multiple-objective optimization model is solved using the weighting method as well as the L-P metric method. Concerning the properties of a nonlinear model, a genetic algorithm is taken into account to find the optimal solutions for the selling price, lot size, and reorder point. Finally, numerical examples with sensitivity analysis regarding key parameters are provided.

Journal ArticleDOI
TL;DR: In this paper, a two-echelon model has been presented to control the inventory of perishable goods in a supply chain and is based on real conditions and data, where real conditions such as production time, storage capacity, inventory level, transportation methods, and sustainability time are considered in the model.
Abstract: In this research, a new two-echelon model has been presented to control the inventory of perishable goods. The performance of the model lies in a supply chain and is based on real conditions and data. The main purpose of the model is to minimize the maintenance cost of the entire chain. However, if the good is perished before reaching the customer (the expiration date is over), the cost would be added to other costs such as transportation, production, and maintenance costs in the target function. As real conditions are required, some limitations such as production time, storage capacity, inventory level, transportation methods, and sustainability time are considered in the model. Also, due to the complexity of the model, the solution approach is based on genetic algorithm under MATLAB to solve and confirm the accuracy of the model’s performance. As can be noted, the manipulation of parametric figures can solve the problem of reaching the optimum point. Using real data from a food production facility, the model was utilized with the same approach and the obtained results confirm the accuracy of the model.

Journal ArticleDOI
TL;DR: The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.
Abstract: Risk analysis of production system, while the actual and appropriate data is not available, will cause wrong system parameters prediction and wrong decision making. In uncertainty condition, there are no appropriate measures for decision making. In epistemic uncertainty, we are confronted by the lack of data. Therefore, in calculating the system risk, we encounter vagueness that we have to use more methods that are efficient in decision making. In this research, using Dempster-Shafer method and risk assessment diagram, the researchers have achieved a better method of calculating tools failure risk. Traditional statistical methods for recognizing and evaluating systems are not always appropriate, especially when enough data is not available. The goal of this research was to present a more modern and applied method in real world organizations. The findings of this research were used in a case study, and an appropriate framework and constraint for tools risk were provided. The research has presented a hopeful concept for the calculation of production systems' risk, and its results show that in uncertainty condition or in case of the lack of knowledge, the selection of an appropriate method will facilitate the decision-making process.

Journal ArticleDOI
TL;DR: In this paper, the authors have designed a framework by QFD by measuring service quality of electricity utility sector in ANN and also find interrelationship between these design requirements by ISM.
Abstract: Competition in the electric service industry is highlighting the importance of a number of issues affecting the nature and quality of customer service. The quality of service(s) provided to electricity customers may be enhanced by competition, if doing so offers service suppliers a competitive advantage. On the other hand, service quality offered to some consumers could decline if utilities focus their attention on those customers most likely to exercise choice, while reducing effort and investment to serve customers less likely to choose alternatives. Service quality is defined as the way in which the utility interacts with and responds to the needs of its customers. To achieve maximum consumer satisfaction in electricity service, This paper has designed a framework by QFD by measuring service quality of electricity utility sector in ANN and also find interrelationship between these design requirements by ISM.

Journal ArticleDOI
TL;DR: In this paper, a robust regression approach based on M-estimator methods is proposed for multi-response problems, where the authors present a coincident outlier index (COI) criterion while considering a realistic number of outliers.
Abstract: A robust approach should be considered when estimating regression coefficients in multi-response problems. Many models are derived from the least squares method. Because the presence of outlier data is unavoidable in most real cases and because the least squares method is sensitive to these types of points, robust regression approaches appear to be a more reliable and suitable method for addressing this problem. Additionally, in many problems, more than one response must be analyzed; thus, multi-response problems have more applications. The robust regression approach used in this paper is based on M-estimator methods. One of the most widely used weighting functions used in regression estimation is Huber’s function. In multi-response surfaces, an individual estimation of each response can cause a problem in future deductions because of separate outlier detection schemes. To address this obstacle, a simultaneous independent multi-response iterative reweighting (SIMIR) approach is suggested. By presenting a coincident outlier index (COI) criterion while considering a realistic number of outliers in a multi-response problem, the performance of the proposed method is illustrated. Two well-known cases are presented as numerical examples from the literature. The results show that the proposed approach performs better than the classic estimation, and the proposed index shows efficiency of the proposed approach.

Journal ArticleDOI
TL;DR: It is shown that total profit of the SC is significantly enhanced using the developed model of continuous review (S-1, S) policy, and a particle swarm optimization algorithm is adopted.
Abstract: In this paper, we apply continuous review (S-1, S) policy for inventory control in a three-echelon supply chain (SC) including r identical retailers, a central warehouse with limited storage space, and two independent manufacturing plants which offer two kinds of product to the customer. The warehouse of the model follows (M/M/1) queue model where customer demands follow a Poisson probability distribution function, and customer serving time is exponential random variable. To evaluate the effect of considering bi-product developed model, solution of the developed model is compared with that of the two (M/M/1) queue models which are separately developed for each product. Moreover, and in order to cope with the computational complexity of the developed model, a particle swarm optimization algorithm is adopted. Through the conducted numerical experiments, it is shown that total profit of the SC is significantly enhanced using the developed model.

Journal ArticleDOI
TL;DR: The results indicate a significant increase in sales when the optimization approach was applied; it provides a longer warranty through increased revenue from selling, not only reducing profit margins but also increasing it.
Abstract: Warranty is now an integral part of each product. Since its length is directly related to the cost of production, it should be set in such a way that it would maximize revenue generation and customers’ satisfaction. Furthermore, based on the behavior of customers, it is assumed that increasing the warranty period to earn the trust of more customers leads to more sales until the market is saturated. We should bear in mind that different groups of consumers have different consumption behaviors and that performance of the product has a direct impact on the failure rate over the life of the product. Therefore, the optimum duration for every group is different. In fact, we cannot present different warranty periods for various customer groups. In conclusion, using cuckoo meta-heuristic optimization algorithm, we try to find a common period for the entire population. Results with high convergence offer a term length that will maximize the aforementioned goals simultaneously. The study was tested using real data from Appliance Company. The results indicate a significant increase in sales when the optimization approach was applied; it provides a longer warranty through increased revenue from selling, not only reducing profit margins but also increasing it.

Journal ArticleDOI
TL;DR: A comparison of the proposed method and goal programming with weighted sum method is presented and a numerical example and applications on two industrial problems have also enriched this paper.
Abstract: A very useful multi-objective technique is goal programming. There are many methodologies of goal programming such as weighted goal programming, min-max goal programming, and lexicographic goal programming. In this paper, weighted goal programming is reformulated as goal programming with logarithmic deviation variables. Here, a comparison of the proposed method and goal programming with weighted sum method is presented. A numerical example and applications on two industrial problems have also enriched this paper.

Journal ArticleDOI
TL;DR: In this paper, two stochastic models and computation algorithm of the piston manufacturing plant are illustrated using the state-space transition diagram and availability parameter is used for its behavioral study, and the conclusion is discussed based on the resulting computations.
Abstract: Piston plays a vital role in almost all types of vehicles. The present study discusses the behavioral study of a piston manufacturing plant. Manufacturing plants are complex repairable systems and therefore, it is difficult to evaluate the performance of a piston manufacturing plant using stochastic models. The stochastic model is an efficient performance evaluator for repairable systems. In this paper, two stochastic models and computation algorithm of the piston manufacturing plant are illustrated using the state-space transition diagram and availability parameter is used for its behavioral study. Finally, the conclusion is discussed based on the resulting computations.

Journal ArticleDOI
TL;DR: A non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution is presented.
Abstract: Clustering parts and machines into part families and machine cells is a major decision in the design of cellular manufacturing systems which is defined as cell formation. This paper presents a non-linear mixed integer programming model to design cellular manufacturing systems which assumes that the arrival rate of parts into cells and machine service rate are stochastic parameters and described by exponential distribution. Uncertain situations may create a queue behind each machine; therefore, we will consider the average waiting time of parts behind each machine in order to have an efficient system. The objective function will minimize summation of idleness cost of machines, sub-contracting cost for exceptional parts, non-utilizing machine cost, and holding cost of parts in the cells. Finally, the linearized model will be solved by the Cplex solver of GAMS, and sensitivity analysis will be performed to illustrate the effectiveness of the parameters.

Journal ArticleDOI
TL;DR: The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the well-known c-, Poisson exponentially weighted moving average and Poisson cumulative sum control charts for different change type scenarios.
Abstract: Precise identification of the time when a process has changed enables process engineers to search for a potential special cause more effectively. In this paper, we develop change point estimation methods for a Poisson process in a Bayesian framework. We apply Bayesian hierarchical models to formulate the change point where there exists a step change, a linear trend and a known multiple number of changes in the Poisson rate. The Markov chain Monte Carlo is used to obtain posterior distributions of the change point parameters and corresponding probabilistic intervals and inferences. The performance of the Bayesian estimator is investigated through simulations and the result shows that precise estimates can be obtained when they are used in conjunction with the well-known c-, Poisson exponentially weighted moving average (EWMA) and Poisson cumulative sum (CUSUM) control charts for different change type scenarios. We also apply the Deviance Information Criterion as a model selection criterion in the Bayesian context, to find the best change point model for a given dataset where there is no prior knowledge about the change type in the process. In comparison with built-in estimators of EWMA and CUSUM charts and ML based estimators, the Bayesian estimator performs reasonably well and remains a strong alternative. These superiorities are enhanced when probability quantification, flexibility and generalizability of the Bayesian change point detection model are also considered.