scispace - formally typeset
Search or ask a question

Showing papers in "Industrial Engineering and Management Systems in 2011"


Journal ArticleDOI
TL;DR: In this paper, the effectiveness of virtuality in SMEs' virtual R&D teams is explored and four main themes were extracted from the experts' recommendations regarding the effectiveness and significance of virtual teams for the growth and performance of SMEs.
Abstract: The number of small and medium enterprises (SMEs), especially those involved with research and development (R&D) programs and employed virtual teams to create the greatest competitive advantage from limited labor are increasing. Global and localized virtual R&D teams are believed to have high potential for the growth of SMEs. Due to the fast-growing complexity of new products coupled with new emerging opportunities of virtual teams, a collaborative approach is believed to be the future trend. This research explores the effectiveness of virtuality in SMEs" virtual R&D teams. Online questionnaires were emailed to Malaysian manufacturing SMEs and 74 usable questionnaires were received, representing a 20.8 percent return rate. In order to avoid biases which may result from pre-suggested answers, a series of open-ended questions were retrieved from the experts. This study was focused on analyzing an open-ended question, whereby four main themes were extracted from the experts" recommendations regarding the effectiveness of virtual teams for the growth and performance of SMEs. The findings of this study would be useful to product design managers of SMEs in order to realize the key advantages and significance of virtual R&D teams during the new product development (NPD) process. This is turn, leads to increased effectiveness in new product development"s procedure.

29 citations


Journal ArticleDOI
TL;DR: In this article, a reliability sampling plan under progressively type-1 interval censoring is proposed when the lifetime of products follows the Pareto distribution of second kind, using the maximum likelihood estimator for the median life and its asymptotic distribution.
Abstract: In this paper, a reliability sampling plan under progressively type-1 interval censoring is proposed when the lifetime of products follows the Pareto distribution of second kind. We use the maximum likelihood estimator for the median life and its asymptotic distribution. The cost model is proposed and the design parameters are determined such that the given producer's and the consumer's risks are satisfied. Tables are given and the results are explained with examples.

26 citations


Journal ArticleDOI
TL;DR: A one-stage differential evolution algorithm (1ST-DE) for job shop scheduling problem that employs random key representation and permutation of m-job repetition to generate active schedules and demonstrated that the proposed algorithm is able to provide good solutions.
Abstract: Job shop scheduling is well-known as one of the hardest combinatorial optimization problems and has been demonstrated to be NP-hard problem. In the past decades, several researchers have devoted their effort to develop evolutionary algorithms such as Genetic Algorithm (GA) and Particle Swarm Optimization (PSO) for job shop scheduling problem. Differential Evolution (DE) algorithm is a more recent evolutionary algorithm which has been widely applied and shown its strength in many application areas. However, the applications of DE on scheduling problems are still limited. This paper proposes a one-stage differential evolution algorithm (1ST-DE) for job shop scheduling problem. The proposed algorithm employs random key representation and permutation of m-job repetition to generate active schedules. The performance of proposed method is evaluated on a set of benchmark problems and compared with results from an existing PSO algorithm. The numerical results demonstrated that the proposed algorithm is able to provide good solutions especially for the large size problems with relatively fast computing time.

18 citations


Journal ArticleDOI
TL;DR: This research develops a framework for resolving time and worker conflicts in the Critical Chain Project Management (CCPM) method, expressed in the form of a Max-Plus Linear (MPL) system, and defines an adjacency matrix to resolve the detected conflicts.
Abstract: This research develops a framework for resolving time and worker conflicts in the Critical Chain Project Management (CCPM) method, expressed in the form of a Max-Plus Linear (MPL) system. Our previous work proposed a method for resolving time conflicts. However, in practical cases, both time and worker conflicts may occur. Hence, we propose a method for resolving both time and worker conflicts for a single project. We first consider how to detect a resource conflict. Then, we define an adjacency matrix to resolve the detected conflicts. Using the proposed method, we confirm that the resource conflict can be resolved through a numerical example.

13 citations


Journal ArticleDOI
TL;DR: This paper proposes a procedure of optimally determining thresholds of the chosen variables for a decision tree using an adaptive particle swarm optimization (APSO) and shows that the proposed algorithm is promising for improving prediction accuracy.
Abstract: Decision tree as a classification tool is being used successfully in many areas such as medical diagnosis, customer churn prediction, signal detection and so on. The main advantage of decision tree classifiers is their capability to break down a complex structure into a collection of simpler structures, thus providing a solution that is easy to interpret. Since decision tree is a top-down algorithm using a divide and conquer induction process, there is a risk of reaching a local optimal solution. This paper proposes a procedure of optimally determining thresholds of the chosen variables for a decision tree using an adaptive particle swarm optimization (APSO). The proposed algorithm consists of two phases. First, we construct a decision tree and choose the relevant variables. Second, we find the optimum thresholds simultaneously using an APSO for those selected variables. To validate the proposed algorithm, several artificial and real datasets are used. We compare our results with the original CART results and show that the proposed algorithm is promising for improving prediction accuracy.

12 citations


Journal ArticleDOI
TL;DR: In this study, choices of evaluation of unknown customer are considered uniform cumulative vector probability and this model has been tested for its soundness and found fairly consistent including existing Kano model and case survey for headlight of bicycle.
Abstract: Functional form and dysfunctional form of Kano model are considered as customer need regarding attribute of product. Both functional and dysfunctional forms are: Like, Must-be Neutral, Live-with and Dislike. The answers of customer regarding a product of functional and dysfunctional forms have been applied for selection of customer needs regarding product attribute (Kano evaluation). Filling?up and returning the Questionnaires by the individuals are essential for determining Kano evaluation. But many Questionnaires have not been returned in that case. Moreover, many possible consumers could not get opportunity to fill-up questionnaire. These uncertain or unknown consumers" opinions are also essential for product development. The choices of Kano evaluations have been outlined by: Attractive, One-dimensional, Must-be, Indifferent and Reverse. In this study, choices of evaluation of unknown customer are considered uniform cumulative vector probability (scenario 1). This study is based on the Monte Carlo simulation method, concept of probability and Kano model. This model has also been tested for its soundness and found fairly consistent including existing Kano model (scenario 2) and case survey for headlight of bicycle (scenario 3).

11 citations


Journal ArticleDOI
TL;DR: In this article, a study on product design optimization to reduce the environmental impact of machining is presented, where the authors used a CAD model of a product with different design scenarios and analyzed their energy consumption using an environmental impact calculator method developed.
Abstract: This paper presents a study on product design optimization to reduce the environmental impact of machining. The objective is to analyze the effect of changing the product design parameters such as its dimensions, and basic features on the environmental impact of machining process in terms of its energy consumption, waste produced and the chemicals and other consumables used up during the process. To realize this objective, we used a CAD model of a product with different design scenarios, and analyze their energy consumption using an environmental impact calculator method developed. The waste produced, and the consumables used up, such as lubricants and coolants were analyzed using environmental emission factors. Optimization methods using Genetic Algorithm and Goal Programming are applied to the product design parameters in order to get the best possible product dimensions with the least environmental impact of the machining process.

10 citations


Journal ArticleDOI
TL;DR: A mixed integer programming (MIP) model is developed, of which the size does not increase even if the authors divide a time period into a number of micro time periods, and an efficient heuristic algorithm is developed by combining a decomposition scheme with a local search procedure.
Abstract: In this paper, we consider a new lot-sizing and scheduling problem (LSSP) that minimizes the sum of production cost, setup cost and inventory cost. Setup carry-over, setup overlapping, state dependent setup time as well as demand splitting are considered. For this LSSP, we develop a mixed integer programming (MIP) model, of which the size does not increase even if we divide a time period into a number of micro time periods. Also, we develop an efficient heuristic algorithm by combining a decomposition scheme with a local search procedure. Test results show that the developed heuristic algorithm finds a good quality (in practice, even better) feasible solution using far less computation time compared with the CPLEX, a competitive MIP solver.

10 citations


Journal ArticleDOI
TL;DR: In this article, the authors proposed a method to provide the distribution of option price under local volatility model when market-provided implied volatility data are given, which is one of the most widely used smile-consistent models.
Abstract: In this paper, we propose a method to provide the distribution of option price under local volatility model when market-provided implied volatility data are given. The local volatility model is one of the most widely used smile-consistent models. In local volatility model, the volatility is a deterministic function of the random stock price. Before estimating local volatility surface (LVS), we need to estimate implied volatility surfaces (IVS) from market data. To do this we use local polynomial smoothing method. Then we apply the Dupire formula to estimate the resulting LVS. However, the result is dependent on the bandwidth of kernel function employed in local polynomial smoothing method and to solve this problem, the proposed method in this paper makes use of model averaging approach by means of bandwidth priors, and then produces a robust local volatility surface estimation with a confidence interval. After constructing LVS, we price barrier option with the LVS estimation through Monte Carlo simulation. To show the merits of our proposed method, we have conducted experiments on simulated and market data which are relevant to KOSPI200 call equity linked warrants (ELWs.) We could show by these experiments that the results of the proposed method are quite reasonable and acceptable when compared to the previous works.

9 citations


Journal ArticleDOI
TL;DR: In this paper, a multi-stage decision-based genetic algorithm (P-mdGA) was proposed to solve the human resource allocation problem (hRAP) in a hotel, where the managers usually need helpful automatic support for effectively allocating hotel staff to hotel tasks.
Abstract: The purpose of this study is to optimally allocate the human resources to tasks while minimizing the total daily human resource costs and smoothing the human resource usage. The human resource allocation problem (hRAP) under consideration contains two kinds of special constraints, i.e. operational precedence and skill constraints in addition to the ordinary constraints. To deal with the multiple objectives and the special constraints, first we designed this hRAP as a network problem and then proposed a Pareto multistage decision-based genetic algorithm (P-mdGA). During the evolutionary process of P-mdGA, a Pareto evaluation procedure called generalized Pareto-based scale-independent fitness function approach is used to evaluate the solutions. Additionally, in order to improve the performance of P-mdGA, we use fuzzy logic controller for fine-tuning of genetic parameters. Finally, in order to demonstrate the applicability and to evaluate the performance of the proposed approach, P-mdGA is applied to solve a case study in a hotel, where the managers usually need helpful automatic support for effectively allocating hotel staff to hotel tasks.

9 citations


Journal ArticleDOI
TL;DR: In this article, an adaptive genetic algorithm (AGA) with spontaneously adjusting crossover and mutation rate depending upon the status of current population was proposed to minimize total completion time of a single-machine scheduling problem.
Abstract: In this paper, we study a single-machine scheduling problem with deteriorating processing time of jobs and multiple rate-modifying activities which reset deteriorated processing time to the original processing time. In this situation, the objective function is to minimize total completion time. First, we formulate an integer programming model. Since the model is difficult to solve as the size of real problem being very large, we design an improved genetic algorithm called adaptive genetic algorithm (AGA) with spontaneously adjusting crossover and mutation rate depending upon the status of current population. Finally, we conduct some computational experiments to evaluate the performance of AGA with the conventional GAs with various combinations of crossover and mutation rates.

Journal ArticleDOI
TL;DR: In this article, the characteristics of an aeronautical product development program are analyzed to figure out the limitations of current DFSS methodology and the prerequisite to deployment of DFSS at the program level is suggested.
Abstract: Design for Six Sigma (DFSS) has been implemented in many companies to enhance their business performance and customer satisfaction. However, DFSS has not been widely applied to the aircraft industry which operates large, complex development programs. In this paper, the characteristics of an aeronautical product development program are analyzed to figure out the limitations of current DFSS methodology and the prerequisite to deployment of DFSS at the program level is suggested.

Journal ArticleDOI
TL;DR: This paper suggests an improved optimal algorithm for the minimum connected dominating set problem, and extensive computational results showed that the algorithm outperformed the previous exact algorithms.
Abstract: One of the critical issues in wireless sensor network is the design of a proper routing protocol. One possible approach is utilizing a virtual infrastructure, which is a subset of sensors to connect all the sensors in the network. Among the many virtual infrastructures, the connected dominating set is widely used. Since a small connected dominating set can help to decrease the protocol overhead and energy consumption, it is preferable to find a small sized connected dominating set. Although many algorithms have been suggested to construct a minimum connected dominating set, there have been few exact approaches. In this paper, we suggest an improved optimal algorithm for the minimum connected dominating set problem, and extensive computational results showed that our algorithm outperformed the previous exact algorithms. Also, we suggest a new heuristic algorithm to find the connected dominating set and computational results show that our algorithm is capable of finding good quality solutions quite fast.

Journal ArticleDOI
TL;DR: In this article, the impact of interdependencies among handling time elements on the expectation and variance of the cycle times of a container crane has been investigated, and the authors suggest formulas for estimating the expectations and the variances of cycle times for various types of operations of a yard crane.
Abstract: During the design process of a terminal, the handling capacity of a container yard needs to be evaluated in advance. This study suggests formulas for estimating the expectations and the variances of cycle times for various types of operations of a yard crane. Statistical analysis is used to estimate the expectations and variances. The main focus of this study is to show the impact of interdependencies among handling time elements on the expectation and variance of the cycle times; these interdependencies have not been considered in previous studies. Numerical experiments are done for evaluating the difference in the variance of cycle times and the waiting of trucks between the cases with and without the consideration of interdependencies.

Journal ArticleDOI
TL;DR: The results show that the proposed algorithm outperforms the GA, PSO, SA, and TS algorithms, while being a good competitor to some other hybridized techniques in solving a selected number of benchmark Job Shop Scheduling problems.
Abstract: In solving the Job Shop Scheduling Problem, the best solution rarely is completely random; it follows one or more rules (heuristics). The Genetic Algorithm (GA), Particle Swarm Optimization (PSO), Simulated Annealing, and Tabu search, which belong to the Evolutionary Computations Algorithms (ECs), are not efficient enough in solving this problem as they neglect all conventional heuristics and hence they need to be hybridized with different heuristics. In this paper a new algorithm titled "Shaking Optimization Algorithm" is proposed that follows the common methodology of the Evolutionary Computations while utilizing different heuristics during the evolution process of the solution. The results show that the proposed algorithm outperforms the GA, PSO, SA, and TS algorithms, while being a good competitor to some other hybridized techniques in solving a selected number of benchmark Job Shop Scheduling problems.

Journal ArticleDOI
TL;DR: In this paper, the authors developed a technology level quantification (TLQ) model by utilizing a learning curve, where the original learning curve shows the relationship between cumulative number of units and the required time for the unit.
Abstract: This paper develops a technology level quantification (TLQ) model by utilizing a learning curve. Original learning curve shows the relationship between cumulative number of units and the required time for the unit. On the other hand, in our developed model, the technology level, such as speed of production and quality of the produced items, is expressed as a function of not cumulative number of units but time, for increasing generality. Furthermore, for expressing each learning that consists of conceptual learning and operational learning, S-curve is utilized in our developed model. By fitting the S-curve and/or decomposing into some activities, our TQL model can be applied to approximate organizational and complicated process. Some variations in time and levels, parameters of our developed model are shown. By using the parameters, the procedure to identify our developed model is proposed. Also, the influential factors for the parameters of our developed model are discussed with classifying the factors into technoware, infoware, humanware, and orgaware. The expected technology level is utilized for expecting the capacity of production system, and the expected capacity can be utilized in predicting various changes in the organization and deciding managerial decision about TT. A case study in manufacturing industry shows the effectiveness of the developed model.

Journal ArticleDOI
TL;DR: In this article, the authors integrated objective measurements and subjective evaluation to predict car seat discomfort considering both static and dynamic conditions, including pressure distribution and vibration transmissibility, in a real car with the engine turned off.
Abstract: A driver interacts directly with the car seat at all times. There are ergonomic characteristics that have to be followed to produce comfortable seats. However, most of previous researches focused on either static or dynamic condition only. In addition, research on car seat development is critically lacking although Malaysia herself manufactures its own car. Hence, this paper integrates objective measurements and subjective evaluation to predict seat discomfort. The objective measurements consider both static and dynamic conditions. Steven"s psychophysics power law has been used in which after expansion; ψ = a+bφ s α +cφ v β where ψ is discomfort sensation, φ s α is static modality with exponent α and φ v β is dynamic modality with exponent β. The subjects in this study were local and the cars used were Malaysian made compact car. Static objective measurement was the seat pressure distribution measurement. The experiment was carried out on the driver"s seat in a real car with the engine turned off. Meanwhile, the dynamic objective measurement was carried out in a moving car on real roads. During pressure distribution and vibration transmissibility experiments, subjects were requested to evaluate their discomfort levels using vehicle seat discomfort survey questionnaire together with body map diagram. From subjective evaluations, seat pressure and vibration dose values exponent for static modality α=1.51 and exponent for dynamic modality β=1.24 were produced. The curves produced from the Eq.s showed better R-sq values (99%) when both static and dynamic modalities were considered together as compared to Eq. with single modality only (static or dynamic only R-Sq = 95%). In conclusion, car seat discomfort prediction gives better result when seat development considered both static and dynamic modalities; and using ergonomic approach.

Journal ArticleDOI
TL;DR: In this article, a Tournament Tree is used to generate assembly sequences regardless of the number of parts in a water valve assembly, and the best assembly sequence is generated by using the Tournament Tree.
Abstract: In seeking further efficiency in production preparation, it is common to examine assembly sequences using digital manufacturing. The assembly sequences affect the product evaluation, so it is necessary to test several assembly sequences before actual production. However, because selection and testing of assembly sequences depends on the operator's personal experience and intuition, only a small number of assembly sequences are actually tested. Nevertheless, there is a systematic method for generating assembly sequences using a contact-related figure. However, the larger the number of parts, the larger the number of assembly sequences geometric becomes. The purpose of this study is to establish a systematic method of generating efficient assembly sequences regardless of the number of parts. To generate such assembly sequences selectively, a "Tournament Tree," which shows the structure of an assembly sequence, is formulated. Applying the method to assembly sequences of a water valve, good assembly sequences with the same structure as the Tournament Tree are identified. The structure of such a Tournament Tree tends to have fewer steps than the others. As a test, the structure is then applied for a drum cartridge with 38 parts. In all the assembly sequences generated from the contact-related figures, the best assembly sequence is generated by using the Tournament Tree.

Journal ArticleDOI
TL;DR: In this paper, a skill transfer system for self-tapping screw-tightening operations is proposed, which consists of screwdriver operation training and screw tightening training with feedback information about the results of the operation.
Abstract: Self-tapping screws have some operational peculiarities. In spite of their economical advantage that requires no prior tapping operation, a weakness of self-tapping screw-tightening operations is that screws can easily be tightened at a non-right angle, thus resulting in an improper tightening strength. Increases in outsourced workers have reduced labor costs, but the accompanying high worker fluidity means that new workers are more frequently introduced into factories. It is necessary to train new workers for self-tapping screw-tightening operations, which occupies a considerable portion of ordinary assembly works. The purpose of this study is to develop and implement a skill transfer system for the operation. This study (1) proposes a set of characteristic values for evaluating the quality of the operation and develops a device that can measure these values; (2) proposes criteria for evaluating the resultant quality of the tightening; and (3) develops a skill training system for better work performance. Firstly, sets of characteristic values for evaluating the quality of the operation, namely, torque, vertical pressure forces and horizontal vibration forces, are proposed. A device that can measure these values is developed. Secondly, criteria for evaluating the resultant quality of the tightening are identified, involving tightening torque, maximum vertical pressure and timing, vibration area during the processing and tightening period, and work angle. By using such parameters, workers with the proper aptitude can be identified. Thirdly, a skill training system for the operation is developed. It consists of screwdriver operation training and screw-tightening training with feedback information about the results of the operation. Finally, the validity of the training system is experimentally verified using new operators and actual workers.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a product innovative function development procedure into TRIZ (theory of inventive problem solving) to transform voice of customers into product design and to create novel functions, respectively.
Abstract: Recently, the fast development of communication technologies has brought a great convince for human beings' life. Lots of commercial services and transactions can be done by using mobile communication equipments such as smart phones. Consequently, smart phones have attracted lots of companies to invest them for their potential growth of market. Compared with basic feature phone, a smart phone can offer more advanced computing ability and connectivity. However, based on the responses of customers, there still are many defectives such as not friendly and smooth operation, short standby time of batteries, threat of virus infected and so on needed to be improved. Therefore, this study will propose a product innovative function development procedure into TRIZ (theory of inventive problem solving) to transform voice of customers into product design and to create novel functions, respectively. A case study of smart phones will be provided to illustrate the effectiveness of the proposed method.

Journal ArticleDOI
TL;DR: In this article, a survey was conducted on 20 respondents consisting of designers involved in product development from various industries to identify the factors involving user and design practices in the design process of new product development.
Abstract: The purpose of this paper is to study user's involvement in new product development (NPD). It seeks to identify the factors involving user and design practices in the design process of new product development. A survey was conducted on 20 respondents consisting of designers involved in product development from various industries. The study focused on the early activities of the product design process which is called product specification. The analysis performed considers the importance of involving users in design decision. The outcome of this research is the significance of involving users and its effect on product development activities. The research also provides a model for an integrated user, designer and product knowledge activity in the product development process.

Journal ArticleDOI
TL;DR: In this paper, an electronic company had applied 12 ECM Program and tries to choose one of those programs using 6 criteria, such as total cost involved, quality, recyclable material, process waste reduction, packaging waste reduction and regulation compliance.
Abstract: Nowadays, green purchasing, stop global warming, love the mother earth, and others that related to environment become hot issues. Manufactures industries tend to more active and responsive to those issues by adopting green strategies or program like Environmentally Conscious Manufacturing (ECM). In this article, an electronic company had applied 12 ECM Program and tries to choose one of those programs using 6 criteria, such as total cost involved, quality, recyclable material, process waste reduction, packaging waste reduction, and regulation compliance. By using multi-criteria decision making model, i.e. Analytical Hierarchy Process (AHP), Technique for Order Preference by Similarity to Ideal Solution (TOPSIS), and Modified TOPSIS methods, the ECM Program 9 (Open pit) is the best option.

Journal ArticleDOI
TL;DR: Recommendations for older driver-friendly automobile interior design have been determined by taking into account older people"s physical and cognitive characteristics, which will facilitate automobile interior designs which are fitter to older drivers" visual, cognitive, and manual capabilities.
Abstract: Recommendations for older driver-friendly automobile interior design have been determined by taking into account older people"s physical and cognitive characteristics. Twenty three older people (aged from 54 to 78) and five younger people (from 20 to 29) performed several tasks in actual driving conditions, in which their reaction times and performance errors were recorded. Some design factors were found to be related to older drivers" visibility and controllability. Several design recommendations were proposed in terms of cluster color and font, display location, and HVAC control type. Proposed recommendations are expected to satisfy a wider range of older drivers as these will facilitate automobile interior designs which are fitter to older drivers" visual, cognitive, and manual capabilities.

Journal ArticleDOI
TL;DR: In this article, the impact of technology components on SME's business performance is analyzed using the Structural Equation Model (SEM) with Partial Least Square software (PLS).
Abstract: Technology is one of the major competitive advantages for small and medium enterprises (SME), especially ones operating in the manufacturing sector. Increasing technological capabilities is the basis for SME's business performance improvement. The main problem of SME's is a limitation in the areas of production facilities, technology, and human resources. Some of these constraints cause a decline in business performance and competitiveness of SME's. In this case analysis of technology components has to be carried out to determine the effect of technology on SME's business performance improvement. This study aims to measure the components of technology and to analyze the influence of each technology component on business performance of rattan processing SME's in South Kalimantan. The assessment is carried out on the technoware, humanware, inforware, and orgaware components using the technometric method (UNESCAP). Business performance is measured through a combination of financial and non-financial aspects deducted from financial and marketing figures. Analysis of the influence of technology components on business performance of SME's is done using the Structural Equation Model (SEM) with Partial Least Square software (PLS). Data is collected through interviews and questionnaires from 21 rattan processing SME's in South Kalimantan that produce rattan furnitures. The results show that the value of the contribution of technology (TCC) to the performance of rattan processing SME's in South Kalimantan is still quite low. Analysis of the results shows a direct influence of technoware and humanware on business performance, while orgaware influences business performance indirectly through humanware.

Journal ArticleDOI
TL;DR: In this article, the authors focus on other ways to decrease cycle time in a self-balancing production line, where each worker is assigned to a particular fixed work, and when specific conditions are satisfied, production remains balanced.
Abstract: In traditional production lines, such as assembly lines, each worker is usually assigned to a particular fixed work, and decreasing the task to master the assigned work is valuated. However, when an imbalance exists between workers" speeds, if a worker delays the overall work in the production line, the production rate of the particular line will also decrease. To avoid this problem, the "Self-Balancing Production Line" was introduced. In this type of production line, each worker is assigned work dynamically, and when specific conditions are satisfied, production remains balanced. Characteristics of these lines that can be preempted at any place have already been analyzed by some researchers. A previous paper examined the situation in which only a single worker can process one machine and cannot preempt processing, and the improved policy of an ordinary self-balancing production line, which specifies which stations workers can process and how workers can behave.This policy achieveda high production rate with only four stations and two workers (Buzacott, 2002). In that paper, worker processing stations and the behavior of a specific worker were limited, andthe paper focused only on specific stations and workers. Therefore, it is not applicable to any worker sequence. In this paper, we focus on other ways to decrease cycle time. In this kind of line, a worker processes at his or her speed. Therefore, if a worker is assigned stations according to his or her speed, the line can decrease cycle time. To do so, we relax the assumptions of this type of line and set a new condition. Under these conditions, we compare our results to the results of previous papers.

Journal ArticleDOI
TL;DR: In this paper, an improved binomial method for pricing financial deriva-tives by using cell averages was proposed, where non-overlapping cells are introduced around each node in the binomial tree, the proposed method calculates cell averages of payoffs at expiry and then performs the backward valuation process.
Abstract: We present an improved binomial method for pricing financial deriva-tives by using cell averages. After non-overlapping cells are introduced around each node in the binomial tree, the proposed method calculates cell averages of payoffs at expiry and then performs the backward valuation process. The price of the derivative and its hedging parameters such as Greeks on the valuation date are then computed using the compact scheme and Richardson extrapolation. The simulation results for European and American barrier options show that the pro-posed method gives much more accurate price and Greeks than other recent lattice methods with less computational effort.

Journal ArticleDOI
Abstract: The problem of scheduling in permutation flowshops has been extensively investigated by many researchers. Recently, attempts are being made to consider more than one objective simultaneously and develop algorithms to obtain a set of Pareto-optimal solutions. Varadharajan et al. (2005) presented a multi-objective simulated-annealing algorithm (MOSA) for the problem of permutation-flowshop scheduling with the objectives of minimizing the makespan and the total flowtime of jobs. The MOSA uses two initial sequences obtained using heuristics, and seeks to obtain non-dominated solutions through the implementation of a probability function, which probabilistically selects the objective of minimizing either the makespan or the total flowtime of jobs. In this paper, the same problem of heuristically developing non-dominated sequences is considered. We propose an effective heuristics based on simulated annealing (SA), in which the weighted sum of the makespan and the total flowtime is used. The essences of the heuristics are in selecting the initial sequence, setting the weight and generating a solution in the search process. Using a benchmark problem provided by Taillard (1993), which was used in the MOSA, these conditions are extracted in a large-scale experiment. The non-dominated sets obtained from the existing algorithms and the proposed heuristics are compared. It was found that the proposed heuristics drastically improved the performance of finding the non-dominated frontier.

Journal ArticleDOI
TL;DR: In this paper, the authors considered a batch scheduling problem with due-dates constraints and constructed the algorithm to find the opti-mal schedule that satisfy the due-dates constraint, batch size constraint, inventory time constraint and mini-mize total flow time.
Abstract: This paper describes the issue of batch scheduling.In food production, the lead-time from produc-tion to sale should be decreased becausefreshness of the product is important. Products are shipped at diverse times depending on a demand of sellers, because the types of sellers has become diversified such as super-markets, convenience stores and etc. production of quantity demanded must be completed by time to ship it then. The authors consider a problem with due-dates constraints and construct the algorithm to find the opti-mal schedule that satisfy the due-dates constraint, batch size constraint, inventory time constraint and mini-mize total flow time.

Journal ArticleDOI
TL;DR: A visualization technique in preference transition analysis based on recency and frequency is proposed, which ensures that the semantic meaning of each item and its transition can be clearly identified by its different types of node size, color and edge style.
Abstract: Given a directed graph, we can determine how the user"s preference moves from one product item to another. In this graph called "preference transition network", each node represents the product item while its edge pointing to the other nodes represents the transition of user"s preference. However, with the large number of items make the network become more complex, unclear and difficult to be interpreted. In order to address this problem, this paper proposes a visualization technique in preference transition analysis based on recency and frequency. By adapting these two elements, the semantic meaning of each item and its transition can be clearly identified by its different types of node size, color and edge style. The experiment in a sales data has shown the results of the proposed approach.

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper developed a computer game "Ecopoly", based on the board game which enables us to learn the relationship between environmental problems and economic activities, and to learn more about environmental problems.
Abstract: It is necessary for humans who have been facing environmental problems to build a sustainable society in which economic activities coexist with nature. To realize such goals, it is essential to promote and enhance environmental education, and to raise global awareness of environmental issues. As a preceding study, a board game "Ecopoly" based on the Kyoto protocol and the estate dealings game, was developed and the validity of environmental education was verified. This study further aims to develop a computer game "Ecopoly", based on the board game which enables us to learn the relationship between environmental problems and economic activities, and to learn more about environmental problems. This computer game enables to concentrate on it by automatic complicated calculations, and to get a higher visual effect compared with the board game. Experimental testing of the game was conducted with 13 college student subjects, and the validity of the game was verified.