scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Computer Integrated Manufacturing in 2011"


Journal ArticleDOI
TL;DR: An up-to-date review of the CAPPResearch works, a critical analysis of journals that publish CAPP research works, and an understanding of the future direction in the field are provided.
Abstract: For the past three decades, computer-aided process planning (CAPP) has attracted a large amount of research interest. A huge volume of literature has been published on this subject. Today, CAPP research faces new challenges owing to the dynamic markets and business globalisation. Thus, there is an urgent need to ascertain the current status and identify future trends of CAPP. Covering articles published on the subjects of CAPP in the past 10 years or so, this article aims to provide an up-to-date review of the CAPP research works, a critical analysis of journals that publish CAPP research works, and an understanding of the future direction in the field. First, general information is provided on CAPP. The past reviews are summarised. Discussions about the recent CAPP research are presented in a number of categories, i.e. feature-based technologies, knowledge-based systems, artificial neural networks, genetic algorithms, fuzzy set theory and fuzzy logic, Petri nets, agent-based technology, Internet-based technology, STEP-compliant CAPP and other emerging technologies. Research on some specific aspects of CAPP is also provided. Discussions and analysis of the methods are then presented based on the data gathered from the Elsevier's Scopus abstract and citation database. The concepts of 'Subject Strength' of a journal and 'technology impact factor' are introduced and used for discussions based on the publication data. The former is used to gauge the level of focus of a journal on a particular research subject/domain, whereas the latter is used to assess the level of impact of a particular technology, in terms of citation counts. Finally, a discussion on the future development is presented.

304 citations


Journal ArticleDOI
TL;DR: Three common integration approaches, non-linear approach, closed loop approach and distributed approach, are discussed with their relative advantages and disadvantages and reported research is classified accordingly.
Abstract: Process planning and scheduling functions strongly influence profitability of manufacturing a product, resource utilisation and product delivery time. Several researchers have addressed the need for integration of process planning and scheduling (IPPS) functions to facilitate flexibility and for improving profitability of manufacturing a product, delivery time as well as creation of realistic process plans that can be executed readily on shop floor. This article presents a state-of-the-art review on IPPS. Three common integration approaches, non-linear approach, closed loop approach and distributed approach, are discussed with their relative advantages and disadvantages and reported research is classified accordingly. It also identifies several future research directions.

116 citations


Journal ArticleDOI
TL;DR: The key features of cooperating robotic cells in the automotive assembly are highlighted, highlighting the key elements in the engineers’ decision-making process, while designing and implementing an assembly line for the Body In White.
Abstract: This article discusses the key features of cooperating robotic cells in the automotive assembly, highlighting the key elements in the engineers' decision-making process, while designing and implementing an assembly line for the Body In White. The main issues, affecting the performance of cooperating robotic cells, are discussed with the aid of a case study, where two different scenarios are compared. The first scenario uses a conventional fixture-based configuration of a robotic cell for performing a welding operation, while the second features the use of cooperating robots. The cases are compared with the aid of a simulation platform, and future potential developments are also discussed.

68 citations


Journal ArticleDOI
TL;DR: ALean pull system implementation procedure based on combining a supermarket supply with two constant work-in-process (CONWIP) structures that can concurrently consider manufacturing system variability and demand uncertainty in multi-products, multi-stage processes to achieve lean pull system is proposed.
Abstract: Lean philosophy is a systematic approach for identifying and eliminating waste through continuous improvement in pursuit of perfection, using a pull-control strategy derived from customers' requirements. However, not all lean implementations have produced such desired results because of not having a clear implementation procedure and execution guide. This article proposes a lean pull system implementation procedure based on combining a supermarket supply with two constant work-in-process (CONWIP) structures that can concurrently consider manufacturing system variability and demand uncertainty in multi-products, multi-stage processes to achieve lean pull system. The study uses a multiple criteria decision-making (MCDM) method, using a hybrid Taguchi technique for order preference by similarity to ideal solution (TOPSIS) method that takes customer demand uncertainty as a noise factor. This allowed identification of the most robust production control strategy to identify an optimal scenario from alternative designs. Value stream mapping (VSM) was applied to visualise what conditions would work when improvements are introduced. Finally, a real-world, thin film transistor-liquid crystal display (TFT-LCD) manufacturing case-study under demand uncertainty is used to demonstrate and test findings. After comparing the current-state map and the future-state map of the case-study, the simulation results indicate that the average cycle time reduced from 15.4 days to 4.82 days without any loss of throughput.

67 citations


Journal ArticleDOI
Qing X. Li1, Cheng Wang1, Jing Wu1, Jun Li1, Ze-Yuan Wang1 
TL;DR: A CP-based business processes collaboration modelling technique is developed to improve the feasibility of IS in CCE and a framework to integrate applications and services deployed in public clouds and intra-IS is designed.
Abstract: With the development of extended enterprise and information technologies (IT), a new business pattern with its infrastructure, cloud computing, is emerging. More and more small and medium enterprises do not implement significant parts of their information systems (IS) in-house. Instead, they prefer to use the software services and even infrastructure services provided by professional information service companies. Their business strategy, IT strategy, business processes and information technologies shall be re-aligned. Furthermore, no cloud computing service vendor can satisfy the complete functional information system requirements of an enterprise. Sometimes, enterprises have to simultaneously use software services distributed in different clouds in conjunction with their intra-IS. These bring great challenges for business-IT alignment of an enterprise in the cloud computing environment (CCE). This study reviews business-IT alignment problems and models in CCE. The concept of collaboration point (CP) is proposed, and a CP-based business processes collaboration modelling technique is developed to improve the feasibility of IS in CCE. A framework to integrate applications and services deployed in public clouds and intra-IS is designed and a run-time platform with the collaboration agent technique is developed to realise the concept of CP. The case study illustrates the implementation of techniques developed in this article.

63 citations


Journal ArticleDOI
TL;DR: The derived algorithmic clustering proved to be more efficient compared with the empirical one, and thus, it can be used by design engineers as an effective tool for the derivation of product clustering alternatives.
Abstract: The clustering of a product's components into modules is an effective means of creating modular architectures. This article initially links the clustering efficiency with the interactions of a product's components, and interesting observations are extracted. A novel clustering method utilising neural network algorithms and design structure matrices (DSMs) is then introduced. The method is capable of reorganising the components of a product in clusters, in order for the interactions to be maximised inside and minimised outside the clusters. In addition, a multi-criteria decision-making approach is used, in order for the efficiency of the different clustering alternatives, derived by the network, to be evaluated. Finally, a case study is presented to demonstrate and assess the application of the method. The derived algorithmic clustering proved to be more efficient compared with the empirical one, and thus, it can be used by design engineers as an effective tool for the derivation of product clustering alternatives.

57 citations


Journal ArticleDOI
TL;DR: Two of the main models for tolerance analysis, the Jacobian and the torsor are introduced and briefly described and then compared showing their analogies and differences.
Abstract: The dimensional and geometrical variations of each part of an assembly have to be limited by tolerances able to ensure both a standardised production and a certain level of quality, which is defined by satisfying functional requirements. The appropriate allocation of tolerances among the different parts of an assembly is the fundamental tool to ensure assemblies that work correctly at lower costs. Therefore, there is a strong need to develop a tolerance analysis to satisfy the requirements of the assembly by the tolerances imposed on the single parts. This tool has to be based on a mathematical model able to evaluate the cumulative effect of the single tolerances. Actually there are some different models used or proposed by the literature to make the tolerance analysis of an assembly constituted by rigid parts, but none of them is completely and univocally accepted. Some authors focus their attention on the solution of single problems found in these models or in their practical application in computer-aided tolerancing systems. But none of them has done an objective and complete comparison among them, analysing the advantages and the weakness and furnishing a criterion for their choice and application. This paper briefly introduces two of the main models for tolerance analysis, the Jacobian and the torsor. These models are briefly described and then compared showing their analogies and differences. The evolution of these two models, known as unified Jacobian-torsor model, is presented too.

57 citations


Journal ArticleDOI
TL;DR: A work-in-progress management framework based on smart objects such as radio frequency identification/Auto-ID devices and web service technologies in a ubiquitous manufacturing (UM) environment is proposed to improve the optimal planning and control of the entire shop floor.
Abstract: Recent developments in wireless technologies have created opportunities for developing next-generation manufacturing systems with real-time traceability, visibility and interoperability in shop-floor planning, execution and control. This article proposes a work-in-progress management framework based on smart objects such as radio frequency identification/Auto-ID devices and web service technologies in a ubiquitous manufacturing (UM) environment. Under this framework, two types of services (data source service and gateway data service) and WIPA (work-in-progress agent) are designed and developed to manage and control the real-time materials and information flows to improve the optimal planning and control of the entire shop floor. During production execution, real-time visibility explorers are provided for operators and supervisors to reflect the real-time situation of current manufacturing environment. It follows a simple but effective principle: what you see is what you do and what you do is what you see. Production disturbances could thus be detected and fed back to decision makers for implementing closed-loop shop-floor control. In addition, some important standards such as ISA 95 and business-to-manufacturing markup language (B2MML) are adopted to establish the information model and schemas of WIP called wipML (work-in-progress markup language). Based on B2MML and wipML, the real-time manufacturing information can be effectively encapsulated, shared and exchanged between gateways, WIPA and heterogeneous enterprise information systems. The presented framework is studied and demonstrated using a near real-life simplified shop floor that consists of typical manufacturing objects.

53 citations


Journal ArticleDOI
TL;DR: The comprehensive performance evaluation metrics for service-oriented manufacturing network is proposed, which combines the key performance indicators of services in business, service and implementation level and brings forward the performance evaluation model to analyse the local and global performance.
Abstract: The management of services is the kernel content of service-oriented manufacturing. However, it is difficult to realise the integration and optimisation of services in an open environment, which contains large amounts randomicity and uncertainty. The key problem is how to realise the optimal service selection and composition. In this article, the comprehensive performance evaluation metrics for service-oriented manufacturing network is proposed, which combines the key performance indicators of services in business, service and implementation level. The performance evaluation model is brought forward to analyse the local and global performance. An uncertainty and genetic algorithm-based method is developed to realise the optimal service selection and composition in effective and efficient way.

50 citations


Journal ArticleDOI
TL;DR: The use of enterprise engineering (EE) to achieve strategic alignment between business and IT is proposed, and new building blocks and life-cycle phases have been defined for their use in an enterprise architecture context.
Abstract: Information systems and information technology (IS/IT, hereafter just IT) strategies usually depend on a business strategy The alignment of both strategies improves their strategic plans From an external perspective, business and IT alignment is the extent to which the IT strategy enables and drives the business strategy This article reviews strategic alignment between business and IT, and proposes the use of enterprise engineering (EE) to achieve this alignment The EE approach facilitates the definition of a formal dialog in the alignment design In relation to this, new building blocks and life-cycle phases have been defined for their use in an enterprise architecture context This proposal has been adopted in a critical process of a ceramic tile company for the purpose of aligning a strategic business plan and IT strategy, which are essential to support this process

49 citations


Journal ArticleDOI
TL;DR: A robust bi-objective evaluation function was defined to obtain a robust, effective solution that is only slightly sensitive to data uncertainty and can generate a trade off for effectiveness and robustness for various degrees of uncertainty.
Abstract: Most of scheduling methods consider a deterministic environment for which the data of the problem are known. Nevertheless, in reality, several kinds of uncertainties should be considered, and robust scheduling allows uncertainty to be taken into account. In this article, we consider a scheduling problem under uncertainty. Our case study is a hybrid flow shop scheduling problem, and the processing time of each job for each machine at each stage is the source of uncertainty. To solve this problem, we developed a genetic algorithm. A robust bi-objective evaluation function was defined to obtain a robust, effective solution that is only slightly sensitive to data uncertainty. This bi-objective function minimises simultaneously the makespan of the initial scenario, and the deviation between the makespan of all the disrupted scenarios and the makespan of the initial scenario. We validated our approach with a simulation in order to evaluate the quality of the robustness faced with uncertainty. The computational results show that our algorithm can generate a trade off for effectiveness and robustness for various degrees of uncertainty.

Journal ArticleDOI
TL;DR: An integrated method to build a decision support system for supplier evaluation and selection that incorporates quantitative and qualitative calculations together to deal with vague and uncertain data available to decision makers is constructed.
Abstract: In the last decades, supply chain management (SCM) has become a significant issue in real life and in the literature due to increasing globalisation. Moreover, supplier selection and periodical evaluation has become an important tool for the companies in order to maintain an effective SCM. The main goal of this study is to construct an integrated method to build a decision support system for supplier evaluation and selection that incorporates quantitative and qualitative calculations together to deal with vague and uncertain data available to decision makers. A methodology, which is capable of evaluating and monitoring suppliers' performance, is constructed, using fuzzy analytic hierarchy process (AHP) to weight the established decision criteria and ELECTRE III to evaluate, rank and classify performance of suppliers regarding relative criteria. The proposed methodology is applied to a real-life supplier-selection and classification problem of a pharmaceutical company.

Journal ArticleDOI
TL;DR: A novel methodology that has been developed to help designers trace, analyse and evaluate engineering changes occurring in the product design phase and a knowledge-based method has been proposed to resolve design conflicts by reusing previous design change knowledge.
Abstract: Engineering design change management is very important to the success of engineering product development. It has been recognised that the earlier change issues are addressed, the greater product lifecycle costs can be saved. However, in practice, most engineering changes happen in the manufacturing phase, the later phase of product development. Change issues happening in the design phase, especially between the functional and the structural domains, have been a research focus in recent years, and thus there is significant research work that has been carried out to resolve early engineering change issues from different perspectives. This article presents a novel methodology that has been developed to help designers trace, analyse and evaluate engineering changes occurring in the product design phase. A modelling method is employed to enhance the traceability of potential design changes occurred between the functional and structural domains of design. Based on functional and physical models, a matrix-based method is developed to analyse change propagations between components and help find out design conflicts arising from design changes. A knowledge-based method has been proposed to resolve design conflicts by reusing previous design change knowledge. An industrial example about changes of a wind turbine cooling system has been used to help understand the methodology and prove its usefulness.

Journal ArticleDOI
TL;DR: An overview of the Core Manufacturing Simulation Data, a standardised, computer-interpretable representation that allows for the efficient exchange of manufacturing shop floor-related data in a manner that it can be used in the creation and execution of manufacturing simulations, is presented.
Abstract: Standard representations for information entities common to manufacturing simulation could help reduce the costs associated with simulation model construction and data exchange between simulation and other manufacturing applications. This would make simulation technology more affordable and accessible to a wide range of potential industrial users. To foster the more widespread use of manufacturing simulation technology through the reduction of data interoperability issues, the Core Manufacturing Simulation Data (CMSD) specification was created. CMSD is a standardised, computer-interpretable representation that allows for the efficient exchange of manufacturing shop floor-related data in a manner that it can be used in the creation and execution of manufacturing simulations. The work has being standardised under the auspices of the Simulation Interoperability Standards Organization (SISO). CMSD defines an information model that describes the characteristics of and relationships between the core manufacturing entities that define shop floor operations. This enables greater integration and data exchange possibilities for manufacturing simulations and other manufacturing applications. This article presents an overview of CMSD, its motivation, structure, and content. Descriptions of case studies using CMSD to integrate real world manufacturing applications are also presented.

Journal ArticleDOI
TL;DR: The authors present an extended version of the LH-model for multi-criteria supplier selection problem and an illustrative example is presented to compare the authors' model and the LH (LH-model).
Abstract: Selecting an appropriate supplier is now one of the most important decisions of the purchasing department. Liu and Hai (Liu, F.H.F. and Hai, H.L., 2005. The voting analytic hierarchy process method for selecting supplier. International Journal of Production Economics, 97, 308-317) proposed a voting analytic hierarchy process method for selecting suppliers. Despite its many advantages, Liu and Hai's model (LH-model) has some shortcomings. In this article, the authors present an extended version of the LH-model for multi-criteria supplier selection problem. An illustrative example is presented to compare the authors' model and the LH-model.

Journal ArticleDOI
TL;DR: This article presents current developments and applications of the Evolvable Production Systems (EPS), a next generation of production systems that have distributed control and are composed of integrated intelligent modules.
Abstract: Current major road mapping efforts, such as ManuFuture, FutMan and EUPASS, have all clearly underlined that true industrial sustainability will require far higher levels of systems' autonomy and adaptability. In accordance with these recommendations, the Evolvable Production Systems (EPS) has aimed at developing such technological solutions and support mechanisms. Since its inception in 2002 (Onori, M., 2002. In: ISR2002-33rd International Symposium on Robotics, Stockholm, 617-621) as a next generation of production systems, the concept is being further developed and tested to emerge as a production system paradigm. The essence of evolvability resides not only in the ability of system components to adapt to the changing conditions of operation but also to assist in the evolution of these components in time such that processes may become self-X, X standing for one or more desirable properties of a system subjected to a variable operation condition such as self-evolvable, self-reconfigurable, self-tuning, self-diagnosing, and so on. Characteristically, evolvable systems have distributed control and are composed of integrated intelligent modules. To assist the development and life cycle issues, comprehensive methodological framework is being developed. A concerted effort is being exerted through European research projects in collaboration with European manufacturers, technology/equipment suppliers and universities. After briefly stating the fundamental concepts of EPS, this article presents current developments and applications.

Journal ArticleDOI
TL;DR: The results of this study showed that both of the proposed integration systems managed to estimate the optimal cutting conditions, leading to the minimum value of machining performance when compared to the result of real experimental data.
Abstract: In this study, simulated annealing (SA) and genetic algorithm (GA) soft computing techniques are integrated to search for a set of optimal cutting conditions value that leads to the minimum value of machining performance. Twointegration systems are proposed; integrated SA-GA-type1 and integrated SA-GA-type2. The considered machining performance is surface roughness (Ra) in end milling. The results of this study showed that both of the proposed integration systems managed to estimate the optimal cutting conditions, leading to the minimum value ofmachining performance when compared to the result of real experimental data. The proposed integration systems have also reduced the number of iteration in searching for the optimal solution compared to the conventional GA and conventional SA, respectively. In other words, the time for searching the optimal solution can be made faster by using the integrated SA-GA.

Journal ArticleDOI
TL;DR: This work proposes a generic and comprehensive methodology that puts ontology engineering on a firm scientific foundation and at the same time provides a collaborative environment for effective knowledge sharing and reuse and provides a way for automatically extracting frequent terms from the data to construct an ontology in a bottom-up fashion.
Abstract: Data inconsistency and data mismatch are critical problems that limit data interoperability and hinder smooth operation of a distributed business. An ontology represents a semantic model that explicitly describes various entities and their properties of a domain of discourse and acts as a vehicle for seamless data integration and exchange. The existing methodologies for ontology development fail to provide a comprehensive coverage for different steps, e.g. pre-development, development and post-development, which are necessary for successfully developing ontologies. We propose a generic and comprehensive methodology that puts ontology engineering on a firm scientific foundation and at the same time provides a collaborative environment for effective knowledge sharing and reuse. Furthermore, our approach also provides a way for automatically extracting frequent terms from the data to construct an ontology in a bottom-up fashion. The performance of our methodology has been evaluated by developing different ontologies to solve the real life applications, e.g. fault diagnosis and root cause investigation and spare parts maintenance.

Journal ArticleDOI
TL;DR: In this study, six heuristic algorithms are used to solve a no-wait two stage flexible flow shop with minimising makespan and an adaptive neuro fuzzy inference system (ANFIS) with neural network and fuzzy theory is applied for estimating the makespan.
Abstract: In a flow shop scheduling problem, no-wait processing is becoming an important subject where the subsequent operations are performed with no-waiting time in between. In addition, it is important to estimate the schedule performance in terms of given criteria. This enables managers to conduct a reliable logistic planning while estimating robust job completion times. In this study, six heuristic algorithms are used to solve a no-wait two stage flexible flow shop with minimising makespan. Furthermore, an adaptive neuro fuzzy inference system (ANFIS) with neural network and fuzzy theory is applied for estimating the makespan. The robustness of the proposed ANFIS using the six heuristic scheduling algorithms is testified by comparing its performance with that of the actual schedule performance. This is followed by concluding remarks and potential area for further researches.

Journal ArticleDOI
TL;DR: A particle swarm optimisation (PSO)-based optimisation scheme is developed to generate a series of cutter locations that produce a minimised error on the machined surface and demonstrates a novel application of GPU on Computer Aided Manufacturing/Computer Numerical Control.
Abstract: Multi-axis machining offers higher machining efficiency and superior shaping capability compared to 3-axis machining. Machining error control is a critical issue in 5-axis flank milling of complex geometries and there is still a lack of solutions. Previous studies have shown that optimisation-based tool path planning is a feasible approach to reduction of machining error. However, the error estimation is time-consuming in the optimisation process, thus limiting the practicality of this approach. In this work, we apply graphics processing unit (GPU) computing technology to solve this problem. A particle swarm optimisation (PSO)-based optimisation scheme is developed to generate a series of cutter locations (CLs) that produce a minimised error on the machined surface. The error amount induced by each CL is simultaneously calculated by the parallel processing units of GPU. The PSO search process driven by the aggregated result is effectively accelerated. Test results show that our approach outperforms previous optimisation methods in both solution quality and computation efficiency. This work demonstrates a novel application of GPU on Computer Aided Manufacturing/Computer Numerical Control.

Journal ArticleDOI
TL;DR: The aim of this article is to present the importance of knowledge formalisation for strategic alignment, based on knowledge contained in a well-known reference model for supply chain: Supply Chain Operations Reference (SCOR) model, which is transformed into ontology.
Abstract: Firms cannot be competitive if their business and information technology strategies are not aligned. Yet achieving strategic alignment continues to be a major concern for business executives. A number of alignment models have been proposed in the literature. Enterprise modelling (EM) can deliver models that are understandable by all participants and formalised enough to map the Enterprise Engineering and Reengineering activities directly onto the business process execution. However, models need terms (names, verbs, etc.) to identify and describe the constructs modelled in the EM language used. To share business knowledge, a common modelling language is not sufficient. A common business language is required to share the understanding of any constructs used in the modelling language at a semantic level. The aim of this article is to present the importance of knowledge formalisation for strategic alignment. Our work is based on knowledge contained in a well-known reference model for supply chain: Supply Chain Operations Reference (SCOR) model. To analyse this knowledge, we transform this model into ontology. Finally, we will explore the respective advantages of the different representations of SCOR model (original text, using a business modelling language, ontology), and more generally, the contribution of ontologies as they are becoming a major issue in business modelling.

Journal ArticleDOI
TL;DR: The definition, common errors, programming approaches and verification methods of NC programs are introduced, and four categories of NC machining simulation methods are discussed.
Abstract: Since the first numerical control (NC) machine tool was created at Massachusetts Institute of Technology in the 1950s, productivity and quality of machined parts have been increased through using NC and later computer numerical control (CNC) machine tools. Like other computer programs, errors may occur in a CNC program, which may lead to scraps or even accidents. Therefore, NC programs need to be verified before actual machining commences. Computer-based NC machining simulation is an economic and safe verification method. So far, much research effort concerning NC machining simulation has been made. This paper aims to provide a comprehensive review of such research work and a clear understanding of the direction in the field. First, the definition, common errors, programming approaches and verification methods of NC programs are introduced. Then, the definitions of geometric and physical NC machining simulation are presented. Four categories of NC machining simulation methods are discussed. They are solid-based, object space-based, image space-based and Web-based NC machining simulations. Finally, future trends and concluding remarks are presented.

Journal ArticleDOI
TL;DR: The aim of this work is to develop a soft computing tool for surface roughness prediction of laser polished components, which has been proven that the results of an ensemble, which is a combination of several models, are better than single methods in many applications.
Abstract: Laser polishing of steel components is an emergent process in the automation of finishing operations in the industry. The aim of this work is to develop a soft computing tool for surface roughness prediction of laser polished components. The laser polishing process depends primarily on three factors: surface material, initial topography and energy density. Although the first two factors can be reasonably estimated, the third one is often unknown under real industrial conditions. The modelling tool developed solves this limitation. The application is composed of four stages: a data-acquisition system, a data set generated from the inputs, a soft computing model trained and validated with the data set. Finally, the model obtained is used to generate different plots of industrial interest. Different prediction models are tested until the most accurate one is selected, in order to generate the soft computing model, and due to the highly complex phenomena that influence surface roughness generation in laser polishing. Ensembles of regression trees yield the best results for the methods under consideration (multilayer perceptrons, radial basis function networks and support vector machines). It has been proven that the results of an ensemble, which is a combination of several models, are better than single methods in many applications.

Journal ArticleDOI
TL;DR: This article investigates the flow shop group scheduling with limited buffers to minimise the total completion time (makespan) and proposes two tabu search algorithms for solving each sub-problem.
Abstract: The scheduling problem in a cellular manufacturing system (CMS) has been named as group scheduling in the literature. Due to the similarities in the processing route of the parts being in a group, it is mostly referred to as flow shop group scheduling. This problem consists of two interrelated sub-problems, namely intra-group scheduling and inter-group scheduling. On the other hand, mostly there are limited buffers between successive machines in which the work-in-process inventories can be stored. This article investigates the flow shop group scheduling with limited buffers to minimise the total completion time (makespan). Regarding the NP-hardness of this problem, two tabu search algorithms are proposed for solving each sub-problem. The effectiveness of the proposed algorithms is evaluated on 270 randomly generated problems classified under 27 categories. The results of the proposed algorithm are compared with those of the heuristic published by Solimanpur-Vrat-Shankar (SVS). Computational results demonstrate significant reduction in the average makespan over the SVS-algorithm.

Journal ArticleDOI
TL;DR: This paper proposes an integrated decision support system (IDSS) that can facilitate manufacturing managers to make more efficient and effective global co-ordination decisions.
Abstract: Global manufacturing increasingly faces decision challenges of how to better manage the dependencies between different activities that take place either locally or across different locations. Co-ordination decision making not only requires the right information to be provided in the right place at the right time, but also requires the right level of support from models for decision analysis and decision evaluation. Furthermore, the alignment of co-ordination decisions with a global firm's global environment and its operations performance has been identified as crucial to the firm's success, but remains a challenge to decision makers. This paper proposes an integrated decision support system (IDSS) that can facilitate manufacturing managers to make more efficient and effective global co-ordination decisions. A combination of qualitative and quantitative analysis and assessment functions has been provided through the system's four key components (a global context modeller, a multi-criteria scoring modeller, a configurator and a co-ordinator). The evaluation of the decision system has been undertaken through a case study within the automotive industry, which demonstrates the applicability of the system to provide decision support for realistic global manufacturing co-ordination problems.

Journal ArticleDOI
TL;DR: This article assumes that several unreliable resources may fail simultaneously and develops the properties that a supervisory controller must possess and develops supervisory controllers that satisfy these properties.
Abstract: Supervisory control for deadlock-free resource allocation has been an active area of manufacturing systems research. To date, most work assumes that allocated resources do not fail. Little research has addressed allocating resources that may fail. In our previous work (Lawley, M., 2002. Control of deadlock and blocking for production systems with unreliable resources. International Journal of Production Research, 40 (17), 4563-4582; Lawley, M. and Sulistyono, W., 2002. Robust supervisory control policies for manufacturing systems with unreliable resources. IEEE Transactions on Robotics and Automation, 18 (3), 346-359), we assumed a single unreliable resource and developed supervisory controllers to ensure robust deadlock-free operation in the event of resource failure. In this article, we assume that several unreliable resources may fail simultaneously. In this case, the controller must guarantee that a set of resource failures does not propagate through blocking to stall other portions of the system. That is, it must ensure that every part type not requiring any of the failed resources should continue to produce smoothly without disruption. To do this, the controller must constrain the system to states that serve as feasible initial states for (i) a reduced system when resource failures occur and (ii) an upgraded system when failed resources are repaired. We develop the properties that such a controller must possess and then develop supervisory controllers that satisfy these properties.

Journal ArticleDOI
TL;DR: A generic system dynamics simulation model for strategic partnering in supply networks that addresses the whole supply chain starting from the suppliers to the final customers and including production and distribution actors is described.
Abstract: Individual businesses no longer compete as autonomous entities but rather by joining a supply chain alliance due to the highly competitive business situation. Supply chain coordination is truly a transformational business strategy that has a profound effect on competitive success and strategic partnering. This paper conceptually integrates supply chain coordination and strategic partnering. In this paper, we describe a generic system dynamics simulation model for strategic partnering in supply networks. Our model addresses the whole supply chain starting from the suppliers to the final customers and including production and distribution actors. It is generic and can adapt to various network structures. Finally, some scenarios based on cost and benefits are designed and the results are analysed which can assist the decision makers in a supply network.

Journal ArticleDOI
TL;DR: This study considers the mutations related to the eXtended Manufacturing Integrated System approach from the manufacturing system level to the industrial enterprise as a whole and a validation platform implemented on industrial CNC manufacturing equipments.
Abstract: Computer Numerical Control (CNC) feature-based programming with STandard for the Exchange of Product data model-compliant Numerical Control extends the collaborative model of manufacturing data exchange all along the numerical data chain. This study considers the mutations related to this approach from the manufacturing system level to the industrial enterprise as a whole. The eXtended Manufacturing Integrated System concept is introduced to fill in the gap of the current manufacturing data exchange bottleneck. It is composed of eXtended Computer Aided Design (CAD) and eXtended CNC systems to link the CAD model to the real machined part through the Manufacturing Information Pipeline. The contributions associated with these concepts are demonstrated through a validation platform implemented on industrial CNC manufacturing equipments.

Journal ArticleDOI
TL;DR: A PnSS framework that expands the scope of production services to include component services, i.e. suppliers provide not only the component entities but also a series of services related to the components' stocking, assembling and consuming, to their customers along the supply chain is proposed.
Abstract: Production service system (PnSS) is a new business mode where a manufacturer obtains manufacturing resources in the form of continuous production services instead of resource entities. This new mode helps reduce the setup costs and technical/financial risks of production resources while ensuring their life cycle service levels. Industrial product service system advocates the servitisation of production facilities and has evolved into a successful and mature type of PnSS. This article proposes a PnSS framework that expands the scope of production services to include component services, i.e. suppliers provide not only the component entities but also a series of services related to the components' stocking, assembling and even consuming, to their customers along the supply chain. A systematic PnSS configuration (PnSSC) methodology and the enabling platform are developed based on a newly extended analytical target cascading (ATC) method. As ATC accommodates heterogeneous sub-system integration and multi-level problem solving, the methodologies are able to address the typical challenges that a practical component service PnSSC process normally faces, such as distributed decision rights, uncertain decision structure and short decision period. Due to the similar and normally simplified configuration requirements, this methodology and platform are also both generic enough to be applied or adapted for PnSS of other production resources.

Journal ArticleDOI
TL;DR: The concept of ubiquitous technology (UT) is applied to the existing PLM and the basis of a u-PLM environment is established to seamlessly and accurately exchange information in a distributed environment and to collaborate and accumulate or use the related knowledge.
Abstract: Ubiquitous computing refers to an environment that enables people to use a variety of information and communication services by networking anywhere, anytime without any interruption. It has been already applied in various industries and in our daily lives. However, studies related to the manufacturing sector have been restricted to production using radio frequency identification at the shop-floor level. On the other hand, the scope of products, which manufacturing companies should manage, has been extending from just design and production to maintenance, repair and disposal, as production in the light of environmental considerations has been emerging as an important topic all over the world these days. That is, product life cycle management (PLM) as a manufacturing paradigm for managing all information during the product life cycle should cover that extended scope as well. To do this, in this article, the concept of ubiquitous technology (UT) is applied to the existing PLM. It is called u-PLM, which is an abbreviation for ubiquitous PLM. First, we define the concepts and characteristics of PLM and ubiquitous computing. Then, an approach for the implementation of u-PLM is explained with 5C (Computing, Communication, Connectivity, Contents and Calm) as conditional elements and 5Any (Any-time, Any-where, Any-network, Any-device and Any-service) as characteristic elements based on the points of view of existing 5C and 5Any, which are the aim of UT. Finally, a conceptual architecture is suggested for u-PLM. As a case study, the proposed concept is practically applied to an automotive press die shop. Based on the results, the basis of a u-PLM environment for manufacturing industries is established to seamlessly and accurately exchange information in a distributed environment and to collaborate and accumulate or use the related knowledge.