scispace - formally typeset
Search or ask a question

Showing papers in "Engineering Optimization in 2008"


Journal ArticleDOI
TL;DR: In this article, a grey-based Taguchi method is proposed to solve the multi-response simulation problem, which adopts grey relational analysis (GRA) to transfer multiresponse problems into single-response problems.
Abstract: Simulation modelling is a widely accepted tool in system design and analysis, particularly when the system or environment has stochastic and nonlinear behaviour. However, it does not provide a method for optimization. In general, problems contain more than one response, which are often in conflict with each other. This article proposes a grey-based Taguchi method to solve the multi-response simulation problem. The grey-based Taguchi method is based on the optimizing procedure of the Taguchi method, and adopts grey relational analysis (GRA) to transfer multi-response problems into single-response problems. A practical case study from an integrated-circuit packaging company illustrates that differences in performance of the proposed grey-based Taguchi method and other methods found in the literature were not significant. The grey-based Taguchi method thus provides a new option when solving a multi-response simulation-optimization problem.

164 citations


Journal ArticleDOI
TL;DR: In this article, an improved harmony search algorithm is proposed which is found to be more efficient than the original harmony search for slope stability analysis, and the effectiveness of the proposed algorithm is examined by considering several published cases.
Abstract: An improved harmony search algorithm is proposed which is found to be more efficient than the original harmony search algorithm for slope stability analysis. The effectiveness of the proposed algorithm is examined by considering several published cases. The improved harmony search method is applied to slope stability problems with five types of procedure for generating trial slip surfaces. It is demonstrated that the improved harmony search algorithm is efficient and effective for the minimization of factors of safety for various difficult problems, and the method of generating the trial failure surfaces can be important in the minimization process.

131 citations


Journal ArticleDOI
Byeong Hyeon Ju1, Byung Chai Lee1
TL;DR: In this paper, the authors considered difficulties in implementing the moment method into RBDO; they are solved using a kriging metamodel with an active constraint strategy, and three numerical examples are tested and the results show that the proposed method is efficient and accurate.
Abstract: Reliability-based design optimization (RBDO) has been used for optimizing engineering systems with uncertainties in design variables and system parameters. RBDO involves reliability analysis, which requires a large amount of computational effort, so it is important to select an efficient method for reliability analysis. Of the many methods for reliability analysis, a moment method, which is called the fourth moment method, is known to be less expensive for moderate size problems and requires neither iteration nor the computation of derivatives. Despite these advantages, previous research on RBDO has been mainly based on the first-order reliability method and relatively little attention has been paid to moment-based RBDO. This article considers difficulties in implementing the moment method into RBDO; they are solved using a kriging metamodel with an active constraint strategy. Three numerical examples are tested and the results show that the proposed method is efficient and accurate.

82 citations


Journal ArticleDOI
TL;DR: In this paper, the authors propose and demonstrate a method, the inductive design exploration method (IDEM), which facilitates robust design in the presence of model structure uncertainty, which is a form of uncertainty that is often hard to quantify.
Abstract: Model structure uncertainty, originating from assumptions and idealizations in modelling processes, is a form of uncertainty that is often hard to quantify. In this article, the authors propose and demonstrate a method, the inductive design exploration method (IDEM), which facilitates robust design in the presence of model structure uncertainty. The approach in this method is achieving robustness by compromising between the degree of system performance and the degree of reliability based on structure uncertainty associated with system models (i.e. models for performances and constraints). The main strategies in the IDEM include: (i) identifying feasible ranged sets of design space instead of single (or optimized) design solution, considering all types of quantifiable uncertainties and (ii) systematically compromising target achievement with provision for potential uncertainty. The IDEM is successfully demonstrated in a clay-filled polyethylene cantilever beam design example, which is a simple but represen...

66 citations


Journal ArticleDOI
TL;DR: Embedded constraint handling methods, which include the gradient repair method and constraint fitness priority-based ranking method, are proposed as a special operator in NM-PSO for dealing with constraints.
Abstract: Constrained optimization problems (COPs) are very important in that they frequently appear in the real world. A COP, in which both the function and constraints may be nonlinear, consists of the optimization of a function subject to constraints. Constraint handling is one of the major concerns when solving COPs with particle swarm optimization (PSO) combined with the Nelder–Mead simplex search method (NM-PSO). This article proposes embedded constraint handling methods, which include the gradient repair method and constraint fitness priority-based ranking method, as a special operator in NM-PSO for dealing with constraints. Experiments using 13 benchmark problems are explained and the NM-PSO results are compared with the best known solutions reported in the literature. Comparison with three different meta-heuristics demonstrates that NM-PSO with the embedded constraint operator is extremely effective and efficient at locating optimal solutions.

64 citations


Journal ArticleDOI
TL;DR: This variant of the particle swarm optimization algorithm makes use of a discrete version of PSO, which overcomes one of the PSO's main drawbacks, namely its difficulty in maintaining acceptable levels of population diversity and in balancing local and global searches.
Abstract: The design of water distribution networks (WDNs) is addressed by using a variant of the particle swarm optimization (PSO) algorithm. This variant, which makes use of a discrete version of PSO already considered by the authors, overcomes one of the PSO's main drawbacks, namely its difficulty in maintaining acceptable levels of population diversity and in balancing local and global searches. The performance of the variant proposed here is investigated by applying the model to solve two standard benchmark problems: the Hanoi new water distribution network and the New York Tunnel water supply system. The results obtained show considerable improvements in both convergence characteristics and the quality of the final solutions, and near-optimal results are consistently achieved at reduced computational cost.

53 citations


Journal ArticleDOI
TL;DR: In this paper, the adaptive moving mesh method was combined with a level set structure topology optimization method, which automatically maintains a high nodal density around the structural boundaries of the material domain, whereas the mesh topology remains unchanged.
Abstract: The level set method is a promising approach to provide flexibility in dealing with topological changes during structural optimization. Normally, the level set surface, which depicts a structure's topology by a level contour set of a continuous scalar function embedded in space, is interpolated on a fixed mesh. The accuracy of the boundary positions is therefore largely dependent on the mesh density, a characteristic of any Eulerian expression when using a fixed mesh. This article combines the adaptive moving mesh method with a level set structure topology optimization method. The finite element mesh automatically maintains a high nodal density around the structural boundaries of the material domain, whereas the mesh topology remains unchanged. Numerical experiments demonstrate the effect of the combination of a Lagrangian expression for a moving mesh and a Eulerian expression for capturing the moving boundaries.

52 citations


Journal ArticleDOI
TL;DR: In this paper, an optimization approach is discussed for the problem of designing light distributions for luminaries for tunnel and street lighting which satisfy luminance-based and glare-based requirements set by the International Commision on Illumination (CIE) and the European Committee for Standardization (CEN) while consuming minimal power.
Abstract: An optimization approach is discussed for the problem of designing light distributions for luminaries for tunnel and street lighting which satisfy luminance-based and glare-based requirements set by the International Commision on Illumination (CIE) and the European Committee for Standardization (CEN) while consuming minimal power. The problem is formulated as a linear optimization problem that incorporates the geometrical parameters of the lighting installation and the reflective properties of the road surface. A polynomial representation for the light intensities is used in order to construct smooth light distribution curves, so that the luminaries can be manufactured with existing technology. Computational experiments indicate that optimization models can substantially improve the lighting parameters of luminaries, and make lighting installations more energy-efficient.

51 citations


Journal ArticleDOI
TL;DR: In this article, a new robust design optimization model is proposed to solve design problems involving multiple responses of several different types, and the results of the experiment are optimized using a new approach that is formulated as a nonlinear goal programming problem.
Abstract: Robust design is an efficient process improvement methodology that combines experimentation with optimization to create systems that are tolerant to uncontrollable variation. Most traditional robust design models, however, consider only a single quality characteristic, yet customers judge products simultaneously on a variety of scales. Additionally, it is often the case that these quality characteristics are not of the same type. To addresses these issues, a new robust design optimization model is proposed to solve design problems involving multiple responses of several different types. In this new approach, noise factors are incorporated into the robust design model using a combined array design, and the results of the experiment are optimized using a new approach that is formulated as a nonlinear goal programming problem. The results obtained from the proposed methodology are compared with those of other robust design methods in order to examine the trade-offs between meeting the objectives associated w...

47 citations


Journal ArticleDOI
TL;DR: In this paper, a new method for projecting the reliability of a structural system is presented, which is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design of future aerospace structures where new technologies are being applied.
Abstract: Uncertainty quantification and risk assessment in the optimal design of structural systems has always been a critical consideration for engineers. When new technologies are developed or implemented and budgets are limited for full-scale testing, the result is insufficient datasets for construction of probability distributions. Making assumptions about these probability distributions can potentially introduce more uncertainty to the system than it quantifies. Evidence theory represents a method to handle epistemic uncertainty that represents a lack of knowledge or information in the numerical optimization process. Therefore, it is a natural tool to use for uncertainty quantification and risk assessment especially in the optimization design cycle for future aerospace structures where new technologies are being applied. For evidence theory to be recognized as a useful tool, it must be efficiently applied in a robust design optimization scheme. This article demonstrates a new method for projecting the reliabi...

46 citations


Journal ArticleDOI
TL;DR: An algorithm which combines two techniques for the numerical treatment of multi-objective optimization problems—a continuation method and a particle swarm optimizer—are proposed, some convergence results for continuous models are provided, and some numerical results are presented indicating the strength of this novel approach.
Abstract: Two techniques for the numerical treatment of multi-objective optimization problems—a continuation method and a particle swarm optimizer—are combined in order to unite their particular advantages. Continuation methods can be applied very efficiently to perform the search along the Pareto set, even for high-dimensional models, but are of local nature. In contrast, many multi-objective particle swarm optimizers tend to have slow convergence, but instead accomplish the ‘global task’ well. An algorithm which combines these two techniques is proposed, some convergence results for continuous models are provided, possible realizations are discussed, and finally some numerical results are presented indicating the strength of this novel approach.

Journal ArticleDOI
TL;DR: In this article, a bi-criteria no-wait flow shop scheduling problem (FSSP) is considered, in which weighted mean completion time and weighted mean tardiness are to be minimized simultaneously.
Abstract: The flow shop problem as a typical manufacturing challenge has gained wide attention in academic fields. This article considers a bi-criteria no-wait flow shop scheduling problem (FSSP) in which weighted mean completion time and weighted mean tardiness are to be minimized simultaneously. Since a FSSP has been proved to be NP-hard in a strong sense, a new multi-objective scatter search (MOSS) is designed for finding the locally Pareto-optimal frontier of the problem. To prove the efficiency of the proposed algorithm, various test problems are solved and the reliability of the proposed algorithm, based on some comparison metrics, is compared with a distinguished multi-objective genetic algorithm (GA), i.e. SPEA-II. The computational results show that the proposed MOSS performs better than the above GA, especially for the large-sized problems.

Journal ArticleDOI
TL;DR: An improved ant colony optimization-power plant maintenance scheduling optimization (ACO-PPMSO) formulation that considers such options in the optimization process is introduced and is shown to be capable of allowing shortening of maintenance duration in the event of expected demand shortfalls.
Abstract: It is common practice in the hydropower industry to either shorten the maintenance duration or to postpone maintenance tasks in a hydropower system when there is expected unserved energy based on current water storage levels and forecast storage inflows. It is therefore essential that a maintenance scheduling optimizer can incorporate the options of shortening the maintenance duration and/or deferring maintenance tasks in the search for practical maintenance schedules. In this article, an improved ant colony optimization-power plant maintenance scheduling optimization (ACO-PPMSO) formulation that considers such options in the optimization process is introduced. As a result, both the optimum commencement time and the optimum outage duration are determined for each of the maintenance tasks that need to be scheduled. In addition, a local search strategy is presented in this article to boost the robustness of the algorithm. When tested on a five-station hydropower system problem, the improved formula...

Journal ArticleDOI
TL;DR: In this paper, a mixed integer programming model for the considered problem is proposed to obtain near-optimal solutions, and the computational results show that the proposed GA is effective and efficient in solving the considered quay crane scheduling problem.
Abstract: The quay crane scheduling problem studied in this article is to determine a handling sequence of ship bays for quay cranes assigned to a container ship considering handling priority of every ship bay. This article provides a mixed integer programming model for the considered problem. A genetic algorithm (GA) is proposed to obtain near-optimal solutions. Computational experiments to examine the proposed model and solution algorithm are described. The computational results show that the proposed GA is effective and efficient in solving the considered quay crane scheduling problem.

Journal ArticleDOI
Yakup Kara1
TL;DR: In this paper, a mixed, zero-one, nonlinear mathematical programming formulation for balancing and sequencing MMULs simultaneously with the objective of reducing work overload is presented and its performance compared with existing approaches.
Abstract: Mixed-model U-lines (MMULs) are important elements of just-in-time production systems. For successful implementation of MMULs, a smoothed workload distribution among workstations is important. This requires that line balancing and model sequencing problems are solved simultaneously. This article presents a mixed, zero–one, nonlinear mathematical programming formulation for balancing and sequencing MMULs simultaneously with the objective of reducing work overload. Since the problem is NP-hard, an effective simulated annealing approach is also presented and its performance compared with existing approaches. The results show that the proposed simulated annealing algorithm outperforms existing approaches.

Journal ArticleDOI
TL;DR: The proposed model is the first fuzzy multi-objective decision-making approach to the SULB problem with multiple objectives which aims at simultaneously optimizing several conflicting goals and provides increased flexibility for the decision-maker(s) to determine different alternatives.
Abstract: A fuzzy goal programming model for the simple U-line balancing (SULB) problem with multiple objectives is presented. In real life applications of the SULB problem with multiple objectives, it is difficult for the decision-maker(s) to determine the goal value of each objective precisely as the goal values are imprecise, vague, or uncertain. Therefore a fuzzy goal programming model is developed for this purpose. The proposed model is the first fuzzy multi-objective decision-making approach to the SULB problem with multiple objectives which aims at simultaneously optimizing several conflicting goals. The proposed model is illustrated using an example. A computational study is conducted by solving a large number of test problems to investigate the relationship between the fuzzy goals and to compare them with the goal programming model proposed by Gokcen and Agpak (Gokcen, H. and Agpak, K., European Journal of Operational Research, 171, 577–585, 2006). The results of the computational study show that the propo...

Journal ArticleDOI
TL;DR: In this article, the authors proposed a deductive top-down estimation methodology, which combines intuitionistic fuzzy set (IFS) and ordered weighted averaging (OWA) operators to evaluate system reliability.
Abstract: In conventional system reliability analysis, the failure probabilities of components of a system are treated as exact values when the failure probability of the entire system is estimated. However, it may be difficult or even impossible to precisely determine the failure probabilities of components as early as the product design phase. Therefore, an efficient and simplified algorithm to assess system reliability is needed. This article proposes a deductive top-down estimation methodology, which combines intuitionistic fuzzy set (IFS) and ordered weighted averaging (OWA) operators to evaluate system reliability. A case of an aircraft propulsion system from an aerospace company is presented to further illustrate the proposed approach. After comparing results from the proposed method and two other approaches, this research found that the proposed approach provides a more accurate and reasonable reliability assessment.

Journal ArticleDOI
TL;DR: In this article, an interval-parameter robust quadratic programming (IRQP) method is developed by incorporating techniques of robust programming and interval-quadratic programming within a general optimization framework.
Abstract: Effective planning of water quality management is important for facilitating sustainable socio-economic development in watershed systems. An interval-parameter robust quadratic programming (IRQP) method is developed by incorporating techniques of robust programming and interval quadratic programming within a general optimization framework. The IRQP improves upon existing quadratic programming methods, and can tackle uncertainties presented as interval numbers and fuzzy sets as well as their combinations. Moreover, it can deal with nonlinearities in the objective function such that economies-of-scale effects can be reflected. The developed method is applied to a case study of a water quality management under uncertainty. A number of decision alternatives are generated based on the interval solutions as well as the projected applicable conditions. They represent multiple decision options with various environmental and economic considerations. Willingness to accept a low economic revenue will guarantee satis...

Journal ArticleDOI
TL;DR: This article analyses the use of a grid-based genetic algorithm (GrEA) to solve a real-world instance of a problem from the telecommunication domain and shows that the search capability of GrEA clearly outperforms that of the equivalent non-grid algorithm.
Abstract: This article analyses the use of a grid-based genetic algorithm (GrEA) to solve a real-world instance of a problem from the telecommunication domain. The problem, known as automatic frequency planning (AFP), is used in a global system for mobile communications (GSM) networks to assign a number of fixed frequencies to a set of GSM transceivers located in the antennae of a cellular phone network. Real data instances of the AFP are very difficult to solve owing to the NP-hard nature of the problem, so combining grid computing and metaheuristics turns out to be a way to provide satisfactory solutions in a reasonable amount of time. GrEA has been deployed on a grid with up to 300 processors to solve an AFP instance of 2612 transceivers. The results not only show that significant running time reductions are achieved, but that the search capability of GrEA clearly outperforms that of the equivalent non-grid algorithm.

Journal ArticleDOI
TL;DR: A value-of-information based approach for determining the appropriate extent of refinement of simulation models focused on multi-objective compromise decisions modelled using the compromise decision support problem construct, which is a hybrid formulation based on traditional optimization and goal programming.
Abstract: The appropriateness of a simulation model for engineering design is dependent on the trade-off between model accuracy and the computational expense for its development and execution. Since no simulation model is perfect, any simulation model for a system's physical behaviour can be refined further, although likely at an increased computational cost. Hence, the question faced by a designer is ‘How much refinement of a simulation model is appropriate for a particular design problem?’ The simplified nature of simulation models results in two types of uncertainty—variability, which can be modelled using probability distribution functions and imprecision, best modelled using intervals. Value-of-information has been used in the engineering design literature to decide whether to make a decision using the available information or to gather more information before making a decision. However, the main drawback of applying existing value-of-information based metrics for model refinement problems is that exi...

Journal ArticleDOI
TL;DR: In this article, a multicriteria maximum-entropy approach to the joint layout, pipe size, and reliability optimization of water distribution systems is presented, where the capital cost of the system is taken as the principal criterion, and so the trade-offs between cost, entropy, reliability and redundancy are examined sequentially in a large population of optimal solutions.
Abstract: A multicriteria maximum-entropy approach to the joint layout, pipe size and reliability optimization of water distribution systems is presented. The capital cost of the system is taken as the principal criterion, and so the trade-offs between cost, entropy, reliability and redundancy are examined sequentially in a large population of optimal solutions. The novelty of the method stems from the use of the maximum-entropy value as a preliminary filter, which screens out a large proportion of the candidate layouts at an early stage of the process before the designs and their reliability values are actually obtained. This technique, which is based on the notion that the entropy is potentially a robust hydraulic reliability measure, contributes greatly to the efficiency of the proposed method. The use of head-dependent modelling for simulating pipe failure conditions in the reliability calculations also complements the method in locating the Pareto-optimal front. The computational efficiency, robustness, accura...

Journal ArticleDOI
TL;DR: In this article, an optimization methodology is developed for determining the most cost-effective maintenance and rehabilitation (M&R) activities for each pavement section in a highway pavement network, along an extended planning horizon.
Abstract: An optimization methodology is developed for determining the most cost-effective maintenance and rehabilitation (M&R) activities for each pavement section in a highway pavement network, along an extended planning horizon. A multi-dimensional 0–1 knapsack problem with M&R strategy-selection and precedence-feasibility constraints is formulated to maximize the total dollar value of benefits associated with the selected pavement improvement activities. The solution approach is a hybrid dynamic programming and branch-and-bound procedure. The imbedded-state approach is used to reduce multi-dimensional dynamic programming to a one-dimensional problem. Bounds at each stage are determined by using Lagrangian optimization to solve a relaxed problem by means of a sub-gradient optimization method. Tests for the proposed solution methodology are conducted using typical data obtained from the Texas Department of Transportation.

Journal ArticleDOI
TL;DR: The reliability-based optimum design of laminated composites is modelled and solved using the improved PSO and the maximization of structural reliability and the minimization of total weight of lamination are analysed.
Abstract: A new approach to the particle swarm optimization (PSO) is proposed for the solution of non-linear optimization problems with constraints, and is applied to the reliability-based optimum design of laminated composites Special mutation-interference operators are introduced to increase swarm variety and improve the convergence performance of the algorithm The reliability-based optimum design of laminated composites is modelled and solved using the improved PSO The maximization of structural reliability and the minimization of total weight of laminates are analysed The stacking sequence optimization is implemented in the improved PSO by using a special coding technique Examples show that the improved PSO has high convergence and good stability and is efficient in dealing with the probabilistic optimal design of composite structures

Journal ArticleDOI
TL;DR: In this article, a sequential robust design-tolerance design optimization procedure is proposed to minimize the total cost incurred by both the customer and manufacturer by balancing quality loss due to variations in product performance and the cost of controlling these variations.
Abstract: Many practitioners and researchers have implemented robust design and tolerance design as quality improvement and process optimization tools for more than two decades. Robust design is an enhanced process/product design methodology for determining the best settings of control factors while minimizing process bias and variability. Tolerance design is aimed at determining the best tolerance limits for minimizing the total cost incurred by both the customer and manufacturer by balancing quality loss due to variations in product performance and the cost of controlling these variations. Although robust design and tolerance design have received much attention from researchers and practitioners, there is ample room for improvement. First, most researchers consider robust design and tolerance design as separate research fields. Second, most research work is based on a single quality characteristic. The primary goal of this paper is to integrate a sequential robust design–tolerance design optimization procedure wi...

Journal ArticleDOI
TL;DR: In this paper, a multiobjective optimization method for structural problems based on multi-objective particle swarm optimization (MOPSO) is proposed, which combines a gradient-based optimization method with MOPSO to alleviate constraint-handling difficulties.
Abstract: This article proposes a new multiobjective optimization method for structural problems based on multiobjective particle swarm optimization (MOPSO). A gradient-based optimization method is combined with MOPSO to alleviate constraint-handling difficulties. In this method, a group of particles is divided into two groups—a dominated solution group and a non-dominated solution group. The gradient-based method, utilizing a weighting coefficient method, is applied to the latter to conduct local searching that yields superior non-dominated solutions. In order to enhance the efficiency of exploration in a multiple objective function space, the weighting coefficients are adaptively assigned considering the distribution of non-dominated solutions. A linear optimization problem is solved to determine the optimal weighting coefficients for each non-dominated solution at each iteration. Finally, numerical and structural optimization problems are solved by the proposed method to verify the optimization efficiency.

Journal ArticleDOI
TL;DR: In this paper, a hybrid evolutionary algorithm consisting of a genetic algorithm (GA) and particle swarm optimization (PSO) was proposed to solve the problem of diesel engine combustion chamber optimization, where GA maintains diverse solutions of good quality in multi-objective problems while PSO shows fast convergence to the optimum solution.
Abstract: A hybrid evolutionary algorithm, consisting of a genetic algorithm (GA) and particle swarm optimization (PSO), is proposed Generally, GAs maintain diverse solutions of good quality in multi-objective problems, while PSO shows fast convergence to the optimum solution By coupling these algorithms, GA will compensate for the low diversity of PSO, while PSO will compensate for the high computational costs of GA The hybrid algorithm was validated using standard test functions The results showed that the hybrid algorithm has better performance than either a pure GA or pure PSO The method was applied to an engineering design problem—the geometry of diesel engine combustion chamber reducing exhaust emissions such as NOx, soot and CO was optimized The results demonstrated the usefulness of the present method to this engineering design problem To identify the relation between exhaust emissions and combustion chamber geometry, data mining was performed with a self-organising map (SOM) The results indicate th

Journal ArticleDOI
TL;DR: The IFIP method is applied to a solid waste management system to illustrate its performance in supporting decision-making and is capable of addressing uncertainties arising from not only the imprecise information but also complex relations to external impact factors.
Abstract: An interval full-infinite programming (IFIP) method is developed by introducing a concept of functional intervals into an optimization framework. Since the solutions of the problem should be ‘globally’ optimal under all possible levels of the associated impact factors, the number of objectives and constraints is infinite. To solve the IFIP problem, it is converted to two interactive semi-infinite programming (SIP) submodels that can be solved by conventional SIP solution algorithms. The IFIP method is applied to a solid waste management system to illustrate its performance in supporting decision-making. Compared to conventional interval linear programming (ILP) methods, the IFIP is capable of addressing uncertainties arising from not only the imprecise information but also complex relations to external impact factors. Compared to SIP that can only handle problems containing infinite constraints, the IFIP approaches are useful for addressing inexact problems with infinite objectives and constraints.

Journal ArticleDOI
TL;DR: In this paper, a study of the relationship between the statistical entropy and hydraulic reliability of water distribution systems (WDS) was carried out by assessing the effects of layout, flow direction, and pipe costs.
Abstract: A study of the relationship between the statistical entropy and hydraulic reliability of water distribution systems (WDS) was carried out by assessing the effects of layout, flow direction, and pipe costs. Because of an invariance property of the entropy function, different WDS layouts can have identical maximum entropy values. The properties of designs that have identical maximum entropy values were also compared. The results reinforce previous observations that entropy is a potential surrogate measure for hydraulic reliability and suggest that any influences due to the design factors mentioned above are negligible. Maximum entropy designs are shown to be more reliable than other designs, while designs with different layouts but equal maximum entropy values have very similar levels of reliability. The head-dependent analysis method was used and revealed the correlation between entropy and reliability more clearly than hitherto achieved using demand-driven analysis.

Journal ArticleDOI
TL;DR: In this paper, a mixed integer programming model is developed to determine which recycling strategy and which production pattern should be selected with what quantity of waste generated by a typical class of industrial waste recycling process in order to maximize profit.
Abstract: Manufacturers have a legal accountability to deal with industrial waste generated from their production processes in order to avoid pollution. Along with advances in waste recovery techniques, manufacturers may adopt various recycling strategies in dealing with industrial waste. With reuse strategies and technologies, byproducts or wastes will be returned to production processes in the iron and steel industry, and some waste can be recycled back to base material for reuse in other industries. This article focuses on a recovery strategies optimization problem for a typical class of industrial waste recycling process in order to maximize profit. There are multiple strategies for waste recycling available to generate multiple byproducts; these byproducts are then further transformed into several types of chemical products via different production patterns. A mixed integer programming model is developed to determine which recycling strategy and which production pattern should be selected with what quantity of...

Journal ArticleDOI
TL;DR: In this article, a model for the optimal long-term design and upgrading of new and existing water distribution networks is presented, which reduces the size of the problem and enables the application of linear programming for pipe size optimization.
Abstract: Given a limited budget, the choice of the best water distribution network upgrading strategy is a complex optimization problem. A model for the optimal long-term design and upgrading of new and existing water distribution networks is presented. A key strength of the methodology is the use of maximum entropy flows, which reduces the size of the problem and enables the application of linear programming for pipe size optimization. It also ensures the reliability level is high. The capital and maintenance costs and hydraulic performance are considered simultaneously for a predefined design horizon. The timing of upgrading over the entire planning horizon is obtained by dynamic programming. The deterioration over time of the structural integrity and hydraulic capacity of every pipe are explicitly considered. The upgrading options considered include pipe paralleling and replacement. The effectiveness of the model is demonstrated using the water supply network of Wobulenzi town in Uganda.