scispace - formally typeset
Search or ask a question

Showing papers on "Heuristic published in 1990"


Book
01 Jan 1990
TL;DR: Introduction Resources and Inspirations Heuristic Concepts, Processes, and Validation Research Design and Methodology Examples of Heuristic Research Applications of heuristic Research
Abstract: Introduction Resources and Inspirations Heuristic Concepts, Processes, and Validation Research Design and Methodology Examples of Heuristic Research Applications of Heuristic Research

1,304 citations


Proceedings ArticleDOI
11 Nov 1990
TL;DR: In this article, the authors propose a method based on transition relations that only requires the ability to compute the binary decision diagram for f/sub i/ and outperforms Coudert's (1990) algorithm for most examples.
Abstract: The authors propose a novel method based on transition relations that only requires the ability to compute the BDD (binary decision diagram) for f/sub i/ and outperforms O. Coudert's (1990) algorithm for most examples. The method offers a simple notational framework to express the basic operations used in BDD-based state enumeration algorithms in a unified way and a set of techniques that can speed up range computation dramatically, including a variable ordering heuristic and a method based on transition relations. >

371 citations


Journal ArticleDOI
TL;DR: This paper presents an efficient scheduling algorithm for dynamic scheduling in real-time systems that focuses its attention on a small subset of tasks with the shortest deadlines and is shown to be very effective when the maximum allowable scheduling overhead is fixed.
Abstract: Efficient scheduling algorithms based on heuristic functions are developed for scheduling a set of tasks on a multiprocessor system. The tasks are characterized by worst-case computation times, deadlines, and resources requirements. Starting with an empty partial schedule, each step of the search extends the current partial schedule by including one of the tasks yet to be scheduled. The heuristic functions used in the algorithm actively direct the search for a feasible schedule, i.e. they help choose the task that extends the current partial schedule. Two scheduling algorithms are evaluated by simulation. To extend the current partial schedule, one of the algorithms considers, at each step of the search, all the tasks that are yet to be scheduled as candidates. The second focuses its attention on a small subset of tasks with the shortest deadlines. The second algorithm is shown to be very effective when the maximum allowable scheduling overhead is fixed. This algorithm is hence appropriate for dynamic scheduling in real-time systems. >

349 citations


Proceedings Article
29 Jul 1990
TL;DR: A theoretical analysis is presented to explain why the heuristic method for solving large-scale constraint satisfaction and scheduling problems works so well on certain types of problems and to predict when it is likely to be most effective.
Abstract: This paper describes a simple heuristic method for solving large-scale constraint satisfaction and scheduling problems. Given an initial assignment for the variables in a problem, the method operates by searching though the space of possible repairs. The search is guided by an ordering heuristic, the min-conflicts heuristic, that attempts to minimize the number of constraint violations after each step. We demonstrate empirically that the method performs orders of magnitude better than traditional backtracking techniques on certain standard problems. For example, the one million queens problem can be solved rapidly using our approach. We also describe practical scheduling applications where the method has been successfully applied. A theoretical analysis is presented to explain why the method works so well on certain types of problems and to predict when it is likely to be most effective.

331 citations


Journal ArticleDOI
Fayer F. Boctor1
TL;DR: Some multi-heuristic procedures employing parallel rules as well as serial rules are suggested to solve the well known NP-hard, resource-constrained project scheduling problem.

272 citations


Journal ArticleDOI
01 Jul 1990
TL;DR: Lower bounds on the optimal cost-to-go from the information-theoretic concepts of Huffman coding and entropy are derived and have made it possible to obtain optimal test sequences to problems that are intractable with traditional dynamic programming techniques.
Abstract: The problem of constructing optimal and near-optimal test sequences to diagnose permanent faults in electronic and electromechanical systems is considered. The test sequencing problem is formulated as an optimal binary AND/OR decision tree construction problem, whose solution is known to be NP-complete. The approach used is based on integrated concepts from information theory and heuristic AND/OR graph search methods to subdue the computational explosion of the optimal test-sequencing problem. Lower bounds on the optimal cost-to-go from the information-theoretic concepts of Huffman coding and entropy are derived. These lower bounds ensure that an optimal solution is found using the heuristic AND/OR graph search algorithms; they have made it possible to obtain optimal test sequences to problems that are intractable with traditional dynamic programming techniques. In addition, a class of test-sequencing algorithms that provide a tradeoff between solution quality and complexity have been derived using the epsilon -optimal and limited search strategies. >

237 citations


Journal ArticleDOI
TL;DR: In this article, the authors extend Kohli and Krishnamurti's 1987 dynamic-programming heuristic for selecting a single item maximizing share to structure product lines maximizing share, seller's return, or buyers' utilitarian welfare.
Abstract: Recently proposed methods for product-line selection use the total utilities of candidate items to construct product lines maximizing seller's return or buyers' welfare. For conjoint hybrid conjoint data, enumerating the utilities of candidate items can be computationally infeasible if the number of attributes and attribute levels is large and most multi-attribute alternatives are feasible. For such problems, constructing product lines directly from part-worths data is preferable. We propose such methods, extending Kohli and Krishnamurti's 1987 dynamic-programming heuristic for selecting a single item maximizing share to structure product lines maximizing share, seller's return, or buyers' utilitarian welfare. The computational performance of the heuristics and their approximation of product-line solutions is evaluated using simulated data. Across problem instances, the dynamic-programming heuristics identify solutions that are no worse, in terms of approximating optimal solutions, to the solutions of heuristics for the current two-step approaches to product-line design. An application using hybrid-conjoint data for a consumer-durable product is described.

235 citations


Proceedings ArticleDOI
11 Nov 1990
TL;DR: In this paper, the main idea pursued is to derive a minimal feedback vertex set of the so-called S-graphs, which is then used to determine flip-flops to be scanned in partial-scan designs for sequential circuits.
Abstract: A report is presented on procedures investigated to determine flip-flops to be scanned in partial-scan designs for sequential circuits. The main idea pursued is to derive a minimal feedback vertex set of the so-called S-graphs. Results of applying optimal and heuristic procedures on a set of benchmark circuits indicate that heuristic methods give fast and near minimal solutions. >

229 citations


Book ChapterDOI
01 Oct 1990
TL;DR: In this paper the optimal parameter setting of Genetic Algorithms (GAs) is investigated and a heuristic comprising these results is presented.
Abstract: In this paper the optimal parameter setting of Genetic Algorithms (GAs) is investigated Particular attention has been paid to the dependence of the mutation probability P M upon two parameters, the dimension of the configuration space l and the population size M Assuming strict conditions on both the problem to be optimized and the GA, P M converges to 0 as the population size M or the dimension of the configuration space l converges to infinity For direct application a heuristic comprising these results is presented The parameter settings obtained by applying this heuristic are in accordance with those which have been obtained earlier by experiment

223 citations


Journal ArticleDOI
TL;DR: It is concluded that a simple and fast heuristic algorithm, such as HNF, may be sufficient to achieve adequate performance in terms of program execution time and processors' idle time.

209 citations


Journal ArticleDOI
TL;DR: Two algorithms for the k-satisfiability problem are presented and a probabilistic analysis is performed and it is shown that the first algorithm finds a solution with probability approaching one for a wide range of parameter values.

Proceedings Article
29 Jul 1990
TL;DR: This paper presents a projection algorithm for incremental control rule synthesis that synthesizes an initial set of goal-achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic to achieve a computationally effective balance between the limited robustness of triangle tables and the absolute robustnessof universal plans.
Abstract: This paper presents a projection algorithm for incremental control rule synthesis. The algorithm synthesizes an initial set of goal-achieving control rules using a combination of situation probability and estimated remaining work as a search heuristic. This set of control rules has a certain probability of satisfying the given goal. The probability is incrementally increased by synthesizing additional control rules to handle "error" situations the execution system is likely to encounter when following the initial control rules. By using situation probabilities the algorithm achieves a computationally effective balance between the limited robustness of triangle tables and the absolute robustness of universal plans.

Journal ArticleDOI
TL;DR: In this article, the authors present a comparative analysis of fourteen heuristic rules which examine the impact of the number of different types of items in a load on the loading efficiency achieved.

Journal ArticleDOI
TL;DR: For this three-dimensional cutting-stock problem various suboptimal solutions are generated using the proposed computer-based heuristic for packing rectangular boxes of different size in a shipping container of known dimensions.

Journal ArticleDOI
TL;DR: A very general, yet powerful backtracking procedure for solving the duration minimization and net present value maximization problems in a precedence and resource-constrained network of the PERT/CPM variety.

Book ChapterDOI
01 Apr 1990
TL;DR: This article argued that the most important thing is not so much by the problem solution but rather by the question or problem that is found, and that wisdom is found more in particular questions that are posed than in the solutions that are given form by those questions.
Abstract: Whereas there can be wisdom in answers that are given or in problem solutions, wisdom is not simply defined by solutions or by answers. Wisdom may be more a matter of interrogatives rather than of declaratives. Answers and problems with their solutions are parts of a larger whole that include the formulation of the problem and the question(s) that drove that formulation. Wisdom is found more in particular questions that are posed than in the solutions that are given form by those questions. Wertheimer (1945, p. 123) argued that the “ … function of thinking is not just solving an actual problem but discovering, envisaging, going into deeper questions. Often, in great discovery the most important thing is that a certain question is found.” The same may be said of wisdom. Wisdom may be recognized not so much by the problem solution but rather by the question or problem that is found. Wisdom may be the means by which one discovers, envisages, or goes into deeper questions. Some of these deeper questions are the productive questions of Wertheimer (1945), the generic questions of Mackworth (1965), and the new problems of Piaget (1980): “Clearly though the modest facts assembled may have permitted us to answer a few minor outstanding questions, they continue to pose a host of problems. … [but] new problems are often more important than the accepted solutions” (Piaget, 1980, p. 304).

Journal ArticleDOI
TL;DR: In this paper, the critical path method is used to assign project activities to specific days so that the final resource histogram approaches a rectangle and its moment approaches a minimum value, and the resulting leveled histogram is the same as or very close to, that produced by other optimization or heuristic methods.
Abstract: A new heuristic for resource leveling based upon the critical path method is developed. The minimum moment of the resource histogram is used to measure the level of resources. The heuristic assigns project activities to specific days so that the final resource histogram approaches a rectangle and its moment approaches a minimum value. Activities are listed in a priority order and all possible assignments for each are determined. Incremental moments contributed by the activity's resource rate and penalties that recognize network interactions are calculated. Each activity is positioned in the time span where the sum of these quantities is a minimum. The histogram is thereby built step‐by‐step until all activities have been positioned within the constraints of a CPM or PERT network. The resulting leveled histogram is the same as, or very close to, that produced by other optimization or heuristic methods. The method is clear, logical, and computationally efficient whether the leveling is done manually or by c...

Journal ArticleDOI
TL;DR: This work investigates the feasibility of applying connectionist networks with hidden units to forecasting and process control and develops a particular approach which embeds input-output pairs in a state space using delay coordinates.

Journal ArticleDOI
TL;DR: It appears that one can achieve a high degree of equity by modestly increasing the total risk and by embarking on different routes to evenly spread the risk among the zones, and it appears that the heuristic procedure is excellent in terms of computational requirements as well as solution quality.
Abstract: In this paper, we develop and analyze a model to generate an equitable set of routes for hazardous material shipments. The objective is to determine a set of routes that will minimize the total risk of travel and spread the risk equitably among the zones of the geographical region in which the transportation network is embedded, when several trips are necessary from origin to destination. An integer programming formulation for the problem is proposed. We develop and test a heuristic that repeatedly solves single-trip problems: a Lagrangian dual approach with a gap-closing procedure is used to optimally solve single-trip problems. We report a sampling of our computational experience, based on a real-life routing scenario in the Albany district of New York State. Our findings indicate that one can achieve a high degree of equity by modestly increasing the total risk and by embarking on different routes to evenly spread the risk among the zones. Furthermore, it appears that our heuristic procedure is excellent in terms of computational requirements as well as solution quality. We also suggest some directions for future research.

Journal ArticleDOI
TL;DR: The recursive allocation scheme is shown to be effective on a number of large test task graphs-its solution quality is nearly as good as that produced by simulated annealing, and its computation time is several orders of magnitude less.

Journal ArticleDOI
TL;DR: This paper examines computational complexity issues and develops algorithms for a class of "shoreline" single-vehicle routing and scheduling problems with release time constraints and develops and analyzes heuristic algorithms for this class.
Abstract: In this paper we examine computational complexity issues and develop algorithms for a class of "shoreline" single-vehicle routing and scheduling problems with release time constraints. Problems in this class are interesting for both practical and theoretical reasons. From a practical perspective, these problems arise in several transportation environments. For instance, in the routing and scheduling of cargo ships, the routing structure is "easy" because the ports to be visited are usually located along a shoreline. However, because release times of cargoes at ports generally complicate the routing structure, the combined routing and scheduling problem is nontrivial. For the straight-line case a restriction of the shoreline case, our analysis shows that the problem of minimizing the maximum completion time can be solved exactly in quadratic time by dynamic programming. For the shoreline case we develop and analyze heuristic algorithms. We derive data-dependent worst-case performance ratios for these heuristics that are bounded by constant. We also discuss how these algorithms perform on practical data.

Journal ArticleDOI
TL;DR: In this paper, a heuristic method is developed for optimal design of a general linear consecutive-k-ou-of-n system using the concept of component reliability importance, and a binary search method is presented to find optimal designs of a linear consecutive k-out-ofn:F system with n ≥ 2k components.
Abstract: This study identifies invariant optimal designs for linear and circular consecutive-k-out-of-n systems and completes the theories on invariant optimal design of consecutive-k-out-of-n systems. The component reliability importance patterns of consecutive-2-out-of-n systems with i.i.d. components is identified. A heuristic method is developed for optimal design of a general linear consecutive-k-ou-of-n system using the concept of component reliability importance. A binary search method is presented to find optimal designs of a linear consecutive-k-out-of-n:F system with n≤2k

Journal ArticleDOI
TL;DR: A decision support system used to help administrators is described in this paper that includes several heuristic procedures to assign edges of the network to the schools.
Abstract: For a school board with several schools in its territory, the School Districting Problem is to specify the groups of children attending each school. A decision support system used to help administrators is described in this paper. It includes several heuristic procedures to assign edges of the network to the schools. The color graphics display is extensively used to assess the quality of the solution and to provide interactive functions for modifying the solution.

Journal ArticleDOI
01 Jan 1990
TL;DR: The nearest-neighbor strategy is shown to be more effective on hypercube systems with high communication start-up costs, especially for finite element graphs; the recursive partitioning heuristic is generally better on hypercubes with lower communication start up costs and is moreeffective on random task graphs.
Abstract: The task-to-processor mapping problem is addressed in the context of a local-memory multicomputer with a hypercube interconnection topology. Two heuristic cluster-based mapping strategies are compared: a nearest-neighbor approach and a recursive-clustering scheme. The nearest-neighbor strategy is shown to be more effective on hypercube systems with high communication start-up costs, especially for finite element graphs; the recursive partitioning heuristic is generally better on hypercubes with lower communication start-up costs and is more effective on random task graphs.

Proceedings Article
29 Jul 1990
TL;DR: The probabilistic network technology is a knowledge-based technique which focuses on reasoning under uncertainty and is finding increasing application in fields such as medical diagnosis, machine vision, military situation assessment, petroleum exploration, and information retrieval.
Abstract: The probabilistic network technology is a knowledge-based technique which focuses on reasoning under uncertainty. Because of its well defined semantics and solid theoretical foundations, the technology is finding increasing application in fields such as medical diagnosis, machine vision, military situation assessment, petroleum exploration, and information retrieval. However, like other knowledge-based techniques, acquiring the qualitative and quantitative information needed to build these networks can be highly labor-intensive. CONSTRUCTQR integrates techniques and concepts from probabilistic networks, artificial intelligence, and statistics in order to induce Markov networks (i.e., undirected probabilistic networks). The resulting networks are useful both qualitatively for concept organization and quantitatively for the assessment of new data. The primary goal of CONSTRUCTOR is to find qualitative structure from data. CONSTRUCTOR finds structure by first, modeling each feature in a data set as a node in a Markov network and secondly, by finding the neighbors of each node in the network. In Markov networks, the neighbors of a node have the property of being the smallest set of nodes which "shield" the node from being affected by other nodes in the graph. This property is used in a heuristic search to identify each node's neighbors. The traditional χ2 test for independence is used to test if a set of nodes "shield" another node. Cross-validation is used to estimate the quality of alternative structures.

Journal ArticleDOI
TL;DR: A new heuristic based on simulated annealing that schedules part families, as well as jobs within each part family, in a flow-line manufacturing cell is proposed that outperforms the other procedures not only in solution quality but also by requiring substantially less computation time.
Abstract: This article proposes a new heuristic based on simulated annealing that schedules part families, as well as jobs within each part family, in a flow-line manufacturing cell. The new scheduling approach is compared to a branch and bound algorithm as well as two other family-based scheduling heuristics for different cell configurations. The results reveal that all the heuristics provide comparable solutions to the optimal procedure for small problems. However, when the problem size increases, the simulated annealing heuristic outperforms the other procedures not only in solution quality but also by requiring substantially less computation time.

Journal ArticleDOI
TL;DR: In this article, a generalization of the idea of drawing colored balls from an urn so as to allow mutually incompatible experiments to be represented, thereby providing a device for thinking about quantum logic and other non-classical statistical situations in a concrete way.
Abstract: This heuristic article introduces a generalization of the idea of drawing colored balls from an urn so as to allow mutually incompatible experiments to be represented, thereby providing a device for thinking about quantum logic and other non-classical statistical situations in a concrete way. Such models have proven valuable in generating examples and counterexamples and in making abstract definitions in quantum logic seem more intuitive.

Journal ArticleDOI
TL;DR: This paper investigates optimal lot-splitting policies in a multiprocess flow shop environment with the objective of minimizing either mean flow time or makespan and indicates those conditions in which managers should implement the repetitive lots scheme and where other lot- Splitting schemes should work better.
Abstract: This paper investigates optimal lot-splitting policies in a multiprocess flow shop environment with the objective of minimizing either mean flow time or makespan. Using a quadratic programming approach to the mean flow time problem, we determine the optimal way of splitting a job into smaller sublots under various setup times to run time ratios, number of machines in the flow shop, and number of allowed sublots. Our results come from a deterministic flow shop environment, but also provide insights into the repetitive lots scheme using equal lot splits for job shop scheduling in a stochastic environment. We indicate those conditions in which managers should implement the repetitive lots scheme and where other lot-splitting schemes should work better.

Journal ArticleDOI
TL;DR: A new single pass forward heuristic is proposed, and it is proved that it has a uniformly bounded worst case performance and a lower bound on the cost of the optimal solution is obtained once the heuristic has been used.
Abstract: The joint replenishment problem involves the lot sizing of several items with nonstationary demand in discrete time. The items have individual ordering costs and linear inventory holding costs. In addition, a joint ordering cost is incurred whenever one or more items is ordered together. This problem often arises when economies can be affected by coordinated ordering or setup of the items, both in distribution and in manufacturing environments. This problem is known to be NP-complete. In this paper, we analyze the worst case performance of an existing multipass heuristic for the problem. Then a new single pass forward heuristic is proposed, and it is proved that it has a uniformly bounded worst case performance. Furthermore, a lower bound on the cost of the optimal solution is obtained once the heuristic has been used. We then discuss a number of related heuristic algorithms and their worst case performance. The behavior of our heuristics for a randomly generated set of problems is also studied.

Journal ArticleDOI
TL;DR: The proposed unit commitment expert system consists of a commitment schedule database, a dynamic load pattern matching process, and an inference optimization process that incorporates various operating constraints in obtaining a suboptimal solution and finally optimizing the schedule.