scispace - formally typeset
Search or ask a question

Showing papers in "Artificial Intelligence Review in 2004"


Journal ArticleDOI
TL;DR: A survey of contemporary techniques for outlier detection is introduced and their respective motivations are identified and distinguish their advantages and disadvantages in a comparative review.
Abstract: Outlier detection has been used for centuries to detect and, where appropriate, remove anomalous observations from data. Outliers arise due to mechanical faults, changes in system behaviour, fraudulent behaviour, human error, instrument error or simply through natural deviations in populations. Their detection can identify system faults and fraud before they escalate with potentially catastrophic consequences. It can identify errors and remove their contaminating effect on the data set and as such to purify the data for processing. The original outlier detection methods were arbitrary but now, principled and systematic techniques are used, drawn from the full gamut of Computer Science and Statistics. In this paper, we introduce a survey of contemporary techniques for outlier detection. We identify their respective motivations and distinguish their advantages and disadvantages in a comparative review.

3,235 citations


Journal ArticleDOI
TL;DR: A novel approach to using context in Web search that seeks to personalize the results of a generic search engine for the needs of a specialist community of users is described.
Abstract: As the search engine arms-race continues, search engines are constantly looking for ways to improve the manner in which they respond to user queries. Given the vagueness of Web search queries, recent research has focused on ways to introduce context into the search process as a means of clarifying vague, under-specified or ambiguous query terms. In this paper we describe a novel approach to using context in Web search that seeks to personalize the results of a generic search engine for the needs of a specialist community of users. In particular we describe two separate evaluations in detail that demonstrate how the collaborative search method has the potential to deliver significant search performance benefits to end-users while avoiding many of the privacy and security concerns that are commonly associated with related personalization research.

110 citations


Journal ArticleDOI
TL;DR: A range of experiments are presented, that demonstrate the benefit of Progressive RL relative to a basic RL approach in which each puzzle is solved from scratch, and illustrate how domain knowledge may be incorporated and show that Progressive Reinforcement Learning may be used to solve complex puzzles more quickly.
Abstract: This paper describes an extension to reinforcement learning (RL), in which a standard RL algorithm is augmented with a mechanism for transferring experience gained in one problem to new but related problems. In this approach, named Progressive RL, an agent acquires experience of operating in a simple environment through experimentation, and then engages in a period of introspection, during which it rationalises the experience gained and formulates symbolic knowledge describing how to behave in that simple environment. When subsequently experimenting in a more complex but related environment, it is guided by this knowledge until it gains direct experience. A test domain with 15 maze environments, arranged in order of difficulty, is described. A range of experiments in this domain are presented, that demonstrate the benefit of Progressive RL relative to a basic RL approach in which each puzzle is solved from scratch. The experiments also analyse the knowledge formed during introspection, illustrate how domain knowledge may be incorporated, and show that Progressive Reinforcement Learning may be used to solve complex puzzles more quickly.

67 citations


Journal ArticleDOI
TL;DR: The paper briefly reviews the field of computational linguistics and the problems it addresses, describes the special difficulties inherent to Hebrew (as well as to other Semitic languages), and attempts to characterize future needs and possible solutions.
Abstract: This paper reviews the current state of the art in Natural Language Processing for Hebrew, both theoretical and practical. The Hebrew language, like other Semitic languages, poses special challenges for developers of programs for natural language processing: the writing system, rich morphology, unique word formation process of roots and patterns, lack of linguistic corpora that document language usage, all contribute to making computational approaches to Hebrew challenging. The paper briefly reviews the field of computational linguistics and the problems it addresses, describes the special difficulties inherent to Hebrew (as well as to other Semitic languages), surveys a wide variety of past and ongoing works and attempts to characterize future needs and possible solutions.

52 citations


Journal ArticleDOI
Donald Kerr1
TL;DR: An analysis of existing literature and practical problems associated with the adoption of a developed knowledge-based decision support system within small rural businesses suggests that system developers need to have a good working knowledge of the target industry and to understand the types of decisions that are made by managers in order to develop systems that will be used.
Abstract: This paper provides an analysis of existing literature and practical problems associated with the adoption of a developed knowledge-based decision support system (KBDSS) within small rural businesses. The rural small businesses selected for this study were individual farms within the Australian dairy industry and the developed KBDSS was called DairyPro. The object was to determine the factors that could help with future KBDSS development and improve adoption rates. These factors were tested against DairyPro to determine their effectiveness. This analysis indicates that system developers need to have a good working knowledge of the target industry and to understand the types of decisions that are made by managers in order to develop systems that will be used. A review of the literature also suggests that adoption rates can be influenced by cultural, political, educational and age factors as well as individual characteristics of information technology itself. Small business managers needed more ownership in the process of KBDSS development. The author suggests that the factors affecting KBDSS adoption by dairy farmers can be equally applicable to other small, owner-operated rural businesses. This approach advocates the use of domain experts to provide estimates of expected production levels rather than that of the traditional approach of using the results from mathematical or simulation models to make these estimates.

43 citations


Journal ArticleDOI
TL;DR: An approach to the construction of adaptive tutoring systems, based on techniques from the research area of Reasoning about Actions and Change, is described, having a multi-agent architecture, whose kernel is a set of rational agents, programmed in the logic programming language DyLOG.
Abstract: In this paper we describe an approach to the construction of adaptive tutoring systems, based on techniques from the research area of Reasoning about Actions and Change. This approach leads to the implementation of a prototype system, having a multi-agent architecture, whose kernel is a set of rational agents, programmed in the logic programming language DyLOG. In the prototype that we implemented the reasoning capabilities of the agents are exploited both to dynamically build study plans and to verify the correctness of user-given study plans with respect to the competence that the user wants to acquire.

43 citations


Journal ArticleDOI
TL;DR: A robust parsing algorithm is presented which is applied after a conventional bottom–up parsing algorithm has failed and combines a rule from the error grammar with rules from the normal grammar to arrive at a parse for an ungrammatical sentence.
Abstract: This paper presents a robust parsing approach which is designed to address the issue of syntactic errors in text. The approach is based on the concept of an error grammar which is a grammar of ungrammatical sentences. An error grammar is derived from a conventional grammar on the basis of an analysis of a corpus of observed ill-formed sentences. A robust parsing algorithm is presented which is applied after a conventional bottom–up parsing algorithm has failed. This algorithm combines a rule from the error grammar with rules from the normal grammar to arrive at a parse for an ungrammatical sentence. This algorithm is applied to 50 test sentences, with encouraging results.

38 citations


Journal ArticleDOI
TL;DR: A neighbourhood filtering mechanism for filtering false profiles from the neighbourhood in order to improve the robustness of the system is proposed.
Abstract: Personalisation features are key to the success of many web applications and collaborative recommender systems have been widely implemented. These systems assist users in finding relevant information or products from the vast quantities that are frequently available. In previous work, we have demonstrated that such systems are vulnerable to attack and that recommendations can be manipulated. We introduced the concept of robustness as a performance measure, which is defined as the ability of a system to provide consistent predictions in the presence of noise in the data. In this paper, we expand on our previous work by examining the effects of several neighbourhood formation schemes and similarity measures on system performance. We propose a neighbourhood filtering mechanism for filtering false profiles from the neighbourhood in order to improve the robustness of the system.

38 citations


Journal ArticleDOI
TL;DR: A collaborative recommender that uses a user-based model to predict user ratings for specified items and is very efficient: predictions can be made in time that grows independently of the number of ratings and items and only logarithmically in the numberof users.
Abstract: We present a collaborative recommender that uses a user-based model to predict user ratings for specified items. The model comprises summary rating information derived from a hierarchical clustering of the users. We compare our algorithm with several others. We show that its accuracy is good and its coverage is maximal. We also show that the algorithm is very efficient: predictions can be made in time that grows independently of the number of ratings and items and only logarithmically in the number of users.

30 citations


Journal ArticleDOI
TL;DR: A novel false colouring-based visual saliency algorithm is presented and how it is used in the situated language interpreter (SLI) system to ground a reference resolution framework for natural language interfaces to 3-D simulated environments is illustrated.
Abstract: In this paper we present a novel false colouring-based visual saliency algorithm and illustrate how it is used in the situated language interpreter (SLI) system to ground a reference resolution framework for natural language interfaces to 3-D simulated environments. The visual saliency algorithm allows us to dynamically maintain a model of the evolving visual context. The visual saliency scores associated with the elements in the context model can be used to resolve underspecified references.

29 citations


Journal ArticleDOI
TL;DR: A temporal model, TemPro, is developed, based on the Allen interval algebra, to express and manage time information in terms of qualitative and quantitative temporal constraints, and the efficiency of the MCRW approximation method to deal with under constrained and middle constrained problems while Tabu Search and SDRW are the methods of choice for over constrained problems.
Abstract: Representing and reasoning about time is fundamental in many applications of Artificial Intelligence as well as of other disciplines in computer science, such as scheduling, planning, computational linguistics, database design and molecular biology. The development of a domain-independent temporal reasoning system is then practically important. An important issue when designing such systems is the efficient handling of qualitative and metric time information. We have developed a temporal model, TemPro, based on the Allen interval algebra, to express and manage such information in terms of qualitative and quantitative temporal constraints. TemPro translates an application involving temporal information into a Constraint Satisfaction Problem (CSP). Constraint satisfaction techniques are then used to manage the different time information by solving the CSP. In order for the system to deal with real time applications or those applications where it is impossible or impractical to solve these problems completely, we have studied different methods capable of trading search time for solution quality when solving the temporal CSP. These methods are exact and approximation algorithms based respectively on constraint satisfaction techniques and local search. Experimental tests were performed on randomly generated temporal constraint problems as well as on scheduling problems in order to compare and evaluate the performance of the different methods we propose. The results demonstrate the efficiency of the MCRW approximation method to deal with under constrained and middle constrained problems while Tabu Search and SDRW are the methods of choice for over constrained problems.

Journal ArticleDOI
TL;DR: Rights allow the agents enough freedom, and at the same time constrain them (prohibiting specific actions), and can be understood as the basic concept underneath open normativesystems where the agents reason about the code they must abide by.
Abstract: As utility calculus cannot account for an important part of agents' behaviour in Multi-Agent Systems, researchers have progressively adopted a more normative approach. Unfortunately, social laws have turned out to be too restrictive in real-life domains where autonomous agents' activity cannot be completely specified in advance. It seems that a halfway concept between anarchic and off-line constrained interaction is needed. We think that the concept of right suits this idea. Rights improve coordination and facilitate social action in Multi-Agent domains. Rights allow the agents enough freedom, and at the same time constrain them (prohibiting specific actions). Besides, rights can be understood as the basic concept underneath open normative systems where the agents reason about the code they must abide by. Typically, in such systems this code is underspecified. On the other hand, the agents might not have complete knowledge about the rules governing their interaction. Conflict situations arise, thus, when the agents have different points of view as to how to apply the code. We have extended Parsons's et al. argumentation protocol (Parsons et al. 1998a, b) to normative systems to deal with this problem.

Journal ArticleDOI
TL;DR: The result demonstrates that the proposed procedure is effective for defection prevention andefficiently detects potential defectors withoutdeterioration of prediction accuracy whencompared to that of the MLP (Multi-LayerPerceptron) neural networks.
Abstract: Customer retention is an increasingly pressing issue in today's competitive environment. This paper proposes a personalized defection detection and prevention procedure based on the observation that potential defectors have a tendency to take a couple of months or weeks to gradually change their behaviour (i.e., trim-out their usage volume) before their eventual withdrawal. For this purpose, we suggest a SOM (Self-Organizing Map) based procedure to determine the possible states of customer behaviour from past behaviour data. Based on this state representation, potential defectors are detected by comparing their monitored trajectories of behaviour states with frequent and confident trajectories of past defectors. Also, the proposed procedure is extended to prevent the defection of potential defectors by recommending the desirable behaviour state for the next period so as to lower the likelihood of defection. For the evaluation of the proposed procedure, a case study has been conducted for a Korean online game site. The result demonstrates that the proposed procedure is effective for defection prevention and efficiently detects potential defectors without deterioration of prediction accuracy when compared to that of the MLP (Multi-Layer Perceptron) neural networks.

Journal ArticleDOI
TL;DR: Real-world data is never perfect and can often suffer from corruptions (noise) that may impact interpretations of the data, models created from the data and decisions made based on the data.
Abstract: Real-world data is never perfect and can often suffer from corruptions (noise) that may impact interpretations of the data, models created from the data and decisions made based on the data. Noise ...

Journal ArticleDOI
TL;DR: This paper presents an alternative to the ‘speech acts with STRIPS’ approach to implementing dialogue a fully implemented AI planner which generates and analyses the semantics of utterances using a single linguistic act for all contexts.
Abstract: This paper presents an alternative to the ‘speech acts with STRIPS’ approach to implementing dialogue a fully implemented AI planner which generates and analyses the semantics of utterances using a single linguistic act for all contexts. Using this act, the planner can model problematic conversational situations, including felicitous and infelicitous instances of bluffing, lying, sarcasm, and stating the obvious. The act has negligible effects, and its precondition can always be proved. ‘Speaker maxims’ enable the speaker to plan to deceive, as well as to generate implicatures, while ‘hearer maxims’ enable the hearer to recognise deceptions, and interpret implicatures. The planner proceeds by achieving parts of the constructive proof of a goal. It incorporates an epistemic theorem prover, which embodies a deduction model of belief, and a constructive logic.

Journal ArticleDOI
TL;DR: Experimental evidence is provided that, despite common belief to the contrary, it is not always necessary for a good arc-consistency algorithm to have an optimal worst-case time-complexity, and that MAC-2001 was slower than MAC-3 d for easy and hard random problems.
Abstract: Arc-consistency algorithms are the workhorse of backtrackers that maintain arc-consistency (MAC). This paper will provide experimental evidence that, despite common belief to the contrary, it is not always necessary for a good arc-consistency algorithm to have an optimal worst-case time-complexity. Sacrificing this optimality allows MAC solvers that (1) do not need additional data structures during search, (2) have an excellent average time-complexity, and (3) have a space-complexity that improves significantly on that of MAC solvers that have optimal arc-consistency components. Results will be presented from an experimental comparison between MAC-2001, MAC-3d and related algorithms. MAC-2001 has an arc-consistency component with an optimal worst-case time-complexity, whereas MAC-3d does not. MAC-2001 requires additional data structures during search, whereas MAC-3d does not. MAC-3d has a O(e+nd) of space-complexity, where n is the number of variables, d the maximum domain size, and e the number of constraints. We shall demonstrate that MAC-2001's space-complexity is O(edmin(n,d)). Our experimental results indicate that MAC-2001 was slower than MAC-3d for easy and hard random problems. For real-world problems things were not as clear.

Journal ArticleDOI
TL;DR: PUNC can not only generate interpretations that reflect those produced by people, but also mirror the differences in processing times for understanding familiar, similar and novel word combinations, by integrating the constraints of the Constraint theory of conceptual combination.
Abstract: Noun-noun compounds play a key role in the growth of language. In this article we present a system for producing and understanding noun-noun compounds (PUNC). PUNC is based on the Constraint theory of conceptual combination and the C3 model. The new model incorporates the primary constraints of the Constraint theory in an integrated fashion, creating a cognitively plausible mechanism of interpreting noun noun phrases. It also tries to overcome algorithmic limitations of the C3model in being more efficient in its computational complexity, and deal with a wider span of empirical phenomena, such as dimensions of word familiarity. We detail the model, including knowledge representation and interpretation production mechanisms. We show that by integrating the constraints of the Constraint theory of conceptual combination and prioritizing the knowledge available within a concept's representation, PUNC can not only generate interpretations that reflect those produced by people, but also mirror the differences in processing times for understanding familiar, similar and novel word combinations.

Journal ArticleDOI
TL;DR: A new algorithm for counting truth assignments of a clausal formula using inverse propositional resolution and its associated normalization rules is presented, and is achieved by constructing in a bottom-up manner a computation graph.
Abstract: We present a new algorithm for counting truth assignments of a clausal formula using inverse propositional resolution and its associated normalization rules. The idea is opposite of the classical resolution, and is achieved by constructing in a bottom-up manner a computation graph. This means that we successively add complementary literals to generate new bigger clauses instead of solving them. Next, we make a comparison between the classical and inverse resolution, followed by a new algorithm which combines these two techniques for solving the SAT problem.

Journal ArticleDOI
TL;DR: A Constraint Programming approach that can be used in a more general context, where nothing is assumed about the nature of the constraints that must be satisfied or the structure of the underlying problem, is proposed.
Abstract: Ensuring truthfulness amongst self-interested agents bidding against one another in an auction can be computationally expensive when prices are determined using the Vickrey–Clarke–Groves (VCG) mechanism. This mechanism guarantees that each agent's dominant strategy is to tell the truth, but it requires solving n + 1 optimization problems when the overall optimal solution involves n agents. This paper first examines a case-study example demonstrating how Operations Research techniques can be used to compute Vickrey prices efficiently. In particular, the case-study focuses on the Assignment Problem. We show how, in this case, Vickrey prices can be computed in the same asymptotic time complexity as that of the original optimization problem. This case-study can be seen as serving a pedagogical role in the paper illustrating how Operations Research techniques can be used for fast Vickrey pricing. We then propose a Constraint Programming approach that can be used in a more general context, where nothing is assumed about the nature of the constraints that must be satisfied or the structure of the underlying problem. In particular, we demonstrate how nogood learning can be used to improve the efficiency of constraint-based Vickrey pricing in combinatorial auctions.

Journal ArticleDOI
TL;DR: This paper presents an alternative to the ‘speech acts with STRIPS’ approach to implementing dialogue a fully implemented AI planner which generates and analyses the semantics of utterances using a single linguistic act for all contexts.
Abstract: This paper presents an alternative to the ‘speech acts with STRIPS’ approach to implementing dialogue a fully implemented AI planner which generates and analyses the semantics of utterances using a single linguistic act for all contexts. Using this act, the planner can model problematic conversational situations, including felicitous and infelicitous instances of bluffing, lying, sarcasm, and stating the obvious. The act has negligible effects, and its precondition can always be proved. ‘Speaker maxims’ enable the speaker to plan to deceive, as well as to generate implicatures, while ‘hearer maxims’ enable the hearer to recognise deceptions, and interpret implicatures. The planner proceeds by achieving parts of the constructive proof of a goal. It incorporates an epistemic theorem prover, which embodies a deduction model of belief, and a constructive logic.

Journal ArticleDOI
TL;DR: The test results indicate that the method is effective in trading speed with the quality of solutions and that it is efficient in producing solutions for p = 0.
Abstract: This paper presents an effective near-optimal search method for state-space problems. The method, LTAast (Learning Threshold Aast), accepts a threshold parameter, p, as an input and finds a solution within that range of the optimum. The larger the parameter, the faster the method finds a solution. LTAast is based on a combination of recursion and dynamic memory and, like Aast, keeps information about all states in memory. In contrast to Aast however, which represents each node as a complete state, LTAast represents each node using an operator. This representation of the nodes makes LTAast dramatically efficient with respect to memory usage. Another advantage of LTAast is that it eliminates any need for computational effort to maintain a priority queue, and this elimination significantly increases speed. To test the effectiveness and efficiency of the method we have applied it to NP-hard problems in scheduling. The test results indicate that the method is effective in trading speed with the quality of solutions and that it is efficient in producing solutions for p = 0.

Journal ArticleDOI
TL;DR: The temporal relations within verb semantics, particularly ordered pairs of verb entailment, are studied using Allen's interval-based temporal formalism and their application to the compositional visual definitions in the intelligent storytelling system, CONFUCIUS, is presented.
Abstract: Numerous temporal relations of verbal actions have been analysed in terms of various grammatical means of expressing verbal temporalisation such as tense, aspect, duration and iteration. Here the temporal relations within verb semantics, particularly ordered pairs of verb entailment, are studied using Allen's interval-based temporal formalism. Their application to the compositional visual definitions in our intelligent storytelling system, CONFUCIUS, is presented, including the representation of procedural events, achievement events and lexical causatives. In applying these methods we consider both language modalities and visual modalities since CONFUCIUS is a multimodal system.

Journal ArticleDOI
TL;DR: The present article points to mistakes made by Entemann in three different areas of fuzzy logic.
Abstract: Entemann (2002) defends fuzzy logic by pointing to what he calls "misconceptions" concerning fuzzy logic. However, some of these OmisconceptionsC are in fact truths, and it is Entemann who has the misconceptions. The present article points to mistakes made by Entemann in three different areas. It closes with a discussion of what sort of general considerations it would take to motivate fuzzy logic.

Journal ArticleDOI
TL;DR: A generic architecture for critic systems is put forward, with its important aspects being analyzed, and two case studies are given to illustrate critic systems.
Abstract: Human--computer collaboration is extremely necessary for solving ill-structured problems and critic systems can effectively facilitate human--computer collaborative problem solving. This paper conducts a systematic study on critic systems. First, the concepts of critic systems are presented. Then, a literature review is presented on critic systems. Afterwards, a generic architecture is put forward for critic systems, with its important aspects being analyzed. Finally, two case studies are given to illustrate critic systems.

Journal ArticleDOI
TL;DR: It is suggested that use of the dictionary, theaurus and knowledge reconciliation techniques has the potential to increase the diagnostic capabilities of intelligent databasedesign tools by facilitating detection and resolution of design inconsistencies that would remain undiscovered in situations where such system-held domain knowledge was not available.
Abstract: Techniques for representing and exploiting domain knowledge (such as the dictionary, thesaurus and knowledge reconciliation techniques) have long been used by intelligent database design tools when performing the task of design synthesis. However, the capacity of these techniques to enhance the diagnostic capabilities of intelligent database design tools has yet to be explored and evaluated. This paper presents such an evaluation, focusing upon the aforementioned techniques of dictionary, thesaurus and knowledge reconciliation. Results obtained from this investigation suggest that use of these techniques has the potential to increase the diagnostic capabilities of intelligent database design tools by facilitating detection and resolution of design inconsistencies that would remain undiscovered in situations where such system-held domain knowledge was not available.

Journal ArticleDOI
TL;DR: This paper takes a novel approach to the problem of representing repeated (or recurrent) entities as a class of temporal entities with well-defined properties, and derives some general properties for this important class ofporal entities, and some properties for an interesting subclass, namely the class of repetition of concatenable entities.
Abstract: Temporal entities are assertions (e.g. events, states) whose truth can be associated with time. An interesting problem in temporal reasoning is the problem of representing the fact that such entities are repeated over time. The problem has attracted some interest lately in the knowledge representation community. In this paper, we take a novel approach to the problem, which allows us to recognize repeated (or recurrent) entities as a class of temporal entities with well-defined properties. We derive some general properties for this important class of temporal entities, and some properties for an interesting subclass, namely the class of repetition of concatenable entities. Conacatenable entities have been called unrepeatable in the literature. As such we take a special interest in investigating their properties. The logical theory used here is a reified theory, which admits entity types, into its ontology as opposed to tokens, and uses Allen's interval logic. Finally, we relate the new class of repetitive entities to existing classes in Shoham's taxonomy.

Journal ArticleDOI
TL;DR: This paper examined two well-known meta-heuristics and carefully combined the short-term and long-term memory-like mechanism sof both methods to achieve better results, and showed the prototype to compare favorably against the original search methods and other related search hybrids on the Solomon's testcases.
Abstract: The vehicle routing problems with time windows are challenging delivery problems in which instances involving 100 customers or more can be difficult to solve. There were many interesting heuristics proposed to handle these problems effectively. In this paper, we examined two well-known meta-heuristics and carefully combined the short-term and long-term memory-like mechanisms of both methods to achieve better results. Our prototype was shown to compare favorably against the original search methods and other related search hybrids on the Solomon's test cases. More importantly, our proposal of integration opens up many exciting directions for further investigation.

Journal ArticleDOI
TL;DR: This paper attempts to propose this approach forintegrating OLAP and fuzzy logic to form an intelligent system, capitalizing on the merits and at the same time offsetting the drawback of the involved technologies.
Abstract: To cope with the issue of ``brain drain'' in today's competitive industrial environment, it is important to capture relevant experience and knowledge in order to sustain the continual growth of company business. In this respect, the study in the domain of knowledge learning is of paramount importance in terms of capturing and reuse of tacit and explicit knowledge. To support the process of knowledge learning, a methodology to establish an intelligent system, which consists of both On-Line Analytical Processing (OLAP) and fuzzy logic principles, is suggested. This paper attempts to propose this approach for integrating OLAP and fuzzy logic to form an intelligent system, capitalizing on the merits and at the same time offsetting the drawbacks of the involved technologies. In this system, the values and positions of related fuzzy sets are modified to suit the industrial environment, supporting smoother operation with less error. To validate the feasibility of the proposed system, a case study related to the monitoring of chemical concentration of PCB electroplating process is covered in the paper.

Journal ArticleDOI
TL;DR: In this article, an over-constrained scheduling problem where constraints cannot be relaxed is studied, and the problem originates from a local defense agency where activities to be scheduled are strongly constrained.
Abstract: In this work we study an over-constrained scheduling problem where constraints cannot be relaxed. This problem originates from a local defense agency where activities to be scheduled are strongly r...