scispace - formally typeset
Search or ask a question

Showing papers presented at "Computer Aided Systems Theory in 2009"


Book ChapterDOI
30 Sep 2009
TL;DR: The developed application uses only a laser radar which provides information to sort objects according to their shape and movement and the subsequent detection and classification provide higher level tracking.
Abstract: This paper describes the detection of moving obstacles using laser radar in road environments. This application is designed to be implemented in further research on data fusion technologies. The developed application uses only a laser radar which provides information to sort objects according to their shape and movement. The subsequent detection and classification provide higher level tracking.

29 citations


Book ChapterDOI
30 Sep 2009
TL;DR: The 0-1 knapsack problem itself, the backgrounds of the HSA, Baldwin and Lamarck Effects and the numerical tests are described and the result of the tests performed is surprised a bit.
Abstract: In the paper we carried out the analysis of the properties of the Harmony Search Algorithm (HSA) on a well known one-dimensional binary knapsack problem Binary knapsack problems are among the most widely studied problems in discrete optimization Since the optimization versions of these problems are nP-hard, practical solution techniques do not ask for optimality, but are heuristics that generate feasible, suboptimal solutions In this paper we describe the 0-1 knapsack problem itself, the backgrounds of the HSA, Baldwin and Lamarck Effects and the numerical tests The result of the tests performed is surprised a bit

22 citations


Book ChapterDOI
30 Sep 2009
TL;DR: A method to obtain a near optimal neuro-controller for the autonomous helicopter flight by means of an ad hoc evolutionary reinforcement learning method and uses a Helicopter Hovering simulator created in the Stanford University.
Abstract: In this paper we present a method to obtain a near optimal neuro-controller for the autonomous helicopter flight by means of an ad hoc evolutionary reinforcement learning method. The method presented here was developed for the Second Annual Reinforcement Learning Competition (RL2008) held in Helsinki-Finland. The present work uses a Helicopter Hovering simulator created in the Stanford University that simulates a Radio Control XCell Tempest helicopter in the flight regime close to hover. The objective of the controller is to hover the helicopter by manipulating four continuous control actions based on a 12-dimensional state space.

22 citations


Book ChapterDOI
30 Sep 2009
TL;DR: A new technique is proposed that allows early termination of an under-approximation refinement loop, although the original formula is unsatisfiable, and it is shown how over-Approximation and under- approximation techniques can be combined.
Abstract: Recently, it has been proposed to use approximation techniques in the context of decision procedures for the quantifier-free theory of fixed-size bit-vectors. We discuss existing and novel variants of under-approximation techniques. Under-approximations produce smaller models and may reduce solving time significantly. We propose a new technique that allows early termination of an under-approximation refinement loop, although the original formula is unsatisfiable. Moreover, we show how over-approximation and under-approximation techniques can be combined. Finally, we evaluate the effectiveness of our approach on array and bit-vector benchmarks of the SMT library.

21 citations


Book ChapterDOI
30 Sep 2009
TL;DR: A construction heuristic based on Kruskal's algorithm for finding a minimum cost spanning tree which eliminates some drawbacks of existing heuristic methods and seems to be a better starting point for subsequent improvement methods.
Abstract: The rooted delay-constrained minimum spanning tree problem is an NP-hard combinatorial optimization problem arising for example in the design of centralized broadcasting networks where quality of service constraints are of concern. We present a construction heuristic based on Kruskal's algorithm for finding a minimum cost spanning tree which eliminates some drawbacks of existing heuristic methods. To improve the solution we introduce a greedy randomized adaptive search procedure (GRASP) and a variable neighborhood descent (VND) using two different neighborhood structures. Experimental results indicate that our approach produces solutions of better quality in shorter runtime when having strict delay-bounds compared to an existing centralized construction method based on Prim's algorithm. Especially when testing on Euclidian instances our Kruskal-based heuristic outperforms the Prim-based approach in all scenarios. Moreover our construction heuristic seems to be a better starting point for subsequent improvement methods.

21 citations


Book ChapterDOI
30 Sep 2009
TL;DR: In this article, a real-time vision-based system that detects vehicles approaching from the rear in order to anticipate possible rear-end collisions is described, where a camera mounted on the rear of the vehicle provides images which are analyzed by means of computer vision techniques.
Abstract: This paper describes a real-time vision-based system that detects vehicles approaching from the rear in order to anticipate possible rear-end collisions. A camera mounted on the rear of the vehicle provides images which are analysed by means of computer vision techniques. The detection of candidates is carried out using the top-hat transform in combination with intensity and edge-based symmetries. The candidates are classified by using a Support Vector Machine-based classifier (SVM) with Histograms of Oriented Gradients (HOG features). Finally, the position of each vehicle is tracked using a Kalman filter and template matching techniques. The proposed system is tested using image data collected in real traffic conditions.

19 citations


Book ChapterDOI
30 Sep 2009
TL;DR: It is seen that the use of enhanced genetic programming yields models for emissions that are valid not only in certain parts of the parameter space but can be used as global virtual sensors.
Abstract: In this paper we discuss the generation of models for emissions of a Diesel engine, produced by genetic programming based evolutionary system identification: Models for the formation of NO x and particulate matter emissions are identified and analyzed We compare these models to models designed by experts applying variables section and the identification of local polynomial models; analyzing the results summarized in the empirical part of this paper we see that the use of enhanced genetic programming yields models for emissions that are valid not only in certain parts of the parameter space but can be used as global virtual sensors

15 citations


Book ChapterDOI
30 Sep 2009
TL;DR: A top-down methodology to detect drusen in initial stages to prevent age-related macular degeneration is proposed, which has several stages where the key issues are the detection and characterization of suspect areas.
Abstract: The age-related macular degeneration (AMD) is the main cause of blindness among people over 50 years in developed countries and there are 150 million people affected worlwide. This disease can lead to severe loss central vision and adversely affect the patient's quality of life. The appearance of drusen is associated with the early AMD, so we proposed a top-down methodology to detect drusen in initial stages to prevent AMD. The proposed methodology has several stages where the key issues are the detection and characterization of suspect areas. We test our method with a set of 1280 ?1024 images, obtaining a system with a high sensitivity in the localization of drusen, not just fake injuries.

15 citations


Book ChapterDOI
30 Sep 2009
TL;DR: New local search algorithms for the Probabilistic Traveling Salesman Problem (PTSP) are presented using sampling and ad-hoc approximation to improve both runtime and solution quality of state-of-the-art local search algorithm for the PTSP.
Abstract: In this paper we present new local search algorithms for the Probabilistic Traveling Salesman Problem (PTSP) using sampling and ad-hoc approximation. These algorithms improve both runtime and solution quality of state-of-the-art local search algorithms for the PTSP.

15 citations


Book ChapterDOI
30 Sep 2009
TL;DR: An abstract model is proposed for the estimation of sequence dependent setup costs and subsequently dispatching and scheduling strategies are applied to generate optimized production sequences.
Abstract: Setup costs are a crucial factor in many branches of industry and frequently sequence dependent. However, the empirical acquisition of setup costs is inaccurate and not practicable for companies with large product portfolios operating in volatile markets. We therefore propose an abstract model for the estimation of such sequence dependent setup costs and subsequently apply dispatching and scheduling strategies to generate optimized production sequences. Both approaches are tested on randomly generated test instances and a real-world production scenario.

14 citations


Book ChapterDOI
30 Sep 2009
TL;DR: A new real-time hierarchical (topological/metric) Visual SLAM system focusing on the localization of a vehicle in large-scale outdoor urban environments is presented, exclusively based on the visual information provided by both a low-cost wide-angle stereo camera and alow-cost GPS.
Abstract: In this paper we present a new real-time hierarchical (topological/metric) Visual SLAM system focusing on the localization of a vehicle in large-scale outdoor urban environments. It is exclusively based on the visual information provided by both a low-cost wide-angle stereo camera and a low-cost GPS. Our approach divides the whole map into local sub-maps identified by the so-called fingerprint (reference poses). At the sub-map level (Low Level SLAM), 3D sequential mapping of natural landmarks and the vehicle location/orientation are obtained using a top-down Bayesian method to model the dynamic behavior. A higher topological level (High Level SLAM) based on references poses has been added to reduce the global accumulated drift, keeping real-time constraints. Using this hierarchical strategy, we keep local consistency of the metric sub-maps, by mean of the EKF, and global consistency by using the topological map and the MultiLevel Relaxation (MLR) algorithm. GPS measurements are integrated at both levels, improving global estimation. Some experimental results for different large-scale urban environments are presented, showing an almost constant processing time.

Book ChapterDOI
30 Sep 2009
TL;DR: This paper analyzes the drawbacks of known schemes, proposes a new secure incentive scheme to stimulate cooperation in VANETs, and examines the problem of motivating drivers to cooperate and contribute to packet forwarding in Vehicle-TO-Vehicle and Vehicle-to-Roadside communications.
Abstract: Vehicular Ad-hoc NETworks (VANETs) will provide many interesting services in the near future. One of the most promising is commercial application. In such a case, there will be necessary to motivate drivers to cooperate and contribute to packet forwarding in Vehicle-TO-Vehicle and Vehicle-TO-Roadside communications. This paper examines the problem, analyzes the drawbacks of known schemes, and proposes a new secure incentive scheme to stimulate cooperation in VANETs.

Book ChapterDOI
30 Sep 2009
TL;DR: This paper presents the part of STOWL which has to do with the definition of n-ary relations, and uses this model as the common model for defining the schemes of the data sources in order to ease their integration.
Abstract: There are many issues to overcome when integrating different data sources due to the number of variables that are involved in the integration phase. However we are interested in the integration of temporal and spatial information due to the nature of modern Information Systems. We have previously developed a model, called STOWL, which is a spatio-temporal extension of OWL. We use this model as the common model for defining the schemes of the data sources in order to ease their integration. This paper presents the part of STOWL which has to do with the definition of n-ary relations.

Book ChapterDOI
30 Sep 2009
TL;DR: A solution procedure consisting in hybridizing a Variable Neighborhood Search (VNS) and a Greedy Randomize Adaptive Search Procedure (GRASP) for the corresponding optimization problem.
Abstract: We consider the Vehicle Routing Problem with time windows where travel times are triangular fuzzy numbers. The weighted possibility and necessity measure of fuzzy relations is used to specify a confidence level at which it is desired that the travel times to reach each customer fall into their time windows. In this paper we propose and analyze a solution procedure consisting in hybridizing a Variable Neighborhood Search (VNS) and a Greedy Randomize Adaptive Search Procedure (GRASP) for the corresponding optimization problem.

Book ChapterDOI
30 Sep 2009
TL;DR: The paper deals with design of a robust controller via algebraic μ-synthesis for a two tank system, which is a well known benchmark problem and the robustness is measured by the structured singular value denoted μ.
Abstract: The paper deals with design of a robust controller via algebraic μ-synthesis for a two tank system, which is a well known benchmark problem. The controller is obtained by decoupling two-input two-output system into two identical SISO (Single-Input Single-Output) plants. The task of robust controller design is then performed by finding a suitable pole placement for the SISO systems. The robustness is measured by the structured singular value denoted μ. The final controller is verified through simulation for plants perturbed by worst case perturbations.

Book ChapterDOI
30 Sep 2009
TL;DR: This paper presents an open-source tool which generates synthesizable HDL code from assertions specified in the Property Specification Language (PSL) by first reducing the PSL formulas into base cases, called PSL min, and then generating automata which can be transformed to synthesisable HDL code and therefore into hardware.
Abstract: The effort of verifying state-of-the-art hardware designs undeviatingly increases with the complexity of those designs. The design's state space, directly related to its complexity, grows exponentially, while the computational performance for verifying the design grows only linearly. This so-called verification gap can, for example, be met by using methods such as assertion-based verification (ABV), which can be used for both specifying the system's properties as well as verifying the relating implementation during simulation phase. In this paper, we present an open-source tool which generates synthesizable HDL code from assertions specified in the Property Specification Language (PSL). This is done by first reducing the PSL formulas into base cases, called PSL min , and then generating automata which can be transformed to synthesizable HDL code and therefore into hardware.

Book ChapterDOI
30 Sep 2009
TL;DR: This paper shows how to use a Bounded Model Checker for C programs as an automatic test generator for the Coverage Analysis, and it is shown how its use can substantially reduce the costs of the testing phase.
Abstract: Testing is the most used technique for software verification: it is easy to use and even if no error is found, it can release a set of tests certifying the (partial) correctness of the compiled system Moreover, in order to increase the confidence of the correctness of the compiled system, it is often required that the provided set of tests covers 100% of the code This requirement, however, substantially increases the costs associated to the testing phase, since it may involve the manual generation of tests In this paper we show how to use a Bounded Model Checker for C programs (CBMC) as an automatic test generator for the Coverage Analysis, and we show how its use can substantially reduce the costs of the testing phase

Book ChapterDOI
30 Sep 2009
TL;DR: This paper presents an initial approach of a performance modelling framework, based on the SAE standardised modelling and analysis language AADL, to integrate performance analysis in the toolchain of this practical context.
Abstract: The new paradigm of Integrated Modular Avionics (IMA) [1] necessitates the analysis and validation of non-functional requirements for IMA systems. This includes the analysis of their performability. In this paper we present an initial approach of a performance modelling framework, based on the SAE standardised modelling and analysis language AADL [2,3], to integrate performance analysis in the toolchain of this practical context. The proposed framework is a hybrid of static and dynamic systems analysis and includes aspects of performance evaluation.

Book ChapterDOI
30 Sep 2009
TL;DR: Methods of automatic analysis and classification of biological signals encompass a set of heterogeneous biological signals recorded simultaneously, which are very complex and exhibit nonstationarity and stochasticity.
Abstract: This paper describes methods of automatic analysis and classification of biological signals. Polysomnographic (PSG) recordings encompass a set of heterogeneous biological signals (e.g. EEG, EOG, EMG, ECG, PNG) recorded simultaneously. These signals, especially EEG, are very complex and exhibit nonstationarity and stochasticity. Thus their processing represents a challenging multilevel procedure composed of several methods. Used methods are illustrated on examples of PSG recordings of newborns and sleep recordings of adults and can be applied to similar tasks in other problem domains. Analysis was performed using real clinical data.

Book ChapterDOI
30 Sep 2009
TL;DR: The performance and features of WSNSim, a WSN Simulator and the immediate advantages that can be experienced by researchers working on various realms in this area are presented.
Abstract: Wireless Sensor Networks (WSN) is fast becoming the holy grail of digital surveillance, data monitoring and analysis. Being relatively cheap, low-powered and easy to deploy WSNs is being adopted in a range of different fields. Currently, the focus is towards optimizing techniques used to form sensor clusters and to route data within a network. In order to satisfy these goals a significant amount of research is being done globally and what tends to be lacking at times is the right tools to make such research work less time consuming and inexpensive. The use of simulation tools is one such adaptation that can help researchers closely analyse a particular aspect of WSN while sticking with a known environment that is applicable to other scenarios in a similar way. This paper presents the performance and features of WSNSim, a WSN Simulator and the immediate advantages that can be experienced by researchers working on various realms in this area.

Book ChapterDOI
30 Sep 2009
TL;DR: This chapter deals with an application of the algebraic formalism in nonlinear control systems on the nonlinear controller design for a fluid tank system and its transfer function is derived.
Abstract: This chapter deals with an application of the algebraic formalism in nonlinear control systems on the nonlinear controller design for a fluid tank system. A nonlinear discrete-time model of a fluid tank system and its transfer function are derived. Then, nonlinear continuous- and discrete-time controllers are designed using transfer function formalism for nonlinear systems which was developed recently. Verification on the real plant is also included and it suggested the modification of the original model of one-tank system to achieve better performance on the real plant.

Book ChapterDOI
30 Sep 2009
TL;DR: A newly proposed Morphotronic System paradigm that can be used as a general computation model for construction of software allows for a definition of very flexible software prototypes, in which a processing path can be interpreted as an extremum principle.
Abstract: This paper discusses a newly proposed Morphotronic System paradigm that can be used as a general computation model for construction of software. The Morphotronic System allows for a definition of very flexible software prototypes, in which a processing path can be interpreted as an extremum principle. This approach offers a significant improvement to traditional software practices. The system purpose is stated as a conceptual input to the computer system at a starting point of the computation process while the local machine state is completely ignored. The system context and its rules are generated as resources to allocate the computational components. The morphotronic system applies non-Euclidean geometry which allows to shape the context and to define the projection operators for an ideal network of forms.

Book ChapterDOI
30 Sep 2009
TL;DR: This work proposes the use of Fuzzy Rule-based Classification in order to obtain the robot position during the estimation stage, after a short training stage where only a few significant WiFi measures are needed.
Abstract: The framework of this paper is robot localization inside buildings using WiFi signal strength measure. This localization is usually made up of two phases: training and estimation stages. In the former the WiFi signal strength of all visible Access Points (APs) are collected and stored in a database or Wifi map, while in the latter the signal strengths received from all APs at a certain position are compared with the WiFi map to estimate the robot location. This work proposes the use of Fuzzy Rule-based Classification in order to obtain the robot position during the estimation stage, after a short training stage where only a few significant WiFi measures are needed. As a result, the proposed method is easily adaptable to new environments where triangulation algorithms can not be applied since the AP physical location is unknown. It has been tested in a real environment using our own robotic platform. Experimental results are better than those achieved by other classical methods.

Book ChapterDOI
30 Sep 2009
TL;DR: The aim of the present work is to study the first exit time problem for the resulting stochastic process of tumor growth based on Gompertz law.
Abstract: A stochastic model describing tumor growth based on Gompertz law is considered. We pay attention on the tumor size at time detection. We assume the initial state as a random variable since it may suffer from errors due to measurement and diagnostics. The aim of the present work is to study the first exit time problem for the resulting stochastic process. A numerical analysis is also performed for particular choices of the initial distribution.

Book ChapterDOI
30 Sep 2009
TL;DR: A generic workflow model for describing fire fighting operations in different scenarios is introduced and heuristics for calculating the similarity of workflows which can be used for searching and clustering are described.
Abstract: Workflows are used nowadays in different areas of application. Emergency services are one of these areas where explicitly defined workflows help to increase traceability, control, efficiency, and quality of rescue missions. In this paper, we introduce a generic workflow model for describing fire fighting operations in different scenarios. Based on this model we also describe heuristics for calculating the similarity of workflows which can be used for searching and clustering.

Book ChapterDOI
30 Sep 2009
TL;DR: The Morphotronic approach postulates a significant improvement to traditional system design thinking based on the Turing Machine model and presents a range of important concepts and definitions supporting this proposition.
Abstract: The Morphotronic approach postulates a significant improvement to traditional system design thinking based on the Turing Machine model. The paper presents a range of important concepts and definitions supporting this proposition.The Morphotronic system represents an abstract universe of the objects. This universe of objects has two interpretations as in the case of the voltages and currents in the electrical circuit. For the space of the voltages the objects are the voltages at edges of the electrical circuit. For the current space of the currents the objects are the currents in any edge. The dimension of the object space is equal to the number of edges in the electrical circuit. Such a space allows dual interpretation of the current and voltages. Other possible dual variables can be used in the morphotronic system as forces and the fluxes in mechanics or dissipative thermodynamics, in a general way the dual interpretation of the object space will be denoted as causes and effects. The morphogenetic system can be modelled by samples of the causes and effects. The morphotronic system with the samples generates the algorithm to implement the purpose in the system. Providing that the samples of the effect and the purpose denote a virtual cause, the vector E can be computed so that it represents the effective origin of the causes inside the purpose map. With the cause-effect rule the effective causes can be computed obtaining results that are coherent with the samples. Providing that the virtual cause is given by purpose the effective causes can be generated in agreement with the samples. The described algorithm is denoted as the projection operator that transforms a virtual cause (purpose) into an effective cause.

Book ChapterDOI
30 Sep 2009
TL;DR: In this paper, modeling approaches to signal generation and processing in single neurons and to spatiotemporal activity patterns in neuronal ensembles are discussed.
Abstract: In Computational Neuroscience, mathematical and computational modeling are differentiated In this paper, both kinds of modeling are considered In particular, modeling approaches to signal generation and processing in single neurons (ie, membrane excitation dynamics, spike propagation, and dendritic integration) and to spatiotemporal activity patterns in neuronal ensembles are discussed

Book ChapterDOI
30 Sep 2009
TL;DR: This work presents a new approach to feature selection that combines advantages of both wrapper as well as filter approaches, by using logistic regression and the area under the ROC curve (AUC) to evaluate pairs of features.
Abstract: The process of feature selection is an important first step in building machine learning models. Feature selection algorithms can be grouped into wrappers and filters; the former use machine learning models to evaluate feature sets, the latter use other criteria to evaluate features individually. We present a new approach to feature selection that combines advantages of both wrapper as well as filter approaches, by using logistic regression and the area under the ROC curve (AUC) to evaluate pairs of features. After choosing as starting feature the one with the highest individual discriminatory power, we incrementally rank features by choosing as next feature the one that achieves the highest AUC in combination with an already chosen feature. To evaluate our approach, we compared it to standard filter and wrapper algorithms. Using two data sets from the biomedical domain, we are able to demonstrate that the performance of our approach exceeds that of filter methods, while being comparable to wrapper methods at smaller computational cost.

Book ChapterDOI
30 Sep 2009
TL;DR: A 3D reconstruction algorithm is used for the purpose of defining no-fly zones and an expert system has been built in cooperation with surgeons that, based on simple rules, can assess the risk of the trainee's actions.
Abstract: The discussed training system employs several means for encouraging safe behavior during laparoscopic surgery procedures. The elements of no-fly zones, magnetic position sensing and expert systems are tied together to form a complex system that provides guidance and performance assessment. A 3D reconstruction algorithm is used for the purpose of defining no-fly zones and has been tested in a simulator developed for the purpose of this work. An expert system has been built in cooperation with surgeons that, based on simple rules, can assess the risk of the trainee's actions. Despite the shortcomings of the 3D reconstruction process, the training system performed as expected during experiments. Simple exercises like touching points in 3D space were performed and scored appropriately to whether a no-fly zone has been breached or not. Also simple advice could be provided to the trainee in order to help improve the results.

Book ChapterDOI
30 Sep 2009
TL;DR: In this work, it is shown how to obtain a digital filter from a given netlist of an analog filter by skipping the transfer function description.
Abstract: Traditionally IIR digital filters are designed by using analog filters described in time or transform domain, then by converting the analog filters to digital filters using appropriate transformation from s-domain to z-domain. For many engineers analog filters mean certain circuits or a netlist of components, and digital filters are a set of statements in certain software. In this work, we show how to obtain a digital filter from a given netlist of an analog filter by skipping the transfer function description.