scispace - formally typeset
Search or ask a question

Showing papers by "Ivan Petrović published in 2011"


Journal ArticleDOI
TL;DR: This work introduces a novel path planning and replanning algorithm - the two-way D^* (TWD^*) algorithm - based on a two-dimensional occupancy grid map of the environment that can find optimal paths in weighted occupancy grid maps.

40 citations


Journal ArticleDOI
TL;DR: A new online path-smoothing algorithm for smoothing a path that consists of straight line segments and is primarily intended for differential drive mobile robots and allows non-zero initial path curvature, which is important for replanning.

37 citations


Proceedings ArticleDOI
03 Jul 2011
TL;DR: The Model Predictive Control (MPC) strategy is used to solve the mobile robot trajectory tracking problem, where controller must ensure that robot follows pre-calculated trajectory.
Abstract: In this paper the Model Predictive Control (MPC) strategy is used to solve the mobile robot trajectory tracking problem, where controller must ensure that robot follows pre-calculated trajectory. The so-called explicit optimal controller design and implementation are described. The MPC solution is calculated off-line and expressed as a piecewise affine function of the current state of a mobile robot. A linearized kinematic model of a differential drive mobile robot is used for the controller design purpose. The optimal controller, which has a form of a look-up table, is tested in simulation and experimentally.

36 citations


Proceedings Article
23 May 2011
TL;DR: A modified Asano's algorithm is implemented for determining the visibility polygons and visibility graphs and the main advantage of this method is that it can be applied in dynamical environments (environments that change in time).
Abstract: Computational geometry is very important for solving motion planning problems Visibility graphs are very useful in determining the shortest path In this work, a modified Asano's algorithm is implemented for determining the visibility polygons and visibility graphs Implementation is done using the CGAL library Although the principle for determining visibility graphs is rather simple, the procedure is very time and space consuming and the goal is to achieve lower algorithm complexity The algorithm consists of two steps: first, angular sorting of points is done using the dual transformation, and second, visibility between the points is determined Testing of the algorithm is done on two polygonal test sets The first is made of squares, uniformly and densely distributed The second is made of triangles, randomly and sparsely distributed Results show a cubical complexity of the algorithm, depending on the number of reflex points The main advantage of this method is that it can be applied in dynamical environments (environments that change in time) It is not required to perform the calculation for all points on the map Instead, the graph can be refreshed locally so it is very practical for online use

36 citations


Journal ArticleDOI
TL;DR: The novel algorithm of complete coverage called complete coverage D* (CCD*) algorithm is developed, based on the D* search of the two-dimensional occupancy grid map of the environment, with emphasis on safety of motion and reductions of path length and search time.

34 citations


Proceedings ArticleDOI
03 Jul 2011
TL;DR: In this article, a dynamic model of the mobile robot that accounts for robot actuator constraints is derived and used to find an optimal velocity profile along the predefined path in order to traverse the path in shortest time.
Abstract: This paper is concerned with the problem of finding an optimal velocity profile along the predefined path in order to traverse the path in shortest time. A dynamic model of the mobile robot that accounts for robot actuator constraints is derived. It is shown how to use this dynamic model to express acceleration limits and velocity limit curve required by the optimal time-scaling algorithm. The developed trajectory planning algorithm is demonstrated on the Pioneer 3DX mobile robot. A method is very flexible and can be extended to account for other constraints, such as limited grip between the robot wheels and the ground.

30 citations


Proceedings ArticleDOI
15 Dec 2011
TL;DR: A comparative study of three algorithms that use occupancy grid map of environments to find a path for a mobile robot from the given start to the goal — the D, Two Way D (TWD) and E (E) and all three algorithms have ability of dynamic replanning in case of changes in the environment.
Abstract: This paper presents a comparative study of three algorithms that use occupancy grid map of environments to find a path for a mobile robot from the given start to the goal — the D∗, Two Way D∗ (TWD∗) and E∗. All three algorithms have ability of dynamic replanning in case of changes in the environment. The D∗ algorithm produces a path consisted of line segments that have discrete transitions between cell edges — a multiple of 45°. This path is hard to follow by a mobile robot. The TWD∗ and E∗ algorithms produce more natural paths with continuous headings of the path line segments. The criteria for comparison were the path characteristics, the time of execution and the number of iterations of the algorithms' main while loop. The algorithms were verified both by simulation and experimentally on a Pioneer 3DX mobile robot equipped with a laser range finder.

25 citations


Journal ArticleDOI
TL;DR: The FHD* algorithm guarantees the optimality of the global path, and requires considerably less time for the path replanning operations, and can be easily extended to the problem of path planning between different floors or buildings.
Abstract: Inspired by the Hierarchical D* (HD*) algorithm of D. Cagigas (Cagigas, 2005), in this paper we introduce a novel hierarchical path planning algorithm called Focussed Hierarchical D* (FHD*). Unlike the original HD* algorithm, the FHD* algorithm guarantees the optimality of the global path, and requires considerably less time for the path replanning operations. This is achieved by several modifications: (i) optimal placement of the so-called bridge nodes needed for hierarchy creation, (ii) focusing the search around the optimal path, which reduces the search area without loss of optimality, and (iii) introduction of partial starts and partial goals, which further reduce computational time of replanning operations. The FHD* algorithm was tested in a multiroom indoor environment and compared to the original HD* algorithm, non-hierarchical D* algorithm, and Focussed D* algorithm under the same conditions. The FHD* algorithm significantly outperforms other algorithms with respect to the computational time. Furthermore, it can be easily extended to the problem of path planning between different floors or buildings.

15 citations


Proceedings Article
01 Jan 2011
TL;DR: This paper proposes to extend the joint probabilistic data association filter (JPDAF) with an entropy based track management scheme, and shows that all the required data come from a running filter, and that it can be readily utilized for an arbitrary type of filter.
Abstract: In this paper we study the problem of tracking an arbitrary number of people with multiple heterogeneous sensors. To solve the problem, we start with a Bayesian derivation of the multiple-hypothesis tracking (MHT), and, under certain assumptions, we arrive to the joint probabilistic data association filter (JPDAF). In their original derivation, both the MHT and JPDAF assume a multiple sensor scenario which enables us to fuse the sensors measurements by asynchronously updating the tracking filters. To solve the data association problem, instead of using the optimal MHT with complex hypothesis branching, we choose the JPDAF since we are interested only in local observations by a mobile robot for people detection, tracking, and avoidance. However, the JPDAF assumes a constant and known number of objects in the scene, and therefore, we propose to extend it with an entropy based track management scheme. The benefits of the proposed approach are that all the required data come from a running filter, and that it can be readily utilized for an arbitrary type of filter, as long as such a strong mathematical principle like entropy is tractable for the underlying distribution. The proposed algorithm is implemented for the Kalman and particle filter, and the performance is verified by simulation and experiment. For the simulation purposes, we analyze two generic sensors, a location and a bearing sensor, while in the experiments we use a laser range scanner, a microphone array and an RGB-D camera.

11 citations


Journal ArticleDOI
TL;DR: The exploration and mapping algorithm is proposed that extends Ekman's exploration algorithm by removing rigid constraints on the range sensor and robot localization by including line extraction algorithm developed by Pfister, which incorporates noise models of the range Sensor and robot's pose uncertainty.
Abstract: We consider problem of exploration and mapping of unknown indoor environments using laser range finder. We assume a setup with a resolved localization problem and known uncertainty sensor models. Most exploration algorithms are based on detection of a boundary between explored and unexplored regions. They are, however, not efficient in practice due to uncertainties in measurement, localization and map building. The exploration and mapping algorithm is proposed that extends Ekman's exploration algorithm by removing rigid constraints on the range sensor and robot localization. The proposed algorithm includes line extraction algorithm developed by Pfister, which incorporates noise models of the range sensor and robot's pose uncertainty. A line representation of the range data is used for creating polygon that represents explored region from each measurement pose. The polygon edges that do not correspond to real environmental features are candidates for a new measurement pose. A general polygon clipping algorithm is used to obtain the total explored region as the union of polygons from different measurement poses. The proposed algorithm is tested and compared to the Ekman's algorithm by simulations and experimentally on a Pioneer 3DX mobile robot equipped with SICK LMS-200 laser range finder.

10 citations


Proceedings Article
23 May 2011
TL;DR: A tracking algorithm which is based on the CAMSHIFT (Continuously Adaptive Mean Shift) algorithm, which operates on the stereo images, and the disparity image to increase the tracking quality with acceptable losses in the execution time.
Abstract: Stereo matching is a powerful method of image segmentation for applications such as moving objects tracking. The result of a stereo matching algorithm is a disparity image which shows the difference of the object location in the images produced by the left and right camera. In this paper, we present a tracking algorithm which is based on the CAMSHIFT (Continuously Adaptive Mean Shift) algorithm. Our algorithm operates on the stereo images, and the disparity image. Extending the CAMSHIFT algorithm with the scene depth information from the disparity image we are able to increase the tracking quality with acceptable losses in the execution time. The algorithm is implemented and evaluated in a people tracking system, where the experimental results show that our algorithm outperforms the conventional CAMSHIFT algorithm in tracking accuracy.

Proceedings Article
23 May 2011
TL;DR: This work proposes to geometrically map the optical flow vectors on a sphere, centered at the center of the central panoramic camera projection, since it observed the space angle to be invariant to the errors of the planar space assumption.
Abstract: We consider the problem of optical flow analysis in an omnidirectional camera image, in order to segment out dynamic objects from ego-motion caused optical flow. The aim is to use the camera on a mobile robot, thus making it reasonable to assume knowledge of the camera movement in the horizontal plane. Estimation of the optical flow is performed in the omnidirectional image. By knowing the ego-motion, it is possible to calculate the resulting optical flow field with the necessary assumption of planarity of the surrounding scene (the mobile robot is in an open, flat space). However, comparing the calculated and estimated flow field, with the latter assumption by comparing the flow vectors or winding, did not result in satisfactory estimates. Therefore, we propose to geometrically map the optical flow vectors on a sphere, centered at the center of the central panoramic camera projection, since we observed the space angle to be invariant to the errors of the planar space assumption. In the experiments we show that the proposed approach is more robust than comparing the flows in the image plane. In addition, properties of the observed angle for the inverse problem of determining camera ego-motion are considered.

Proceedings Article
23 May 2011
TL;DR: This paper describes design concept of an embedded computer system for service mobile robots, which is made modular so there are several different modules that are used to control different subsystems of the robot.
Abstract: This paper describes design concept of an embedded computer system for service mobile robots. The system is made modular so there are several different modules that are used to control different subsystems of the robot. There are power management module, sonar management module with twelve ultrasonic sensors, motor control module and module for charging batteries which is part of a docking station. The design of all modules is based on PIC 16F midrange microcontroller family and they are connected via I2C system bus. To connect modules to the main computer, another module is developed, which connects the RS232 port of the main computer to I2C system bus. The main computer is master device on this bus and the modules are slave devices. For testing purposes the main computer is ordinary notebook and the simple test application is developed so the whole system can be controlled with it. Simple protocol is designed for communication and its format is also described in this paper.

Proceedings Article
17 Oct 2011
TL;DR: A novel method is proposed which iteratively generates optical flow vectors for different possible real world coordinates of the objects in the scene in order to incorporate motion estimates given by motor encoders in safe navigation and collision avoidance.
Abstract: Mobile robots equipped with an omnidirectional camera have gained a considerable attention over the last decade. Having an entire view of the scene can be very advantageous in numerous applications as all information is stored in a single frame. This paper is primarily concerned with detection of moving objects from optical flow field in cluttered indoor environments, which is necessary for safe navigation and collision avoidance. The algorithm is based on the comparison of the measured optical flow vectors with the generated ones. As depth information is not available, a novel method is proposed which iteratively generates optical flow vectors for different possible real world coordinates of the objects in the scene. This is necessary in order to incorporate motion estimates given by motor encoders. Back-projecting into image is then used to generate synthetic optical flow vectors needed for comparison. The algorithm was tested on a real system and was able to successfully localize a moving object under both artificial and natural lighting. The proposed algorithm can be implemented in real-time on any system with known calibrated model of the omnidirectional sensor and reliable motion estimation.

Journal ArticleDOI
TL;DR: The feasibility of integrating legacy software processes and tools into the paradigm of model-based development of industrial real-time embedded systems by using legacy assembly code for automatic code generation scheme inside MATLAB/Simulink environment is investigated.
Abstract: This paper investigates the feasibility of integrating legacy software processes and tools into the paradigm of model-based development of industrial real-time embedded systems. Research has been conducted on the example of using legacy assembly code for automatic code generation scheme inside MATLAB/Simulink environment. A sample Simulink model has been presented, code has been generated from it and its correctness has been validated by back-to-back comparison with the simulation results.

Proceedings ArticleDOI
03 Jul 2011
TL;DR: A strategy for deadlock avoidance of mobile robots in narrow passages of environments populated with other moving objects based on the random multi-access algorithm for the network congestion avoidance is presented.
Abstract: This paper presents a strategy for deadlock avoidance of mobile robots in narrow passages of environments populated with other moving objects. The proposed strategy detects deadlocks in narrow passages only by robot's perceptive sensors, i.e., no other communication means with moving objects is assumed. The strategy is based on the random multi-access algorithm for the network congestion avoidance. The strategy is implemented within our existing motion planning and control system for mobile robots and thoroughly tested by simulation and experimentally on the Pioneer 3DX mobile robot equipped with SICK LMS-200 laser range finder. The test results illustrate the appropriateness of the proposed strategy for resolving deadlocks in narrow passages.

01 Jan 2011
TL;DR: This paper deduces general observability condition of all types of SLAM solutions regardless what states they consider directly in estimation, i.e. whether SLAM is feature based or pose (graph) based, which avoids extra non-singularity conditions.
Abstract: In this paper, we deduce general observability condition of all types of SLAM solutions regardless what states they consider directly in estimation, i.e. whether SLAM is feature based or pose (graph) based. This result comes from doing nonlinear observability analysis of the pose-based SLAM. We consider general vehicle motion model whose control inputs are translational and rotational velocity of the vehicle’s body, and relative vehicle’s pose measurements that come, for example but not necessarily, from stereo camera image registration. We conclude that, for the SLAM to be observable, one vehicle pose must be known apriori, or must be exactly reconstructable from observed features. This approach avoids extra non-singularity conditions, i.e. connection of the localization algorithm and a robot control law in observability analysis. Finally, we demonstrate the theory on the common pose estimation problem solved with the Extended Kalman Filter.