CoSLAM: Collaborative Visual SLAM in Dynamic Environments
Citations
333 citations
Cites background from "CoSLAM: Collaborative Visual SLAM i..."
...Robots may also maintain the position uncertainty of each point in the map for handling of dynamic objects [190]....
[...]
298 citations
Cites background from "CoSLAM: Collaborative Visual SLAM i..."
...Tan et al. [152] also use a similar projection principle to detect dynamic features....
[...]
...With the proliferation of mobile and wearable devices, this natural extension of visual SLAM in dynamic environments will benefit many applications, including obstacle avoidance [63], human-robot interaction [51], people following [183], path planning [19], cooperative robotics [46], collaborative mapping [28], driverless cars [102], augmented reality (e....
[...]
...Zou and Tan [28] project features from the previous frame into the current frame and measure the distance from the tracked features....
[...]
235 citations
Cites methods from "CoSLAM: Collaborative Visual SLAM i..."
...[16] A. Kawewong, N. Tongprasit, S. Tangruamsub, and O. Hasegawa....
[...]
...[43] D. Zou and P. Tan....
[...]
...In Zou and Tan [43]’s CoSLAM system, all cameras can move freely in the scene, where each camera works independently with intra-camera pose estimation, and both static and dynamic points are used to obtain inter-camera pose estimation for all cameras....
[...]
...[37] A. Taneja, L. Ballan, and M. Pollefeys....
[...]
...Taneja et al. [37] argue that changes in image appearance may not lead to changes in the geometry, and propose a graph based method....
[...]
234 citations
204 citations
Cites background from "CoSLAM: Collaborative Visual SLAM i..."
...…supported by the Swiss National Science Foundation through project number 200021-143607 (”Swarm of Flying Cameras”) and the National Centre of Competence in Research Robotics. robots allows the computation of the relative configuration of the agents, which forms a basis for multi-robot path…...
[...]
...This research was supported by the Swiss National Science Foundation through project number 200021-143607 (”Swarm of Flying Cameras”) and the National Centre of Competence in Research Robotics. robots allows the computation of the relative configuration of the agents, which forms a basis for multi-robot path planning and cooperative behaviors....
[...]
References
8,432 citations
Additional excerpts
...Manuscript received 5 Oct. 2011; revised 31 Mar. 2012; accepted 14 Apr. 2012; published online 27 Apr. 2012....
[...]
4,091 citations
"CoSLAM: Collaborative Visual SLAM i..." refers background or methods in this paper
...When measuring the uncertainty in map point positions, we only consider the uncertainty in feature detection and triangulation....
[...]
...Experimental results demonstrate that our system can work robustly in highly dynamic environments and produce more accurate results in static environments....
[...]
...If the camera intrinsic parameters are known, the camera pose ¼ ðR; tÞ can be computed by minimizing the reprojection error (the distance between the image projection of 3D map points and their corresponding image feature points), namely, ¼ arg min X i kmi PðMi; Þkð Þ; ð1Þ where PðMi; Þ is the…...
[...]
...10, for each camera, its poses at neighboring frames are connected....
[...]
...We will adjust all camera poses from frame 2 to F , and adjust the map points generated within these frames, which consists of two successive steps described in the following section....
[...]
3,772 citations
"CoSLAM: Collaborative Visual SLAM i..." refers background in this paper
...These cameras move independently and can be mounted on different platforms....
[...]
3,760 citations
1,967 citations
"CoSLAM: Collaborative Visual SLAM i..." refers background in this paper
...The difference between the “intracamera pose estimation” and the “intercamera pose estimation” lies in the second term of (3), where the dynamic points are included in the objective function....
[...]