3D Reconstruction of Bird Flight Using a Single Video Camera
Summary (3 min read)
INTRODUCTION
- These markers can potentially disturb natural movement and behaviour, especially when used on small animals.
- Not certified by peer review) is the author/funder.
- This paper presents a simple, inexpensive, compact, field-deployable technique for reconstructing the flight trajectories of birds in 3D, using a single video camera.
Derivation of method
- The authors method uses a single, downward-looking camera positioned at the ceiling of the experimental arena in which the birds are filmed.
- Essentially, the approach involves combining knowledge of the bird's wingspan (which provides a scale factor that determines the absolute distance of the bird from the camera) with a calibration of the camera that uses a grid of known geometry drawn on the floor.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
- BioRxiv preprint Video footage of a bird flying in the chamber, as captured by the overhead camera, is then analysed to reconstruct the bird's 3D flight trajectory, as described below, also known as doi.
Procedural steps
- Based on the theory described above, the step-by-step procedure for reconstructing the 3D trajectory of the head of a bird from a video sequence captured by a single overhead camera can be described as follows: (i) Construct the floor grid and acquire an image of the grid from the video camera.
- The grid is used only once for the camera calibration, and does not need to be present in the experiments.
- Not certified by peer review) is the author/funder.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
- (viii) Obtain the height profile of the head for the entire video sequence by temporally interpolating the heights calculated for the Wex frames.
Test of accuracy
- The precision of the 3D trajectory reconstruction procedure was evaluated by placing a small test target at 44 different, known 3D locations within the tunnel, of which 39 were within the boundary of the grid.
- The test target was a model bird with a calibrated wingspan of 30 cm.
- This assumption does not affect the generality of the results, as discussed above.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
Examples of flight tracking and reconstruction
- A downward-facing video camera, placed at the centre of the ceiling of the tunnel, was used to film the flights and reconstruct the trajectories in 3D.
- It has been not certified by peer review) is the author/funder.
- This is also clear from Figure 5 , which shows two 3D views of the same flight trajectory, where the blue circles represent the centre of the body at each wing extension and the red curve shows the reconstructed 3D position of the head for every frame, as described in the text above and in the legend.
- Here is clear that the wingbeat cycle is interrupted when the bird passes through the aperture -the distance between successive wing extensions is dramatically larger during the passage.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
DISCUSSION
- This study has described a simple, inexpensive method for reconstructing the flight trajectories of birds in 3D, using a single video camera.
- Not certified by peer review) is the author/funder.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
- When a bird glides with its wings outstretched, its height (and therefore the 3D coordinates of the wingtips and the head) can be reconstructed in every frame without requiring any interpolation.
- The calibration grid on the floor grid must cover a sufficiently large area to enable projection of the wingtips on to the floor at all possible bird positions.
FIGURE LEGENDS Figure 1
- Schematic view of image of the flight chamber from an overhead video camera, showing the calibration grid on the floor, and the instantaneous position of a bird with its wings extended.
- The origin of the pixel co-ordinates is taken to be the center of the image, i.e. corresponding to the direction of the camera's optic axis.
- The origin of the calibration grid is taken to be point directly beneath the camera, i.e. the position where the optic axis of the camera intersects the floor.
Figure 2
- Schematic view of experimental chamber, showing the variables used for computing the instantaneous 3D position of the bird and its wingtips.
- E is the point on the floor that is directly beneath the camera, i.e. the point where the camera's optic axis intersects the floor.
Figure 4
- Example of a video sequence showing superimposed images of the bird in successive frames.
- Successive wing extensions are marked by the crosses.
Figure 5
- The red circles show the wingtip positions at the time of each wing extension, the black circles show the inferred position of the center of the body at these instants, and the blue asterisks depict the position of the head at these instants.
- The red lines show the wing extension not certified by peer review) is the author/funder.
- The copyright holder for this preprint (which was this version posted June 6, 2018.
- ; https://doi.org/10.1101/340232 doi: bioRxiv preprint trajectories interpolated between wing extensions.
- The arrow in this and other figures shows the direction of flight.
Figure 6
- The blue circles show the inferred position of the center of the body at the time of each wing extension, the blue lines show the linearly interpolated body center positions between successive wing extensions, and the red asterisks show the head position at the time of each wing extension.
- The image coordinates of the head, which were digitized in every video frame, were used to calculate the 3D trajectory of the head in every frame by linearly interpolating the image lengths of the extended wingspans across the frames between successive wing extensions, as described in the text.
Figure 10
- Four during flight through the narrow aperture (top panel), the wide aperture (middle panel), and the empty tunnel (bottom panel).
- In each case, the black curve shows the speed profile of the head, computed from the frame-to-frame X positions of the head.
- The height of each grey bar depicts the mean forward speed of the body between successive wing extensions, computed as the ratio of the X distance between successive edges, to the time interval between these edges.
- Not certified by peer review) is the author/funder.
- The copyright holder for this preprint (which was this version posted June 6, 2018. ;.
Did you find this useful? Give us your feedback
Citations
20 citations
References
152 citations
"3D Reconstruction of Bird Flight Us..." refers background in this paper
...…however they require (a) synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008; Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large amounts of data, particularly when using high frame rates; and (c) substantial…...
[...]
133 citations
"3D Reconstruction of Bird Flight Us..." refers background in this paper
...…data, particularly when using high frame rates; and (c) substantial post-processing that entails frame-by-frame tracking of individual features in all of the video sequences, and establishing the correct correspondences between these features across the video sequences (e.g. Cavagna et al., 2008)....
[...]
103 citations
"3D Reconstruction of Bird Flight Us..." refers background in this paper
...During flight, the head is the most stable part of the bird’s anatomy- it maintains a horizontal orientation that is largely independent of the pitch and roll attitude of the body (Warrick et al., 2002; Frost, 2009; Bhagavatula, 2011)....
[...]
102 citations
"3D Reconstruction of Bird Flight Us..." refers background in this paper
...…as challenges for tracking three-dimensional motions of humans and animals, and of their body parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016); Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al., 2015;…...
[...]
68 citations
"3D Reconstruction of Bird Flight Us..." refers background in this paper
...…(a) synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008; Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large amounts of data, particularly when using high frame rates; and (c) substantial post-processing that…...
[...]
...…three-dimensional motions of humans and animals, and of their body parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016); Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al., 2015; Deetjen et al., 2017)....
[...]