scispace - formally typeset
Search or ask a question
Posted ContentDOI

3D Reconstruction of Bird Flight Using a Single Video Camera

06 Jun 2018-bioRxiv (Cold Spring Harbor Laboratory)-pp 340232
TL;DR: This work presents an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of flight trajectories in three dimensions and combines prior knowledge of the wingspan of the bird with a camera calibration procedure that needs to be used only once in the lifetime of the system.
Abstract: Video cameras are finding increasing use in the study and analysis of bird flight over short ranges. However, reconstruction of flight trajectories in three dimensions typically requires the use of multiple cameras and elaborate calibration procedures. We present an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of such trajectories. The technique combines prior knowledge of the bird's wingspan with a camera calibration procedure that needs to be used only once in the system's lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full wing extension, and uses interpolated height estimates to compute the 3D positions of the bird in the video frames between successive wing extensions. The system is inexpensive, compact and portable, and can be easily deployed in the laboratory as well as the field.

Summary (3 min read)

INTRODUCTION

  • These markers can potentially disturb natural movement and behaviour, especially when used on small animals.
  • Not certified by peer review) is the author/funder.
  • This paper presents a simple, inexpensive, compact, field-deployable technique for reconstructing the flight trajectories of birds in 3D, using a single video camera.

Derivation of method

  • The authors method uses a single, downward-looking camera positioned at the ceiling of the experimental arena in which the birds are filmed.
  • Essentially, the approach involves combining knowledge of the bird's wingspan (which provides a scale factor that determines the absolute distance of the bird from the camera) with a calibration of the camera that uses a grid of known geometry drawn on the floor.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • BioRxiv preprint Video footage of a bird flying in the chamber, as captured by the overhead camera, is then analysed to reconstruct the bird's 3D flight trajectory, as described below, also known as doi.

Procedural steps

  • Based on the theory described above, the step-by-step procedure for reconstructing the 3D trajectory of the head of a bird from a video sequence captured by a single overhead camera can be described as follows: (i) Construct the floor grid and acquire an image of the grid from the video camera.
  • The grid is used only once for the camera calibration, and does not need to be present in the experiments.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • (viii) Obtain the height profile of the head for the entire video sequence by temporally interpolating the heights calculated for the Wex frames.

Test of accuracy

  • The precision of the 3D trajectory reconstruction procedure was evaluated by placing a small test target at 44 different, known 3D locations within the tunnel, of which 39 were within the boundary of the grid.
  • The test target was a model bird with a calibrated wingspan of 30 cm.
  • This assumption does not affect the generality of the results, as discussed above.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.

Examples of flight tracking and reconstruction

  • A downward-facing video camera, placed at the centre of the ceiling of the tunnel, was used to film the flights and reconstruct the trajectories in 3D.
  • It has been not certified by peer review) is the author/funder.
  • This is also clear from Figure 5 , which shows two 3D views of the same flight trajectory, where the blue circles represent the centre of the body at each wing extension and the red curve shows the reconstructed 3D position of the head for every frame, as described in the text above and in the legend.
  • Here is clear that the wingbeat cycle is interrupted when the bird passes through the aperture -the distance between successive wing extensions is dramatically larger during the passage.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.

DISCUSSION

  • This study has described a simple, inexpensive method for reconstructing the flight trajectories of birds in 3D, using a single video camera.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • When a bird glides with its wings outstretched, its height (and therefore the 3D coordinates of the wingtips and the head) can be reconstructed in every frame without requiring any interpolation.
  • The calibration grid on the floor grid must cover a sufficiently large area to enable projection of the wingtips on to the floor at all possible bird positions.

FIGURE LEGENDS Figure 1

  • Schematic view of image of the flight chamber from an overhead video camera, showing the calibration grid on the floor, and the instantaneous position of a bird with its wings extended.
  • The origin of the pixel co-ordinates is taken to be the center of the image, i.e. corresponding to the direction of the camera's optic axis.
  • The origin of the calibration grid is taken to be point directly beneath the camera, i.e. the position where the optic axis of the camera intersects the floor.

Figure 2

  • Schematic view of experimental chamber, showing the variables used for computing the instantaneous 3D position of the bird and its wingtips.
  • E is the point on the floor that is directly beneath the camera, i.e. the point where the camera's optic axis intersects the floor.

Figure 4

  • Example of a video sequence showing superimposed images of the bird in successive frames.
  • Successive wing extensions are marked by the crosses.

Figure 5

  • The red circles show the wingtip positions at the time of each wing extension, the black circles show the inferred position of the center of the body at these instants, and the blue asterisks depict the position of the head at these instants.
  • The red lines show the wing extension not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • ; https://doi.org/10.1101/340232 doi: bioRxiv preprint trajectories interpolated between wing extensions.
  • The arrow in this and other figures shows the direction of flight.

Figure 6

  • The blue circles show the inferred position of the center of the body at the time of each wing extension, the blue lines show the linearly interpolated body center positions between successive wing extensions, and the red asterisks show the head position at the time of each wing extension.
  • The image coordinates of the head, which were digitized in every video frame, were used to calculate the 3D trajectory of the head in every frame by linearly interpolating the image lengths of the extended wingspans across the frames between successive wing extensions, as described in the text.

Figure 10

  • Four during flight through the narrow aperture (top panel), the wide aperture (middle panel), and the empty tunnel (bottom panel).
  • In each case, the black curve shows the speed profile of the head, computed from the frame-to-frame X positions of the head.
  • The height of each grey bar depicts the mean forward speed of the body between successive wing extensions, computed as the ratio of the X distance between successive edges, to the time interval between these edges.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018. ;.

Did you find this useful? Give us your feedback

Content maybe subject to copyright    Report

3D RECONSTRUCTION OF BIRD FLIGHT
1
USING A SINGLE VIDEO CAMERA
2
3
M.V. Srinivasan, H.D. Vo and I. Schiffner
4
5
6
ABSTRACT
7
Video cameras are finding increasing use in the study and analysis of bird flight over short
8
ranges. However, reconstruction of flight trajectories in three dimensions typically requires
9
the use of multiple cameras and elaborate calibration procedures. We present an alternative
10
approach that uses a single video camera and a simple calibration procedure for the
11
reconstruction of such trajectories. The technique combines prior knowledge of the bird’s
12
wingspan with a camera calibration procedure that needs to be used only once in the system’s
13
lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full
14
wing extension, and uses interpolated height estimates to compute the 3D positions of the
15
bird in the video frames between successive wing extensions. The system is inexpensive,
16
compact and portable, and can be easily deployed in the laboratory as well as the field.
17
18
19
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

INTRODUCTION
20
The increasing use of high-speed video cameras is offering new opportunities as well as
21
challenges for tracking three-dimensional motions of humans and animals, and of their body
22
parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016);
23
Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al.,
24
2015; Deetjen et al., 2017).
25
Stereo-based approaches that use two (or more) cameras are popular, however they require (a)
26
synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008;
27
Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large
28
amounts of data, particularly when using high frame rates; and (c) substantial post-processing
29
that entails frame- by-frame tracking of individual features in all of the video sequences, and
30
establishing the correct correspondences between these features across the video sequences
31
(e.g. Cavagna et al., 2008). This is particularly complicated when tracking highly deformable
32
objects, such as flying birds.
33
Vicon-based stereo trackers simplify the problem of feature tracking by using special reflective
34
markers or photodiodes attached to the tracked (e.g. Ros et al., 2017; Goller and Altshuler,
35
2014; Tobalske et al., 2007; Troje, 2002). However, these markers can potentially disturb
36
natural movement and behaviour, especially when used on small animals.
37
A novel recent approach uses structured light illumination produced by a laser system in
38
combination a high-speed video camera to reconstruct the wing kinematics of a freely flying
39
parrotlet at 3200 frames/second (Deetjen et al., 2017). However, this impressive capability
40
comes at the cost of some complexity, and works best if the bird possesses a highly reflective
41
plumage of a single colour (preferably white).
42
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

GPS-based tracking methods (e.g. Bouten et al., 2013) are useful for mapping long-range
43
flights of birds, for example, but are not feasible in indoor laboratory settings, where GPS
44
signals are typically unavailable or do not provide sufficiently accurate positioning.
45
Furthermore, they require the animal to carry a GPS receiver, which can affect the flight of a
46
small animal.
47
A simple technique for reconstructing 3D flight trajectories of insects from a single overhead
48
video camera involves tracking the position of the insect as well as the shadow that it casts on
49
the ground (e.g. Zeil, 1993; Srinivasan et al., 2000). However, this technique requires the
50
presence of the unobscured sun in the sky, or a strong artificial indoor light, which in itself
51
could affect the animal’s behaviour. (The latter problem could be overcome, in principle, by
52
using an infrared source of light and an infrared-sensitive camera).
53
This paper presents a simple, inexpensive, compact, field-deployable technique for
54
reconstructing the flight trajectories of birds in 3D, using a single video camera. The procedure
55
for calibrating the camera is uncomplicated, and is an exercise that needs to be carried out only
56
once in the lifetime of the lens/camera combination, irrespective of where the system is used
57
in subsequent applications.
58
The system was used in a recent study of bird flight (Vo et al., 2016) but that paper provided
59
only a cursory description of the technique. This paper provides a comprehensive description
60
of the underlying technique and procedure, which will enable it to be used in other laboratories
61
and field studies.
62
63
64
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

METHODOLOGY
65
66
Derivation of method
67
Our method uses a single, downward-looking camera positioned at the ceiling of the
68
experimental arena in which the birds are filmed. The camera must have a field of view that is
69
large enough to cover the entire volume of space within which the bird’s flight trajectories are
70
to be reconstructed.
71
Essentially, the approach involves combining knowledge of the bird’s wingspan (which
72
provides a scale factor that determines the absolute distance of the bird from the camera) with
73
a calibration of the camera that uses a grid of known geometry drawn on the floor. This
74
calibration provides a means of accounting for all of the imaging distortions that are introduced
75
by the wide-angle optics of the camera lens.
76
A square grid of known mesh dimensions is laid out on the floor. The 2D locations (X,Y) of
77
each of the intersection points are therefore known. Figure 1 illustrates, schematically, a camera
78
view of the grid on the floor, and of a bird in flight above it, as imaged in a video frame in
79
which the wings are fully extended. In general, the image of the grid will not be square, but
80
distorted by the non-linear off-axis imaging produced by the wide-angle lens, as shown in the
81
real image of Figure 3. The intersection points of the grid in the camera image are digitised
82
(manually, or by using specially developed image analysis software), and their pixel locations
83
are recorded. Thus, each grid location (Xi,Yi) on the floor is tagged with its corresponding
84
pixel co-ordinates (pxi,pyi) in the image. This data is used to compute a function that
85
characterises a two-dimensional mapping between the grid locations on the floor and their
86
corresponding pixel co-ordinates in the image.
87
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Video footage of a bird flying in the chamber, as captured by the overhead camera, is then
88
analysed to reconstruct the bird’s 3D flight trajectory, as described below. Two examples of
89
such footage are provided in the Supplementary videos SV1 and SV2. The positions of the
90
wingtips are digitised in every frame in which the wings are fully extended, i.e. when the
91
distance between the wingtips is equal to the wingspan, and attains a maximum in the video
92
image. In the Budgerigar this occurs once during each wingbeat cycle, roughly halfway through
93
the downstroke. We denote the pixel co-ordinates of the wingtips in these frames, which we
94
call the Wex frames, by (pxL,pyL) (left wingtip) and (pxR,pyR) (right wingtip). The projected
95
locations of the two wingtips on the floor are determined by using the mapping function,
96
described above, to carry out an interpolation. Essentially, the projected location of this wingtip
97
on the floor is obtained by computing the position of the point on the floor that has the same
98
location, relative to its four surrounding grid points, as does the position of the wingtip (in
99
image pixel co-ordinates) in relation to the positions of the four surrounding grid locations (in
100
image pixel co-ordinates). Thus, in the case of the left wing tip, for example, this computation
101
effectively uses the locations of the four grid points 1,2, 3 and 4 (see Figure 1) with locations
102
(X1,Y1), (X2,Y2), (X3,Y3) and (X4,Y4) on the floor, and their corresponding image pixel co-
103
ordinates (px1,py1), (px2,py2), (px3,py3) and (px4,py4) respectively, to interpolate the
104
projected position of the pixel co-ordinate (pxL,pyL) on the floor. A similar procedure is used
105
to project the position of the right wingtip (pxR,pyR) on the floor. The construction of the two-
106
dimensional mapping function, and the interpolation are accomplished by using the Matlab
107
function TriScatteredInterp. (Equivalent customized codes could be written in any language)
108
Once the positions of the two wingtips have been projected on to the floor, this information
109
can be used to determine the instantaneous position of the bird in three dimensions, as
110
illustrated in Figure 2. In Figure 2, the 3D positions of the left and right wingtips are denoted
111
by M, with co-ordinates (xL,yL,z), and N, with co-ordinates (xR,yR,z), respectively. Their
112
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Citations
More filters
01 Nov 2017
TL;DR: A custom 3D surface reconstruction method is developed, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird to analyze wing geometry and aerodynamic variables time-resolved.
Abstract: ABSTRACT Birds fly effectively and maneuver nimbly by dynamically changing the shape of their wings during each wingbeat. These shape changes have yet to be quantified automatically at high temporal and spatial resolution. Therefore, we developed a custom 3D surface reconstruction method, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird. This non-invasive structured-light method allows automated 3D reconstruction of each stand-alone frame and can be extended to multiple views. We demonstrate this new technique by automatically reconstructing the dorsal surface of a parrotlet wing at 3200 frames s−1 during flapping flight. From this shape we analyze key parameters such as wing twist and angle of attack distribution. While our binary ‘single-shot’ algorithm is demonstrated by quantifying dynamic shape changes of a flying bird, it is generally applicable to moving animals, plants and deforming objects. Summary: Demonstration of a new high-speed structured-light technique that can automatically record the 3D surface of a bird taking off to analyze wing geometry and aerodynamic variables time-resolved.

20 citations

References
More filters
Journal ArticleDOI
TL;DR: A workflow and associated software for performing calibration of cameras placed in a field setting and estimating the accuracy of the resulting stereoscopic reconstructions that other researchers may use to calibrate their own cameras.
Abstract: Stereo videography is a powerful technique for quantifying the kinematics and behavior of animals, but it can be challenging to use in an outdoor field setting. We here present a workflow and associated software for performing calibration of cameras placed in a field setting and estimating the accuracy of the resulting stereoscopic reconstructions. We demonstrate the workflow through example stereoscopic reconstructions of bat and bird flight. We provide software tools for planning experiments and processing the resulting calibrations that other researchers may use to calibrate their own cameras. Our field protocol can be deployed in a single afternoon, requiring only short video clips of light, portable calibration objects.

152 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...…however they require (a) synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008; Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large amounts of data, particularly when using high frame rates; and (c) substantial…...

    [...]

Journal ArticleDOI
TL;DR: 3D studies of collective animal behaviour in three dimensions with small groups highlights the difficulty of obtaining high-quality 3D data for birds and mosquito studies, where the number of animals is very low compared to natural conditions.

133 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...…data, particularly when using high frame rates; and (c) substantial post-processing that entails frame-by-frame tracking of individual features in all of the video sequences, and establishing the correct correspondences between these features across the video sequences (e.g. Cavagna et al., 2008)....

    [...]

Journal ArticleDOI
TL;DR: Experiments with pigeons suggest the ability to isolate the visual and Vestibular systems is critical to controlled flapping flight: birds wearing collars that prohibited the neck from isolating the head from the angular accelerations of induced rolls frequently exhibited a loss of vestibular and/or visual horizon and were unable to maintain controlled flight.
Abstract: While useful in describing the efficiency of maneuvering flight, steady-state (i.e., fixed wing) models of maneuvering performance cannot provide insight to the efficacy of maneuvering, particularly during low-speed flapping flight. Contrasted with airplane-analogous gliding/high speed maneuvering, the aerodynamic and biomechanical mechanisms employed by birds at low flight speeds are violent, with rapidly alternating forces routinely being developed. The saltatory nature of this type of flight results in extreme linear and angular displacements of the bird's body; however, birds isolate their heads from these accelerations with cervical reflexes. Experiments with pigeons suggest this ability to isolate the visual and vestibular systems is critical to controlled flapping flight: birds wearing collars that prohibited the neck from isolating the head from the angular accelerations of induced rolls frequently exhibited (50% of flights) a loss of vestibular and/or visual horizon and were unable to maintain controlled flight.

103 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...During flight, the head is the most stable part of the bird’s anatomy- it maintains a horizontal orientation that is largely independent of the pitch and roll attitude of the body (Warrick et al., 2002; Frost, 2009; Bhagavatula, 2011)....

    [...]

Journal ArticleDOI
TL;DR: These sequences show that Drosophila melanogaster do not utilize clap and fling during take-off and are able to modify their wing kinematics from one wingstroke to the next.
Abstract: The fruit fly Drosophila melanogaster is a widely used model organism in studies of genetics, developmental biology and biomechanics. One limitation for exploiting Drosophila as a model system for behavioral neurobiology is that measuring body kinematics during behavior is labor intensive and subjective. In order to quantify flight kinematics during different types of maneuvers, we have developed a visual tracking system that estimates the posture of the fly from multiple calibrated cameras. An accurate geometric fly model is designed using unit quaternions to capture complex body and wing rotations, which are automatically fitted to the images in each time frame. Our approach works across a range of flight behaviors, while also being robust to common environmental clutter. The tracking system is used in this paper to compare wing and body motion during both voluntary and escape take-offs. Using our automated algorithms, we are able to measure stroke amplitude, geometric angle of attack and other parameters important to a mechanistic understanding of flapping flight. When compared with manual tracking methods, the algorithm estimates body position within 4.4±1.3% of the body length, while body orientation is measured within 6.5±1.9 deg. (roll), 3.2±1.3 deg. (pitch) and 3.4±1.6 deg. (yaw) on average across six videos. Similarly, stroke amplitude and deviation are estimated within 3.3 deg. and 2.1 deg., while angle of attack is typically measured within 8.8 deg. comparing against a human digitizer. Using our automated tracker, we analyzed a total of eight voluntary and two escape take-offs. These sequences show that Drosophila melanogaster do not utilize clap and fling during take-off and are able to modify their wing kinematics from one wingstroke to the next. Our approach should enable biomechanists and ethologists to process much larger datasets than possible at present and, therefore, accelerate insight into the mechanisms of free-flight maneuvers of flying insects.

102 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...…as challenges for tracking three-dimensional motions of humans and animals, and of their body parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016); Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al., 2015;…...

    [...]

Journal ArticleDOI
TL;DR: Argus is a free and open source toolset for using consumer grade cameras to acquire 3D kinematic data in field settings and will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts.
Abstract: Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts.

68 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...…(a) synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008; Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large amounts of data, particularly when using high frame rates; and (c) substantial post-processing that…...

    [...]

  • ...…three-dimensional motions of humans and animals, and of their body parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016); Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al., 2015; Deetjen et al., 2017)....

    [...]