scispace - formally typeset

Posted ContentDOI

3D Reconstruction of Bird Flight Using a Single Video Camera

06 Jun 2018-bioRxiv (Cold Spring Harbor Laboratory)-pp 340232

TL;DR: This work presents an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of flight trajectories in three dimensions and combines prior knowledge of the wingspan of the bird with a camera calibration procedure that needs to be used only once in the lifetime of the system.

AbstractVideo cameras are finding increasing use in the study and analysis of bird flight over short ranges. However, reconstruction of flight trajectories in three dimensions typically requires the use of multiple cameras and elaborate calibration procedures. We present an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of such trajectories. The technique combines prior knowledge of the bird's wingspan with a camera calibration procedure that needs to be used only once in the system's lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full wing extension, and uses interpolated height estimates to compute the 3D positions of the bird in the video frames between successive wing extensions. The system is inexpensive, compact and portable, and can be easily deployed in the laboratory as well as the field.

Topics: Video camera (59%), Camera resectioning (58%)

Summary (3 min read)

INTRODUCTION

  • These markers can potentially disturb natural movement and behaviour, especially when used on small animals.
  • Not certified by peer review) is the author/funder.
  • This paper presents a simple, inexpensive, compact, field-deployable technique for reconstructing the flight trajectories of birds in 3D, using a single video camera.

Derivation of method

  • The authors method uses a single, downward-looking camera positioned at the ceiling of the experimental arena in which the birds are filmed.
  • Essentially, the approach involves combining knowledge of the bird's wingspan (which provides a scale factor that determines the absolute distance of the bird from the camera) with a calibration of the camera that uses a grid of known geometry drawn on the floor.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • BioRxiv preprint Video footage of a bird flying in the chamber, as captured by the overhead camera, is then analysed to reconstruct the bird's 3D flight trajectory, as described below, also known as doi.

Procedural steps

  • Based on the theory described above, the step-by-step procedure for reconstructing the 3D trajectory of the head of a bird from a video sequence captured by a single overhead camera can be described as follows: (i) Construct the floor grid and acquire an image of the grid from the video camera.
  • The grid is used only once for the camera calibration, and does not need to be present in the experiments.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • (viii) Obtain the height profile of the head for the entire video sequence by temporally interpolating the heights calculated for the Wex frames.

Test of accuracy

  • The precision of the 3D trajectory reconstruction procedure was evaluated by placing a small test target at 44 different, known 3D locations within the tunnel, of which 39 were within the boundary of the grid.
  • The test target was a model bird with a calibrated wingspan of 30 cm.
  • This assumption does not affect the generality of the results, as discussed above.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.

Examples of flight tracking and reconstruction

  • A downward-facing video camera, placed at the centre of the ceiling of the tunnel, was used to film the flights and reconstruct the trajectories in 3D.
  • It has been not certified by peer review) is the author/funder.
  • This is also clear from Figure 5 , which shows two 3D views of the same flight trajectory, where the blue circles represent the centre of the body at each wing extension and the red curve shows the reconstructed 3D position of the head for every frame, as described in the text above and in the legend.
  • Here is clear that the wingbeat cycle is interrupted when the bird passes through the aperture -the distance between successive wing extensions is dramatically larger during the passage.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.

DISCUSSION

  • This study has described a simple, inexpensive method for reconstructing the flight trajectories of birds in 3D, using a single video camera.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • When a bird glides with its wings outstretched, its height (and therefore the 3D coordinates of the wingtips and the head) can be reconstructed in every frame without requiring any interpolation.
  • The calibration grid on the floor grid must cover a sufficiently large area to enable projection of the wingtips on to the floor at all possible bird positions.

FIGURE LEGENDS Figure 1

  • Schematic view of image of the flight chamber from an overhead video camera, showing the calibration grid on the floor, and the instantaneous position of a bird with its wings extended.
  • The origin of the pixel co-ordinates is taken to be the center of the image, i.e. corresponding to the direction of the camera's optic axis.
  • The origin of the calibration grid is taken to be point directly beneath the camera, i.e. the position where the optic axis of the camera intersects the floor.

Figure 2

  • Schematic view of experimental chamber, showing the variables used for computing the instantaneous 3D position of the bird and its wingtips.
  • E is the point on the floor that is directly beneath the camera, i.e. the point where the camera's optic axis intersects the floor.

Figure 4

  • Example of a video sequence showing superimposed images of the bird in successive frames.
  • Successive wing extensions are marked by the crosses.

Figure 5

  • The red circles show the wingtip positions at the time of each wing extension, the black circles show the inferred position of the center of the body at these instants, and the blue asterisks depict the position of the head at these instants.
  • The red lines show the wing extension not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018.
  • ; https://doi.org/10.1101/340232 doi: bioRxiv preprint trajectories interpolated between wing extensions.
  • The arrow in this and other figures shows the direction of flight.

Figure 6

  • The blue circles show the inferred position of the center of the body at the time of each wing extension, the blue lines show the linearly interpolated body center positions between successive wing extensions, and the red asterisks show the head position at the time of each wing extension.
  • The image coordinates of the head, which were digitized in every video frame, were used to calculate the 3D trajectory of the head in every frame by linearly interpolating the image lengths of the extended wingspans across the frames between successive wing extensions, as described in the text.

Figure 10

  • Four during flight through the narrow aperture (top panel), the wide aperture (middle panel), and the empty tunnel (bottom panel).
  • In each case, the black curve shows the speed profile of the head, computed from the frame-to-frame X positions of the head.
  • The height of each grey bar depicts the mean forward speed of the body between successive wing extensions, computed as the ratio of the X distance between successive edges, to the time interval between these edges.
  • Not certified by peer review) is the author/funder.
  • The copyright holder for this preprint (which was this version posted June 6, 2018. ;.

Did you find this useful? Give us your feedback

...read more

Content maybe subject to copyright    Report

3D RECONSTRUCTION OF BIRD FLIGHT
1
USING A SINGLE VIDEO CAMERA
2
3
M.V. Srinivasan, H.D. Vo and I. Schiffner
4
5
6
ABSTRACT
7
Video cameras are finding increasing use in the study and analysis of bird flight over short
8
ranges. However, reconstruction of flight trajectories in three dimensions typically requires
9
the use of multiple cameras and elaborate calibration procedures. We present an alternative
10
approach that uses a single video camera and a simple calibration procedure for the
11
reconstruction of such trajectories. The technique combines prior knowledge of the bird’s
12
wingspan with a camera calibration procedure that needs to be used only once in the system’s
13
lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full
14
wing extension, and uses interpolated height estimates to compute the 3D positions of the
15
bird in the video frames between successive wing extensions. The system is inexpensive,
16
compact and portable, and can be easily deployed in the laboratory as well as the field.
17
18
19
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

INTRODUCTION
20
The increasing use of high-speed video cameras is offering new opportunities as well as
21
challenges for tracking three-dimensional motions of humans and animals, and of their body
22
parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016);
23
Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al.,
24
2015; Deetjen et al., 2017).
25
Stereo-based approaches that use two (or more) cameras are popular, however they require (a)
26
synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008;
27
Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large
28
amounts of data, particularly when using high frame rates; and (c) substantial post-processing
29
that entails frame- by-frame tracking of individual features in all of the video sequences, and
30
establishing the correct correspondences between these features across the video sequences
31
(e.g. Cavagna et al., 2008). This is particularly complicated when tracking highly deformable
32
objects, such as flying birds.
33
Vicon-based stereo trackers simplify the problem of feature tracking by using special reflective
34
markers or photodiodes attached to the tracked (e.g. Ros et al., 2017; Goller and Altshuler,
35
2014; Tobalske et al., 2007; Troje, 2002). However, these markers can potentially disturb
36
natural movement and behaviour, especially when used on small animals.
37
A novel recent approach uses structured light illumination produced by a laser system in
38
combination a high-speed video camera to reconstruct the wing kinematics of a freely flying
39
parrotlet at 3200 frames/second (Deetjen et al., 2017). However, this impressive capability
40
comes at the cost of some complexity, and works best if the bird possesses a highly reflective
41
plumage of a single colour (preferably white).
42
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

GPS-based tracking methods (e.g. Bouten et al., 2013) are useful for mapping long-range
43
flights of birds, for example, but are not feasible in indoor laboratory settings, where GPS
44
signals are typically unavailable or do not provide sufficiently accurate positioning.
45
Furthermore, they require the animal to carry a GPS receiver, which can affect the flight of a
46
small animal.
47
A simple technique for reconstructing 3D flight trajectories of insects from a single overhead
48
video camera involves tracking the position of the insect as well as the shadow that it casts on
49
the ground (e.g. Zeil, 1993; Srinivasan et al., 2000). However, this technique requires the
50
presence of the unobscured sun in the sky, or a strong artificial indoor light, which in itself
51
could affect the animal’s behaviour. (The latter problem could be overcome, in principle, by
52
using an infrared source of light and an infrared-sensitive camera).
53
This paper presents a simple, inexpensive, compact, field-deployable technique for
54
reconstructing the flight trajectories of birds in 3D, using a single video camera. The procedure
55
for calibrating the camera is uncomplicated, and is an exercise that needs to be carried out only
56
once in the lifetime of the lens/camera combination, irrespective of where the system is used
57
in subsequent applications.
58
The system was used in a recent study of bird flight (Vo et al., 2016) but that paper provided
59
only a cursory description of the technique. This paper provides a comprehensive description
60
of the underlying technique and procedure, which will enable it to be used in other laboratories
61
and field studies.
62
63
64
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

METHODOLOGY
65
66
Derivation of method
67
Our method uses a single, downward-looking camera positioned at the ceiling of the
68
experimental arena in which the birds are filmed. The camera must have a field of view that is
69
large enough to cover the entire volume of space within which the bird’s flight trajectories are
70
to be reconstructed.
71
Essentially, the approach involves combining knowledge of the bird’s wingspan (which
72
provides a scale factor that determines the absolute distance of the bird from the camera) with
73
a calibration of the camera that uses a grid of known geometry drawn on the floor. This
74
calibration provides a means of accounting for all of the imaging distortions that are introduced
75
by the wide-angle optics of the camera lens.
76
A square grid of known mesh dimensions is laid out on the floor. The 2D locations (X,Y) of
77
each of the intersection points are therefore known. Figure 1 illustrates, schematically, a camera
78
view of the grid on the floor, and of a bird in flight above it, as imaged in a video frame in
79
which the wings are fully extended. In general, the image of the grid will not be square, but
80
distorted by the non-linear off-axis imaging produced by the wide-angle lens, as shown in the
81
real image of Figure 3. The intersection points of the grid in the camera image are digitised
82
(manually, or by using specially developed image analysis software), and their pixel locations
83
are recorded. Thus, each grid location (Xi,Yi) on the floor is tagged with its corresponding
84
pixel co-ordinates (pxi,pyi) in the image. This data is used to compute a function that
85
characterises a two-dimensional mapping between the grid locations on the floor and their
86
corresponding pixel co-ordinates in the image.
87
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Video footage of a bird flying in the chamber, as captured by the overhead camera, is then
88
analysed to reconstruct the bird’s 3D flight trajectory, as described below. Two examples of
89
such footage are provided in the Supplementary videos SV1 and SV2. The positions of the
90
wingtips are digitised in every frame in which the wings are fully extended, i.e. when the
91
distance between the wingtips is equal to the wingspan, and attains a maximum in the video
92
image. In the Budgerigar this occurs once during each wingbeat cycle, roughly halfway through
93
the downstroke. We denote the pixel co-ordinates of the wingtips in these frames, which we
94
call the Wex frames, by (pxL,pyL) (left wingtip) and (pxR,pyR) (right wingtip). The projected
95
locations of the two wingtips on the floor are determined by using the mapping function,
96
described above, to carry out an interpolation. Essentially, the projected location of this wingtip
97
on the floor is obtained by computing the position of the point on the floor that has the same
98
location, relative to its four surrounding grid points, as does the position of the wingtip (in
99
image pixel co-ordinates) in relation to the positions of the four surrounding grid locations (in
100
image pixel co-ordinates). Thus, in the case of the left wing tip, for example, this computation
101
effectively uses the locations of the four grid points 1,2, 3 and 4 (see Figure 1) with locations
102
(X1,Y1), (X2,Y2), (X3,Y3) and (X4,Y4) on the floor, and their corresponding image pixel co-
103
ordinates (px1,py1), (px2,py2), (px3,py3) and (px4,py4) respectively, to interpolate the
104
projected position of the pixel co-ordinate (pxL,pyL) on the floor. A similar procedure is used
105
to project the position of the right wingtip (pxR,pyR) on the floor. The construction of the two-
106
dimensional mapping function, and the interpolation are accomplished by using the Matlab
107
function TriScatteredInterp. (Equivalent customized codes could be written in any language)
108
Once the positions of the two wingtips have been projected on to the floor, this information
109
can be used to determine the instantaneous position of the bird in three dimensions, as
110
illustrated in Figure 2. In Figure 2, the 3D positions of the left and right wingtips are denoted
111
by M, with co-ordinates (xL,yL,z), and N, with co-ordinates (xR,yR,z), respectively. Their
112
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Citations
More filters

01 Nov 2017
TL;DR: A custom 3D surface reconstruction method is developed, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird to analyze wing geometry and aerodynamic variables time-resolved.
Abstract: ABSTRACT Birds fly effectively and maneuver nimbly by dynamically changing the shape of their wings during each wingbeat. These shape changes have yet to be quantified automatically at high temporal and spatial resolution. Therefore, we developed a custom 3D surface reconstruction method, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird. This non-invasive structured-light method allows automated 3D reconstruction of each stand-alone frame and can be extended to multiple views. We demonstrate this new technique by automatically reconstructing the dorsal surface of a parrotlet wing at 3200 frames s−1 during flapping flight. From this shape we analyze key parameters such as wing twist and angle of attack distribution. While our binary ‘single-shot’ algorithm is demonstrated by quantifying dynamic shape changes of a flying bird, it is generally applicable to moving animals, plants and deforming objects. Summary: Demonstration of a new high-speed structured-light technique that can automatically record the 3D surface of a bird taking off to analyze wing geometry and aerodynamic variables time-resolved.

20 citations


References
More filters

Book
01 Jan 2000
Abstract: From the Publisher: A basic problem in computer vision is to understand the structure of a real world scene given several images of it. Recent major developments in the theory and practice of scene reconstruction are described in detail in a unified framework. The book covers the geometric principles and how to represent objects algebraically so they can be computed and applied. The authors provide comprehensive background material and explain how to apply the methods and implement the algorithms directly.

15,158 citations


Journal ArticleDOI
Abstract: Researchers studying aspects of locomotion or movement in biological and biomimetic systems commonly use video or stereo video recordings to quantify the behaviour of the system in question, often with an emphasis on measures of position, velocity and acceleration. However, despite the apparent simplicity of video analysis, it can require substantial investment of time and effort, even when performed with adequate software tools. This paper reviews the underlying principles of video and stereo video analysis as well as its automation and is accompanied by fully functional and freely available software implementation.

902 citations


Journal ArticleDOI
TL;DR: A framework is developed that transforms biological motion into a representation allowing for analysis using linear methods from statistics and pattern recognition, and reveals that the dynamic part of the motion contains more information about gender than motion-mediated structural cues.
Abstract: Biological motion contains information about the identity of an agent as well as about his or her actions, intentions, and emotions. The human visual system is highly sensitive to biological motion and capable of extracting socially relevant information from it. Here we investigate the question of how such information is encoded in biological motion patterns and how such information can be retrieved. A framework is developed that transforms biological motion into a representation allowing for analysis using linear methods from statistics and pattern recognition. Using gender classification as an example, simple classifiers are constructed and compared to psychophysical data from human observers. The analysis reveals that the dynamic part of the motion contains more information about gender than motion-mediated structural cues. The proposed framework can be used not only for analysis of biological motion but also to synthesize new motion patterns. A simple motion modeler is presented that can be used to visualize and exaggerate the differences in male and female walking patterns.

768 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...Vicon-based stereo trackers simplify the problem of feature tracking by using special reflective markers or photodiodes attached to the tracked (e.g. Ros et al., 2017; Goller and Altshuler, 2014; Tobalske et al., 2007; Troje, 2002)....

    [...]

  • ...…three-dimensional motions of humans and animals, and of their body parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016); Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al., 2015; Deetjen et al., 2017)....

    [...]


Journal ArticleDOI
TL;DR: It is shown that, during landing, the bee decelerates continuously and in such a way as to keep the projected time to touchdown constant as the surface is approached, which reflects a surprisingly simple and effective strategy for achieving a smooth landing.
Abstract: Freely flying bees were filmed as they landed on a flat, horizontal surface, to investigate the underlying visuomotor control strategies. The results reveal that (1) landing bees approach the surface at a relatively shallow descent angle; (2) they tend to hold the angular velocity of the image of the surface constant as they approach it; and (3) the instantaneous speed of descent is proportional to the instantaneous forward speed. These characteristics reflect a surprisingly simple and effective strategy for achieving a smooth landing, by which the forward and descent speeds are automatically reduced as the surface is approached and are both close to zero at touchdown. No explicit knowledge of flight speed or height above the ground is necessary. A model of the control scheme is developed and its predictions are verified. It is also shown that, during landing, the bee decelerates continuously and in such a way as to keep the projected time to touchdown constant as the surface is approached. The feasibility of this landing strategy is demonstrated by implementation in a robotic gantry equipped with vision.

230 citations


"3D Reconstruction of Bird Flight Us..." refers methods in this paper

  • ...A simple technique for reconstructing 3D flight trajectories of insects from a single overhead video camera involves tracking the position of the insect as well as the shadow that it casts on the ground (e.g. Zeil, 1993; Srinivasan et al., 2000)....

    [...]


Journal ArticleDOI
TL;DR: An 'adverse-scaling' hypothesis is proposed in which it is proposed that the ability to reduce metabolic and mechanical power output using flap-bounding flight at fast flight speeds is scaled negatively with body mass.
Abstract: To investigate how birds that differ in morphology change their wing and body movements while flying at a range of speeds, we analyzed high-speed (60 Hz) video tapes of black-billed magpies (Pica pica) flying at speeds of 4-14 m s-1 and pigeons (Columba livia) flying at 6-20 m s-1 in a wind-tunnel. Pigeons had higher wing loading and higher-aspect-ratio wings compared with magpies. Both species alternated phases of steady-speed flight with phases of acceleration and deceleration, particularly at intermediate flight speeds. The birds modulated their wingbeat kinematics among these phases and frequently exhibited non-flapping phases while decelerating. Such modulation in kinematics during forward flight is typical of magpies but not of pigeons in the wild. The behavior of the pigeons may have been a response to the reduced power costs for flight in the closed wind-tunnel relative to those for free flight at similar speeds. During steady-speed flight, wingbeat frequency did not change appreciably with increasing flight speed. Body angle relative to the horizontal, the stroke-plane angles of the wingtip and wrist relative to the horizontal and the angle describing tail spread at mid-downstroke all decreased with increasing flight speed, thereby illustrating a shift in the dominant function of wing flapping from weight support at slow speeds to positive thrust at fast speeds. Using wingbeat kinematics to infer lift production, it appeared that magpies used a vortex-ring gait during steady-speed flight at all speeds whereas pigeons used a vortex-ring gait at 6 and 8 m s-1, a transitional vortex-ring gait at 10 m s-1, and a continuous-vortex gait at faster speeds. Both species used a vortex-ring gait for acceleration and a continuous-vortex gait or a non-flapping phase for deceleration during flight at intermediate wind-tunnel speeds. Pigeons progressively flexed their wings during glides as flight speed increased but never performed bounds. Wingspan during glides in magpies did not vary with flight speed, but the percentage of bounds among non-flapping intervals increased with speed from 10 to 14 m s-1. The use of non-flapping wing postures seemed to be related to the gaits used during flapping and to the aspect ratio of the wings. We develop an 'adverse-scaling' hypothesis in which it is proposed that the ability to reduce metabolic and mechanical power output using flap-bounding flight at fast flight speeds is scaled negatively with body mass. This represents an alternative to the 'fixed-gear' hypothesis previously suggested by other authors to explain the use of intermittent flight in birds. Future comparative studies in the field would be worthwhile, especially if instantaneous flight speeds and within-wingbeat kinematics were documented; new studies in the laboratory should involve simultaneous recording of wing kinematics and aerodynamic forces on the wing.

221 citations


"3D Reconstruction of Bird Flight Us..." refers background in this paper

  • ...This is also appears to be the case in pigeons and magpies (Tobalske and Dial, 1996)....

    [...]

  • ...This also appears to be the case in pigeons and magpies (Tobalske and Dial, 1996)....

    [...]