scispace - formally typeset
Open AccessPosted ContentDOI

3D Reconstruction of Bird Flight Using a Single Video Camera

Reads0
Chats0
TLDR
This work presents an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of flight trajectories in three dimensions and combines prior knowledge of the wingspan of the bird with a camera calibration procedure that needs to be used only once in the lifetime of the system.
Abstract
Video cameras are finding increasing use in the study and analysis of bird flight over short ranges. However, reconstruction of flight trajectories in three dimensions typically requires the use of multiple cameras and elaborate calibration procedures. We present an alternative approach that uses a single video camera and a simple calibration procedure for the reconstruction of such trajectories. The technique combines prior knowledge of the bird's wingspan with a camera calibration procedure that needs to be used only once in the system's lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full wing extension, and uses interpolated height estimates to compute the 3D positions of the bird in the video frames between successive wing extensions. The system is inexpensive, compact and portable, and can be easily deployed in the laboratory as well as the field.

read more

Content maybe subject to copyright    Report

3D RECONSTRUCTION OF BIRD FLIGHT
1
USING A SINGLE VIDEO CAMERA
2
3
M.V. Srinivasan, H.D. Vo and I. Schiffner
4
5
6
ABSTRACT
7
Video cameras are finding increasing use in the study and analysis of bird flight over short
8
ranges. However, reconstruction of flight trajectories in three dimensions typically requires
9
the use of multiple cameras and elaborate calibration procedures. We present an alternative
10
approach that uses a single video camera and a simple calibration procedure for the
11
reconstruction of such trajectories. The technique combines prior knowledge of the bird’s
12
wingspan with a camera calibration procedure that needs to be used only once in the system’s
13
lifetime. The system delivers the exact 3D coordinates of the bird at the time of every full
14
wing extension, and uses interpolated height estimates to compute the 3D positions of the
15
bird in the video frames between successive wing extensions. The system is inexpensive,
16
compact and portable, and can be easily deployed in the laboratory as well as the field.
17
18
19
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

INTRODUCTION
20
The increasing use of high-speed video cameras is offering new opportunities as well as
21
challenges for tracking three-dimensional motions of humans and animals, and of their body
22
parts (e.g. Shelton et al., 2014; Straw et al., 2011; Fontaine et al., 2009; Dakin et al., 2016);
23
Ros et al., 2017; Troje, 2002; de Margerie et al., 2015; Jackson et al., 2016; Macfarlane et al.,
24
2015; Deetjen et al., 2017).
25
Stereo-based approaches that use two (or more) cameras are popular, however they require (a)
26
synchronisation of the cameras (b) elaborate calibration procedures (e.g. Hedrick, 2008;
27
Hartley and Zisserman, 2003; Theriault et al., 2014; Jackson et al., 2016) (b) collection of large
28
amounts of data, particularly when using high frame rates; and (c) substantial post-processing
29
that entails frame- by-frame tracking of individual features in all of the video sequences, and
30
establishing the correct correspondences between these features across the video sequences
31
(e.g. Cavagna et al., 2008). This is particularly complicated when tracking highly deformable
32
objects, such as flying birds.
33
Vicon-based stereo trackers simplify the problem of feature tracking by using special reflective
34
markers or photodiodes attached to the tracked (e.g. Ros et al., 2017; Goller and Altshuler,
35
2014; Tobalske et al., 2007; Troje, 2002). However, these markers can potentially disturb
36
natural movement and behaviour, especially when used on small animals.
37
A novel recent approach uses structured light illumination produced by a laser system in
38
combination a high-speed video camera to reconstruct the wing kinematics of a freely flying
39
parrotlet at 3200 frames/second (Deetjen et al., 2017). However, this impressive capability
40
comes at the cost of some complexity, and works best if the bird possesses a highly reflective
41
plumage of a single colour (preferably white).
42
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

GPS-based tracking methods (e.g. Bouten et al., 2013) are useful for mapping long-range
43
flights of birds, for example, but are not feasible in indoor laboratory settings, where GPS
44
signals are typically unavailable or do not provide sufficiently accurate positioning.
45
Furthermore, they require the animal to carry a GPS receiver, which can affect the flight of a
46
small animal.
47
A simple technique for reconstructing 3D flight trajectories of insects from a single overhead
48
video camera involves tracking the position of the insect as well as the shadow that it casts on
49
the ground (e.g. Zeil, 1993; Srinivasan et al., 2000). However, this technique requires the
50
presence of the unobscured sun in the sky, or a strong artificial indoor light, which in itself
51
could affect the animal’s behaviour. (The latter problem could be overcome, in principle, by
52
using an infrared source of light and an infrared-sensitive camera).
53
This paper presents a simple, inexpensive, compact, field-deployable technique for
54
reconstructing the flight trajectories of birds in 3D, using a single video camera. The procedure
55
for calibrating the camera is uncomplicated, and is an exercise that needs to be carried out only
56
once in the lifetime of the lens/camera combination, irrespective of where the system is used
57
in subsequent applications.
58
The system was used in a recent study of bird flight (Vo et al., 2016) but that paper provided
59
only a cursory description of the technique. This paper provides a comprehensive description
60
of the underlying technique and procedure, which will enable it to be used in other laboratories
61
and field studies.
62
63
64
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

METHODOLOGY
65
66
Derivation of method
67
Our method uses a single, downward-looking camera positioned at the ceiling of the
68
experimental arena in which the birds are filmed. The camera must have a field of view that is
69
large enough to cover the entire volume of space within which the bird’s flight trajectories are
70
to be reconstructed.
71
Essentially, the approach involves combining knowledge of the bird’s wingspan (which
72
provides a scale factor that determines the absolute distance of the bird from the camera) with
73
a calibration of the camera that uses a grid of known geometry drawn on the floor. This
74
calibration provides a means of accounting for all of the imaging distortions that are introduced
75
by the wide-angle optics of the camera lens.
76
A square grid of known mesh dimensions is laid out on the floor. The 2D locations (X,Y) of
77
each of the intersection points are therefore known. Figure 1 illustrates, schematically, a camera
78
view of the grid on the floor, and of a bird in flight above it, as imaged in a video frame in
79
which the wings are fully extended. In general, the image of the grid will not be square, but
80
distorted by the non-linear off-axis imaging produced by the wide-angle lens, as shown in the
81
real image of Figure 3. The intersection points of the grid in the camera image are digitised
82
(manually, or by using specially developed image analysis software), and their pixel locations
83
are recorded. Thus, each grid location (Xi,Yi) on the floor is tagged with its corresponding
84
pixel co-ordinates (pxi,pyi) in the image. This data is used to compute a function that
85
characterises a two-dimensional mapping between the grid locations on the floor and their
86
corresponding pixel co-ordinates in the image.
87
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Video footage of a bird flying in the chamber, as captured by the overhead camera, is then
88
analysed to reconstruct the bird’s 3D flight trajectory, as described below. Two examples of
89
such footage are provided in the Supplementary videos SV1 and SV2. The positions of the
90
wingtips are digitised in every frame in which the wings are fully extended, i.e. when the
91
distance between the wingtips is equal to the wingspan, and attains a maximum in the video
92
image. In the Budgerigar this occurs once during each wingbeat cycle, roughly halfway through
93
the downstroke. We denote the pixel co-ordinates of the wingtips in these frames, which we
94
call the Wex frames, by (pxL,pyL) (left wingtip) and (pxR,pyR) (right wingtip). The projected
95
locations of the two wingtips on the floor are determined by using the mapping function,
96
described above, to carry out an interpolation. Essentially, the projected location of this wingtip
97
on the floor is obtained by computing the position of the point on the floor that has the same
98
location, relative to its four surrounding grid points, as does the position of the wingtip (in
99
image pixel co-ordinates) in relation to the positions of the four surrounding grid locations (in
100
image pixel co-ordinates). Thus, in the case of the left wing tip, for example, this computation
101
effectively uses the locations of the four grid points 1,2, 3 and 4 (see Figure 1) with locations
102
(X1,Y1), (X2,Y2), (X3,Y3) and (X4,Y4) on the floor, and their corresponding image pixel co-
103
ordinates (px1,py1), (px2,py2), (px3,py3) and (px4,py4) respectively, to interpolate the
104
projected position of the pixel co-ordinate (pxL,pyL) on the floor. A similar procedure is used
105
to project the position of the right wingtip (pxR,pyR) on the floor. The construction of the two-
106
dimensional mapping function, and the interpolation are accomplished by using the Matlab
107
function TriScatteredInterp. (Equivalent customized codes could be written in any language)
108
Once the positions of the two wingtips have been projected on to the floor, this information
109
can be used to determine the instantaneous position of the bird in three dimensions, as
110
illustrated in Figure 2. In Figure 2, the 3D positions of the left and right wingtips are denoted
111
by M, with co-ordinates (xL,yL,z), and N, with co-ordinates (xR,yR,z), respectively. Their
112
not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission.
The copyright holder for this preprint (which wasthis version posted June 6, 2018. ; https://doi.org/10.1101/340232doi: bioRxiv preprint

Citations
More filters

High-Speed Surface Reconstruction of Flying Birds Using Structured Light

TL;DR: A custom 3D surface reconstruction method is developed, which uses a high-speed camera to identify spatially encoded binary striped patterns that are projected on a flying bird to analyze wing geometry and aerodynamic variables time-resolved.
References
More filters
Book

Multiple view geometry in computer vision

TL;DR: In this article, the authors provide comprehensive background material and explain how to apply the methods and implement the algorithms directly in a unified framework, including geometric principles and how to represent objects algebraically so they can be computed and applied.
Journal ArticleDOI

Software techniques for two- and three-dimensional kinematic measurements of biological and biomimetic systems.

TL;DR: In this paper, the underlying principles of video and stereo video analysis as well as its automation are reviewed and accompanied by a fully functional and freely available software implementation, which can be used to automate the analysis.
Journal ArticleDOI

Decomposing biological motion: A framework for analysis and synthesis of human gait patterns

TL;DR: A framework is developed that transforms biological motion into a representation allowing for analysis using linear methods from statistics and pattern recognition, and reveals that the dynamic part of the motion contains more information about gender than motion-mediated structural cues.
Journal ArticleDOI

A flexible GPS tracking system for studying bird behaviour at multiple scales

TL;DR: It is anticipated that flexible tracking systems that enable researchers to optimize their measurement protocols will contribute to revolutionizing research on animal behaviour and ecology in the next decade.
Journal ArticleDOI

How honeybees make grazing landings on flat surfaces

TL;DR: It is shown that, during landing, the bee decelerates continuously and in such a way as to keep the projected time to touchdown constant as the surface is approached, which reflects a surprisingly simple and effective strategy for achieving a smooth landing.
Related Papers (5)