arXiv:1807.08812v4 [physics.bio-ph] 15 Dec 2018 peia raml htrgsestept fa animal an of path the lightweight registers that air-cushioned an treadmill of spherical consists tracking. insect instrument to applied The procedures region hot mobile of in ple especially task, This intensive labor instrument. very the the environments. of a chase load must indeed researchers the is The carrying un- the while body. GPS exceeds easily insect, insect the which the since cm 1.0 of low as size is big as data be the can of certainty trajectories longer resolution obtain the to but one allows approach mobile a) millimetric of tracking continuous for robot autonomous An one ieeta P naspotbighl by held experiment being their support throughout a operator on human one GPS differential a mounted on oiaerslto n hrfr rcso nthe in in substantially but precision suffer cameras, may therefore positions and the estimated resolution of image height so the doing raising by collective tech- but increased, individuals, well this single as using only motion out not experiments. study carried to the been nique of have duration lim- efforts the and the Research is area in drawback observation remains major ited the device However, displacements, acquisition position. or image same vibrations the by camera’s and that simple the affected given quite by not is limited setup are experimental course, results The of view. stud- is, of be field to surface surface de- This the statically covering ied. are height, given devices a acquisition at ployed image more or one sdapoc orcr n nlz h oinof motion the analyze objects moving and or species record other and to arthropods, approach used ue Vision SODAR, puter RFID, Com- as and Tomography such have Computational research X-Rays, of approaches SONAR, field Several this in used alike. been engineers and scientists INTRODUCTION I. gvieralopez@fisica.uh.cu h oko .Dahmen H. of work The Narendra A. approach, another Following ie aeatakn F)i h otcommonly most the is (FC) tracking Camera Fixed nmltakn a enacniuu hleg for challenge continuous a been has tracking Animal rlmnr eut civduigtepooe rcigsse ar pa system the tracking precision experiment proposed higher the the using with During achieved tech estimate results Processing position. to preliminary Image spatial processed Digital their afterwards uses calculating and that by platform time 1000 to robotic real walker up mobile the sc analysis of a motion a continuous size describes of and work the precise Our of than continuity. boundaries bigger or the st times accuracy most either 100 of than limitations, expense more technical no to areas due –suc within However, walkers millimetric-sized studies. of tracking behavioral continuous and precise The .Serrano-Mu˜noz,A. 1) 2) Cuba 3) Dtd 8Dcme 2018) December 18 (Dated: Cuba ru fCmlxSsesadSaitclPyis hsc F Physics Physics, Statistical and Systems Complex of Group SC ehia nvriyo aeca 62 aeca Sp Valencia, 46022 Valencia, of University Technical DSIC, aut fMteaisadCmue cec,Uiest of University Science, Computer and Mathematics of Faculty 1 . 3,4 h eino td ol efurther be could study of region The . 1 tal. et .Frayle-P´erez,S. 7 tnsota nexam- an as out stands ,2 1, 5 . .Reyes, A. 2 nFC, In . 6 This . tal. et 1 .Almeida, Y. rohrkn fweldmbl robots robots mobile flying wheeled as of such kind techniques, other our or camera with a a used with maintain be equipped to and could able position their robots of of other track localization principle, keep to In robot walker. drive differential the a use we work this agrta h ieo h niiul–“non-confined” individual the times of 1000 size over areas– the areas in than walkers hard larger track and to complexity The required work length. tracking or accuracy in INSTRUMENT TRACKING THE II. individuals arthropods. real of of species arti- tracking different of two the tracking to also belonging the and on walkers, based uncertainty ficial the of Section uation described. are measured ing widespread more becoming are robots opens and This ecology.control behavioral surface. of flat field a the crustaceans– in on even possibilities meters and new few insects a of over walkers moving species millimetric-sized imple- many track and as to designed –such have able We system a areas. mented confined non in als this However, altering potentially behavior. precision. insect, natural ob- the of its are to invasive degree trajectories be may high Long system a sphere. with the tained of top on walking rsnstepoeueo yia xeietbsdon Section based In experiment Section apparatus. typical proposed a the described. of fully procedure the is presents instrument tracking the frbtbcuei ses olclz tuigtemotion the using it localize to easy is it because kind robot particular intervals this of chose time We long interaction. human for without walkers millimetric-sized track areas. to bigger based explore camera to the ap- location of animal’s camera position the in- fixed on the Our the changes of way. but accuracy automated proaches, fully the a exploits in strument task this complishes hsppri raie sflos nSection in follows: as organized is paper This urn ehd ftakn akr aelimitations have walkers tracking of methods Current computer automated using built instruments Scientific individu- single of motion the study we work this In h ntueti ieeta rv oierbtable robot mobile drive differential a is instrument The ain 6 cly nvriyo aaa 00 Havana, 10400 Havana, of University aculty, 3 aeldu odsg nisrmn htac- that instrument an design to us led have aaa 00 Havana, 10400 Havana, .Altshuler, E. etfi ntuetdsge opush to designed instrument ientific presented. e de ocnrt ntrajectories on concentrate udies htae ytewles Some walkers. the by traced th oylntso oe tconsists It more. or lengths body sat–i ut motn in important quite is ants– as h iust rc h agt in targets the track to niques ,alteiae r stored, are images the all s, rlne rjcoisa the at trajectories longer or 1 n .Viera-L´opez G. and szdwalkers -sized IV V 11–13 nldsteeval- the includes h aibe be- variables the 1, . a) 8–10 In . III II 2 of its wheels. But with minor changes, other kinds of then possible to locate the walker relative to the area. robots can be used depending on the desired working environment. In Figure 1 a sketch of the instrument and all its components is shown. III. TRACKING PROCEDURE An infrared camera is the primary sensor used to de- tect the position of the walker. It was placed on the In our previous work14, several tracking algorithms robot to capture images of the ground covering an area of based on image processing designed to capture the tra- 2 0.1054 m at approximately 30 cm in front of the robot. jectory of a single insect in a sequence of frames were The camera type can be varied, but is important that presented and discussed, focusing on the ones involving the resolution of the camera be high enough to resolve a mobile camera. The instrument proposed here is able the walker. Using an infrared camera allows the instru- to work with all of these algorithms. In this section we ment to work at night which is convenient for the study explain the procedure of a typical experiment, covering of nocturnal species. The images are scaled via GPU the details of every step of the process. at 512 × 512 pixels and processed to find the position of the targeted individual in each frame. The decision of whether it is necessary to move the robot in order to keep A. Work-flow the target in the field of view of the camera is made based on the position of the walker relative to the camera. At first, the robotic platform is carried near the target individual, placing the camera right above it. At this point the trail left by the walker begins to be traced. The robot, and therefore the camera, remain static, hence the Frame Differencing algorithm14 may be applied, although other algorithms can be used. When the target moves, altering its current location, it will be detected and a Region of Interest (ROI) around it will be selected. As the walkers explores the area, it will eventually es- cape the field of view of the camera. The robot must act right before this happens, rapidly moving in such a way that the target occupies again a spot near the center of the visual field of the camera, as it did at the begin- ning of the experiment. To be able to determine when to move, it is necessary to track the walker in real time. This Real-Time Tracking does not need to be extremely accurate, but it needs to work as fast as possible. Once the data from the whole experiment is gathered, another tracking process is performed in order to obtain an accurate estimation of the position of the walker rel- ative to the camera. The trajectory of the individual is FIG. 1. Sketch of the instrument based on a Differential Drive reconstructed relative to the area using the images and mobile robot used to track non-flying arthropods. The in- the position of the robot. frared camera on top, attached at the end of the horizontal arm, allows to work on low illumination conditions. The outer diameter of the mobile robot is 28 cm and the bar supporting the camera is 80 cm long. B. Real-Time Tracking
High accuracy in the real-time tracking is not impor- The robot was designed using open source technologies. tant, because all of the camera frames captured and robot Motors with 360 steps-per-revolution Hall effect encoders positions are stored for further processing in order to ob- and a custom motor driver based on Arduino were cho- tain a highly accurate trajectory. The goal at this stage sen. The core of the system is the single board computer is to keep the target walker inside the field of view of the Raspberry Pi model 3. It handles the communications camera at all times. To accomplish this task, each frame via WiFi or Ethernet, motion control and localization of has been divided into three regions, labeled as 1, 2 and 3 the robot, and the camera sampling as well as logging in Figure 2. The radii of the circumferences enclosing re- and processing of images. However, in most cases an ex- gions 1 and 2 can be easily modified, since not all walkers ternal computer is used to process the images in real time move at the same speed. in order to obtain a lower latency in the determination The outermost region is used as a trigger zone, to indi- of further movements. cate that the walker is near the edge of the field of view. The temporal evolution of the position of the robot The robot must then move in such a way that the target is stored along with the images acquired during the is centered. It will keep moving, thereby transitioning experiments. By processing these images it is possible the target from the region labeled 3, to the one labeled to locate the walker relative to the camera. Using the 2 and finally 1. Once it has been positioned in this in- information of the position of the robot in the map, it is ner zone, the robot stops, until a new escaping threat is 3 detected, and the procedure then repeats. cedure integrates the rotation of the wheels in order to compute a relative location of the robot following the equations: C. Trajectory Reconstruction r ∆S = (∆φ + ∆φ ) (1) 2 R L At this point, the purpose is to obtain the actual tra- jectory of the individual being tracked with high preci- sion, regardless of the processing time. This is better r ∆Θ = (∆φR + ∆φL) (2) done after gathering the data in the field, in order to d avoid excessive CPU usage that may slow down the data where ∆S is the linear displacement of the robot, ∆Θ acquisition process or allow the target to escape the field is the angular displacement of the robot, r is the radius of view of the camera. of the wheels, d is the distance between the wheels and The tracking process of the individual in this stage is ∆φ and ∆φ are the angular displacement of the right highly accurate, but may require human assistance when R L and left wheels respectively. the tracked object cannot be auto-detected accurately. Based on the incremental magnitudes and the initial All the images gathered are reprocessed to acquire the coordinates of the robot relative to the area it is possi- position of the target arthropod relative to the camera on ble to estimate its current localization (x, y, Θ) for each every frame. Next, we write the position of the arthropod time step k in coordinate system centered in the initial relative to the area in which the origin is the arthropod’s position of the robot. The localization can be obtained initial position in the ground. The problem of recon- using a second order Runge-Kutta integration through structing the trajectory of the walker on the area can be the following equations: solved if the trajectory of the camera relative to the area can be obtained. ∆Θ X = X − + ∆S cos(Θ − + ) (3) k k 1 k 1 2 D. Camera Positioning
As the camera is located at a fixed angular and lineal ∆Θ Y = Y − + ∆S sin(Θ − + ) (4) distance relative to the robot, both will move together. k k 1 k 1 2 So, localizing one of them, will automatically provide the position of the other. The localization of the camera can 15–18 be achieved by several methods . We implemented Θk =Θk−1 + ∆Θ (5) two different methods to localize the camera that are useful under different circumstances. Once the robot is localized, the camera position is known through Equation 6, where w is the camera’s dis- tance from the center of the robot: 1. Robot Odometry
X cos Θ X The first localization method aims to estimate the c r r Y w Y camera’s position via Robot Odometry (RO). This pro- c = sin Θr + r (6) Θc 0 Θr
This method is an effective way of estimating the local- ization of the camera in environments with uniform floors where the wheels are not likely to slip. The computed localization is independent of the illumination of the en- vironment. But, as any relative localization method, it also carries an accumulative error due to the numerical integration.
2. Monocular Visual Odometry
The other approach used was the Monocular Visual Odometry (MVO). This method directly estimates the position of the camera based on the detection of features in the captured images19. The procedure extracts fea- tures from a first image and tries to find the same features in a second image. Afterwards, a linear transformation FIG. 2. Classification of frames in regions labeled 1, 2 and matrix (rotation, scale and translation) is calculated for 3 based on the possibility of the target to escape from the these features. Finally, the position of the camera is esti- camera’s field of view in the following frames. mated based on the calculated transformation matrix by 4 a numerical integration of the relative displacements on rotations at each sample. Finally, the relative angles are each frame. ordered, forming a rotation histogram that shows how The scale value of the calculated matrix can be used likely is the walker to rotate at a given angle. to evaluate the quality of the computed transformation, In order to analyze these variables correctly, it is nec- because the scale has to be constant as the camera is essary to obtain for all time steps the position of the indi- not changing its height relative to the ground. In case vidual with a high precision in a sufficiently long lasting the scale changes significantly, new features have to be experiment. Useful trajectories should be, at least, three acquired. orders of magnitude larger than the size of the individual. This method tracks the position of the camera directly and does not depend on the amount of slippage of the wheels. However, it requires that the floor has enough V. RESULTS AND DISCUSSION landmarks to be used as features for the algorithm; it still carries a cumulative error due to the numerical inte- Validating this kind of system is a challenging task. gration process. When using individuals of the same species, or even the Generally in our experiments, we use MVO unless the exact same individual, it is possible to get different re- floor is too homogeneous because it has fewer sources sults when quantifying the variables of interest, due to of error. However, other methods of localization can be fluctuations in the animal behavior. Getting the ground tested to increase the accuracy in this stage. truth of the walker’s actual trajectory is not easy. Next, we present different results in the tracking of artificial and real walkers. IV. MEASURED VARIABLES
Different parameters can be used to quantify the mo- A. Tracking of virtual walkers tion of animals during free exploration, some of them have been used in very diverse fields. These parameters In order to validate our tracking system, we designed characterize the walker trajectory when interacting with an experiment in which a virtual walker is generated and Escherichia coli its surroundings. For example, bacte- projected on a LCD screen placed face-up on the ground. ria can be tracked in a three dimensional liquid solution We then had the robot track the walker. This configura- medium20,21, and insects such as ants in a bi-dimensional 22 tion allows to compare a precisely generated trajectory of medium . a walker with the tracked trajectory obtained using our Two of the parameters useful to characterize the system. Even in the limited region of the LCD screen, trajectory of walkers are the diffusivity and turn 23 it is possible to generate long trajectories that make the symmetries . Both are briefly described next. walker move 1000 times its size, causing the robot to move several times, thus accumulating error in the local- ization. A. Diffusivity The virtual walker was programmed to rotate a ran- dom angle on each time step following a Gaussian distri- Diffusivity relates the average change in position of a bution. If the walker was not at the borders of the screen, particle or set of particles in a given time using their it would move forward a constant distance. That simple Mean Squared Displacement (MSD) ≡< r2 >, where r is automaton produces very complex trajectories. Figure the distance moved and the brackets refer to an ensemble 3A shows a sample generated trajectory, of around 7m average. Normal diffusion (or Brownian motion) follows long, that was used to validate the system. The robot an < r2 >∼ t law24,25. More generally, we can define was placed at different initial positions and then tracked the law < r2 >∼ tγ , where the slope γ is the parameter the virtual walker 10 times. In Figure 3B it is possible to used to define the diffusivity. The MSD is calculated as observe a reconstructed trajectory obtained as a result of the difference between the present position and the ini- one of the tracking processes performed. tial position for each instant of time. Following the clas- There are some qualitative differences in the trajecto- sification of Viswanathan et al.25, the different regimes ries generated and estimated. To quantify the accuracy, of motion can be classified, based on the value of γ, as we use the variables of interest which are computed and super-ballistic, ballistic, super-diffusive, normal diffusive shown in the Figures 4 and 5. and sub-diffusive. Figure 4 shows the statistics of turn symmetry com- puted after tracking 10 trajectories of the generated walker. It is possible to see how consistent the measure- B. Turn Symmetry ments are. Figure 5 shows the diffusivity analysis based on the Turn Symmetry is used to characterize the angular same 10 trajectories, also demonstrating the good re- changes in a trajectory. It finds turning patterns by peatability of the measurements. means of a rotation histogram25. To generate the data for The results shown suggest that our instrument is per- the histograms, it is necessary to compute all the angular forming well in several trackings of an artificial walker. positions of the walker relative to a fixed coordinate sys- Our data show that even with accumulative errors due to tem. Consecutive angles are subtracted to obtain relative the relative localization systems, the variables of interest 5