The Robotanist: a Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping

The Robotanist: a Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping

The Robotanist: A Ground-Based Agricultural Robot for High-Throughput Crop Phenotyping Tim Mueller-Sim, Merritt Jenkins, Justin Abel, and George Kantor Abstract— The established processes for measuring phys- iological and morphological traits (phenotypes) of crops in outdoor test plots are labor intensive and error-prone. Low-cost, reliable, field-based robotic phenotyping will enable geneticists to more easily map genotypes to phenotypes, which in turn will improve crop yields. In this paper, we present a novel robotic ground-based platform capable of autonomously navigating below the canopy of row crops such as sorghum or corn. The robot is also capable of deploying a manipulator to measure plant stalk strength and gathering phenotypic data with a modular array of non-contact sensors. We present data obtained from deployments to Sorghum bicolor test plots at various sites in South Carolina, USA. I. INTRODUCTION Plant phenotyping is a critical step in the process of breed- ing crops for higher yield, disease resistance, drought tol- erance, and other desirable traits. Plant genome researchers must empirically confirm that new cross-breeds exhibit asso- ciated phenotypes, such as stalk width, leaf area, leaf angle, and color. Unfortunately, the rate at which these associations are measured and analyzed is slower than the rate of plant genome research. Fig. 1. The Robotanist in sorghum breeding plots near Clemson, SC. This deficiency is well-recognized by the scientific com- • munity, which has deemed it the Phenotyping Bottleneck [1]. A platform capable of navigating between row crops This bottleneck is caused by a variety of factors, including and deploying phenotyping sensors for sub-canopy data labor-intensive processes, their associated costs, and the collection • necessity of replicated trials. The laborious process of plant What we believe is the first-ever demonstration of phenotyping is currently performed by highly skilled plant outdoor contact-based automated phenotyping scientists and breeders who must assess thousands of plants The rest of this paper is organized as follows: Section 2 under field conditions. Unless the rate of plant phenotyping is describes related work on this topic and the current state accelerated, the agricultural promise of plant genomics will of the art. Section 3 provides an overview of the system not be fully realized. and validation in the field. Future work and conclusions are In this paper, we outline the design and testing of a novel presented in Sections 4 and 5. ground robot capable of autonomously navigating within II. RELATED WORK sorghum rows. The robot gathers phenotypic data using A. High-Throughput Phenotyping a custom manipulator and non-contact sensors such as a custom side-facing stereo camera, and offers significantly Lemnatec has developed an outdoor phenotyping platform, higher throughput than manual measurements performed on the Scanalyzer Field, which is an overhead gantry with a plant structure beneath the crop canopy. The robot was sensor payload consisting of multi-spectral cameras, fluores- tested in fields located in Clemson, SC and Florence, SC cence imaging, and LiDAR [2]. Two of these gantry systems in July and August of 2016. These tests demonstrate that the are commercially deployed, each capable of measuring up to platform is capable of navigating fields exhibiting a variety 0.6 hectares of crops. While gantry systems will yield very of soil conditions and phenotyping a wide array of sorghum detailed information, they are expensive and constrained to accessions. a relatively small plot size. The development and deployment of this novel mobile Researchers associated with the University of Arizona system has yielded the following contributions to the field and the United States Department of Agriculture (USDA) of agricultural robotics: developed a tractor-based platform for phenotyping Pima cotton. The system consists of sonar proximity sensors, *This work was supported by the ARPA-E TERRA Program GPS, infrared radiometers, and NIR cameras to measure T. Mueller-Sim, M. Jenkins, J. Abel, and G. Kantor are with The Robotics Institute, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh PA, canopy height, temperature and reflectance [3]. However, USA [email protected] the data collected by the system is restricted to overhead by military and police organizations for explosive ordinance disposal and reconnaissance [15], and Omron Adept Tech- nologies, Inc. offers a variety of wheeled platforms used primarily for robotics research [16]. Rowbot Systems has developed a platform [17] that is designed to travel between crop rows autonomously. While a variety of ground-based robotic vehicles are currently available, none meet the specific functional, quality, performance, and modularity requirements of this project. D. Perception, Localization, and Navigation Image-based plant phenotyping using commodity sensors is a growing field within the computer vision and bioinfor- Fig. 2. The Robotanist enters a row of sorghum. matics communities. Researchers associated with the Univer- sity of Bonn developed a method of segmenting plant stalks views, and the tractor’s maximum boom height (1.93 m) from plant leaves using indoor laser scan data [18]. Another limits the height at which plants could be phenotyped. Texas team reconstructed corn plants indoors using a time-of- A&M University is also developing an overhead phenotyping flight camera, a turntable, and 3D holographic reconstruction platform [4]. Both of these systems require a trained operator. techniques [19]. While a significant amount of research Aerial-based research platforms are easy to deploy and can focuses on image-based plant segmentation indoors, very few collect data over large distances in relatively short periods of these methods have been applied in field conditions. of time. However, they are limited by sensor resolution, Localization and navigation within agricultural settings payload, and flight time. Drone-based sensors are also unable has been the focus of significant research. A group from to directly measure sub-canopy phenotypes such as stalk Carnegie Mellon University (CMU) used a monocular cam- strength, stalk count, and leaf angle. Rotorcraft such as the era [20] and data from a LiDAR sensor [21] to navigate DJI Inspire or Yamaha RMAX have flight times that range between rows of an apple orchard, while a group from from 8–120 minutes, depending on payload, and maximum the University of Illinois investigated the use of a variable payloads range from 0.8–8.0 kg [5]. field-of-view camera to navigate within corn [22]. Most previous work was performed in monoculture fields, which B. Plant Manipulation do not contain the wide phenotypic variation inherent in Few intelligent agricultural manipulation systems have sorghum breeding plots. This phenotypic variation, such as been deployed in field conditions. A team from the Nether- stalk height and leaf angle, causes significant visible clutter lands describes a recently-developed system for autonomous within rows (see Figure 2). Prior work does not address harvesting of sweet peppers, but operation is constrained to reliable navigation from early season through to late season the controlled conditions of a greenhouse [6]. A Belgian growth stages. team developed an apple-picking robot that retrieves 80% III. SYSTEM OVERVIEW of fruit at an average rate of one apple every 9 seconds, but the system requires a shade covering the entire tree The state-of-the-art systems outlined above exhibit limita- when deployed outdoors [7]. Several commercial systems tions ranging from payload capacity to geometric limitations provide mobile platforms for harvesting, but laborers must to weather rating to cost. For this reason, we have devel- still pick the fruit [8]. To the best of our knowledge, the only oped our own custom intra-row autonomous mobile sensor automated system that manipulates vegetation, rather than platform - the Robotanist. fruit, is a grapevine pruning system developed by Vision The Robotanist is a wheeled skid-steer, electrically pow- Robotics Corp [9]. However, this system also requires a ered ground vehicle that is capable of autonomously navi- shade positioned over the entire plant. gating within sorghum rows. It can travel at speeds up to 2 m/s for more than 8 hours per charge. The robot is equipped C. Unmanned Ground Vehicles with LiDAR, RTK GPS, RGB cameras, inertial measurement Ground-based research platforms have been developed units, and the computing power necessary to run perception for applications such as precision pesticide spraying, soil and localization algorithms in real time. The system houses a sampling [10] and weeding [11], for a wide variety of field custom manipulator capable of taking contact measurements conditions and crops [12]. Several commercial ground-based from sorghum stalks, and is capable of deploying a wide robotic platforms were also investigated for their viability. range of non-contact phenotyping sensors. Clearpath Robotics Inc. has developed a family of field tested all-terrain robots which have been used under a wide A. Robot Base range of conditions, from mapping underground mines to 1) Requirements: System requirements were driven by navigating in dense forests [13]. Robotnik Automation S.L.L. the need to reliably traverse a typical breeding plot (1–2 has developed several robotic platforms that have been used hectares) within a few days in order to avoid significant plant to deploy sensors within an agricultural setting

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us