Developed by the University of Illinois, the TerraSentia robot that autonomously monitors crops earned the best systems paper award at Robotics: Science and Systems, the preeminent robotics conference held in Pittsburgh. Credit: TERRA-MEPP Project Today's crop breeders are trying to boost yields while also preparing crops to withstand severe weather and changing climates. To succeed, they must locate genes for high-yielding, hardy traits in crop plants' DNA. A robot developed by the University of Illinois to find these proverbial needles in the haystack was recognized by the best systems paper award at Robotics: Science and Systems, the preeminent robotics conference held last week in Pittsburgh.
"There's a real need to accelerate breeding to meet global food demand," said principal investigator Girish Chowdhary, an assistant professor of field robotics in the Department of Agricultural and Biological Engineering and the Coordinated Science Lab at Illinois. "In Africa, the population will more than double by 2050, but today the yields are only a quarter of their potential."
Crop breeders run massive experiments comparing thousands of different cultivars, or varieties, of crops over hundreds of acres and measure key traits, like plant emergence or height, by hand. The task is expensive, time-consuming, inaccurate, and ultimately inadequate—a team can only manually measure a fraction of plants in a field.
"The lack of automation for measuring plant traits is a bottleneck to progress," said first author Erkan Kayacan, now a postdoctoral researcher at the Massachusetts Institute of Technology. "But it's hard to make robotic systems that can count plants autonomously: the fields are vast, the data can be noisy (unlike benchmark datasets), and the robot has to stay within the tight rows in the challenging under-canopy environment."
Developed by the University of Illinois, the TerraSentia robot that autonomously monitors crops earned the best systems paper award at Robotics: Science and Systems, the preeminent robotics conference held in Pittsburgh. Credit: TERRA-MEPP Project Illinois' 13-inch wide, 24-pound TerraSentia robot is transportable, compact and autonomous. It captures each plant from top to bottom using a suite of sensors (cameras), algorithms, and deep learning. Using a transfer learning method, the researchers taught TerraSentia to count corn plants with just 300 images, as reported at this conference.
"One challenge is that plants aren't equally spaced, so just assuming that a single plant is in the camera frame is not good enough," said co-author ZhongZhong Zhang, a graduate student in the College of Agricultural Consumer and Environmental Science (ACES). "We developed a method that uses the camera motion to adjust to varying inter-plant spacing, which has led to a fairly robust system for counting plants in different fields, with different and varying spacing, and at different speeds."
Explore further: Ag robot speeds data collection, analyses of crops as they grow
More information: Erkan Kayacan et al, High-precision control of tracked field robots in the presence of unknown traction coefficients, Journal of Field Robotics (2018). DOI: 10.1002/rob.21794