Computer vision has its eyes on precision


Cornell University tree fruit physiologist Terence Robinson said precision crop load management is critical to orchard profitability, and he hopes to see developing technology that will eventually help growers do it on a tree-by-tree basis. (TJ Mullinax/Good Fruit Grower)

Optimized crop load delivers optimum profitability. 

Even the researchers and extension specialists who have promoted precision crop load management for the past decade agree that is far easier said than done.

“To do that, we’ve developed manual approaches that are time-consuming, and no one likes to do them,” said Terence Robinson, applied fruit crop physiologist at Cornell University. “Growers love the information but hate to do it for their farm.” 

From that frustration grew an idea: Emerging computer vision technology could replace hand measurements with images and algorithms. Robinson began gathering a team of leading horticulturists and a technology partner to seek grant funding in 2019, eventually landing $5 million from the U.S. Department of Agriculture’s Specialty Crop Research Initiative in late 2020. 

But as that work got underway, more tech startups began turning their attention to the same problems, using tools ranging from complex camera arrays to smartphone apps — all promising to deliver data-driven crop load management. 

“Over the past three years, I’ve learned of at least five other companies trying to do something similar,” Robinson said. “We’re trying to engage and interact with all of them.” 

The research team’s first tech partner, Moog, is a New York-based military contractor that was looking to test problems in automation and computer vision closer to home. Moog built one Apple Data Rover that gathers the image data in orchards and is working on technology to relay instructions to workers, via headset, on how much to prune or thin.

“If I can map every tree, geolocate that information and store it in the tree, and communicate that to the human worker standing in front of that tree, then every tree is managed to the optimum fruit number,” Robinson said. “Some people are not convinced that growers want that level of detail. … I don’t know if it will ever happen, but it’s my goal in life.”

Robinson hopes to bring the Moog rover to Washington for trials this year, but growers there are gathering similar imagery data with the Cartographer technology developed by Australia-based Green Atlas. It’s now available as a service from two Washington companies. 

Both technologies (which were featured in the pages of Good Fruit Grower last August) use multiple cameras and light arrays mounted on ATVs, making the technology an expensive investment. 

User-friendly products with lower barriers to entry are also starting to appear in the marketplace, said Michigan State University Extension apple production specialist Anna Wallis, a member of the research team. At a Michigan field day last summer, she introduced growers to several technologies she’s been evaluating, including FarmVision (see “A vision in hand“), FruitScout (see “Phoning in precision“) and Vivid (see “Computer eyes“). 

Robinson, Wallis and other researchers planned a meeting with “all the players in the field” this month to discuss how to incorporate each distinct technological approach into the larger research project. Robinson wants to learn how they handle the inherent error of occlusion — the fruit or flowers blocked from camera view — and ground-truth their results.

The computer vision industry remains in the early stages of development, Wallis said, but startup companies are already capturing a “tremendous amount” of helpful information about crop load management.

Even as research continues, though, Robinson recommended that growers test out these technologies if they have the opportunity. 

“Some of the newer approaches seem very promising to me, and I think it’s worth a try,” he said. “I like the (FruitScout) approach, I like the Vivid approach. If folks in Washington can try Green Atlas, I think they should jump at the opportunity.” 

Meanwhile, the research team aims to identify the optimum crop load, both from a physiological and economic perspective, Robinson said. In Washington, Michigan, New York and North Carolina, physiologists are looking at the fundamental relationships between climate and crop load.

Researchers have always believed that Washington, with its high light environment, can support larger crops than Eastern growing regions, but they have never studied why. Data collected last year supports that theory, with larger crops and fruit size found in Washington, but more data is needed to make regional recommendations on optimum crop load, he said.

The horticulturists have a few ag tech ideas of their own, when it comes to making the fruitlet growth model more accessible, Robinson said. Tom Kon, a collaborator from North Carolina State University Extension, is researching whether fruitlets that are still developing show a different spectral signature than those that are destined to fall off, and how a multispectral camera could be used to distinguish them with no sizing needed. 

MSU’s Todd Einhorn demonstrated last year that harvesting a sample of enough fruitlets and photographing them, so that computer vision algorithms could size them, can also provide data for the fruitlet growth model. And this spring, Hectre, a New Zealand-based orchard management platform that already developed a fruit-sizing tool for bins, is releasing a sizing tool for a sample of harvested fruitlets. 

by Kate Prengaman



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *