Rod Farrow gives a demonstration of a prototype vision system developed for assessing crop load in apple orchards. The system will specifically focus on counting buds so that growers can use the data to do better precision crop load management. Tech developer Moog is collaborating on a Cornell University-led precision crop load management project that includes leading apple researchers from around the country. (Amanda Morrison/for Good Fruit Grower)
Rod Farrow gives a demonstration of a prototype vision system developed for assessing crop load in apple orchards. The system will specifically focus on counting buds so that growers can use the data to do better precision crop load management. Tech developer Moog is collaborating on a Cornell University-led precision crop load management project that includes leading apple researchers from around the country. (Amanda Morrison/for Good Fruit Grower)

When can emerging technology offer the biggest benefit to a grower’s bottom line? A team of leading tree fruit horticulturists are betting that it begins even before bloom — and with a new $4.8 million federal grant for precision crop load management, they aim to prove it. 

“This is an attempt to tackle the amount of value a crop generates for a particular acre, rather than reduce input costs,” said Terence Robinson, Cornell University horticulture professor and leader of the new project. 

The best available precision crop load management strategies today are tedious, he said, and it’s hard to convince growers that it’s worth their time to count buds before pruning or use the fruitlet growth model and carbohydrate model to bring more precision to chemical thinning. Enter vision-based technology that could gather that data for every tree in the block. 

Vision-based technology underpins robotics, but it’s advancing more quickly than things on the hardware side, said Karen Lewis, director of Washington State University Extension’s Agriculture and Natural Resources Program. Putting that information into the hands (or ears, or eyes) of skilled farmworkers could go a long way toward increasing the precision of crop load management, she said. 

“With computer vision, we think we can develop data on every single tree, store it in the cloud using GPS maps and communicate it to the worker to guide his actions on each tree,” Robinson said, perhaps using something such as smart glasses or via audio instructions.

Such technology may be closer than growers realize. Two teams of engineers participating in the project say their vision systems can already count developing buds and track the crop development on each tree. 

The project includes a who’s who of apple researchers from Cornell, WSU, Penn State University, Michigan State University, North Carolina State University, University of Massachusetts, Virginia Tech University and the Washington Tree Fruit Research Commission, along with engineering company Moog Inc. They will also tackle fundamental physiology research to understand the optimum crop load for key varieties in different regions and guide the application of the technology. Extension team members have gathered an industry advisory panel to consult on the project, and economists will help to quantify the value of careful crop load management. 

“What I am most excited about is this is bud to bin,” Lewis said. The increasing cost of labor is a top concern for apple growers, and precision crop load management is one of the best ways to ensure a positive return on investment for those payroll dollars. “This really is the topic of our time,” she said.

Technology

Robinson’s vision for the project got a boost from an unlikely partnership with a Buffalo, New York-based engineering firm better known for military defense contracts. 

“If I can count apples on a tree or buds on a tree, I can count soldiers on the battlefield: It’s all the same technology,” said Chris Layer, principal engineer of the technology and advanced pursuits division at Moog. 

Prior to the grant funding, Layer was looking for opportunities in agriculture where his team could gain experience in autonomous vehicle navigation, vision technology and robust, high-precision GPS systems they expect to eventually use in military robotics. A conversation with Cornell extension specialist Mario Miranda Sazo and Western New York grower Rod Farrow convinced him that bud counting for apple growers was a problem worth solving. 

Last summer, Moog tested a prototype in Farrow’s super spindle orchards. Driving at 4 or 5 mph, it uses bright lights to normalize the variation in sunlight as it takes photos of the canopy. Counting buds gets easier as they swell, Layer said, but the prototype works fairly successfully during the month from green tip to full bloom. Eventually, they envision imaging buds, blooms and fruitlets.

Another team of engineers collaborating on the project has developed drone-based vision systems that can precisely map bloom progression. 

“It’s a challenge because buds are so small and for a ground vehicle, it’s hard to zoom to bud level, so we are using drones,” said Penn State agricultural engineer Daeun Dana Choi. 

While Choi focused on vision systems, her colleagues, Paul Heinemann and Long He, will focus on the long-term goal of developing robotic systems that could eventually do the crop load management work. He also leads a U.S. Department of Agriculture-funded project looking at robotics for green fruit thinning, while Choi has a National Science Foundation grant to map bloom stages to eventually guide smart frost protection. 

When it comes to the crop load collaboration, Choi said the challenge really comes from processing the enormous amount of data the imaging generates, through machine learning, and figuring out how to translate it into actionable insights. 

“It (provides) the number of fruits on the trees, but machine learning doesn’t know what to do with it, whether it’s too much or too little,” she said. 

Even with today’s computer power, processing all that visual information takes time. Robots would need to be able to do it in real time, but workers could make pruning or thinning decisions using information gathered on drone flights done the day before and processed overnight, Heinemann said. He likened the technology evolution to the development of self-driving cars.

“What do we have now? We have lane assist and adaptive cruise control and other driver-assist safety features,” he said. “We’re taking those steps and putting those pieces together to use anything we can prior to full automation.” 

Physiology

Technology is only half of the project. 

To implement the technology successfully, growers will need to know what the optimum crop load is and, ideally, how to select the best developing buds.

“If the engineers can come up with vision systems and the appropriate technology to take that information and reduce crop load, we would love to bud thin,” said Todd Einhorn, horticulture professor at MSU. “If we could precisely narrow our flower load down by choosing the buds we want, how would we choose them?”

Growers already know to favor setting fruit at king bloom, which results in larger fruit than side bloom, Einhorn said. He wants to understand if that advantage in fruit growth rate goes all the way back to the bud, and if so, how early can the difference be detected?

To understand how optimum crop load varies by region, Robinson and Einhorn, along with WSU’s Stefano Musacchi and NCSU’s Tom Kon, will set up trials looking at Gala and Honeycrisp in each region, along with regionally key varieties such as WA 38 in Washington. 

“For each climate, there is a potential number of fruit the tree should have to be sized for the market to maximize crop value,” Robinson said. “We want to know what’s the climate potential for each variety, and then the economists will help us focus on what’s the maximum crop value.”

Another physiology question raised by the potential technology relates to the fruitlet growth model growers can use to predict which fruitlets will drop — so they can more precisely apply thinners. The model works by repeatedly measuring the growth of the same flagged fruitlets to get growth rate over time. But a vision system would look at the size distribution of far more fruitlets, Einhorn said. 

“Small fruit can be small and have a high growth rate,” he said, indicating that it’s not likely to drop. Physiologists will need to figure out how to adjust the approach to use the data that vision systems can provide. “We need to figure out how (vision data) can help growers get their reapplications on at the optimum time,” he said.

Combining these fundamental physiology research questions with technology to make the information actionable makes this a “pioneering project,” according to Musacchi, WSU tree fruit physiologist. 

“Right now, you go to the field at bloom and everything looks homogenous, but it’s not,” he said. “From the physiological point of view, thinning is the most uncertain activity you carry out in the orchard.” 

There’s a common saying in the industry: that growers “spray and pray,” because so many factors influence thinning outcomes, Musacchi said. This project will provide a better understanding of how thinning works. “If you bring more science, you can mitigate the risk,” he said. •

by Kate Prengaman