Conventional orchard scouting methods may not be rapid enough to be practical for growers, especially when conditions affecting the crop are too complex to be identified visually or through field sampling or laboratory analysis.
The emergence of small and mid-sized unmanned aircraft systems (UAS, or drones) have raised hopes in the farming community that “everything” seems to be possible with this technology. At least that is the impression given by the drone industry to growers at trade shows and expos.
So where do we really stand when it comes to using UAS imagery for crop monitoring? What sensor and data analytic tools do growers really have at their disposal for real or near-real time crop monitoring?
As of today, UAS is somewhat established technology with variants of consumer- and industry-grade platforms available in the market. Depending on the budget, one can procure a sophisticated UAS platform that can perform several types of within line-of-sight autonomous missions (region-of-interest or ROI, point-of-interest, follow-me, etc.) for about 25 minutes or longer.
An entire orchard can be repeatedly mapped by flying waypoint (GPS points) guided missions throughout the season. Flight missions can now be divided into the small ROIs, or a large ROI mission can be continued from last point-of-interest after swapping the UAS batteries.
Regulations have also eased to an extent that one can fly UAS on most agricultural farm areas, as long as teams are in compliance with FAA Part 107 (http://bit.ly/2Dhyne5) or have job specific certificate of authorizations (COAs) in certain conditions.
When it comes to sensor payload integration with UAS for meaningful agricultural use, we are still limited by sensor weight (payload), form factor, image capture rate for adequate overlaps (more for fixed-wing UAS types), and field ruggedness. An added challenge: calibration capabilities of an optical sensor to account for changing illumination scenarios (i.e. varied cloud covers on mapped farms). In general, active (one that uses its own source of illumination) and passive optical imaging sensors can be integrated with UAS.
Passive multi- and hyperspectral optical sensors are being promoted by the industry for agricultural fields and orchard mapping. A wide range of commercial multispectral sensors (3 to 10 bands) that measure reflectance at specific bands (up to 1,000 nanometers) are available for use with UAS. The cost of such sensors, ranging between $3,000 to $6,000, is primarily governed by the number of spectral bands and the spectral range of each band.
One of the major limitations of such multispectral sensors is that changes in ambient light conditions can influence the response.
Several ways to partly compensate for ambient light changes exist, including use of: 1) a reference calibration panel for data normalization; (2) commercial incident light (or downwelling) sensors integrated with the UAS to account for the variation in sunlight intensity; (3) spectral ratios or vegetation indices (VIs), rather than use of absolute reflectance values during data interpretation.
In terms of data use, certain vegetation indices can differentiate crop vigor changes at spatiotemporal scale and can be loosely related to nutrient deficiencies and water stress.
One can identify the problematic area with such imagery maps but cannot pinpoint specific reasons for such crop stress unless additional field management or ground-truth data is available. Multispectral imagery thus needs to be used as one of the many layers of information during diagnostic analysis.
Note that multispectral imaging sensors would not identify the presence of certain pests in an orchard; they are generally engineered to measure only crop reflectance variations. If the pest infestation damage results in crop vigor variation, it might be picked by the UAS multispectral imagery.
Orchard imaging time window for quality data acquisition still remains short, solar noon plus 2 hours on either side. Several factors can limit the quality of UAS imagery: orchards in hilly terrains, trees shadowing on one of the sides, netting and other types of crop/ground covers and wind-gusts at certain times of the day.
Method advancements are needed to compensate for cloud cover and sunlight variation scenarios during data analytics.
Hyperspectral imaging, with hundreds of spectral bands and narrower spectral resolution (typically less than 30 nanometers) allows more detailed soil and crop analysis in the ranges of 380 to 2,500 nanometers.
Existing hyperspectral imagers used in proximal and aerial sensing applications are of line scan (push broom) type, where either the sensor or the object needs to be moved to record or capture the image. Such sensors are somewhat heavy to be fitted on small UAS (less than 55 pounds).
Moreover, hyperspectral data output (also called “hypercubes”) need significant storage and computational power. In recent years, snapshot-type hyperspectral imagers, which are similar to digital cameras, that generate an instantaneous image of the objects within the Field-Of-View (FOV) are being designed for UAS-based remote sensing applications.
In general, hyperspectral sensors can provide a higher level of reliability to detect early signs of crop stress compared to general multispectral sensors. However, their form factor, payload and cost needs to be commercially viable for broader adoption by the agricultural community.
A word on thermal infrared and RGB imaging
Thermal imaging sensors capture an object’s long wave infrared radiation using thermal detectors. In terms of data output, sensors with radiometry-enabled options need to be used for acquiring pixelated temperature within the FOV.
The design of such sensors are being simplified and the costs decreasing rapidly, and options exist when it comes to integrating such technology with small UAS.
The sensors, however, still need to be improved for acquiring high resolution and noise-free data. By nature, changes in temperature are very dynamic, quickly affected by wind or cloud cover, which can lead to unreliable data. Also, the camera temperature can itself affect the accuracy of data.
Thermal data is harder to normalize and needs pertinent weather data at orchard scale for reliable normalization and interpretation.
RGB imaging has evolved rapidly, and lightweight, miniaturized and high resolution imagers are available commercially. Such sensors, also common with high-end smartphones, use complementary metal-oxide semiconductor readout structures to efficiently transfer charge from detector array to output data stream.
General crop scouting, inventory mapping and two-dimensional photogrammetry-based orchard canopy characterization are thus becoming viable with UAS integrated low-cost RGB sensing.
A range of commercial data analytic solutions are available, from offline, post-flight data processing through in-house technical expertise to the cloud-based data upload for storage, online and offline external data processing. Some software can be purchased with a one-time cost, plus annual upgrades, or an annual subscription.
Besides commercial software, ImageJ (from NIH.gov) and QGIS (QGIS.org) are the open source data analytic software available for small UAS-based data processing. Concerns do exist about the use of online cloud-based services due to issues related to data ownership, privacy and unintentional or without-consent sharing of data by such providers.
As the agriculture industry explores this domain, sensors for specific agricultural applications would require major validation efforts under various scenarios using ground-reference methods.
Variation in sensor response during the crop season due to various climatic and non-climatic conditions need to be accounted for before using such data for production management decision making.
Industry also needs to develop and offer robust data analytics solutions for processing of sensor data to have intelligence available to growers for real-time crop management decisions.
UAS imagery can be effectively used as one of the many layers of remote sensing data for crop production decision making. High- and low-orbiting, satellite-based spatiotemporal data is becoming readily available for use each day.
Such “big data” can be used for high-level decision making with drones serving as a supplementary, site-specific diagnostic tool. •
Lav Khot, Ph.D., is an assistant professor of biological systems engineering at Washington State University.