Skip to main content Skip to navigation

Multi-scale remote sensing data driven apple crop water use mapping

View Print Version

Written by Abhilash K. Chandel, Lav R. Khot, Nina Kilham, Lee Kalcsits, Steve Mantle, R. Troy Peters, Claudio O. Stöckle , June 2022


High throughput and timely mapping of crop water use or evapotranspiration (ET) at tree level could be critical for site-specific irrigation management (Chandel et al., 2021). Conventionally, ET is estimated using generalized crop coefficients and weather data (Allen et al., 1998), soil water budget, soil moisture, canopy stomatal conductance, stem water potential, and eddy covariance flux measurements. However, most of these are point sampling approaches and are limited in scope to realize in-orchard spatial variability (Chandel et al., 2021). Satellite (e.g. Landsat 7/8) based remote sensing with energy balance models provide geospatial ET maps (Allen et al., 2007) but offer limited spatiotemporal resolution. High resolution aerial imaging systems (e.g. drones and manned aircrafts) have emerged as an alternative to offer on-demand and tree level crop characterizations. This article shares the aspects crop water use mapping for modern apple orchard using multi-scale remote sensing (Spatial resolution ranges: 7–3000 cm/pixel [2.8–1180 in/pixel]) combined with localized weather data as inputs to the modified energy balance model.

Energy balance modeling

We used a satellite imagery driven energy balance model, Mapping ET at High Resolution with Internalized Calibration (METRIC, Allen et al., 2007) to process the satellite imagery. This model was also modified to process multispectral and thermal infrared aerial imagery from drone and manned aircraft platforms. The model performs internal energy balance calibration using extremely stressed (hot i.e., bare soil) and non-stressed material (cold i.e., well irrigated vegetation) pixels (Chandel et al., 2020; 2021). This process compensates for biases in temperature and atmospheric uncertainties. The model inputs are: 1) surface reflectance, temperature and digital elevation model maps, 2) drone flight metadata (Flight date and time, sun azimuth and elevation angles), and 3) localized weather data (Solar radiation, air temperature, relative humidity, wind speed, and precipitation). Using such inputs, model computes net radiation, soil heat flux, sensible heat flux and the latent heat flux as a residue which can be converted to daily ET.

Orchard data collection and processing

Multispectral and thermal imagery data from three platforms, i.e., drone, manned aircraft, and Landsat-8 satellite was used to estimate ET for a Smart Orchard site-2 at Grandview, WA. The drone was mounted with a five-band multispectral imaging sensor (RedEdge3, Micasense Inc., Seattle, WA) with Blue, Green, Red, Red-Edge, and Near-Infrared wavebands and a radiometric thermal imaging sensor (Flir DUO Pro R, Flir systems, Wilsonville, OR). A calibrated reflectance panel (CRP, Micasense Inc., Seattle, WA) was imaged before and after each mission. The CRP imagery and light irradiance sensor (mounted on top of the drone) data was used to correct the multispectral imagery for any changes in natural light conditions during flights. The drone imaging missions were conducted at an altitude of 330 ft above ground level (AGL) to realize spatial resolution of 7 cm/pixel (2.8 in/pixel) for multispectral and 13 cm/pixel (5.1 in/pixel) for thermal imagery. The drone flight missions were conducted on 157, 98, 62, 26, and 13 days before harvest (DBH) and 11 and 54 days after harvest (DAH). The collected imagery for respective campaigns was stitched in a photogrammetry and image stitching software platform (Pix4D mapper, Lausanne, Switzerland) to obtain seamless maps of the orchard block. Our industry collaborator, CERES imaging (Oakland, CA) also conducted the manned aircraft-based imaging missions on 98, 82, 72, 62, 56, 41, and 26 DBH. The manned aircraft had a customized multispectral imaging sensor with Red, Green, Red-Edge, and Near-Infrared wavebands and a thermal imaging sensor. The flight missions were conducted at an altitude of 1000 m (3280 ft) AGL to capture multispectral imagery at 20 cm/pixel (7.9 in/pixel) and thermal imagery at 27 cm/pixel (10.7 in/pixel). The collected imagery was stitched and radiometrically calibrated to obtain the reflectance maps of the orchard block. The Landsat-8 based thermal and multispectral imagery (~1180 in/pixel) was acquired for 13 cloud free days in the growing season. Satellite overpass for target orchard site occurs around 11:50 AM local time. Drone campaigns were conducted in same timeframe to align with satellite overpass and to avoid imaging wet canopies due to overhead sprinklers that are operated to minimize apple sunburn risks. This is critical as wet canopies would alter true characteristics of the canopy. Auxiliary ground truth data of stem water potential (Microtensitometers, Florapulse, Davis, CA) was also acquired for two sample trees to validate the derived crop water use maps.

Result highlights

Water use maps derived from drone-based imagery had the highest correlation with water stress measurements on the ground (Figure 1, r: 0.9), followed by that of satellite-based imagery (r: 0.8) and manned aircraft-based imagery (r: 0.6). The correlation was strongest for the drone-based imagery most probably due to the imaging conducted before actuation of the overhead evaporative cooling sprinkler system thereby highlighting the water stress variability in the orchard. The manned aircraft imaging often coincided or immediately followed the evaporative cooling event and hence had lower correlation. Thus, if growers subscribe to manned aircraft-based imaging service, they should request flyovers before noon to avoid imaging wet canopies.

Three correlation charts with negative correlation
Correlation between water stress measurements (stem water potential) and water use (daily ET) derived from (a) drone, (b) manned aircraft (MA), and (c) satellite (SAT) based imagery.


Overall, drone-based imagery enables up to leaf-scale water use mapping while manned aircraft imagery enables tree-scale mapping and satellite imagery enables orchard-scale mapping (Figure 2).

Aerial images of an apple orchard
Evapotranspiration or water use maps of the apple orchard on 98 and 62 DBH derived from drone (a, d), manned aircraft (b, e), and satellite (c, f) based imagery.

Acknowledgments: This study was funded in part by USDA NIFA (WNP0745, WNP0893 projects) and Washington Tree Fruit Research Commission. The authors would like to thank the grower cooperator (Washington Fruit) and Dr. Ines Hanrahan from Washington Tree Fruit Research Commission, Ms. Bernardita Sallato, Dr. Victor Blanco, Dr. Anura Rathnayake, Mr. Jake Schrader, and Mr. Gajanan Kothawade from Washington State University for their assistance in the data collection.


Allen, R.G., Pereira, L., Raes, D., Smith, M., 1998. Crop evapotranspiration—Guidelines for computing crop water requirements. Food and Agriculture Organization of the United Nations Irrigation and Drainage Paper 56.

Allen, R.G., Tasumi, M., Trezza, R., 2007. Satellite-based energy balance for mapping evapotranspiration with internalized calibration (METRIC)-Model. J. Irrig. Drain. Eng., 133, 380–394.

Chandel, A.K., Molaei, B., Khot, L.R., Peters, R.T., Stöckle, C.O., 2020. Small UAS-based multispectral and thermal infrared imagery driven energy balance model for high-resolution evapotranspiration estimation of irrigated field crops. Drones, 4(3), 52–69.

Chandel, A.K., Khot, L.R., Molaei, B., Peters, T.R., Stöckle, C.O., Jacoby, P.W., 2021. High spatiotemporal water use mapping of a surface and direct-root-zone irrigated vineyard using UAS based thermal and multispectral remote sensing. Remote Sens. 13(5), p.954.


Washington State University