Retrieval of Crop Canopy Chlorophyll: Machine Learning vs. Radiative Transfer Model
-
Published:2024-06-07
Issue:12
Volume:16
Page:2058
-
ISSN:2072-4292
-
Container-title:Remote Sensing
-
language:en
-
Short-container-title:Remote Sensing
Author:
Alam Mir Md Tasnim1ORCID, Simic Milas Anita1ORCID, Gašparović Mateo2ORCID, Osei Henry Poku1
Affiliation:
1. School of Earth, Environment and Society, Bowling Green State University, Bowling Green, OH 43403, USA 2. Faculty of Geodesy, University of Zagreb, 10000 Zagreb, Croatia
Abstract
In recent years, the utilization of machine learning algorithms and advancements in unmanned aerial vehicle (UAV) technology have caused significant shifts in remote sensing practices. In particular, the integration of machine learning with physical models and their application in UAV–satellite data fusion have emerged as two prominent approaches for the estimation of vegetation biochemistry. This study evaluates the performance of five machine learning regression algorithms (MLRAs) for the mapping of crop canopy chlorophyll at the Kellogg Biological Station (KBS) in Michigan, USA, across three scenarios: (1) application to Landsat 7, RapidEye, and PlanetScope satellite images; (2) application to UAV–satellite data fusion; and (3) integration with the PROSAIL radiative transfer model (hybrid methods PROSAIL + MLRAs). The results indicate that the majority of the five MLRAs utilized in UAV–satellite data fusion perform better than the five PROSAIL + MLRAs. The general trend suggests that the integration of satellite data with UAV-derived information, including the normalized difference red-edge index (NDRE), canopy height model, and leaf area index (LAI), significantly enhances the performance of MLRAs. The UAV–RapidEye dataset exhibits the highest coefficient of determination (R2) and the lowest root mean square errors (RMSE) when employing kernel ridge regression (KRR) and Gaussian process regression (GPR) (R2 = 0.89 and 0.89 and RMSE = 8.99 µg/cm2 and 9.65 µg/cm2, respectively). Similar performance is observed for the UAV–Landsat and UAV–PlanetScope datasets (R2 = 0.86 and 0.87 for KRR, respectively). For the hybrid models, the maximum performance is attained with the Landsat data using KRR and GPR (R2 = 0.77 and 0.51 and RMSE = 33.10 µg/cm2 and 42.91 µg/cm2, respectively), followed by R2 = 0.75 and RMSE = 39.78 µg/cm2 for the PlanetScope data upon integrating partial least squares regression (PLSR) into the hybrid model. Across all hybrid models, the RapidEye data yield the most stable performance, with the R2 ranging from 0.45 to 0.71 and RMSE ranging from 19.16 µg/cm2 to 33.07 µg/cm2. The study highlights the importance of synergizing UAV and satellite data, which enables the effective monitoring of canopy chlorophyll in small agricultural lands.
Reference107 articles.
1. Chakhvashvili, E., Siegmann, B., Muller, O., Verrelst, J., Bendig, J., Kraska, T., and Rascher, U. (2022). Retrieval of Crop Variables from Proximal Multispectral UAV Image Data Using PROSAIL in Maize Canopy. Remote Sens., 14. 2. Guo, A., Ye, H., Li, G., Zhang, B., Huang, W., Jiao, Q., Qian, B., and Luo, P. (2023). Evaluation of Hybrid Models for Maize Chlorophyll Retrieval Using Medium-and High-Spatial-Resolution Satellite Images. Remote Sens., 15. 3. Hybrid Retrieval of Crop Traits from Multi-Temporal PRISMA Hyperspectral Imagery;Tagliabue;ISPRS J. Photogramm. Remote Sens.,2022 4. Extraction of Vegetation Biophysical Parameters by Inversion of the PROSPECT+ SAIL Models on Sugar Beet Canopy Reflectance Data;Jacquemoud;Appl. TM AVIRIS Sensors. Remote Sens. Environ.,1995 5. A Fast, Invertible Canopy Reflectance Model;Kuusk;Remote Sens. Environ.,1995
|
|