TY - JOUR
T1 - NeRF-LAI
T2 - A hybrid method combining neural radiance field and gap-fraction theory for deriving effective leaf area index of corn and soybean using multi-angle UAV images
AU - Yang, Qi
AU - Zhou, Junxiong
AU - Zhao, Liya
AU - Jin, Zhenong
N1 - Publisher Copyright:
© 2024
PY - 2025/10/1
Y1 - 2025/10/1
N2 - Methods based on upward canopy gap fractions are widely employed to measure in-situ effective LAI (Le) as an alternative to destructive sampling. However, these measurements are limited to point-level and are not practical for scaling up to larger areas. To address the point-to-landscape gap, this study introduces an innovative approach, named NeRF-LAI, for corn and soybean Le estimation that combines gap-fraction theory with the neural radiance field (NeRF) technology, an emerging neural network-based method for implicitly representing 3D scenes using multi-angle 2D images. The trained NeRF-LAI can render downward photorealistic hemispherical depth images from an arbitrary viewpoint in the 3D scene, and then calculate gap fractions to estimate Le. To investigate the intrinsic difference between upward and downward gaps estimations, initial tests on virtual corn fields demonstrated that the downward Le matches well with the upward Le, and the viewpoint height is insensitive to Le estimation for a homogeneous field. Furthermore, we conducted intensive real-world experiments at controlled plots and farmer-managed fields to test the effectiveness and transferability of NeRF-LAI in real-world scenarios, where multi-angle UAV oblique images from different phenological stages were collected for corn and soybeans. Results showed the NeRF-LAI is able to render photorealistic synthetic images with an average peak signal-to-noise ratio (PSNR) of 18.94 for the controlled corn plots and 19.10 for the controlled soybean plots. We further explored three methods to estimate Le from calculated gap fractions: the 57.5° method, the five-ring-based method, and the cell-based method. Among these, the cell-based method achieved the best performance, with the r2 ranging from 0.674 to 0.780 and RRMSE ranging from 1.95 % to 5.58 %. The Le estimates are sensitive to viewpoint height in heterogeneous fields due to the difference in the observable foliage volume, but they exhibit less sensitivity to relatively homogeneous fields. Additionally, the cross-site testing for pixel-level LAI mapping showed the NeRF-LAI significantly outperforms the VI-based models, with a small variation of RMSE (0.71 to 0.95 m2/m2) for spatial resolution from 0.5 m to 2.0 m. This study extends the application of gap fraction-based Le estimation from a discrete point scale to a continuous field scale by leveraging implicit 3D neural representations learned by NeRF. The NeRF-LAI method can map Le from raw multi-angle 2D images without prior information, offering a potential alternative to the traditional in-situ plant canopy analyzer with a more flexible and efficient solution.
AB - Methods based on upward canopy gap fractions are widely employed to measure in-situ effective LAI (Le) as an alternative to destructive sampling. However, these measurements are limited to point-level and are not practical for scaling up to larger areas. To address the point-to-landscape gap, this study introduces an innovative approach, named NeRF-LAI, for corn and soybean Le estimation that combines gap-fraction theory with the neural radiance field (NeRF) technology, an emerging neural network-based method for implicitly representing 3D scenes using multi-angle 2D images. The trained NeRF-LAI can render downward photorealistic hemispherical depth images from an arbitrary viewpoint in the 3D scene, and then calculate gap fractions to estimate Le. To investigate the intrinsic difference between upward and downward gaps estimations, initial tests on virtual corn fields demonstrated that the downward Le matches well with the upward Le, and the viewpoint height is insensitive to Le estimation for a homogeneous field. Furthermore, we conducted intensive real-world experiments at controlled plots and farmer-managed fields to test the effectiveness and transferability of NeRF-LAI in real-world scenarios, where multi-angle UAV oblique images from different phenological stages were collected for corn and soybeans. Results showed the NeRF-LAI is able to render photorealistic synthetic images with an average peak signal-to-noise ratio (PSNR) of 18.94 for the controlled corn plots and 19.10 for the controlled soybean plots. We further explored three methods to estimate Le from calculated gap fractions: the 57.5° method, the five-ring-based method, and the cell-based method. Among these, the cell-based method achieved the best performance, with the r2 ranging from 0.674 to 0.780 and RRMSE ranging from 1.95 % to 5.58 %. The Le estimates are sensitive to viewpoint height in heterogeneous fields due to the difference in the observable foliage volume, but they exhibit less sensitivity to relatively homogeneous fields. Additionally, the cross-site testing for pixel-level LAI mapping showed the NeRF-LAI significantly outperforms the VI-based models, with a small variation of RMSE (0.71 to 0.95 m2/m2) for spatial resolution from 0.5 m to 2.0 m. This study extends the application of gap fraction-based Le estimation from a discrete point scale to a continuous field scale by leveraging implicit 3D neural representations learned by NeRF. The NeRF-LAI method can map Le from raw multi-angle 2D images without prior information, offering a potential alternative to the traditional in-situ plant canopy analyzer with a more flexible and efficient solution.
KW - Deep learning
KW - Gap fraction
KW - Leaf area index
KW - Multi-angle
KW - Neural radiance fields
KW - Unmanned aerial vehicle
UR - https://www.scopus.com/pages/publications/105007322152
UR - https://www.scopus.com/inward/citedby.url?scp=105007322152&partnerID=8YFLogxK
U2 - 10.1016/j.rse.2025.114844
DO - 10.1016/j.rse.2025.114844
M3 - Article
AN - SCOPUS:105007322152
SN - 0034-4257
VL - 328
JO - Remote Sensing of Environment
JF - Remote Sensing of Environment
M1 - 114844
ER -