[1]

Yang W, Feng H, Zhang X, Zhang J, Doonan JH, et al. 2020. Crop phenomics and high-throughput phenotyping: past decades, current challenges and future perspectives. Molecular Plant 13(2):187−214

doi: 10.1016/j.molp.2020.01.008
[2]

Zhao C, Zhang Y, Du J, Guo X, Wen W, et al. 2019. Crop phenomics: current status and perspectives. Frontiers in Plant Science 10:714

doi: 10.3389/fpls.2019.00714
[3]

Guo QH, Yang WC, Wu FF, Pang SX, Jin SC, et al. 2018. High-throughput crop phenotyping: accelerators for development of breeding and precision agriculture. Bulletin of Chinese Academy of Sciences 33(9):940−46

doi: 10.16418/j.issn.1000-3045.2018.09.007
[4]

Fan J, Li Y, Yu S, Gou W, Guo X, et al. 2023. Application of Internet of Things to agriculture—the LQ-FieldPheno platform: a high-throughput platform for obtaining crop phenotypes in field. Research 6:0059

doi: 10.34133/research.0059
[5]

Cai S, Gou W, Wen W, Lu X, Fan J, et al. 2023. Design and development of a low-cost UGV 3D phenotyping platform with integrated LiDAR and electric slide rail. Plants 12(3):483

doi: 10.3390/plants12030483
[6]

Virlet N, Sabermanesh K, Sadeghi-Tehran P, Hawkesford MJ. 2016. Field Scanalyzer: an automated robotic field phenotyping platform for detailed crop monitoring. Functional Plant Biology 44(1):143−53

doi: 10.1071/FP16163
[7]

Das J, Cross G, Qu C, Makineni A, Tokekar P, et al. 2015. Devices, systems, and methods for automated monitoring enabling precision agriculture. 2015 IEEE International Conference on Automation Science and Engineering, Gothenburg, Sweden, 2015. US: IEEE. pp. 462−69 doi: 10.1109/CoASE.2015.7294123

[8]

Zhu F, Yan S, Sun L, He M, Zheng Z, et al. 2022. Estimation method of lettuce phenotypic parameters using deep learning multi-source data fusion. Transactions of the Chinese Society of Agricultural Engineering 38(9):195−204

doi: 10.11975/j.issn.1002-6819.2022.09.021
[9]

Wang D. 2023. Corn yield estimation based on phenotypic features of UAV RGB images. Thesis. Jilin Agricultural University, China. pp. 3−4

[10]

Zheng G, Moskal LM. 2012. Computational-Geometry-based retrieval of effective leaf area index using terrestrial laser scanning. IEEE Transactions on Geoscience and Remote Sensing 50(10):3958−69

doi: 10.1109/TGRS.2012.2187907
[11]

Gu Y, Wang Y, Wu Y, Warner TA, Guo T, et al. 2024. Novel 3D photosynthetic traits derived from the fusion of UAV LiDAR point cloud and multispectral imagery in wheat. Remote Sensing of Environment 311:114244

doi: 10.1016/j.rse.2024.114244
[12]

He Y, Yu H, Liu X, Yang Z, Sun W, et al. 2025. Deep learning based 3D segmentation in computer vision: a survey. Information Fusion 115:102722

doi: 10.1016/j.inffus.2024.102722
[13]

Hamuda E, Glavin M, Jones E. 2016. A survey of image processing techniques for plant extraction and segmentation in the field. Computers and Electronics in Agriculture 125:184−99

doi: 10.1016/j.compag.2016.04.024
[14]

Teng H, Wang Y, Song X, Karydis K. 2023. Multimodal dataset for localization, mapping and crop monitoring in citrus tree farms. Proc. International Symposium on Visual Computing, Lake Tahoe, Nevada, USA, 2023. pp. 571−82 doi: 10.48550/arXiv.2309.15332

[15]

Pire T, Mujica M, Civera J, Kofman E. 2019. The Rosario dataset: multisensor data for localization and mapping in agricultural environments. The International Journal of Robotics Research 38(6):633−41

doi: 10.1177/0278364919841437
[16]

Yin J, Li A, Li T, Yu W, Zou D. 2021. M2DGR: a multi-sensor and multi-scenario SLAM dataset for ground robots. IEEE Robotics and Automation Letters 7(2):2266−73

doi: 10.1109/LRA.2021.3138527
[17]

Maimaitijiang M, Sagan V, Sidike P, Hartling S, Esposito F, et al. 2020. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sensing of Environment 237:111599

doi: 10.1016/j.rse.2019.111599
[18]

Xie P, Du R, Ma Z, Cen H. 2023. Generating 3D multispectral point clouds of plants with fusion of snapshot spectral and RGB-D images. Plant Phenomics 5:40

doi: 10.34133/plantphenomics.0040
[19]

Zhang T, Hu L, Sun Y, Li L, Navarro-Alarcon D. 2022. Computing thermal point clouds by fusing rgb-d and infrared images: from dense object reconstruction to environment mapping. 2022 IEEE International Conference on Robotics and Biomimetics, Jinghong, China, 2022. US: IEEE. pp. 1707−14 doi: 10.1109/ROBIO55434.2022.10011817

[20]

Li Y, Wen W, Fan J, Gou W, Gu S, et al. 2023. Multi-source data fusion improves time-series phenotype accuracy in maize under a field high-throughput phenotyping platform. Plant Phenomics 5:43

doi: 10.34133/plantphenomics.0043
[21]

Sun G, Wang X, Sun Y, Ding Y, Lu W. 2019. Measurement method based on multispectral three-dimensional imaging for the chlorophyll contents of greenhouse tomato plants. Sensors 19(15):3345

doi: 10.3390/s19153345
[22]

Correa ES, Calderon FC, Colorado JD. 2024. A novel multi-camera fusion approach at plant scale: from 2D to 3D. SN Computer Science 5(5):582

doi: 10.1007/s42979-024-02849-7
[23]

Lin D, Jarzabek-Rychard M, Tong X, Maas HG. 2019. Fusion of thermal imagery with point clouds for building façade thermal attribute mapping. ISPRS Journal of Photogrammetry and Remote Sensing 151:162−75

doi: 10.1016/j.isprsjprs.2019.03.010
[24]

Beltrán J, Guindel C, de la Escalera A, García F. 2022. Automatic extrinsic calibration method for LiDAR and camera sensor setups. IEEE Transactions on Intelligent Transportation Systems 23:17677−89

doi: 10.1109/TITS.2022.3155228
[25]

Liao Q, Chen Z, Liu Y, Wang Z, Liu M. 2018. Extrinsic calibration of lidar and camera with polygon. 2018 IEEE International Conference on Robotics and Biomimetics (ROBIO), Kuala Lumpur, Malaysia, 2018. US: IEEE. pp. 200−5 doi: 10.1109/ROBIO.2018.8665256

[26]

Ma T, Liu Z, Guo H, Li Y. 2021. CRLF: automatic calibration and refinement based on line feature for LiDAR and camera in road scenes. 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems, Prague, Czech Republic, 2021 doi: 10.48550/arXiv.2103.04558

[27]

Shen Y, Li J, Shao X, Romillo B I, Jindal A, et al. 2024. FastSAM3D: an efficient segment anything model for 3D volumetric medical images. International Conference on Medical Image Computing and Computer Assisted Intervention, Marrakesh, Morocco, 2024. pp. 542−52 doi: 10.48550/arXiv.2403.09827

[28]

Yang N, Zhou M, Chen H, Cao C, Du S, et al. 2023. Estimation of wheat leaf area index and yield based on UAV RGB images. Journal of Triticeae Crops 43(7):920−32

doi: 10.7606/j.issn.1009-1041.2023.07.13
[29]

Louhaichi M, Borman MM, Johnson DE. 2001. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto International 16(1):65−70

doi: 10.1080/10106040108542184