Search
2022 Volume 2
Article Contents
REVIEW   Open Access    

High-throughput plant phenotyping for improved turfgrass breeding applications

More Information
  • Turfgrasses are used extensively throughout the world, and there is a steadfast demand to develop turfgrass varieties with improved abiotic and biotic stress tolerances that will perform well with limited management inputs. Modern breeding programs incorporate advanced breeding strategies such as DNA sequencing and high-throughput phenotyping with traditional breeding strategies to identify and select germplasm and genes of interest. Molecular biology methods and DNA sequencing technology have rapidly increased in recent years, and, as a result, plant phenotyping is currently a bottleneck in the process of advancing breeding programs. Recent advances in remote sensing technology have offered improved, non-destructive plant phenotyping approaches such as visible light imaging, spectral imaging, infrared thermal imaging, range sensing, and fluorescence imaging. Integrated mobile and time efficient platforms are being developed, coupling remote sensing with robotics and unmanned aerial systems technology for high-throughput plant phenotyping applications across large field spaces. Modern turfgrass breeding programs will continue to research, develop, and implement remote sensing technologies to assess larger numbers of genotypes and identify elite germplasm. All together, these efforts will improve cultivar development efficiency and aid plant breeders in developing improved turfgrass cultivars to meet current and future demands of the turfgrass industry. This review provides an overview of ground- and aerial-based plant phenotyping platforms, with particular emphasis placed on applications to turfgrass breeding practices. Similarly, imaging technologies that have been used in various plant breeding programs are discussed, with indications as to how those technologies could be applicable to turfgrass breeding programs.
  • 加载中
  • [1]

    Beard JB, Green RL. 1994. The role of turfgrasses in environmental protection and their benefits to humans. Journal of Environmental Quality 23:452−60

    doi: 10.2134/jeq1994.00472425002300030007x

    CrossRef   Google Scholar

    [2]

    Chawla SL, Roshni A, Patel M, Patil S, Shah HP. 2018. Turfgrass: A billion dollar industry. In Proceedings of the National Conference on Floriculture for Rural and Urban Prosperity in the Scenario of Climate Change-2018. pp. 30−35

    [3]

    Milesi C, Running SW, Elvidge CD, Dietz JB, Tuttle BT, et al. 2005. Mapping and modeling the biogeochemical cycling of turf grasses in the United States. Environmental Management 36:426−38

    doi: 10.1007/s00267-004-0316-2

    CrossRef   Google Scholar

    [4]

    Haydu JJ, Hodges AW, Hall CR. 2009. Economic impacts of the turfgrass and lawncare industry in the United States. USDA, CES, University of Florida, IFAS. URL http://edis.ifas.ufl.edu/pdffiles/FE/FE63200.pdf (last accessed 14 March 2021)

    [5]

    Stier JC, Steinke K, Ervin EH, Higginson FR, McMaugh PE. 2013. Turfgrass benefits and issues. In Turfgrass: Biology, Use, and Management, eds. Stier JC, Horgan BP, Bonos SA. Agron. Monogr. 56. Madison, WI: ASA, CSSA, and SSSA. pp. 105–45 https://doi.org/10.2134/agronmonogr56.c3

    [6]

    Meyer WA, Funk CR. 1989. Progress and benefits to humanity from breeding cool-season grasses for turf. In Contributions from breeding forage and turf grasses, eds. Sleper et al. CSSA Spec. Publ. 15. CSSA, Madison, WI. pp. 31−48 https://doi.org/10.2135/cssaspecpub15.c4

    [7]

    Bonos SA, Huff DR. 2013. Cool-season grasses: biology and breeding. In Turfgrass: Biology, Use, and Management. Agron. Monogr, eds. Stier JC, Horgan BP, Bonos SA. Bonos. Madison, WI: ASA, CSSA, and SSSA. pp. 591–660 https://doi.org/10.2134/agronmonogr56.c17

    [8]

    Hanna W, Raymer P, Schwartz B. 2013. Warm-season grasses: biology and breeding. In Turfgrass: Biology, Use, and Management. eds. Stier JC, Horgan BP, Bonos SA. Bonos. Agron. Monogr. 56. Madison, WI: ASA, CSSA, and SSSA. pp. 543–90 https://doi.org/10.2134/agronmonogr56.c16

    [9]

    Meyer WA, Hoffman L, Bonos SA. 2017. Breeding cool-season turfgrass cultivars for stress tolerance and sustainability in a changing environment. International Turfgrass Society Research Journal 13:3−10

    doi: 10.2134/itsrj2016.09.0806

    CrossRef   Google Scholar

    [10]

    Baxter LL, Schwartz BM. 2018. History of bermudagrass turfgrass breeding research in Tifton, GA. HortScience 53:1560−61

    doi: 10.21273/HORTSCI13257-18

    CrossRef   Google Scholar

    [11]

    Cobb JN, DeClerck G, Greenberg A, Clark R, McCouch S. 2013. Next-generation phenotyping: requirements and strategies for enhancing our understanding of genotype–phenotype relationships and its relevance to crop improvement. Theoretical and Applied Genetics. 126:867−87

    doi: 10.1007/s00122-013-2066-0

    CrossRef   Google Scholar

    [12]

    Jiang GL. 2013. Molecular markers and marker-assisted breeding in plants. In Plant Breeding from Laboratories to Fields, ed. Andersen SB. Rijeka, Croatia: InTech. pp. 45−83 https://doi.org/10.5772/52583

    [13]

    Chawade A, van Ham J, Blomquist H, Bagge O, Alexandersson E, et al. 2019. High-throughput field-phenotyping tools for plant breeding and precision agriculture. Agronomy 9:258

    doi: 10.3390/agronomy9050258

    CrossRef   Google Scholar

    [14]

    Mir RR, Reynolds M, Pinto F, Khan MA, Bhat MA. 2019. High-throughput phenotyping for crop improvement in the genomics era. Plant Science. 282:60−72

    doi: 10.1016/j.plantsci.2019.01.007

    CrossRef   Google Scholar

    [15]

    Furbank RT, Tester M. 2011. Phenomics – technologies to relieve the phenotyping bottleneck. Trends in Plant Science 16:635−44

    doi: 10.1016/j.tplants.2011.09.005

    CrossRef   Google Scholar

    [16]

    Araus JL, Cairns JE. 2014. Field high-throughput phenotyping: the new crop breeding frontier. Trends in Plant Science 19:52−61

    doi: 10.1016/j.tplants.2013.09.008

    CrossRef   Google Scholar

    [17]

    Fahlgren N, Gehan MA, Baxter I. 2015. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up. Current opinion in plant biology 24:93−99

    doi: 10.1016/j.pbi.2015.02.006

    CrossRef   Google Scholar

    [18]

    Cabrera-Bosquet L, Sánchez C, Rosales A, Palacios-Rojas N, Araus JL. 2011. Near-Infrared Reflectance Spectroscopy (NIRS) assessment of δ18O and nitrogen and ash contents for improved yield potential and drought adaptation in maize. Journal of agricultural and food chemistry 59:467−74

    doi: 10.1021/jf103395z

    CrossRef   Google Scholar

    [19]

    White JW, Andrade-Sanchez P, Gore MA, Bronson KF, Coffelt TA, et al. 2012. Field-based phenomics for plant genetics research. Field Crops Research 133:101−12

    doi: 10.1016/j.fcr.2012.04.003

    CrossRef   Google Scholar

    [20]

    Monneveux P, Ramírez DA, Pino MT. 2013. Drought tolerance in potato (S. tuberosum L.): Can we learn from drought tolerance research in cereals? Plant Science 205:76−86

    doi: 10.1016/j.plantsci.2013.01.011

    CrossRef   Google Scholar

    [21]

    Walter A, Studer B, Kölliker R. 2012. Advanced phenotyping offers opportunities for improved breeding of forage and turf species. Annals of Botany 110:1271−79

    doi: 10.1093/aob/mcs026

    CrossRef   Google Scholar

    [22]

    Deery D, Jimenez-Berni J, Jones H, Sirault X, Furbank R. 2014. Proximal remote sensing buggies and potential applications for field-based phenotyping. Agronomy 4:349−79

    doi: 10.3390/agronomy4030349

    CrossRef   Google Scholar

    [23]

    Li L, Zhang Q, Huang D. 2014. A review of imaging techniques for plant phenotyping. Sensors 14:20078−111

    doi: 10.3390/s141120078

    CrossRef   Google Scholar

    [24]

    Sankaran S, Khot LR, Espinoza CZ, Jarolmasjed S, Sathuvalli VR, et al. 2015. Low-altitude, high-resolution aerial imaging systems for row and field crop phenotyping: A review. European Journal of Agronomy 70:112−23

    doi: 10.1016/j.eja.2015.07.004

    CrossRef   Google Scholar

    [25]

    Zhang Y, Zhang N. 2018. Imaging technologies for plant high-throughput phenotyping: a review. Frontiers of Agricultural Science and Engineering 5:406−19

    doi: 10.15302/j-fase-2018242

    CrossRef   Google Scholar

    [26]

    Feng L, Chen S, Zhang C, Zhang Y, He Y. 2021. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Computers and Electronics in Agriculture 182:106033

    doi: 10.1016/j.compag.2021.106033

    CrossRef   Google Scholar

    [27]

    Mahner M, Kary M. 1997. What exactly are genomes, genotypes and phenotypes? And what about phenomes? Journal of Theoretical Biology 186:55−63

    doi: 10.1006/jtbi.1996.0335

    CrossRef   Google Scholar

    [28]

    Fiorani F, Schurr U. 2013. Future scenarios for plant phenotyping. Annual Review of Plant Biology 64:267−291

    doi: 10.1146/annurev-arplant-050312-120137

    CrossRef   Google Scholar

    [29]

    Rahaman M, Chen D, Gillani Z, Klukas C, Chen M. 2015. Advanced phenotyping and phenotype data analysis for the study of plant growth and development. Frontiers in Plant Science 6:619

    doi: 10.3389/fpls.2015.00619

    CrossRef   Google Scholar

    [30]

    Yano MM, Tuberosa R. 2009. Genome studies and molecular genetics — from sequence to crops: genomics comes of age. Current Opinion in Plant Biology 12:103−6

    doi: 10.1016/j.pbi.2009.01.001

    CrossRef   Google Scholar

    [31]

    Tester M, Langridge P. 2010. Breeding technologies to increase crop production in a changing world. Science 327:818−22

    doi: 10.1126/science.1183700

    CrossRef   Google Scholar

    [32]

    Schneeberger K, Weigel D. 2011. Fast-forward genetics enabled by new sequencing technologies. Trends in Plant Science 16:282−88

    doi: 10.1016/j.tplants.2011.02.006

    CrossRef   Google Scholar

    [33]

    Finkel E. 2009. With 'phenomics', plant scientists hope to shift breeding into overdrive. Science 325:380−81

    doi: 10.1126/science.325_380

    CrossRef   Google Scholar

    [34]

    Furbank RT. 2009. Plant phenomics: from gene to form and function. Functional Plant Biology 36:v−vi

    doi: 10.1071/fpv36n11_fo

    CrossRef   Google Scholar

    [35]

    Houle D, Govindaraju DR, Omholt S. 2010. Phenomics: the next challenge. Nature Reviews Genetics 11:855−66

    doi: 10.1038/nrg2897

    CrossRef   Google Scholar

    [36]

    Fiorani F, Rascher U, Jahnke S, Schurr U. 2012. Imaging plants dynamics in heterogenic environments. Current Opinion in Biotechnology 23:227−35

    doi: 10.1016/j.copbio.2011.12.010

    CrossRef   Google Scholar

    [37]

    Hartmann A, Czauderna T, Hoffmann R, Stein N, Schreiber F. 2011. HTPheno: an image analysis pipeline for high-throughput plant phenotyping. BMC Bioinformatics 12:148

    doi: 10.1186/1471-2105-12-148

    CrossRef   Google Scholar

    [38]

    Chen D, Neumann K, Friedel S, Kilian B, Chen M, et al. 2014. Dissecting the phenotypic components of crop plant growth and drought responses based on high-throughput image analysis. The Plant Cell 26:4636−55

    doi: 10.1105/tpc.114.129601

    CrossRef   Google Scholar

    [39]

    Naito H, Ogawa S, Valencia MO, Mohri H, Urano Y, et al. 2017. Estimating rice yield related traits and quantitative trait loci analysis under different nitrogen treatments using a simple tower-based field phenotyping system with modified single-lens reflex cameras. ISPRS Journal of Photogrammetry and Remote Sensing 125:50−62

    doi: 10.1016/j.isprsjprs.2017.01.010

    CrossRef   Google Scholar

    [40]

    Shafiekhani A, Kadam S, Fritschi FB, DeSouza GN. 2017. Vinobot and vinoculer: two robotic platforms for high-throughput field phenotyping. Sensors 17:214

    doi: 10.3390/s17010214

    CrossRef   Google Scholar

    [41]

    Yang G, Liu J, Zhao C, Li Z, Huang Y, et al. 2017. Unmanned aerial vehicle remote sensing for field-based crop phenotyping: current status and perspectives. Frontiers in Plant Science 8:1111

    doi: 10.3389/fpls.2017.01111

    CrossRef   Google Scholar

    [42]

    Berni JAJ, Zarco-Tejada PJ, Suarez L, Fereres E. 2009. Thermal and narrowband multispectral remote sensing for vegetation monitoring from an unmanned aerial vehicle. IEEE Transactions on Geoscience and Remote Sensing 47:722−38

    doi: 10.1109/tgrs.2008.2010457

    CrossRef   Google Scholar

    [43]

    Andrade-Sanchez P, Gore MA, Heun JT, Thorp KR, Carmo-Silva AE, et al. 2013. Development and evaluation of a field-based high-throughput phenotyping platform. Functional Plant Biology 41:68−79

    doi: 10.1071/fp13126

    CrossRef   Google Scholar

    [44]

    Montes JM, Technow F, Dhillon BS, Mauch F, Melchinger AE. 2011. High-throughput non-destructive biomass determination during early plant development in maize under field conditions. Field Crops Research 121:268−73

    doi: 10.1016/j.fcr.2010.12.017

    CrossRef   Google Scholar

    [45]

    Busemeyer L, Mentrup D, Möller K, Wunder E, Alheit K, et al. 2013. Breedvision − A multi-sensor platform for non-destructive field-based phenotyping in plant breeding. Sensors 13:2830−47

    doi: 10.3390/s130302830

    CrossRef   Google Scholar

    [46]

    Comar A, Burger P, de Solan B, Baret F, Daumard F, et al. 2012. A semi-automatic system for high throughput phenotyping wheat cultivars in-field conditions: Description and first results. Functional Plant Biology 39:914−924

    doi: 10.1071/FP12065

    CrossRef   Google Scholar

    [47]

    Crain JL, Wei Y, Barker J III, Thompson SM, Alderman PD, et al. 2016. Development and deployment of a portable field phenotyping platform. Crop Science 56:965−75

    doi: 10.2135/cropsci2015.05.0290

    CrossRef   Google Scholar

    [48]

    Ruckelshausen A, Biber P, Dorna M, Gremmes H, Klose R, et al. 2009. BoniRob–an autonomous field robot platform for individual plant phenotyping. Precision Agriculture 9:841−47

    Google Scholar

    [49]

    Jensen K, Nielsen SH, Joergensen RN, Boegild A, Jacobsen NJ, et al. 2013. A low cost, modular robotics tool carrier for precision agriculture research. Proceedings of the 11th International Conference on Precision Agriculture, Indianapolis, IN, USA, 15−18 Jul 2012. USA: International Society of Precision Agriculture

    [50]

    White JW, Conley MM. 2012. A flexible, low-cost cart for proximal sensing. Crop Science 53:1646−49

    doi: 10.2135/cropsci2013.01.0054

    CrossRef   Google Scholar

    [51]

    Stowell L, Gelernter W. 2008. Evaluation of a Geonics EM38 and NTech GreenSeeker sensor array for use in precision turfgrass management. Abstracts, GSA-SSSA-ASA-CSSA-GCAGS International Annual Meeting, Houston, TX. pp. 5–9 http://acs.confex.com/crops/2008am/webprogram/Paper45119.html

    [52]

    Carrow RN, Krum JM, Flitcroft I, Cline V. 2010. Precision turfgrass management: Challenges and field applications for mapping turfgrass soil and stress. Precision Agriculture 11:115−34

    doi: 10.1007/s11119-009-9136-y

    CrossRef   Google Scholar

    [53]

    Krum JM, Carrow RN, Karnok K. 2010. Spatial mapping of complex turfgrass sites: Site-specific management units and protocols. Crop Science 50:301−15

    doi: 10.2135/cropsci2009.04.0173

    CrossRef   Google Scholar

    [54]

    Booth JC, Sullivan D, Askew SA, Kochersberger K, McCall DS. 2021. Investigating targeted spring dead spot management via aerial mapping and precision-guided fungicide applications. Crop Science 61:3134−44

    doi: 10.1002/csc2.20623

    CrossRef   Google Scholar

    [55]

    Eisenbeiss H. 2004. A mini unmanned aerial vehicle (UAV): system overview and image acquisition. International Archives of Photogrammetry. Remote Sensing and Spatial Information Sciences 36:1−7

    Google Scholar

    [56]

    Boon MA, Drijfhout AP, Tesfamichael S. 2017. Comparison of a fixed-wing and multi-rotor uav for environmental mapping applications: A case study. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 42:47−54

    doi: 10.5194/isprs-archives-XLII-2-W6-47-2017

    CrossRef   Google Scholar

    [57]

    Cai G, Dias J, Seneviratne L. 2014. A survey of small scale unmanned aerial vehicles: Recent advances and future development trends. Unmanned Systems 2:175−99

    doi: 10.1142/S2301385014300017

    CrossRef   Google Scholar

    [58]

    Thamm FP, Brieger N, Neitzke KP, Meyer M, Jansen R, Mönninghof M. 2015. SONGBIRD-an innovative UAS combining the advantages of fixed wing and multi rotor UAS. The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences 40:345−49

    doi: 10.5194/isprsarchives-xl-1-w4-345-2015

    CrossRef   Google Scholar

    [59]

    Hayes DJ, Sader SA. 2001. Comparison of change-detection techniques for monitoring tropical forest clearing and vegetation regrowth in a time series. Photogrammetric Engineering and Remote Sensing. 67:1067−75

    Google Scholar

    [60]

    Richardson MD, Karcher DE, Purcell LC. 2001. Quantifying turfgrass cover using digital image analysis. Crop Science 41:1884−88

    doi: 10.2135/cropsci2001.1884

    CrossRef   Google Scholar

    [61]

    Karcher DE, Richardson MD. 2003. Quantifying turfgrass color using digital image analysis. Crop Science 43:943−51

    doi: 10.2135/cropsci2003.9430

    CrossRef   Google Scholar

    [62]

    Madec S, Baret F, De Solan B, Thomas S, Dutartre D, Jezequel S, Hemmerlé M, Colombeau G, Comar A. 2017. High-throughput phenotyping of plant height: comparing unmanned aerial vehicles and ground LiDAR estimates. Frontiers in Plant Science 8:2002

    doi: 10.3389/fpls.2017.02002

    CrossRef   Google Scholar

    [63]

    Zhang J, Maleski J, Schwartz B, Dunn D, Mailhot D, Ni X, Harris-Shultz K, Knoll J, Toews M. 2021. Assessing spatio-temporal patterns of sugarcane aphid (Hemiptera: Aphididae) infestations on silage sorghum yield using unmanned aerial systems (UAS). Crop Protection 146:105681

    doi: 10.1016/j.cropro.2021.105681

    CrossRef   Google Scholar

    [64]

    Karcher DE, Richardson MD. 2013. Digital image analysis in turfgrass research. In Turfgrass: Biology, Use, and Management, eds. Stier JC, Horgan BP, Bonos SA. Agron. Monogr. 56. Madison, WI: ASA, CSSA, and SSSA. p. 1133–49 https://doi.org/10.2134/agronmonogr56.c29

    [65]

    Shaver BR, Richardson MD, McCalla JH, Karcher DE, Berger PJ. 2006. Dormant seeding bermudagrass cultivars in a transition-zone environment. Crop Science 46:1787−92

    doi: 10.2135/cropsci2006.02-0078

    CrossRef   Google Scholar

    [66]

    Patton AJ, Volenec JJ, Reicher ZJ. 2007. Stolon growth and dry matter partitioning explain differences in zoysiagrass establishment rates. Crop Science 47:1237−45

    doi: 10.2135/cropsci2006.10.0633

    CrossRef   Google Scholar

    [67]

    Schnell RW, Vietor DM, White RH, Provin TL, Munster CL. 2009. Effects of composted biosolids and nitrogen on turfgrass establishment, sod properties, and nutrient export at harvest. HortScience 44:1746−1750

    doi: 10.21273/HORTSCI.44.6.1746

    CrossRef   Google Scholar

    [68]

    Herrmann M, Goatley JM Jr, McCall DS, Askew SD. 2021. Establishment of dormant 'Latitude 36' bermudagrass sprigs in the transition zone. Crop, Forage & Turfgrass Management. 7:e20087

    doi: 10.1002/cft2.20087

    CrossRef   Google Scholar

    [69]

    Karcher DE, Richardson MD, Hignight K, Rush D. 2008. Drought tolerance of tall fescue populations selected for high root/shoot ratios and summer survival. Crop Science 48:771−77

    doi: 10.2135/cropsci2007.05.0272

    CrossRef   Google Scholar

    [70]

    Richardson MD, Karcher DE, Hignight K, Rush D. 2008. Drought tolerance and rooting capacity of Kentucky bluegrass cultivars. Crop Science 48:2429−36

    doi: 10.2135/cropsci2008.01.0034

    CrossRef   Google Scholar

    [71]

    Githinji LJM, Dane JH, Walker RH. 2009. Water-use patterns of tall fescue and hybrid bluegrass cultivars subjected to ET-based irrigation scheduling. Irrigation Science 27:377−91

    doi: 10.1007/s00271-009-0153-4

    CrossRef   Google Scholar

    [72]

    McCall DS, Zhang X, Sullivan DG, Askew SD, Ervin EH. 2017. Enhanced soil moisture assessment using narrowband reflectance vegetation indices in creeping bentgrass. Crop Science 57:S-161−S-168

    doi: 10.2135/cropsci2016.06.0471

    CrossRef   Google Scholar

    [73]

    Badzmierowski MJ, McCall DS, Evanylo G. 2019. Using hyperspectral and multispectral indices to detect water stress for an urban turfgrass system. Agronomy 9:439

    doi: 10.3390/agronomy9080439

    CrossRef   Google Scholar

    [74]

    Sorochan JC, Karcher DE, Parham JM, Richardson MD. 2006. Segway and golf car wear on bermudagrass fairway turf. Applied Turfgrass Science 3:1−7

    Google Scholar

    [75]

    Trappe JM, Richardson MD, Patton AJ. 2012. Species selection, pre-plant cultivation, and traffic affect overseeding establishment in bermudagrass turf. Agronomy Journal 104:1130−35

    doi: 10.2134/agronj2011.0407

    CrossRef   Google Scholar

    [76]

    Ellram A, Horgan B, Hulke B. 2007. Mowing strategies and dew removal to minimize dollar spot on creeping bentgrass. Crop Science 47:2129−37

    doi: 10.2135/cropsci2006.10.0649

    CrossRef   Google Scholar

    [77]

    Tomaso-Peterson M. 2008. Controlling spring dead spot of bermudagrass: Scientists at Mississippi State University conduct research to unravel this mysterious turfgrass disease. USGA Green Section Record 46:16−19

    Google Scholar

    [78]

    Wong F, Chen CM, Stowell L. 2009. Effects of nitrogen and Primo Maxx on brown ring patch development: Best management practices are still being developed for brown ring patch, a recently discovered disease of Poa annua greens. Golf Course Management 77:117−121

    Google Scholar

    [79]

    Richardson MD, Hignight KW, Walker RH, Rodgers CA, Rush D, McCalla JH, Karcher DE. 2007. Meadow fescue and tetraploid perennial ryegrass – Two new species for overseeding dormant bermudagrass turf. Crop Sci. 47:83−90

    doi: 10.2135/cropsci2006.04.0221

    CrossRef   Google Scholar

    [80]

    Okeyo DO, Fry JD, Bremmer D, Rajashekar CB, Kennelly M, et al. 2011. Freezing Tolerance and Seasonal Color of Experimental Zoysiagrasses. Crop Science 51:2858−63

    doi: 10.2135/cropsci2011.01.0049

    CrossRef   Google Scholar

    [81]

    Xiang H, Tian L. 2011. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV). Biosystems Engineering 108:174−190

    doi: 10.1016/j.biosystemseng.2010.11.010

    CrossRef   Google Scholar

    [82]

    Zhang J, Virk S, Porter W, Kenworthy K, Sullivan D, Schwartz B. 2019. Applications of unmanned aerial vehicle based imagery in turfgrass field trials. Frontiers in Plant Science 10:279

    doi: 10.3389/fpls.2019.00279

    CrossRef   Google Scholar

    [83]

    Louhaichi M, Borman MM, Johnson DE. 2001. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto International 16:65−70

    doi: 10.1080/10106040108542184

    CrossRef   Google Scholar

    [84]

    Gitelson AA, Kaufman YJ, Stark R, Rundquist D. 2002. Novel algorithms for remote estimation of vegetation fraction. Remote sensing of Environment 80:76−87

    doi: 10.1016/S0034-4257(01)00289-9

    CrossRef   Google Scholar

    [85]

    Hong M, Bremer DJ, van der Merwe D. 2019. Using small unmanned aircraft systems for early detection of drought stress in turfgrass. Crop Science 59:2829−44

    doi: 10.2135/cropsci2019.04.0212

    CrossRef   Google Scholar

    [86]

    Blackmer TM, Schepers JS, Varvel GE. 1994. Light reflectance compared with other nitrogen stress measurements in corn leaves. Agronomy Journal 86:934−38

    doi: 10.2134/agronj1994.00021962008600060002x

    CrossRef   Google Scholar

    [87]

    Gausman HW. 1977. Reflectance of leaf components. Remote Sensing of Environment 6:1−9

    doi: 10.1016/0034-4257(77)90015-3

    CrossRef   Google Scholar

    [88]

    Horler DNH, Dockray M, Barber J. 1983. The red edge of plant leaf reflectance. International journal of remote sensing 4:273−88

    doi: 10.1080/01431168308948546

    CrossRef   Google Scholar

    [89]

    Peñuelas J, Filella I, Biel C, Serrano L, Save R. 1993. The reflectance at the 950-970 nm region as an indicator of plant water status. International Journal of Remote Sensing 14:1887−905

    doi: 10.1080/01431169308954010

    CrossRef   Google Scholar

    [90]

    Zarco-Tejada PJ, Rueda CA, Ustin SL. 2003. Water content estimation in vegetation with MODIS reflectance data and model inversion methods. Remote Sensing of Environment. 85:109−24

    doi: 10.1016/S0034-4257(02)00197-9

    CrossRef   Google Scholar

    [91]

    McCall DS, Sullivan DG, Zhang X, Martin SB, Wong A, et al. 2021. Influence of synthetic phthalocyanine pigments on light reflectance of creeping bentgrass. Crop Science 61:804−13

    doi: 10.1002/csc2.20335

    CrossRef   Google Scholar

    [92]

    Knipling EB. 1970. Physical and physiological basis for the reflectance of visible and near-infrared radiation from vegetation. Remote Sensing of Environment 1:155−59

    doi: 10.1016/S0034-4257(70)80021-9

    CrossRef   Google Scholar

    [93]

    Carter GA. 1991. Primary and secondary effects of water content on the spectral reflectance of leaves. American Journal of Botany 78:916−24

    doi: 10.1002/j.1537-2197.1991.tb14495.x

    CrossRef   Google Scholar

    [94]

    Kou L, Labrie D, Chylek P. 1993. Refractive indices of water and ice in the 0.65- to 2.5-μm spectral range. Applied Optics 32:3531−40

    doi: 10.1364/AO.32.003531

    CrossRef   Google Scholar

    [95]

    Munns R, James RA, Sirault XRR, Furbank RT, Jones HG. 2010. New phenotyping methods for screening wheat and barley for beneficial responses to water deficit. Journal of Experimental Botany 61:3499−3507

    doi: 10.1093/jxb/erq199

    CrossRef   Google Scholar

    [96]

    Govender M, Chetty K, Bulcock H. 2007. A review of hyperspectral remote sensing and its application in vegetation and water resource studies. Water SA 33:145−51

    doi: 10.4314/wsa.v33i2.49049

    CrossRef   Google Scholar

    [97]

    Fitz–Rodríguez E, Choi CY. 2002. Monitoring turfgrass quality using multispectral radiometry. Transactions of the ASAE 45:865−71

    doi: 10.13031/2013.8839

    CrossRef   Google Scholar

    [98]

    Resop JP, Cundiff JS, Heatwole CD. 2011. Spatial analysis to site satellite storage locations for herbaceous biomass In the piedmont of the southeast. Applied Engineering in Agriculture 27:25−32

    doi: 10.13031/2013.36221

    CrossRef   Google Scholar

    [99]

    Jiang Y, Duncan RR, Carrow RN. 2004. Assessment of low light tolerance of seashore paspalum and bermudagrass. Crop Science 44:587−94

    doi: 10.2135/cropsci2004.5870

    CrossRef   Google Scholar

    [100]

    Jiang Y, Carrow RN. 2007. Broadband spectral reflectance models of turfgrass species and cultivars to drought stress. Crop Science 47:1611−18

    doi: 10.2135/cropsci2006.09.0617

    CrossRef   Google Scholar

    [101]

    Zhang J, Unruh JB, Kenworthy K, Erickson J, Christensen CT, et al. 2016. Phenotypic plasticity and turf performance of zoysiagrass in response to reduced light intensities. Crop Science 56:1337−48

    doi: 10.2135/cropsci2015.09.0570

    CrossRef   Google Scholar

    [102]

    Bell GE, Howell BM, Johnson GV, Raun WR, Solie JB, et al. 2004. Optical sensing of turfgrass chlorophyll content and tissue nitrogen. HortScience 39:1130−32

    doi: 10.21273/HORTSCI.39.5.1130

    CrossRef   Google Scholar

    [103]

    Trenholm LE, Carrow RN, Duncan RR. 1999. Relationship of multispectral radiometry data to qualitative data in turfgrass research. Crop Science 39:763−69

    doi: 10.2135/cropsci1999.0011183X003900030025x

    CrossRef   Google Scholar

    [104]

    Jordan CF. 1969. Derivation of leaf-area index from quality of light on the forest floor. Ecology 50:663−66

    doi: 10.2307/1936256

    CrossRef   Google Scholar

    [105]

    Birth GS, McVey, GR. 1968. Measuring the color of growing turf with a reflectance spectrophotometer. Agronomy Journal 60:640−43

    doi: 10.2134/agronj1968.00021962006000060016x

    CrossRef   Google Scholar

    [106]

    Rouse JW, Haas RW, Schell JA, Deering DW. 1974. Monitoring the vernal advancement and retrogradation (green wave effect) of natural vegetation. Greenbelt, MD: Final Report. Remote Sensing Center, Texas A&M University, College Station

    [107]

    Deering DW, Rouse JW, Haas RW, Schell JA. 1975. Measuring "forage production" of grazing units from Landsat MSS data. In: Proceedings of the Tenth International Symposium of Remote Sensing of the Environment. Ann Arbor, MI. pp. 1169−78

    [108]

    Tucker CJ. 1979. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sensing of Environment 8:127−50

    doi: 10.1016/0034-4257(79)90013-0

    CrossRef   Google Scholar

    [109]

    Rondeaux G, Steven M, Baret F. 1996. Optimization of soil-adjusted vegetation indices. Remote Sensing of Environment 55:95−107

    doi: 10.1016/0034-4257(95)00186-7

    CrossRef   Google Scholar

    [110]

    Huete A, Didan K, Miura T, Rodriguez EP, Gao X, et al. 2002. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sensing of Environment 83:195−213

    doi: 10.1016/S0034-4257(02)00096-2

    CrossRef   Google Scholar

    [111]

    Gitelson AA, Gritz Y, Merzlyak MN. 2003. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. Journal of Plant Physiology 160:271−82

    doi: 10.1078/0176-1617-00887

    CrossRef   Google Scholar

    [112]

    Mutanga O, Skidmore AK. 2004. Narrow band vegetation indices overcome the saturation problem in biomass estimation. International Journal of Remote Sensing 25:3999−4014

    doi: 10.1080/01431160310001654923

    CrossRef   Google Scholar

    [113]

    Sripada RP, Heiniger RW, White JG, Weisz R. 2005. Aerial color infrared photography for determining late-season nitrogen requirements in corn. Agronomy Journal. 97:1443−51

    doi: 10.2134/agronj2004.0314

    CrossRef   Google Scholar

    [114]

    Bremer DJ, Lee H, Su K, Keeley SJ. 2011. Relationships between normalized difference vegetation index and visual quality in cool-season turfgrass: II. Factors affecting NDVI and its component reflectances. Crop Science 51:2219−27

    doi: 10.2135/cropsci2010.12.0729

    CrossRef   Google Scholar

    [115]

    Lee H, Bremer DJ, Su K, Keeley SJ. 2011. Relationships between normalized difference vegetation index and visual quality in turfgrasses: Effects of mowing height. Crop Science 51:323−32

    doi: 10.2135/cropsci2010.05.0296

    CrossRef   Google Scholar

    [116]

    Nanda MK, Giri U, Bera N. 2018. Canopy temperature-based water stress indices: Potential and limitations. In Advances in Crop Environment Interaction, eds. Bal SK, Mukherjee J, Choudhury BU, Dhawanpp AK. pp. 365−85. Singapore: Springer, Singapore https://doi.org/10.1007/978-981-13-1861-0_14

    [117]

    Jalali-Farahani HR, Slack DC, Kopec DM, Matthias AD. 1993. Crop water stress index models for Bermudagrass turf: a comparison. Agronomy Journal 85:1210−17

    doi: 10.2134/agronj1993.00021962008500060022x

    CrossRef   Google Scholar

    [118]

    Bijanzadeh E, Naderi R, Emam Y. 2013. Determination of crop water stress index for irrigation scheduling of turfgrass (Cynodon dactylon L. Pers.) under drought conditions. Journal of Plant Physiology and Breeding 3:13−22

    Google Scholar

    [119]

    Taghvaeian S, Chávez JL, Hattendorf MJ, Crookston MA. 2013. Optical and thermal remote sensing of turfgrass quality, water stress, and water use under different soil and irrigation treatments. Remote Sensing 5:2327−47

    doi: 10.3390/rs5052327

    CrossRef   Google Scholar

    [120]

    Foral JG. 2021. Using thermal imaging to measure water stress in creeping bentgrass putting greens. Master's thesis. University of Nebraska-Lincoln, USA. https://digitalcommons.unl.edu/agronhortdiss/221.

    [121]

    Payero JO, Neale CMU, Wright JL. 2005. Non-water-stressed baselines for calculating crop water stress index (CWSI) for alfalfa and tall fescue grass. Transactions of the ASAE 48:653−61

    doi: 10.13031/2013.18329

    CrossRef   Google Scholar

    [122]

    Zhang L, Niu Y, Zhang H, Han W, Li G, et al. 2019. Maize canopy temperature extracted from UAV thermal and RGB imagery and its application in water stress monitoring. Frontiers in Plant Science 10:1270

    doi: 10.3389/fpls.2019.01270

    CrossRef   Google Scholar

    [123]

    Butler WL. 1973. Primary photochemistry of photosystem II of photosynthesis. Accounts of Chemical Research 6:177−84

    doi: 10.1021/ar50066a001

    CrossRef   Google Scholar

    [124]

    Maxwell K, Johnson GN. 2000. Chlorophyll fluorescence—a practical guide. Journal of experimental botany 51:659−68

    doi: 10.1093/jexbot/51.345.659

    CrossRef   Google Scholar

    [125]

    Lee WS, Alchanatis V, Yang C, Hirafuji M, Moshou D, Li C. 2010. Sensing technologies for precision specialty crop production. Computers and electronics in agriculture. 74:2−33

    doi: 10.1016/j.compag.2010.08.005

    CrossRef   Google Scholar

    [126]

    Gorbe E, Calatayud A. 2012. Applications of chlorophyll fluorescence imaging technique in horticultural research: a review. Scientia Horticulturae 138:24−35

    doi: 10.1016/j.scienta.2012.02.002

    CrossRef   Google Scholar

    [127]

    Chaerle L, Van Der Straeten D. 2000. Imaging techniques and early detection of plant stress. Trends in Plant Science 5:495−501

    doi: 10.1016/S1360-1385(00)01781-7

    CrossRef   Google Scholar

    [128]

    Kalaji HM, Goltsev V, Bosa K, Allakhverdiev SI, Strasser RJ, et al. 2012. Experimental in vivo measurements of light emission in plants: a perspective dedicated to David Walker. Photosynthesis Research 114:69−96

    doi: 10.1007/s11120-012-9780-3

    CrossRef   Google Scholar

    [129]

    Bąba W, Kalaji HM, Kompała-Bąba A, Goltsev V. 2016. Acclimatization of photosynthetic apparatus of tor grass (Brachypodium pinnatum) during expansion. PLoS ONE 11:e0156201

    doi: 10.1371/journal.pone.0156201

    CrossRef   Google Scholar

    [130]

    Goltsev VN, Kalaji HM, Paunov M, Bąba W, Horaczek T, et al. 2016. Variable chlorophyll fluorescence and its use for assessing physiological condition of plant photosynthetic apparatus. Russian Journal of Plant Physiology 63:869−93

    doi: 10.1134/S1021443716050058

    CrossRef   Google Scholar

    [131]

    Balachandran VK, Gopinathan CP, PIllai VK, Nandakumar A, Valsala KK. 1997. Chlorophyll profile of the euphotic zone in the Lakshadweep Sea during the southwest monsoon season. Indian Journal of Fisheries 44:29−43

    Google Scholar

    [132]

    Lohaus G, Heldt HW, Osmond CB. 2000. Infection with phloem limited Abutilon mosaic virus causes localized carbohydrate accumulation in leaves of Abutilon striatum: relationships to symptom development and effects on chlorophyll fluorescence quenching during photosynthetic induction. Plant Biology 2:161−67

    doi: 10.1055/s-2000-9461

    CrossRef   Google Scholar

    [133]

    Swarbrick PJ, Schulze-Lefert P, Scholes JD. 2006. Metabolic consequences of susceptibility and resistance (race-specific and broad-spectrum) in barley leaves challenged with powdery mildew. Plant, Cell & Environment 29:1061−76

    doi: 10.1111/j.1365-3040.2005.01472.x

    CrossRef   Google Scholar

    [134]

    Chaerle L, Leinonen I, Jones HG, Van Der Straeten D. 2007. Monitoring and screening plant populations with combined thermal and chlorophyll fluorescence imaging. Journal of Experimental Botany 58:773−84

    doi: 10.1093/jxb/erl257

    CrossRef   Google Scholar

    [135]

    Rolfe SA, Scholes JD. 2010. Chlorophyll fluorescence imaging of plant–pathogen interactions. Protoplasma 247:163−75

    doi: 10.1007/s00709-010-0203-z

    CrossRef   Google Scholar

    [136]

    Barbagallo RP, Oxborough K, Pallett KE, Baker NR. 2003. Rapid, noninvasive screening for perturbations of metabolism and plant growth using chlorophyll fluorescence imaging. Plant physiology 132:485−93

    doi: 10.1104/pp.102.018093

    CrossRef   Google Scholar

    [137]

    Konishi A, Eguchi A, Hosoi F, Omasa K. 2009. 3D monitoring spatio-temporal effects of herbicide on a whole plant using combined range and chlorophyll a fluorescence imaging. Functional Plant Biology 36:874−79

    doi: 10.1071/FP09108

    CrossRef   Google Scholar

    [138]

    Baker NR, Rosenqvist E. 2004. Applications of chlorophyll fluorescence can improve crop production strategies: An examination of future possibilities. Journal of Experimental Botany 55:1607−21

    doi: 10.1093/jxb/erh196

    CrossRef   Google Scholar

    [139]

    Lenk S, Chaerle L, Pfündel EE, Langsdorf G, Hagenbeek D, et al. 2007. Multispectral fluorescence and reflectance imaging at the leaf level and its possible applications. Journal of Experimental Botany 58:807−14

    doi: 10.1093/jxb/erl207

    CrossRef   Google Scholar

    [140]

    Baker NR. 2008. Chlorophyll fluorescence: A probe of photosynthesis in vivo. Annual Review of Plant Biology 59:89−113

    doi: 10.1146/annurev.arplant.59.032607.092759

    CrossRef   Google Scholar

    [141]

    Harbinson J, Prinzenberg AE, Kruijer W, Aarts MG. 2012. High throughput screening with chlorophyll fluorescence imaging and its use in crop improvement. Current Opinion in Biotechnology 23:221−26

    doi: 10.1016/j.copbio.2011.10.006

    CrossRef   Google Scholar

    [142]

    Colaço AF, Molin JP, Rosell-Polo JR, Escolà A. 2018. Application of light detection and ranging and ultrasonic sensors to high-throughput phenotyping and precision horticulture: current status and challenges. Horticulture Research 5:35

    doi: 10.1038/s41438-018-0043-0

    CrossRef   Google Scholar

    [143]

    Lin Y. 2015. LiDAR: An important tool for next-generation phenotyping technology of high potential for plant phenomics? Computers and Electronics in Agriculture 119:61−73

    doi: 10.1016/j.compag.2015.10.011

    CrossRef   Google Scholar

    [144]

    García-Santillán ID, Montalvo M, Guerrero JM, Pajares G. 2017. Automatic detection of curved and straight crop rows from images in maize fields. Biosystems Engineering 156:61−79

    doi: 10.1016/j.biosystemseng.2017.01.013

    CrossRef   Google Scholar

    [145]

    Bao Y, Tang L, Breitzman MW, Salas Fernandez MG, Schnable PS. 2019. Field-based robotic phenotyping of sorghum plant architecture using stereo vision. Journal of Field Robotics 36:397−415

    doi: 10.1002/rob.21830

    CrossRef   Google Scholar

    [146]

    Pandey P, Ge Y, Stoerger V, Schnable JC. 2017. High throughput in vivo analysis of plant leaf chemical properties using hyperspectral imaging. Frontiers in Plant Science 8:1348

    doi: 10.3389/fpls.2017.01348

    CrossRef   Google Scholar

    [147]

    Sun S, Li C, Paterson AH, Jiang Y, Xu R, et al. 2018. In-field high-throughput phenotyping and cotton plant growth analysis using LiDAR. Frontiers in Plant Science 9:16

    doi: 10.3389/fpls.2018.00016

    CrossRef   Google Scholar

    [148]

    Nguyen P, Badenhorst PE, Shi F, Spangenberg GC, Smith KF, et al. 2021. Design of an Unmanned Ground Vehicle and LiDAR Pipeline for the High-Throughput Phenotyping of Biomass in Perennial Ryegrass. Remote Sensing 13:20

    doi: 10.3390/rs13010020

    CrossRef   Google Scholar

    [149]

    Pittman JJ, Arnall DB, Interrante SM, Moffet CA, Butler TJ. 2015. Estimation of biomass and canopy height in bermudagrass, alfalfa, and wheat using ultrasonic, laser, and spectral sensors. Sensors 15:2920−43

    doi: 10.3390/s150202920

    CrossRef   Google Scholar

    [150]

    Scotford IM, Miller PCH. 2004. Combination of spectral reflectance and ultrasonic sensing to monitor the growth of winter wheat. Biosystems Engineering 87:27−38

    doi: 10.1016/j.biosystemseng.2003.09.009

    CrossRef   Google Scholar

    [151]

    Yuan W, Li J, Bhatta M, Shi Y, Baenziger PS, et al. 2018. Wheat height estimation using LiDAR in comparison to ultrasonic sensor and UAS. Sensors 18:3731

    doi: 10.3390/s18113731

    CrossRef   Google Scholar

    [152]

    Lobet G, Pagès L, Draye X. 2011. A novel image-analysis toolbox enabling quantitative analysis of root system architecture. Plant Physiology 157:29−39

    doi: 10.1104/pp.111.179895

    CrossRef   Google Scholar

    [153]

    Pierret A, Gonkhamdee S, Jourdan C, Maeght JL. 2013. IJ_RHIZO: An open-source software to measure scanned images of root samples. Plant and Soil 373:531−39

    doi: 10.1007/s11104-013-1795-9

    CrossRef   Google Scholar

    [154]

    Clark RT, Famoso AN, Zhao K, Shaff JE, Craft EJ, et al. 2013. High-throughput two-dimensional root system phenotyping platform facilitates genetic analysis of root growth and development. Plant, Cell & Environment 36:454−466

    doi: 10.1111/j.1365-3040.2012.02587.x

    CrossRef   Google Scholar

    [155]

    Watt M, Moosavi S, Cunningham SC, Kirkegaard JA, Rebetzke GJ, et al. 2013. A rapid, controlled-environment seedling root screen for wheat correlates well with rooting depths at vegetative, but not reproductive, stages at two field sites. Annals of Botany 112:447−55

    doi: 10.1093/aob/mct122

    CrossRef   Google Scholar

    [156]

    Arsenault JL, Poulcur S, Messier C, Guay R. 1995. WinRHlZO™, a root measuring system with a unique overlap correction method. HortScience 30:906

    doi: 10.21273/hortsci.30.4.906d

    CrossRef   Google Scholar

    [157]

    Abramoff MD, Magelhaes PJ, Ram SJ. 2004. Image processing with ImageJ. Biophotonics International 11:36−42

    Google Scholar

    [158]

    Chloupek O. 1977. Evaluation of size of a plants-root system using its electrical capacitance. Plant and Soil 48:525−32

    doi: 10.1007/BF02187258

    CrossRef   Google Scholar

    [159]

    Messmer R, Fracheboud Y, Bänziger M, Stamp P, Ribaut JM. 2011. Drought stress and tropical maize: QTLs for leaf greenness, plant senescence, and root capacitance. Field Crops Research 124:93−103

    doi: 10.1016/j.fcr.2011.06.010

    CrossRef   Google Scholar

    [160]

    Metzner R, Eggert A, van Dusschoten D, Pflugfelder D, Gerth S, et al. 2015. Direct comparison of MRI and X-ray CT technologies for 3D imaging of root systems in soil: potential and challenges for root trait quantification. Plant Methods 11:1

    doi: 10.1186/s13007-015-0043-0

    CrossRef   Google Scholar

    [161]

    Flavel RJ, Guppy CN, Tighe M, Watt M, McNeill A, et al. 2012. Non-destructive quantification of cereal roots in soil using high-resolution X-ray tomography. Journal of Experimental Botany 63:2503−11

    doi: 10.1093/jxb/err421

    CrossRef   Google Scholar

    [162]

    Idso SB, Jackson RD, Reginato RJ. 1977. Remote-sensing of crop yields. Science 196:19−25

    doi: 10.1126/science.196.4285.19

    CrossRef   Google Scholar

    [163]

    González-Dugo MP, Moran MS, Mateos L, Bryant R. 2006. Canopy temperature variability as an indicator of crop water stress severity. Irrigation Science 24:233

    doi: 10.1007/s00271-005-0022-8

    CrossRef   Google Scholar

    [164]

    Idso SB, Jackson RD, Pinter PJ Jr, Reginato RJ, Hatfield JL. 1981. Normalizing the stress-degree-day parameter for environmental variability. Agricultural meteorology 24:45−55

    doi: 10.1016/0002-1571(81)90032-7

    CrossRef   Google Scholar

    [165]

    Woebbecke DM, Meyer GE, Von Bargen K, Mortensen DA. 1995. Color indices for weed identification under various soil, residue, and lighting conditions. Transactions of the ASAE 38:259−69

    doi: 10.13031/2013.27838

    CrossRef   Google Scholar

    [166]

    Zarco-Tejada PJ, Berjón A, López-Lozano R, Miller JR, Martín P, et al. 2005. Assessing vineyard condition with hyperspectral indices: Leaf and canopy reflectance simulation in a row-structured discontinuous canopy. Remote Sensing of Environment 99:271−87

    doi: 10.1016/j.rse.2005.09.002

    CrossRef   Google Scholar

    [167]

    Pérez AJ, López F, Benlloch JV, Christensen S. 2000. Colour and shape analysis techniques for weed detection in cereal fields. Computers and Electronics in Agriculture 25:197−212

    doi: 10.1016/S0168-1699(99)00068-X

    CrossRef   Google Scholar

    [168]

    Gardner BR, Blad BL, Garrity DP, Watts DG. 1981. Relationships between crop temperature, grain yield, evapotranspiration and phenological development in two hybrids of moisture stressed sorghum. Irrigation Science 2:213−24

    doi: 10.1007/BF00258375

    CrossRef   Google Scholar

    [169]

    Roman A, Ursu T. 2016. Multispectral satellite imagery and airborne laser scanning techniques for the detection of archaeological vegetation marks. In Landscape Archaeology On The Northern Frontier Of The Roman Empire At Porolissum-an Interdisciplinary Research Project, eds. Opreanu CH, Lăzărescu VA. Cluj-Napoca, Romania: Mega Publishing House. pp. 141−52.

    [170]

    Giannoni L, Lange F, Tachtsidis I. 2018. Hyperspectral imaging solutions for brain tissue metabolic and hemodynamic monitoring: past, current and future developments. Journal of Optics 20:044009

    doi: 10.1088/2040-8986/aab3a6

    CrossRef   Google Scholar

  • Cite this article

    Vines PL, Zhang J. 2022. High-throughput plant phenotyping for improved turfgrass breeding applications. Grass Research 2:1 doi: 10.48130/GR-2022-0001
    Vines PL, Zhang J. 2022. High-throughput plant phenotyping for improved turfgrass breeding applications. Grass Research 2:1 doi: 10.48130/GR-2022-0001

Figures(5)  /  Tables(1)

Article Metrics

Article views(6931) PDF downloads(1073)

Other Articles By Authors

REVIEW   Open Access    

High-throughput plant phenotyping for improved turfgrass breeding applications

Grass Research  2 Article number: 1  (2022)  |  Cite this article

Abstract: Turfgrasses are used extensively throughout the world, and there is a steadfast demand to develop turfgrass varieties with improved abiotic and biotic stress tolerances that will perform well with limited management inputs. Modern breeding programs incorporate advanced breeding strategies such as DNA sequencing and high-throughput phenotyping with traditional breeding strategies to identify and select germplasm and genes of interest. Molecular biology methods and DNA sequencing technology have rapidly increased in recent years, and, as a result, plant phenotyping is currently a bottleneck in the process of advancing breeding programs. Recent advances in remote sensing technology have offered improved, non-destructive plant phenotyping approaches such as visible light imaging, spectral imaging, infrared thermal imaging, range sensing, and fluorescence imaging. Integrated mobile and time efficient platforms are being developed, coupling remote sensing with robotics and unmanned aerial systems technology for high-throughput plant phenotyping applications across large field spaces. Modern turfgrass breeding programs will continue to research, develop, and implement remote sensing technologies to assess larger numbers of genotypes and identify elite germplasm. All together, these efforts will improve cultivar development efficiency and aid plant breeders in developing improved turfgrass cultivars to meet current and future demands of the turfgrass industry. This review provides an overview of ground- and aerial-based plant phenotyping platforms, with particular emphasis placed on applications to turfgrass breeding practices. Similarly, imaging technologies that have been used in various plant breeding programs are discussed, with indications as to how those technologies could be applicable to turfgrass breeding programs.

    • Turfgrasses are utilized throughout inhabited regions of the world for home lawns, athletic field and golf course turf, parks and recreational areas, and roadside vegetation[1]. In addition, turfgrass seed and sod production contribute to the significant economic, ecological, and environmental values of the turf industry[2]. It is estimated that maintained turfgrass in the U.S. covers approximately 20 million hectares of managed land[3]. The annual economic value of the turfgrass industry is approximately $60 billion, making a large contribution to the national economy[4]. The value of turfgrass continues to grow due to strong demand for use in landscape, recreation, and sports areas, as well as environmental and aesthetic benefits of turfgrasses such as moderating temperatures, preventing soil erosion, reducing noise and air pollution, and increasing property values[1,5].

      Extensive efforts in turfgrass breeding have resulted in persistent, attractive varieties with improved turf quality characteristics, pests and stress tolerance, and reduced maintenance requirements[6]. Breeding objectives currently focus on improving tolerance to abiotic stress factors such as drought, heat, cold, and salinity and biotic stresses such as diseases and insects[710]. In addition, efforts are also aimed at developing grasses that will perform at high levels with limited inputs of fertility, irrigation, pesticides, and mowing[79]. Breeders of seed propagated species continue to focus on improving seed yield characteristics and identifying grasses with resistance to seed production diseases such as stem rust, caused by Puccinia graminis, while for sod propagated species, breeders focus on developing varieties with improved sod-forming ability[79]. In addition to the aforementioned objectives, breeders are working to maintain the high turf quality characteristics that have been bred into all major turfgrass species to date[9].

      Modern breeding programs strive to incorporate advanced breeding strategies such as DNA sequencing and high-throughput phenotyping with traditional breeding strategies to identify and select germplasm and genes of interest[1114]. In recent years, DNA sequencing and molecular biology methods have rapidly increased, and, as a result, plant phenotyping is currently a bottleneck in the process of advancing breeding programs[1517]. Recent advances in remote sensing have offered improved, non-destructive plant phenotyping approaches[1820]. These advancements have been coupled with improvements in robotics and unmanned aerial systems (UAS) technology to provide mobile, time efficient platforms for remote sensors that have contributed to high-throughput plant phenotyping applications across large fields (Fig. 1).

      Figure 1. 

      Overview of high-throughput phenotyping tools for modern turfgrass breeding programs. 1: UAS for remote sensing data collection on mowed turf plot trials; 2: ground robot for proximal sensing data collection on mowed turf plot trials; 3: turfgrass breeder for visual assessment and oversight of various data collection practices on mowed turf plot trials; 4: ground vehicle for proximal sensing data collection on mowed turf plot trials; 5: ground vehicle for proximal sensing data collection on turfgrass nursery trials; 6: turfgrass breeder for visual assessment and oversight of various data collection practices on turfgrass nursery trials; 7: ground robot for proximal sensing data collection on turfgrass nursery trials; 8: UAS for remote sensing data collection on turfgrass nursery trial; 9: weather station for environmental data collection. All data is stored and processed via cloud computing services.

      The use of ground- and aerial-based platforms and imaging technologies for high-throughput phenotyping applications have been thoroughly reviewed previously[13,16,17,19,2126]. This review provides an overview of ground- and aerial-based plant phenotyping platforms, with particular emphasis placed on applications to turfgrass breeding practices. Similarly, imaging technologies that have been used in various plant breeding programs are discussed, with indications as to how those technologies could be applicable to turfgrass breeding programs.

    • A phenotype is the physical appearance of a plant; this includes complex traits related to architecture, growth, development, physiology, ecology, yield, and tolerance to abiotic and biotic stresses[23,27]. Plant phenotyping is the act of assessing phenotypic plant traits in order to rank or compare germplasm to identify elite lines for breeding purposes[28]. Traditionally, plant phenotyping has involved the use of manual and visual assessments, which are labor intensive, time consuming, and variable due to observational bias and preference[29]. These limitations, in light of genotyping advancements[3032], have led to a phenotyping bottleneck in plant breeding programs[15, 3336]. However, many breeding programs have combined efforts from biological science, computer science, mathematics, physics, data science, and statistics to develop more efficient phenotyping methods, which is an area of research commonly known as high-throughput plant phenotyping[37]. The high-throughput phenotyping approaches employed in breeding programs consist of both ground- and aerial-based platforms that are equipped with various remote sensors to efficiently collect quantitative and geospatial data across large geographic areas[38].

    • Ground-based plant phenotyping involves the assessment of plant phenotypes using proximal sensors, which are located close to the plants of interest[19]. For this application, sensors may be handheld or mounted on phenotyping platforms such as stationary towers, cable suspensions, and ground vehicles[39,40]. Handheld sensors are convenient to use but require long periods of time to phenotype large fields, which can result in significant environmental variation during the data collection process[23,24,41]. Moreover, data collection is not always consistent among different evaluators using handheld devices, and this adds systematic error to resulting datasets[42]. Another limitation of handheld sensors is that only one sensor may be used at a time, which does not provide the best solution for a high-throughput, time efficient means of plant phenotyping[23]. Stationary towers and cable suspensions are also acceptable for certain phenotyping applications, but their use is limited by aspects such as inability to cover large field areas and angle distortion issues that arise from having a single viewpoint and collecting data across large fields[22].

      Several ground vehicle plant phenotyping platforms have been developed for various breeding applications in crops such as cotton (Gossypium barbadense L.)[43], maize (Zea mays L.)[44], triticale (× Triticosecale Wittmack L.)[45], and wheat (Triticum aestivum L.)[46,47]. These ground vehicle platforms range from simple pushcart designs to more sophisticated motor-driven buggies and are capable of accommodating multiple sensors and other data recording devices[43,45,46,4850]. For turfgrass applications, there are different types of ground-based platforms with various sensors and cameras to be used in field phenotyping (Fig. 2). Researchers have demonstrated the usability of ground-based mobile platforms to accurately and precisely monitor characteristics such as soil moisture[5153], turfgrass health[5154], and turfgrass disease symptoms[54]. A major benefit of ground-based platforms is that they generate high spatial resolution data, which is required for plant science research and breeding programs. However, field-scale applications of these ground-based approaches are limited by the time to phenotype large areas and the fact that soil conditions immediately following irrigation or precipitation events can limit access to ground-based platforms[2224,41].

      Figure 2. 

      Examples of ground-based phenotyping devices used in turfgrass breeding and research. Left: pushcart with multispectral sensor; middle: light box with digital camera; right: hand-held NDVI meter. (Photo credit: Brian Schwartz).

    • Aerial-based plant phenotyping involves the assessment of plant phenotypes using aerial, remote sensors, which are located farther away from the plants of interest than proximal sensors[26]. Aerial-based plant phenotyping efforts began by using traditional, manned vehicles such as small airplanes, blimps, and parachutes, which all remain useful for certain phenotyping applications today[23]. However, advancements in UASs have increased rapidly in recent years, and these platforms have become routinely used for remote sensing-based plant phenotyping applications.

      Traditional aerial vehicles such as small airplanes and blimps require a person to be onboard for operational purposes[23]. These vehicles have higher payloads than UASs but generally require relatively higher operational altitudes and speeds. Such limitations have given rise to a widespread use of UAS technology. By definition, a UAS consists of a vehicle that can travel through the air without a person onboard for operation[55]. The UASs are typically categorized as either fixed wing or multicopter aircrafts. The selection of one platform over the other is dependent upon a specific application and available resources, as these platforms vary widely in terms of maneuverability, initial costs, maintenance costs, run time, and payload[23].

      Fixed wing UASs, compared to multicopter UASs, have faster flight speeds, longer flight times, and can carry a heavier payload[56]. This means that fixed wing systems can cover more land area and can accommodate more sensors and other data recording devices onboard. The limitations to fixed wing UASs are also attributed to the fast travel speeds; operators must be aware of image blurring risks and ensure onboard sensors are compatible with the fast speeds of travel[56]. In addition, fixed wing aircrafts cannot hover, and, with exception to some fixed wing aircrafts that have vertical takeoff and landing capabilities, they require relatively large areas for takeoff and landing[56]. Multicopter UASs, on the other hand, have slower flight speeds, shorter flight times, and cannot carry as heavy of a payload as fixed wing systems[57]. The ability of multicopter platforms to maintain stable positions at slower travel speeds and lower altitudes gives them an advantage for use in plant science research and breeding programs[58].

    • Various remote sensing technologies have been explored for plant phenotyping applications. Many of these technologies are based on plant interactions with light at wavelengths that span much of the electromagnetic spectrum (Fig. 3). The following sections provide detailed descriptions of visible light imaging, spectral imaging, infrared thermal imaging, and fluorescence imaging technologies with particular emphasis on their usefulness in high-throughput plant phenotyping for turfgrass breeding applications. These remote sensing approaches are primarily used to assess two-dimensional plant characteristics but can be used to assess limited three-dimensional plant traits as well. However, light ranging and detection (LiDAR) and ultrasonic sensors represent much more appropriate options for assessing three-dimensional plant architecture and are also discussed herein.

      Figure 3. 

      Plant light reflectance curve at wavelengths ranging from 300 nm to 2,500 nm. Chlorophyll absorption, red edge, spongy mesophyll reflectance, and water absorption regions are shown[23,169].

    • Visible light imaging is based on plant interactions with light intensities in the 400 nm to 700 nm wavelength range (Fig. 4) and is meant to mimic human perception[23]. For phenotyping purposes, visible light imaging is primarily used to capture plant characteristics such as color, morphology, and architecture[23,25]. This is an affordable and convenient imaging solution and has been extensively used for plant phenotyping applications among various crop species[13,26]. Standard digital cameras are typically used for visible light imaging to capture raw data that correspond with photon fluxes in the red (~650 nm), green (~550 nm), and blue (~450 nm) spectral bands (Fig. 4); for this reason, these images are often called RGB images.

      Figure 4. 

      Light wavelengths along the electromagnetic spectrum captured by various optical sensors. Visible light imaging sensors for 400 nm to 700 nm, spectral imaging sensors for 400 nm to 2,500 nm, and infrared thermal imaging sensors for 7,500 nm to 13,000 nm[17].

      Once RGB data are captured, there are different approaches that can be taken to process the data depending on the objectives of a given project. One approach for analyzing these data is to convert RGB images into color indices such as excess green index, green index, green leaf index, greenblue, normalized difference index, or visible atmospherically resistant index, which can be done using gray-scale, single band data (Table 1). This approach can also be used to obtain measurements of percentage green cover by thresholding, which is a pixel classification procedure whereby pixels with values above a threshold are classified as green and pixels below a threshold are classified as non-green[59]. A second approach is to convert RGB pixel values to hue, saturation, and brightness (HSB) pixel values, which can subsequently be used to generate measurements including percentage ground cover[60] and plant color[61]. The HSB data can also be used to calculate the dark green color index (Table 1). In addition to plant characteristics such as green cover and plant color, plant breeders can also obtain plant height information using the Structure-from-Motion technique, which combines computing algorithms, digital cameras, and aerial vehicles to reconstruct a three-dimensional digital surface model of the target[62,63]. This approach is challenging to use in mowed turfgrass research because of the low canopy height (< 10 cm) but does offer some promise in estimating yield for seed production research and breeding programs.

      Table 1.  Color, temperature, and vegetation indices used in plant remote sensing research and breeding applications.

      IndexFormulaReference
      Canopy-Air Temperature Difference (CATD)TL – TA[162]
      Canopy Temperature Variability (CTV)σTC[163]
      Crop Water Stress Index (CWSI)$ \dfrac{\left(\mathrm{T}\mathrm{C}-\mathrm{T}\mathrm{A}\right)-\left(TC-TA\right)ll}{(TC-TA)ul-(TC-TA)ll} $[164]
      Dark Green Color Index (DGCI)$ \dfrac{\left(\dfrac{\mathrm{H}\mathrm{u}\mathrm{e}-60}{60+\left(1-\mathrm{S}\mathrm{a}\mathrm{t}\mathrm{u}\mathrm{r}\mathrm{a}\mathrm{t}\mathrm{i}\mathrm{o}\mathrm{n}\right)+(1-\mathrm{B}\mathrm{r}\mathrm{i}\mathrm{g}\mathrm{h}\mathrm{t}\mathrm{n}\mathrm{e}\mathrm{s}\mathrm{s})}\right)}{3} $[61]
      Difference Vegetation Index (DVI)Near Infrared − Red[108]
      Enhanced Vegetation Index (EVI)$ 2.5\dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}\;+\left(6\mathrm{R}\mathrm{e}\mathrm{d}\right)-\left(7.5\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}\right)+1} $[110]
      Excess Green Index (ExG)2Green – Red – Blue[165]
      Green Chlorophyll Index (GCI)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}}{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}}-1 $[111]
      Green Difference Vegetation Index (GDVI)Near Infrared – Green[113]
      Green Index (GI)$ \dfrac{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}}{\mathrm{R}\mathrm{e}\mathrm{d}} $[166]
      Green Leaf Index (GLI)$ \dfrac{2\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}-\mathrm{R}\mathrm{e}\mathrm{d}-\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}}{2\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}+\mathrm{R}\mathrm{e}\mathrm{d}+\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}} $[83]
      GreenBlue (GB)$ \dfrac{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}-\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}}{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}+\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}} $[85]
      Normalized Difference Index (NDI)$ \dfrac{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}+\mathrm{R}\mathrm{e}\mathrm{d}} $[167]
      Normalized Difference Red Edge (NDRE)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}-\mathrm{R}\mathrm{e}\mathrm{d}\;\mathrm{E}\mathrm{d}\mathrm{g}\mathrm{e}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}+\mathrm{R}\mathrm{e}\mathrm{d}\;\mathrm{E}\mathrm{d}\mathrm{g}\mathrm{e}} $[112]
      Normalized Difference Vegetation Index (NDVI)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}+\mathrm{R}\mathrm{e}\mathrm{d}} $[106]
      Optimized Soil Adjusted Vegetation Index (OSAVI)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}+\mathrm{R}\mathrm{e}\mathrm{d}+0.16} $[109]
      Ratio Vegetation Index (RVI)$ \dfrac{\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}} $[104]
      Simple Ratio (SR)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}}{\mathrm{R}\mathrm{e}\mathrm{d}} $[105]
      Temperature Stress Day (TSD)Tstress – Tnon-stress[168]
      Transformed Vegetation Index (TVI)$ \dfrac{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{N}\mathrm{e}\mathrm{a}\mathrm{r}\;\mathrm{I}\mathrm{n}\mathrm{f}\mathrm{r}\mathrm{a}\mathrm{r}\mathrm{e}\mathrm{d}+\mathrm{R}\mathrm{e}\mathrm{d}} $[107]
      Visible atmospherically resistant index (VARI)
      $ \dfrac{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}-\mathrm{R}\mathrm{e}\mathrm{d}}{\mathrm{G}\mathrm{r}\mathrm{e}\mathrm{e}\mathrm{n}+\mathrm{R}\mathrm{e}\mathrm{d}-\mathrm{B}\mathrm{l}\mathrm{u}\mathrm{e}} $[84]

      Visible light imaging has been widely used in turfgrass science research to date[64]. Since the early 2000s, researchers have routinely used RGB digital imagers attached to ground-based, enclosed lighting systems (Fig. 2) to collect phenotypic data for turf plot trials. Percentage ground cover measurements have been used to evaluate important turfgrass characteristics such as establishment rate[6568] and turf performance during periods of drought[6973] and traffic[74,75] stress, for example. Turfgrass color measurements, indicated by the dark green color index (Table 1), have been used to monitor turfgrass diseases[7678] and seasonal turf performance[79,80].

      In recent years, studies have been conducted to assess the potential applications for RGB imagers mounted to aerial platforms. The first study to use a UAS-mounted RGB camera in turfgrass science research found only a 1.5% difference between digital image data and ground survey data when studying turfgrass response 40 d after herbicide application using an unmanned helicopter[81]. More recently, Zhang et al.[82] compared ground- and aerial-based measurements on small plot bermudagrass (Cynodon spp.) and zoysiagrass (Zoysia spp.) research field trials and found that both UAS-based green leaf index and visible atmospherically resistant index, introduced by Louhaichi et al.[83] and Gitelson et al.[84], respectively, adequately predicted ground-based percent green cover ratings. Hong et al.[85] evaluated the ability of UAS-based RGB imagery to detect early drought stress in creeping bentgrass (Agrostis stolonifera L.) and reported that the greenblue color index (Table 1) enabled drought stress detection 5 d before decreases in visual turf quality were observed. These studies offer foundational evidence that RGB digital imagery is an affordable, entry-level plant phenotyping tool, and it is anticipated that additional studies of UAS-based visible light imaging will be reported in the future to further characterize the usefulness and limitations of this technology for turfgrass breeding applications.

      Based on prior research in turfgrasses and other crops, some limitations of UAS-based RGB imagery have been identified. Examples of current concerns include the difficulties in differentiating various plant stresses, processing datasets when sun and shade irregularities exist within the plant canopies at time of data collection, and challenges in distinguishing soil from vegetation in noncontinuous plant canopies. These and other issues are being further studied to search for solutions and enhance the usability of this technology for phenotyping applications. On a positive note, commercial UASs, fully integrated with RGB cameras and software for mapping missions, are available for plant breeders, requiring minimal technical training to operate compared to earlier developed platforms.

    • Spectral imaging sensors, also known as imaging spectrophotometers, collect data from the interaction of plants with light intensities that span much of the electromagnetic spectrum[28]. There are several key wavelengths (Fig. 3) along the spectrum that have been extensively studied in prior research. Light reflection from plant leaves is limited within the visible light range, as much of the light is absorbed by leaf pigments, particularly the chlorophyll; there is a notably high reflectance at approximately 550 nm in the green region and low reflectance at approximately 450 nm and 680 nm in the blue and red regions, respectively[86]. As wavelengths extend into the near-infrared range (690 nm to 730 nm), there is a marked increase in light emittance due primarily to light scattering within leaf cells[87]. This region has proven useful for assessing various plant characteristics, and because of the drastic increase in reflection at this region, it is commonly called the 'red edge'[88]. Just beyond this region, there is a water absorbing band at 970 nm that has been used as an indirect assessment of plant leaf water content[8991]. There are also additional regions of interest as wavelengths progress into the short-wave infrared region (1,000 nm to 2,500 nm). For example, strong water absorbing bands exist at 1,200 nm, 1,450 nm, 1,930 nm, and 2,500 nm, which could potentially be used for remote assessment of leaf water content[9295].

      Spectral imaging can be further classified into multispectral imaging and hyperspectral imaging (Fig. 5). Multispectral imaging collects discrete light reflectance data from approximately 3 to 10 bands, where the bands are typically broader than those in hyperspectral sensing[96]. These bands are typically well characterized and often assigned descriptive titles. Hyperspectral imaging, on the other hand, collects continuous light reflectance data from tens to thousands of bands. In this case, the bands are much narrower than those in multispectral sensing, and they do not typically have descriptive titles.

      Figure 5. 

      Comparison of multispectral imaging and hyperspectral imaging. Discrete light reflectance data is generated from multispectral sensors whereas continuous light reflectance data is generated from hyperspectral sensors[170].

      Arguably the most noteworthy work to come from plant spectral imaging research to date has been the derivation of various vegetation indices (Table 1), which are calculated based on simple mathematical functions such as differences or ratios between spectral reflectance at two or more spectral bands[97]. Vegetation indices are found to be useful in assessing chlorophyll and biomass production[98], plant stress and health[99101], and nutritional status[102] in plants.

      Trenholm et al.[103] used a hand-held Cropscan multispectral radiometer to measure turfgrass reflectance at seven wavelengths, which were subsequently used to calculate four vegetation indices as indicators of turfgrass visual quality, shoot density, and shoot tissue injury from traffic wear. This was one of the earlier studies to correlate turfgrass reflectance data with traditional visual qualitative estimates. Fitz–Rodríguez and Choi[97] found that normalized difference vegetation index, ratio vegetation index, and difference vegetation index (Table 1) correlated well with turfgrass visual quality under different irrigation treatments. Developing new and improved vegetation indices has been the focus for research projects for many years, and those types of studies are still actively being conducted at present[104113]. However, the normalized difference vegetation index, which was first introduced by Rouse et al.[106], has been extensively studied and remains one of the most widely used vegetation indices of plant health across various plant species, including turfgrasses[100,101,114,115].

      Spectral imaging is a promising technology for high-throughput plant phenotyping applications. As mentioned above, this technology is adaptable to ground- or aerial-based platforms and offers the ability to investigate plant interactions with light intensities beyond the visible light range. Many plant responses are more active outside the visible light range; therefore, spectral imaging in the near-infrared and short-wave infrared regions offer insights to many plant behaviors that are not detectable with visible light imaging platforms. Widespread implementation of spectral imaging technologies in plant science research and breeding programs has been slowed by a few difficulties that are being addressed in current research projects. Two of the most notable limitations to spectral imaging are the large quantities of data that are generated and the startup costs associated with purchasing these instruments. However, research and advancements in fields such as computer science and data science are offering solutions to these issues.

    • Infrared thermal imaging, also known as long-wave infrared imaging, thermal long-wave infrared imaging, or forward-looking infrared imaging, collects reflectance data in the far-infrared and long-wave infrared range of wavelengths, which span from 7,500 nm to 13,000 nm (Fig. 4). Over the last few decades, there has been mounting interest in using infrared thermometers to characterize drought- and heat-induced plant water stress based on the concept that water-stressed canopies have higher temperatures than well-watered canopies[116]. However, other than plant physiological status, canopy temperature measurements can also be affected by other factors such as surface soil exposure, solar radiation, and air temperature at the time of observation. Indices have been developed to normalize canopy temperature measurements to account for these types of environmental factors (Table 1).

      Among the indices listed in Table 1, crop water stress index is one of the most commonly used indices in studies on turfgrass irrigation scheduling. Jalali-Farahani et al.[117] reported that midday estimates of crop water stress index in bermudagrass were related to soil percent available extractable water. Bijanzadeh et al.[118] monitored crop water stress index of bermudagrass subjected to deficit irrigation on a monthly basis in southern Iran and concluded that turfgrass quality can be maintained with seasonal crop water stress index being kept at a value of approximately 0.15. However, one of the challenges is to accurately measure the upper and lower limit of temperature difference between canopy and air; such types of baseline values vary across different soil and environmental conditions[119] and could be dynamic during the day[120]. A model was developed to predict those baselines in tall fescue with meteorological factors such as air temperature, solar radiation, vapor pressure deficit, and wind speed[121].

      Another limitation regarding the application of canopy temperature is the dynamic nature of the measurement, which is highly variable if the time of data collection stretches too long. More valuable information can be derived regarding the water status of the plants if the data collection can be done within a few minutes. Infrared thermal cameras mounted on UASs would provide an option for thermal imagery to be collected across turfgrass breeding trials within minutes. Moreover, exposed soil among vegetation could potentially be removed if combined with RGB and multispectral imagery. Several hurdles need to be overcome to use UAS-based thermal imagery including temperature calibration, canopy temperature extraction, and establishment of canopy temperature-based crop water stress indicator[122]. Early exploration was reported using UAS-based thermal imagery to detect early drought stress in creeping bentgrass[85]. The researchers detected a rise of canopy temperature under 15% and 30% evapotranspiration replacements before visible decline of turf compared to 100% ET plots. More studies are needed to address these limitations associated with using UAS-based thermal imagers to detect drought stress in turfgrass.

    • Fluorescence is the emitted light generated during the absorption of short wavelength radiation, and in plants, the chlorophyll complex is the most common fluorescing machinery. As chloroplasts are irradiated with actinic or blue light, a portion of the light absorbed by chlorophyll will be reemitted as fluorescence[123]. The proportion of absorbed light that gets reemitted varies due to the plant's light metabolic capacity[124]. This fluorescence is a valuable indication of the plant's ability to assimilate actinic light[125]. Moreover, adding brief pulses of saturating blue light to the actinic light is useful to assess plant status for physiological parameters such as non-photochemical quenching and photo-assimilation[23].

      Fluorescence imaging, also known as chlorophyll fluorescence imaging, is the procedure of capturing images of fluorescence emitted by plants upon illumination with visible or ultraviolet light[126]. This technique commonly uses charge-coupled device cameras that are sensitive to fluorescence signals generated by light-emitting diodes, pulsed flashlights, or pulsed lasers[127]. Fluorescence imaging provides an efficient means for in vivo assessment of the electron transport rate, the extent of non-photochemical quenching, and the effective and potential quantum efficiency of photosystem II[128130]. Many uses of chlorophyll fluorescence imaging have been investigated including early detection of pathogen attack[131135], herbicide injury[136,137], and other abiotic and biotic stress factors[134,138140].

      Although fluorescence imaging is a promising technique for assessing plant health status, there are several limitations that have inhibited its implementation for high-throughput plant phenotyping in field settings. Fluorescence imaging requires that plants be dark-adapted prior to light excitation, meaning data collection for each plant sample will take multiple minutes[126]. In addition, currently available fluorescence imaging systems are only capable of measuring fluorescence from single leaves; for high-throughput applications, the technology must be developed to assess multiple plants at once. Another complication is that substantial power sources are needed to operate various light and sensor components of fluorescence imaging systems[141]. For this technology to be applicable for high-throughput plant phenotyping, concerns around robustness, reproducibility, and fluorescence image processing must be addressed.

    • Plant traits related to height and canopy architecture are highly prioritized in breeding goals and can be obtained through three-dimensional reconstruction of plant canopies[28]. LiDAR and ultrasonic sensors are both classified as ranging sensors, which means they measure the distance to the nearest object by emitting an electromagnetic signal and calculating the time difference between emitting and receiving the signal to indicate distance to the target[142]. For LiDAR, a laser beam is emitted to the target and the reflected light is analyzed[143]. One of the advantages of using LiDAR is being able to supply structural information of plants with high accuracy compared to other sensors due to view-obscuration from nadir view. In theory, LiDAR-based plant phenotyping can provide information from the leaf level to the canopy level, potentially helping diagnosis of plant status and crop management[143]. Growing literature reported the use of LiDAR-based plant phenotyping in row crops such as maize[144], sorghum [Sorghum bicolor (L.) Moench.][145], soybean [Glycine max (L.) Merr.][146], and cotton[147], focusing on traits including plant height, row spacing, and biomass.

      Given the high cost and availability of the integrated platform, LiDAR is less explored for plant phenotyping than other technologies. Limited studies investigated the use of LiDAR-based phenotyping in turfgrass. Nguyen et al.[148] reported using an unmanned ground vehicle (DairyBioBot) and LiDAR pipeline for the high-throughput phenotyping of biomass in forage perennial ryegrass (Lolium perenne L.) with R2 = 0.73 at the plot level when correlating with fresh mass basis observation. Nonetheless, the application of LiDAR in individual-plant-level phenotyping is promising in the future as this technology continues to be developed and becomes more affordable and integrated in user-friendly platforms.

      Ultrasonic sensors are generally more affordable compared to LiDAR. Similar to LiDAR, ultrasonic sensors can be used to estimate geometrical parameters of plants (for instance, plant height and canopy volume) if appropriate acquisition and data processing is applied. Studies were carried out to use ultrasonic sensors to estimate plant height in cotton[43], alfalfa (Medicago sativa L.) and bermudagrass[149], and wheat[150]. Yuan et al.[151] compared LiDAR, ultrasonic sensor, and RGB camera mounted on UAS in estimating plant height in wheat and concluded that LiDAR and UAS-mounted RGB camera provided the best results. Therefore, the strength of ultrasonic is not prominent but it provides an alternative for LiDAR in estimating plant height on the ground level when the target plants are too small for UAS applications.

    • Plant breeding programs have greatly benefited from recent advancements in DNA genotyping technologies. However, plant phenotype assessment has become the limiting factor in screening large numbers of plants in current plant breeding programs. Advancements in remote and proximal sensing technologies have led to the development and implementation of high-throughput plant phenotyping practices, which is beginning to increase the efficiency of plant phenotyping. Visible light imaging has been the most widely used remote sensing approach. This is a relatively inexpensive phenotyping solution for assessing plant traits such as ground cover and canopy architecture. Future research efforts of visible light imaging for plant phenotyping applications should emphasize the need for improved analysis approaches to account for shading issues and light variation as well as alleviating difficulties associated with distinguishing soil from plant tissues.

      Spectral imaging technologies have expanded in recent years and are becoming increasingly more prevalent in plant science research efforts. This technology is expected to continue to expand for additional plant phenotyping applications. Turfgrass breeders have already begun experimenting with this technology and have found promising results thus far. As the technology advances, it is expected that the initial costs associated with purchasing equipment will reduce; this will enable more plant breeding programs to utilize this technology. Research efforts should continue in developing improved data handling and processing options to better accommodate the large datasets generated using this imaging technology.

      Thermal imaging and fluorescence imaging are two technologies that are also being adapted to field applications. Although these technologies are not currently suited for in-field breeding applications, researchers are experimenting with these technologies to determine their usefulness in monitoring plant health and growth characteristics. As these technologies continue to be developed, it is anticipated that they will be more readily used in turfgrass breeding applications. Additionally, range sensors such as LiDAR will be further developed for use in assessing morphological characteristics such as leaf texture, leaf width, and plant height for turfgrass breeding programs.

      In addition to the phenotyping tools mentioned in this review, various other technologies are being explored to efficiently assess plant root phenotypes both in controlled environment and field conditions. Programs such as EZ Rhizo[152], IJ Rhizo[153], Root System Analyzer[154], Root Trace[155], Smart Root[156], and WhinRhizo[157] have been widely used for image-based analysis of root architecture. However, these approaches do not offer in situ root analyses, as they require roots to be cleaned of soil. Options for in situ root assessment include the use of mini-rhizotrons equipped with cameras or scanners to periodically gather root architecture data[158]. This approach is not well-suited for high-throughput applications and can only accommodate limited numbers of genotypes[159]. Other promising approaches currently being investigated include non-destructive methods such as magnetic resonance imaging and X-ray computed tomography[160,161]. The development of high-throughput phenotyping tools for characterizing root performance under stresses such as drought, insect feeding, and disease will be valuable resources for plant breeding programs in the future.

      Modern turfgrass breeding programs will continue to research, develop, and implement remote sensing technologies for high-throughput plant phenotyping applications. These technologies will enable turfgrass breeders to assess larger numbers of genotypes to efficiently identify elite germplasm. All together, these efforts will improve cultivar development efficiency and aid plant breeders in developing improved turfgrass cultivars to meet current and future demands of the turfgrass industry.

      • The authors would like to acknowledge the Rutgers Center for Turfgrass Science and the USDA – NIFA Specialty Crop Research Initiative (grant number: 2019-51181-30472) for partial funding of this effort.

      • The authors declare that they have no conflict of interest.

      • Copyright: © 2022 by the author(s). Published by Maximum Academic Press, Fayetteville, GA. This article is an open access article distributed under Creative Commons Attribution License (CC BY 4.0), visit https://creativecommons.org/licenses/by/4.0/.
    Figure (5)  Table (1) References (170)
  • About this article
    Cite this article
    Vines PL, Zhang J. 2022. High-throughput plant phenotyping for improved turfgrass breeding applications. Grass Research 2:1 doi: 10.48130/GR-2022-0001
    Vines PL, Zhang J. 2022. High-throughput plant phenotyping for improved turfgrass breeding applications. Grass Research 2:1 doi: 10.48130/GR-2022-0001

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return