Current status and prospects of the visual detection and positioning technology for intelligent picking of famous tea
DOI:
https://doi.org/10.25165/ijabe.v18i6.9245Keywords:
famous tea, detection, position, deep learning, intelligent pickingAbstract
The mechanization of famous tea harvesting is an essential way to develop China’s tea industry. This paper centers on the detection and positioning technologies in famous tea harvesting, systematically reviewing research progress in these domains. In tea detection, traditional methods rely on color space selection and image segmentation, exhibiting limitations such as insufficient accuracy and poor generalization capability. Conversely, deep learning algorithms demonstrate superior detection accuracy and robustness. Current research focuses on enhancing detection accuracy, inference speed, and multi-variety recognition. In picking positioning, depth information measurement technology utilizing RGB-D cameras provides foundational support. Positioning methods have evolved from traditional visual processing techniques to deep learning and point cloud approaches, seeking to overcome challenges including occlusion and irregular growth patterns. Notwithstanding notable technological advancements, existing methods confront three primary limitations: difficulties in adapting to diverse growth stage characteristics, reliance on large-scale annotated datasets, and inadequate occlusion handling. Future research ought to concentrate on three directions: developing highly universal tea bud detection models, refining model training techniques for small-sample scenarios, and improving tea-picking point positioning accuracy under occluded conditions. This review aims to furnish critical references for advancing high-end intelligent tea-picking machinery, thereby facilitating the tea industry’s mechanization and intelligentization. Key words: famous tea; detection; position; deep learning; intelligent picking DOI: 10.25165/j.ijabe.20251806.9245 Citation: Zhou Y J, He L Y, Chen J N, Jia J M, Wu C Y, Li Y T, et al. Current status and prospects of the visual detection and positioning technology for intelligent picking of famous tea. Int J Agric & Biol Eng, 2025; 18(6): 1–11.References
Mei Y, Liang X. Analysis of China’s tea production, sales, import and export situation in 2023. China Tea, 2024; 46(4): 18–26.
Liu W, Xiong R. Research on the key technology of intelligent tea picking machine-measurement of depth and height of tea gathering in visual processing system. in Proc. Biotech Middle East 2017, Dubai, United Arab Emirates, 2017; pp.146-156. DOI: 10.25236/isafb.2019.025
Quan Q A. The development of tea picking machinery and the development of the mechanization of tea picking in China. Chinese Tea, 2018; 40(10): 4–9.
Han Y, Xiao H R, Song Z Y, Chen Q M, Ding W Q, Mei S. Design and experiments of 4CJ-1200 self-propelled tea plucking machine. Int J Agric & Biol Eng, 2021; 14(6): 75–84.
Wei Y, Wen Y Q, Huang X L, Ma P H, Wang L, Pan Y, et al. The dawn of intelligent technologies in tea industry. Trends in Food Science & Technology, 2024; 144: 104337.
Zhou Y J, Wu Q, He L Y, Zhao R M, Jia J M, Chen J N, et al. Design and experiment of intelligent picking robot for famous tea. Journal of Mechanical Engineering, 2022; 58(19): 12–23.
Li Y T, Zhou Y J, Wang S Q, Chen J N, He L Y, Jia J M, et al. Experimental study on high-quality tea plucking by robot. Journal of Tea Science, 2024; 1: 75–83. DOI: 10.3969/j.issn.1000-369X.2024.01.007 (in Chinese)
Wang Z H, Xun Y, Wang Y K, Yang Q H. Review of smart robots for fruit and vegetable picking in agriculture. Int J Agric & Biol Eng, 2022; 15(1): 33–54.
Yang F Z, Yang L, Tian Y N, Yang Q. Recognition of the tea sprout based on color and shape features. Transactions of the CSAM, 2009; 40(S1): 119–123. (in Chinese)
Wu X M, Zhang F G, Lu J T. Research on recognition of tea tender leaf based on image color information. Journal of Tea Science, 2013; 33(6): 584–589.
Wei J J, Chen Y, Jin X J, Zheng J Q, Shi Y Z, Zhang H. Researches on tender tea shoots identification under natural conditions. Journal of Tea Science, 2012; 32(5): 377–381. (in Chinese)
Zhao B J, Dong W, Sun W Z, Liu Y. Research on tea bud identification technology based on HSI/HSV color transformation. in Proc. 6th Int. Conf. Inf. Sci. Control Eng., Shanghai, China, 2019; pp.511–515. DOI: 10.1109/ICISCE48695.2019.00108
Zhang K, Lu J. Study on automatic segmentation of tea sprouts under natural conditions. Journal of Heilongjiang Bayi Agricultural University, 2016; 28(2): 100–104.
Xia H K, Fang M R, Huang T, Lu J. Study on the method of image segmentation of tea sprouts based on SLIC super-pixel. Journal of Xichang University (Natural Science Edition), 2019; 33(4): 4.
Tang X, Wu X M, Zhang F G, Gu J M. Contrastive research on tender tea recognition based on multiple threshold segmentation methods. Agricultural Equipment&Technology, 2013; 39(6): 10–14.
Zhang L, Zhang H D, Chen Y D, Dai S H, Li X M, et al. Real-time monitoring of optimum timing for harvesting fresh tea leaves based on machine vision. Int J Agric & Biol Eng, 2019; 12(1): 6–9.
Wu X M, Tang X, Zhang F G, Gu J M. Tea buds image identification based on Lab color model and k-means clustering. Journal of Chinese Agricultural Mechanization, 2015; 36(5): 161–164.
Shao P D, Wu M H, Wang X W, Zhou J, Liu S. Research on the tea bud recognition based on improved k-means algorithm. MATEC Web of Conferences EDP Sciences, 2018; 232: 03050.
Chen M T. Recognition and location of high-quality tea buds based on computer vision. Qingdao University of Science and Technology, 2019.
Li W, Chen R, Gao Y. Automatic recognition of tea bud image based on support vector machine. International Conference on Advanced Hybrid Information Processing, 2020; 348: 279–290.
Karunasena G, Priyankara H. Tea bud leaf identification by using machine learning and image processing techniques. International Journal of Scientific Engineering Research, 2020; 11(8): 624–628.
Zhang L, Zou L, Wu C Y, Jia J M, Chen J N. Method of famous tea sprout identification and segmentation based on improved watershed algorithm. Computers and Electronics in Agriculture, 2021; 184: 106108.
Jiang H T, He B X, Zhang Y F. Research on the method of recognizing and positioning the shoots of the tea picking manipulator. Machinery & Electronics, 2021; 39(7): 60–64, 69.
Huang H J, Wu M H, Wang X W, Liu S, Zhou J. Image recognition of tender leaves based on improved watershed algorithm in tea. Guizhou Agricultural Science, 2018; 46(4): 136–138.
Vu H, Le T-L, Tran T-H, Nguyen T-T. A vision-based method for automatizing tea shoots detection. International Conference on Image Processing, Melbourne: 2013; pp.3775-3779 DOI: 10.1109/icip.2013.6738778
Fang Q, Xie Z Q, Tang Z, Chen H R. Related study based on OSTU watershed algorithm and new squeeze and excitation networks for segmentation and level classification of tea buds. Neural Processing Letters, 2021; 53(3): 2261–2275.
Sun X X, Mou S M, Xu Y, Cao Z H, Su T. Detection algorithm of tea tender buds under complex background based on deep learning. Journal of Hebei University, 2019; 39(2): 211–216.
Kamilaris A, Prenafeta-Boldu F X. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 2018; 147(1): 70–90.
Ren S Q, He K M, Ross G, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. Transactions on Pattern Analysis & Machine Intelligence, 2017; 39(6): 1137–1149.
Zhu H C, Li X, Meng Y, Yang H B, Xu Z, Li Z H. Tea bud detection based on Faster R-CNN network. Transactions of the CSAM, 2022; 53(5): 217–224.
Xu B Y, Gao Y F. Tea-buds multi-dimensional recognition with Faster-RCNN deep learning method and its performance analysis. Agricultural Equipment & Vehicle Engineering, 2023; 61(2): 19–24.
Wang T, Zhang K M, Zhang W, Wang R Q, Wan S M, Rao Y, et al. Tea picking point detection and location based on Mask-RCNN. Information Processing in Agriculture, 2023; 10(2): 267–265.
Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: unified, real-time object detection. Conference on Computer Vision and Pattern Recognition, Las Vegas: 2016; pp.779–788. DOI: 10.1109/cvpr.2016.91
Berg A C, Fu C Y, Szegedy C, Anguelov D, Erhan D, Reed S. SSD: Single shot MultiBox detector. European Conference on Computer Vision, Amsterdam: 2016; pp.21-37. DOI: 10.1007/978-3-319-46448-0_2
Chen B, Yan J, Wang K. Fresh tea sprouts detection via image enhancement and fusion SSD. Journal of Control Science and Engineering, 2021; 1: 1–11.
Li Y T, Wu S K, He L Y, Tong J H, Zhao R M, Jiang J M, et al. Development and field evaluation of a robotic harvesting system for plucking high-quality tea. Computers and Electronics in Agriculture, 2023; 206: 107659.
Zhao X Y, He Y X, Zhang H T, Ding Z T, Zhou C A, Zhang K X. A quality grade classification method for fresh tea leaves based on an improved YOLOv8x-SPPCSPC-CBAM model. Scientifc Reports, 2024; 14: 4166.
Xu G J, Zhang Y, Lai X Y. Recognition approaches of tea bud image based on faster R-CNN depth network. Journal of Optoelectronics Laser, 2020; 31(11): 1131–1139.
Xu W K, Zhao L G, Li J, Shang S Q, Ding X P, Wang T W. Detection and classification of tea buds based on deep learning. Computers and Electronics in Agriculture, 2022; 192: 106547.
Liu F, Wang S D, Pang S C, Han Z. Detection and recognition of tea buds by integrating deep learning and image-processing algorithm. Journal of Food Measurement and Characterization, 2024; 18(4): 2744–2761.
Liu Z, Lin Y T, Cao Y, Hu H, Wei Y X, Zhang Z, et al. Swin Transformer: Hierarchical vision transformer using shifted windows. International Conference on Computer Vision, Montreal: 2021; pp.9992–10002. DOI: 10.48550/arXiv.2103.14030
Li X T, Liu R X, Li Y X, Li Z L, Yan P, Yu M, et al. An improved model based on YOLOX for detection of tea sprouts in natural environment. Evolving Systems, 2024; 15(5): 1665–1679.
Gui J S, Wu D W, Xu H R, Chen J N, Tong J H. Tea bud detection based on multi-scale convolutional block attention module. Journal of Food Process Engineering, 2024; 47(2): 14556.
Yang H L, Chen L, Chen M T, Ma Z B, Deng F, Li M Z, et al. Tender tea shoots recognition and positioning for picking robot using improved YOLO-V3 model. IEEE Access, 2019; 7: 7180998–181011.
Fang M R, Lu J, Ruan J Y, Bian L, Wu C Y, Yao Q. Tea buds detection model using improved YOLOv4-tiny. Journal of Tea Science, 2022; 42(4): 549–560.
Gui Z Y, Chen J N, Li Y, Chen Z W, Wu C Y. A lightweight tea bud detection model based on Yolov5. Computers and Electronics in Agriculture, 2023; 205: 107636.
He K, Zhang X Y, Ren S Q, Sun J. Deep residual learning for image recognition. Conference on Computer Vision and Pattern Recognition, Las Vegas: 2016; pp.770–778. DOI: 10.1109/CVPR.2016.90
Mao T Y, Zhu J J, Tie J. Research based on recognition method for tea buds based on anchor free detection network. Journal of South-Central Minzu University (Natural Science Edition), 2023; 42(4): 489–496.
Zhang Z, Lu Y Z, Zhao Y Q, Pan Q M, Jin K, Xu G, et al. TS-YOLO: An all-day and lightweight tea canopy shoots detection model. Agronomy, 2023; 13(5): 1411.
Wang G J, Wang Z, Zhao Y, Zhang Y Z. Tea bud recognition based on machine learning. Proceedings of the 41st Chinese Control Conference, Hefei: IEEE, 2022; pp.6533-6537. DOI: 10.23919/ccc55666.2022.9902610
Liu Z, Li J G, Shen Z Q, Huang G, Yan S M, Zhang C S. Learning efficient convolutional networks through network slimming. International Conference on Computer Vision, Venice: 2017; pp.2736–2744. DOI: 10.1109/ICCV.2017.298
He Y H, Zhang X Y, Sun J. Channel pruning for accelerating very deep neural networks. International Conference on Computer Vision, Venice: 2017; pp.1389–1397. DOI: 10.1109/iccv.2017.155
Han K, Wang Y H, Tian Q, Guo J Y, Xu C J, Chang X. GhostNet: More features from cheap operations. Conference on Computer Vision and Pattern Recognition, 2020; pp.1577–1586. DOI: 10.1109/cvpr42600.2020.00165
Sandler M, Howard A, Zhu M, Zhmoginov A, Chen L. MobileNetV2: Inverted residuals and linear bottlenecks. Conference on Computer Vision and Pattern Recognition, Salt Lake City, 2018; pp.4510–4520. DOI: 10.1109/CVPR.2018.00474
Zhang X Y, Zhou X Y, Ma M X, Sun J. ShuffleNet: An extremely efficient convolutional neural network for mobile devices. Conference on Computer Vision and Pattern Recognition, Salt Lake City, 2018; pp.6848–6856. DOI: 10.48550/arXiv.1707.01083
Lin G C, Xiong J T, Zhao R M, Li X M, Hu H N, Zhu L X, et al. Efficient detection and picking sequence planning of tea buds in a high-density canopy. Computers and Electronics in Agriculture, 2023; 213: 108213.
Li J, Li J H, Zhao X, Su X H, Wu W B. Lightweight detection networks for tea bud on complex agricultural environment via improved YOLO v4. Computers and Electronics in Agriculture, 2023; 211: 107955.
Huang J C, Tang A, Chen G M, Zhang D, Gao F Z, Chen T. Mobile recognition solution of tea buds based on compact-YOLO v4 algorithm. Transactions of the CSAM, 2023; 54(03): 282–290.
Xie S, Sun H W. Tea-YOLOv8s: A tea bud detection model based on deep learning and computer vision. Sensors, 2023; 23(14): 6576.
Li Y T, He L Y, Jia J M, Chen J N, Lu J, Wu C Y. High-efficiency tea shoot detection method via a compressed deep learning model. Int J Agric & Biol Eng, 2022; 15(3): 159–166.
Yang M D, Yuan W H, Xu G J. YOLOX target detection model can identify and classify several types of tea buds with similar characteristics. Scientific Reports, 2024; 14: 5548.
Yu T J, Chen J N, Chen Z W, Li Y T, Tong J H, Du X Q. DMT: A model detecting multispecies of tea buds in multi-seasons. Int J Agric & Biol Eng, 2024; 17(1): 199–208.
Zhao R M, Liao C, Yu T J, Chen J N, Li Y T, Lin G C, et al. IMVTS: A detection model for multi-varieties of famous tea sprouts based on deep learning. Horticulturae, 2023; 819(9): 1–15.
Wu Y X, Chen J N, Wu S K, Li H, He L Y, Zhao R M, et al. An improved YOLOv7 network using RGB-D multi-modal feature fusion for tea shoots detection. Computers and Electronics in Agriculture, 2024; 216: 108541.
Wu Y X, Chen J N, He L Y, Gui J S, Jia J M. An RGB-D object detection model with high-generalization ability applied to tea harvesting robot for outdoor cross-variety tea shoots detection. Journal of Field Robotics, 2024; 41(4): 1167–1186.
Wang M J, Li Y, Meng H W, Chen Z W, Gui Z Y, Li Y P, et al. Small target tea bud detection based on improved YOLOv5 in complex background. Front. Plant Sci, 2024; 15: 1393138.
Zhang Z, Lu Y Z, Yang M Y, Wang G Q, Zhao Y Q, Hu Y G. Optimal training strategy for high-performance detection model of multi-cultivar tea shoots based on deep learning methods. Scientia Horticulturae, 2024; 328: 112949.
Liu W H, Pan Z Y, Liu W J, Shao Q Q, Hu J, Wang W M, et al. Deep learning for picking point detection in dense cluster. Asian Control Conference, Gold Coast: 2017; pp.1644–1649. DOI: 10.1109/ascc.2017.8287420
Bulanon D M, Kataoka T. A Fruit detection system and an end effector for robotic harvesting of Fuji apples. Agricultural Engineering International: CIGR Journal, 2010; 12(1): 203–210.
Han K S, Kim S C, Lee Y B, Kim S C, Hwang H. Strawberry harvesting robot for bench-type cultivation. Journal of Biosystems Engineering, 2012; 37(1): 65–74.
Wang M, Li J P, Yu Q C, Ji D M, Zhu S M. Positioning method of axillary bud removal point for cherry tomato. Transactions of the CSAM, 2016; 47(9): 23–28.
Li G L, Ji C Y, Gu B X, Xu W R, Dong M. Kinematics analysis and experiment of apple harvesting robot manipulator with multiple end-effectors. Transactions of the CSAM, 2016; 47(12): 14–21, 29.
Barth R, Hemming J, Van Henten E J. Design of an eye-in-hand sensing and servo control framework for harvesting robotics in dense vegetation. Biosystems Engineering, 2016; 146: 71–84.
Lin G C, Tang Y C, Zou X J, Li J H, Xiong J T. In-field citrus detection and localization based on RGB-D image analysis. Biosystems Engineering, 2019; 186: 34–44.
Li J, Tang L. Crop recognition under weedy conditions based on 3D imaging for robotic weed control. Journal of Field Robotics, 2017; 35(4): 596–611.
Fu L S, Gao F, Wu J Z, Li R, Karkee M, Zhang Q. Application of consumer RGB-D cameras for fruit detection and localization in field: A critical review. Computers and Electronics in Agriculture, 2020; 177: 105687.
Rao U. Design of automatic cotton-picking robot with Machine vision using Image Processing algorithms. International Conference on Control, Automation, Robotics and Embedded Systems, Jabalpur: 2013; pp.1–5. DOI: 10.1109/care.2013.6733700
Geng J. Structured-light 3D surface imaging: a tutorial. Advances in Optics and Photonics, 2011; 3(2): 128–160.
Pei W, Wang X L. The two-dimension coordinates extraction of tea shoots picking based on image information. Acta Agriculturae Zhejiangensis, 2016; 28(3): 522–527.
Lu J Z, Yang Z M, Sun Q, Gao Z M, Ma W. A Machine Vision-Based Method for Tea Buds Segmentation and Picking Point Location Used on a Cloud Platform. Agronomy, 2023; 13(6): 1537
Long Z, Jiang Q, Wang J, Zhu H L, Li B, Wen F J. Research on method of tea flushes vision recognition and picking point localization. Transducer and Microsystem Technologies, 2022; 41(2): 39–41, 45. (in Chinese)
Zhang L, Zou L, Wu C Y, Chen J N, Chen H P. Locating famous tea’s picking point based on Shi-Tomasi algorithm. Computers, Materials & Continua, 2021; 69(1): 1109–1122.
Guo S D, Yoon S C, Li L, Wang W, Zhuang H, Wei C J, et al. Recognition and Positioning of Fresh Tea Buds Using YOLOv4-lighted+ICBAM Model and RGB-D Sensing. Agriculture, 2023; 13: 518.
Chen Y T, Chen S F. Localizing plucking points of tea leaves using deep convolutional neural networks. Computers and Electronics in Agriculture, 2020; 171: 105298.
Shuai L Y, Mu J, Jiang X Q, Chen P, Zhang B D, Li H D, et al. An improved YOLOv5-based method for multi-species tea shoot detection and picking point location in complex backgrounds. Biosystems Engineering, 2023; 231: 117–132.
Zhang F Y, Sun H W, Xie S, Dong C W, Li Y, Xu Y T, et al. A tea bud segmentation, detection and picking point localization based on the MDY7-3PTB model. Front. Plant Sci, 2023; 14: 1199473.
Meng J Q, Wang Y X, Zhang J M, Tong S Y, Chen C C, Zhang C X, et al. Tea bud and picking point detection based on deep learning. Forests, 2023; 14(6): 1188.
Yan C Y, Chen Z H, Li Z L, Liu R X, Li Y X, Xiao H, et al. Tea sprout picking point identification based on improved DeepLabV3+. Agriculture, 2022; 12(10): 1594.
Yan L J, Wu K H, Lin J, Xu X G, Zhang J C, Zhao X H, et al. Identification and picking point positioning of tender tea shoots based on MR3P-TS model. Frontiers in Plant Science, 2022; 13: 962391.
Wu W C, Hu Y G, Lu Y Z. Parametric surface modelling for tea leaf point cloud based on non-uniform rational basis spline technique. Sensors, 2021; 21(4): 1304–1304.
Hu H M, Kaizu Y, Zhang H D, Xu Y W, Imou K, Li M, et al. Recognition and localization of strawberries from 3D binocular cameras for a strawberry picking robot using coupled YOLO/Mask R-CNN. Int J Agric & Biol Eng, 2022; 15(6): 175–179.
Li Y T, He L Y, Jia J M, Lu J, Chen J N, Qiao X, et al. In-field tea shoot detection and 3D localization using an RGB-D camera. Computers and Electronics in Agriculture, 2021; 185: 106149.
Zhu L X, Zhang Z H, Lin G C, Chen P L, Li X M, Zhang S A. Detection and localization of tea bud based on improved YOLOv5s and 3D point cloud processing. Agronomy, 2023; 13(9): 2412
Chen Z W, Chen J N, Li Y, Gui Z Y, Yu T J. Tea bud detection and 3D pose estimation in the field with a depth camera based on improved YOLOv5 and the optimal pose-vertices search method. Agriculture, 2023; 13: 1405.
Chen C L, Lu J Z, Zhou M C, Yi J, Liao M, Gao Z M. A YOLOv3-based computer vision system for identification of tea buds and the picking point. Computers and Electronics in Agriculture, 2022; 198: 107116.
Li R J, Qin W B, He Y T, Li Y D, Ji R B, Wu Y H, et al. Method for the classification of tea diseases via weighted sampling and hierarchical classification learning. Int J Agric & Biol Eng, 2024; 17(3): 211–221.
Cao Q, Zhao C J, Xu Z, Jiang P, Yang H B, Meng X Y, et al. Review of the application of in-situ sensing techniques to address the tea growth characteristics from leaf to field. Int J Agric & Biol Eng, 2024; 17(1): 1–11.
Downloads
Published
How to Cite
Issue
Section
License
IJABE is an international peer reviewed open access journal, adopting Creative Commons Copyright Notices as follows.
Authors who publish with this journal agree to the following terms:
- Authors retain copyright and grant the journal right of first publication with the work simultaneously licensed under a Creative Commons Attribution License that allows others to share the work with an acknowledgement of the work's authorship and initial publication in this journal.
- Authors are able to enter into separate, additional contractual arrangements for the non-exclusive distribution of the journal's published version of the work (e.g., post it to an institutional repository or publish it in a book), with an acknowledgement of its initial publication in this journal.
- Authors are permitted and encouraged to post their work online (e.g., in institutional repositories or on their website) prior to and during the submission process, as it can lead to productive exchanges, as well as earlier and greater citation of published work (See The Effect of Open Access).