Maize (Zea mays L.) seedling detection based on the fusion of a modified deep learning model and a novel Lidar points projecting strategy

Gang Wang, Dongyan Huang, Deyi Zhou, Huili Liu, Minghao Qu, Zhongyang Ma

Abstract


Accurate crop detection is the prerequisite for the operation of intelligent agricultural machinery. Image recognition usually lacks accurate orientation information, and Lidar point clouds are not easy to distinguish different objects. Fortunately, the fusion of images and Lidar points can complement each other. This research aimed to detect maize (Zea mays L.) seedlings by fusing Lidar data with images. By applying coordinate transformation and time stamps, the images and Lidar points were realized homogeneous in spatial as well as temporal dimensions. Deep learning was used to develop a maize seedling recognition model, then the model recognized maize seedlings by labeling them with bounding boxes. Meanwhile, Lidar points were mapped to the bounding boxes. Only one-third of points that fell into the right middle of bounding boxes were selected for clustering operation, the calculated center of the cluster provided spatial information for target maize seedlings. This study modified the classical single shot multi-box detector (SSD) by merely linking the last feature map to the final output layer, owing to the higher feature maps having the unique advantages of detecting relatively larger objects. In images, maize seedlings were just the largest objects owing to be shot on purpose. This modification enabled the recognition model to finish recognizing an image by only consuming around 60 ms, which saved about 10 ms/image compared with the classical SSD model. The experiment was conducted in a maize field, and the maize was during the elongation stage. Experimental results demonstrated that the standard deviations for maximum distance error and maximum angle error were 1.4 cm and 1.1°, respectively, which can be tolerated under current technical requirements. Since agricultural fields are subject to staple crop-orientated and changeable ambient environment, the fusion of images and Lidar points can derive more precision information, and make agricultural machinery smarter. This study can act as an upstream technology for other researches on intelligent agricultural machinery.
Keywords: maize seedling, detection, fusion, deep learning, Lidar
DOI: 10.25165/j.ijabe.20221505.7830

Citation: Wang G, Huang D Y, Zhou D Y, Liu H L, Qu M H, Ma Z Y. Maize (Zea mays L.) seedling detection based on the fusion of a modified deep learning model and a novel Lidar points projecting strategy. Int J Agric & Biol Eng, 2022; 15(5): 172–180.

Keywords


maize seedling, detection, fusion, deep learning, Lidar

Full Text:

PDF

References


Wang Y, Xu S, Li W, Kang F, Zheng Y. Identification and location of grapevine sucker based on information fusion of 2D laser scanner and machine vision. Int J Agric & Biol Eng, 2017; 10(2): 84–93.

Zhang R Y, Cao S Y. Extending reliability of mmWave radar tracking and detection via fusion with camera. IEEE Access, 2019; 7: 137065–137079.

Gonzalez R C, Woods R E. Digital image processing. 3rd. New Jersey: Prentice Hall, 2008; 976p.

Kamilaris A, Prenafeta-Boldu F X. Deep learning in agriculture: A survey. Computers and Electronics in Agriculture, 2018; 147: 70–90.

Lecun Y, Bengio Y, Hinton G. Deep learning. Nature, 2015; 521(7553): 436–444.

Girshick R, Donahue J, Darrell T, Malik J. Rich feature hierarchies for accurate object detection and semantic segmentation. 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Columbus, OH: IEEE, 2014; pp.580–587.

Girshick R. Fast R-CNN. 2015 IEEE International Conference on Computer Vision and Pattern Reconition (CVPR), Santiago: IEEE, 2015; pp.1440–1448.

Ren S, He K, Girshick R, Sun J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Transactions on Pattern Analysis and machine intelligence, 2016; 39(6): 1137–1149.

Redmon J, Farhadi A. YOLOv3: An incremental improvement. arXiv, 2018; arXiv 1804.02767.

Redmon J, Farhadi A, YOLO9000: Better, Faster, Stronger. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu: IEEE, 2017; pp.6517–6525. doi: 10.1109/CVPR.2017.690.

Redmon J, Divvala S, Girshick R, Farhadi A. You only look once: Unified, real-time object detection. In: 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Seattle: IEEE, 2016; pp.779–788. doi: 10.1109/CVPR.2016.91.

Uijlings J R R, Van De Sande K E A, Gevers T, Smeulders A W M. Selective search for object recognition. International Journal of Computer Vision, 2013; 104(2): 154–171.

Zhao H, Li Z, Zhang T. Attention based single shot multibox detector. Journal of Electronics & Information Technology, 2021; 43(7): 2096–2104.

Yandún Narváez F J, Salvo Del Pedregal J, Prieto P A, Torres-Torriti M, Auat Cheein F A. Lidar and thermal images fusion for ground-based 3D characterisation of fruit trees. Biosystems Engineering, 2016; 151: 479–494.

Zhao X, Sun P, Xu Z, Min H, Yu H. Fusion of 3D Lidar and camera data for object detection in autonomous vehicle applications. IEEE Sensors Journal, 2020; 20(9): 4901–4913.

Xue J, Fan B, Yan J, Dong S, Ding Q. Trunk detection based on laser radar and vision data fusion. Int J Agric & Biol Eng, 2018, 11(6): 20–26.

Ji Y H, Li S C, Peng C, Xu H Z, Cao R Y, Zhang M. Obstacle detection and recognition in farmland based on fusion point cloud data. Computers and Electronics in Agriculture, 2021, 189: 106409. doi: 10.1016/j.compag.2021.106409.

Underwood J P, Hung C, Whelan B, Sukkarieh S. Mapping almond orchard canopy volume, flowers, fruit and yield using lidar and vision sensors. Computers and Electronics in Agriculture, 2016; 130: 83–96.

Ma Z, Tao Z, Du X, Yu Y, Wu C. Automatic detection of crop root rows in paddy fields based on straight-line clustering algorithm and supervised learning method. Biosystems Engineering, 2021; 211: 63–76.

Raja R, Nguyen T T, Slaughter D C, Fennimore S A. Real-time robotic weed knife control system for tomato and lettuce based on geometric appearance of plant labels. Biosystems Engineering, 2020; 194: 152–164.

Schirrmann M, Hamdorf A, Garz A, Ustyuzhanin A, Dammer K-H. Estimating wheat biomass by combining image clustering with crop height. Computers and Electronics in Agriculture, 2016; 121: 374–384.

Tharp B E, Kells J J, Bauman T T, Harvey R G, Johnson W G, Loux M M, et al. Assessment of weed control strategies for corn in the north-central united states. Weed Technology, 2004; 18(2): 203–210.

Cordill C, Grift T E. Design and testing of an intra-row mechanical weeding machine for corn. Biosystems Engineering, 2011; 110(3): 247–252.

Raja R, Thuy T N, Vuong V L, Slaughter D C, Fennimore S A. Rtd-seps: Real-time detection of stem emerging points and classification of crop-weed for robotic weed control in producing tomato. Biosystems Engineering, 2020; 195: 152–171.

Jia H, Gu B, Ma Z, Liu H, Wang G, Li M, et al. Optimized design and

experiment of spiral-type intra-row weeding actuator for maize (Zea mays L.) planting. Int J Agric & Biol Eng, 2021; 14(6): 54–60.

National Bureau of Statistics of China. China statistical yearbook. Beijing: China Statistics Press, 2021; 945p. (in Chinese)

Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu C-Y, et al. SSD: Single shot multibox detector. Computer Vision – ECCV 2016, Springer, 2016; 9905: 21–37.

Zhang Z Y. Flexible camera calibration by viewing a plane from unknown orientations. Proceedings of the Seventh IEEE International Conference on Computer Vision, Redmond: IEEE, 1999; pp.666–673. doi: 10.1109/ICCV.1999.791289.

Budil D E, Lee S, Saxena S, Freed J H. Nonlinear-least-squares analysis of slow-motion EPR spectra in one and two dimensions using a modified levenberg–marquardt algorithm. Journal of Magnetic Resonance Series A, 1996; 120(2): 155–189.

Zhang Y, Pan S, Xie Y, Chen K, Mo J. Detection of ridge in front of agricultural machinery by fusion of camera and millimeter wave radar. Transactions of the CSAE, 2021; 37(15): 169–178. (in Chinese)

Russakovsky O, Deng J, Su H, Krause J, Satheesh S, Ma S, et al.

Imagenet large scale visual recognition challenge. International Journal of Computer Vision, 2015; 115(3): 211–252.

Shen C, Zhao X, Fan X, Lian X, Liu Z. Multi-receptive field graph convolutional neural networks for pedestrian detection. IET Intelligent Transport Systems, 2019; 13(9): 1319–1328.

Jia H, Qu M, Wang G, Walsh M J, Yao J, Guo H, et al. Dough-stage maize (Zea mays L.) ear recognition based on multiscale hierarchical features and multifeature fusion. Mathematical Problems in Engineering, 2020; 2020: 9825472. doi: 10.1155/2020/9825472.

Wang G, Jia H, Zhao J, Li C, Wang Y, Guo H. Design of corn high-stubble cutter and experiments of stubble retaining effects. Transactions of the CSAE, 2014, 30: 43–49. (in Chinese)

Jia H, Li S, Wang G, Liu H. Design and experiment of seedling avoidable weeding control device for intertillage maize (Zea mays L.). Transactions of the CSAE, 2018, 34(7): 15–22. (in Chinese)

Fu Y, Tian D, Duan X, Zhou J, Lang P, Lin C, et al. A camera–radar fusion method based on edge computing. In: 2020 IEEE International Conference on Edge Computing (EDGE), Beijing: IEEE, 2020; pp.9–14. doi: 10.1109/EDGE50951.2020.00009.




Copyright (c) 2022 International Journal of Agricultural and Biological Engineering

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

2023-2026 Copyright IJABE Editing and Publishing Office