A Tactile Method for Rice Plant Recognition Based on Machine Learning
<p>Diagram of rice plant recognition sensor.</p> "> Figure 2
<p>Diagram of experimental device: (<b>1</b>) master computer program, (<b>2</b>) lifting platform, (<b>3</b>) flexible gasbag with air pressure measurement, (<b>4</b>) Arduino, (<b>5</b>) 24-bit AD data acquisition module (USB DAQ-580I), (<b>6</b>) controller, (<b>7</b>) driver, (<b>8</b>) DC power, and (<b>9</b>) line-glide rail.</p> "> Figure 3
<p>Contact process of gasbag with rice and weeds: (<b>a</b>) contact between the root of the gasbag and rice; (<b>b</b>) contact between the middle of the gasbag and rice; (<b>c</b>) contact between the end of the gasbag and rice; (<b>d</b>) contact between the gasbag and weeds.</p> "> Figure 4
<p>Waveforms of tactile voltage signals: (<b>a</b>) contact between the gasbag and weeds; (<b>b</b>) contact between the gasbag end and rice; (<b>c</b>) contact between the middle of the gasbag and rice; and (<b>d</b>) contact between the root of the gasbag end and rice.</p> "> Figure 5
<p>Resulting plot of five times feature selection using the genetic algorithm.</p> "> Figure 6
<p>The training result of the back propagation (BP) neural network.</p> "> Figure 7
<p>Resulting plot of the comparison of recognition rate before and after using the genetic algorithm.</p> "> Figure 8
<p>Resulting plot of the experiment.</p> ">
Abstract
:1. Introduction
- A novel sensing method for identifying rice plants and weeds was proposed to address the poor effect of the vision recognition method in rice field, which was different from the previous tactile perception method based on artificial threshold recognition. This study was based on the features of tactile perception data of rice plants and weeds.
- A flexible tactile sensor was designed. The gasbag structure of the cantilever beam type showed good adaptability and barometric sensitivity, which was conducive to obtaining differences in structure and mechanics between rice plants and weeds and provided a basis for the depth mining of tactile identification data of rice plants and weeds.
- A classification method of rice plants and weeds was proposed, including feature extraction with dimension, dimensionless, and fractal dimension, feature selection with genetic algorithm, and feature classification with neural network. To some extent, this improved the accuracy of identification of rice plants and weeds.
2. Materials and Methods
2.1. Tactile Signals Acquisition and Processing
2.2. Plant Growth Conditions
2.3. Data Processing
2.3.1. Feature Extraction
Dimensional Parameter
Dimensionless Parameter
Fractal Dimension Feature
2.3.2. Feature Selection
2.3.3. Data Classification
2.4. Experimental Methods
- Case I: Touching rice plants and weeds with the end of gasbag while moving the sensor.
- Case II: Touching rice plants and weeds with the middle of gasbag while moving the sensor.
- Case III: Touching rice plants and weeds with the root of gasbag while moving the sensor.
3. Results
3.1. Comparison of Tactile Signals
3.2. Feature Extraction Results
3.3. Results of The Tested Network Accuracy for Each Group
3.4. Feature Selection and Performance of Classifiers
3.5. Performance of Rice Machine Recognition Sensor
4. Discussion
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Chhokar, R.S.; Sharma, R.K.; Gathala, M.K.; Pundir, A.K. Effects of crop establishment techniques on weeds and rice yield. Crop Prot. 2014, 64, 7–12. [Google Scholar] [CrossRef]
- Tshewang, S.; Sindel, B.M.; Ghimiray, M.; Chauhan, B.S. Weed management challenges in rice (Oryza sativa L.) for food security in Bhutan: A review. Crop Prot. 2016, 90, 117–124. [Google Scholar] [CrossRef]
- Rodrigo, M.A.; Oturan, N.; Oturan, M.A. Electrochemically assisted remediation of pesticides in soils and water: A review. Chem. Rev. 2014, 114, 8720–8745. [Google Scholar] [CrossRef] [PubMed]
- Cordill, C.; Grift, T.E. Design and testing of an intra-row mechanical weeding machine for corn. Biosyst. Eng. 2011, 110, 247–252. [Google Scholar]
- Norremark, M.; Griepentrog, H.W.; Nielsen, J.; Sogaard, H.T. The development and assessment of the accuracy of an autonomous GPS-based system for intra-row mechanical weed control in row crops. Biosyst. Eng. 2008, 101, 396–410. [Google Scholar] [CrossRef]
- Tillett, N.D.; Hague, T.; Grundy, A.C.; Dedousis, A.P. Mechanical within-row weed control for transplanted crops using computer vision. Biosyst. Eng. 2008, 99, 171–178. [Google Scholar] [CrossRef]
- Shapira, U.; Herrmann, I.; Karnieli, A.; Bonfil, D.J. Field spectroscopy for weed detection in wheat and chickpea fields. Int. J. Remote Sens. 2013, 34, 6094–6108. [Google Scholar]
- Huang, Y.; Lee, M.A.; Thomson, S.J.; Reddy, K.N. Ground-based hyperspectral remote sensing for weed management in crop production. Int. J. Agric. Biol. Eng. 2016, 9, 98–109. [Google Scholar]
- Zwiggelaar, R. A review of spectral properties of plants and their potential use for crop/weed discrimination in row-crops. Crop Prot. 1998, 17, 189–206. [Google Scholar] [CrossRef]
- Longchamps, L.; Panneton, B.; Samson, G.; Leroux, G.D.; Thériault, R. Discrimination of corn, grasses and dicot weeds by their UV-induced fluorescence spectral signature. Precis. Agric. 2010, 11, 181–197. [Google Scholar]
- Andújar, D.; Rueda-Ayala, V.; Moreno, H.; Rosell-Polo, J.R.; Escolá, A.; Valero, C.; Gerhards, R.; Fernández-Quintanilla, C.; Dorado, J.; Griepentrog, H.-W. Discriminating Crop, Weeds and Soil Surface with a Terrestrial LIDAR Sensor. Sensors 2013, 13, 14662–14675. [Google Scholar] [CrossRef] [Green Version]
- Reiser, D.; Martin-Lopez, J.; Memic, E.; Vazquez-Arellano, M.; Brandner, S.; Griepentrog, H. 3D imaging with a sonar sensor and an automated 3-axes frame for selective spraying in controlled conditions. J. Imaging 2017, 3, 9. [Google Scholar] [CrossRef] [Green Version]
- Taghadomi-Saberi, S.; Hemmat, A. Improving field management by machine vision—A review. Agric. Eng. Int. CIGR J. 2015, 17, 92–111. [Google Scholar]
- Pérez-Ortiza, M.; Pena, J.M.; Gutiérrezb, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F. A semi-supervised system for weed mapping in sunflower crops using unmanned aerial vehicles and a crop row detection method. Appl. Soft Comput. 2015, 37, 533–544. [Google Scholar] [CrossRef]
- Søgaard, H.T.; Olsen, H.J. Determination of crop rows by image analysis without segmentation. Comput. Electron. Agric. 2003, 38, 141–158. [Google Scholar] [CrossRef]
- Gui, Y.; Fan, G.; Tang, B.; Zhang, Y.; Wu, Y. Image Recognition of Weeds and Crops Based on Convolution Neural Network. J. Luoyang Inst. Sci. Technol. Nat. Sci. Ed. 2019, 29, 78–82. (In Chinese) [Google Scholar]
- Montalvo, M.; Pajares, G.; Guerrero, J.M.; Romeo, J.; Guijarro, M.; Ribeiro, A.; Ruz, J.J.; Cruz, J.M. Automatic detection of crop rows in maize fields with high weeds pressure. Expert Syst. Appl. 2012, 39, 11889–11897. [Google Scholar] [CrossRef] [Green Version]
- Bakker, T.; Wouters, H.; Asselt, K.; Bontsema, J.; Tang, L.; Müller, J.; van Straten, G. A vision based row detection system for sugar beet. Comput. Electron. Agric. 2008, 60, 87–95. [Google Scholar] [CrossRef]
- Jiang, G.; Wang, Z.; Liu, H. Automatic detection of crop rows based on multi-ROIs. Expert Syst. Appl. 2015, 42, 2429–2441. [Google Scholar] [CrossRef]
- Cheng, B.; Matson, E. A Feature-Based Machine Learning Agent for Automatic Rice and Weed Discrimination. In Proceedings of the International Conference on Artificial Intelligence and Soft Computing, Zakopane, Poland, 14–18 June 2015; pp. 517–527. [Google Scholar]
- Hung, C.; Xu, Z.; Sukkarieh, S. Feature learning based approach for weed classification using high resolution aerial images from a digital camera mounted on a UAV. Remote Sens. 2014, 6, 12037–12054. [Google Scholar] [CrossRef] [Green Version]
- Kim, K.; Lee, K.R.; Kim, W.H.; Park, K.B.; Kim, T.H.; Kim, J.S.; Pak, J.J. Polymer-based flexible tactile sensor up to 32 × 32 arrays integrated with interconnection terminals. Sens. Actuators A Phys. 2009, 156, 284–291. [Google Scholar] [CrossRef]
- Wang, H.; Zhou, D.; Cao, J. Development of a skin-like tactile sensor array for curved surface. IEEE Sens. J. 2014, 14, 55–61. [Google Scholar] [CrossRef]
- Noda, K.; Matsumoto, K.; Shimoyama, I. Stretchable tri-axis force sensor using conductive liquid. Sens. Actuators A Phys. 2014, 215, 123–129. [Google Scholar] [CrossRef]
- Seminara, L.; Pinna, L.; Valle, M.; Basiricò, L.; Loi, A.; Cosseddu, P.; Bonfiglio, A.; Ascia, A.; Biso, M.; Ansaldo, A.; et al. Piezoelectric polymer transducer arrays for flexible tactile sensors. IEEE Sens. J. 2013, 13, 4022–4029. [Google Scholar] [CrossRef]
- Alfadhel, A.; Kosel, J. Magnetic nanocomposite cilia tactile sensor. Adv. Mater. 2015, 27, 7888–7892. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Adler, R.; Desmares, P.J. An Economical Touch Panel Using SAW Absorption. In Proceedings of the IEEE 1985 Ultra sonics Symposium, San Francisco, CA, USA, 16–18 October 1985; pp. 499–502. [Google Scholar]
- Charlebois, M.; Gupta, K.; Payandeh, S. On estimating local shape using contact sensing. J. Robot. Syst. 2000, 17, 643–658. [Google Scholar] [CrossRef]
- Russell, R.A.; Parkinson, S. Sensing Surface Shape by Touch. In Proceedings of the 1993 IEEE International Conference on Robotics and Automation, Atlanta, GA, USA, 2–6 May 1993. [Google Scholar]
- Maheshwari, V.; Saraf, R.F. High-Resolution thin-film device to sense texture by touch. Science 2006, 312, 1501–1504. [Google Scholar] [CrossRef] [Green Version]
- Howe, R.D.; Cutkosky, M.R. Sensing Skin Acceleration for Slip and Texture Perception. In Proceedings of the IEEE International Conference on Robotics and Automation, Scottsdale, AZ, USA, 14–19 May 1989; pp. 145–150. [Google Scholar]
- Ozaki, K.; Hosoda, K.; Asada, M. Active Sensing of Surface Roughness by Using Array-Type Pressure Sensor Based on Dynamic Changes of Contact Force. In Proceedings of the International Conference on Instrumentation, Control and Information Technology, Chofu, Japan, 20–22 August 1994; pp. 595–596. [Google Scholar]
- Tanaka, M.; Li, N.; Chonan, S. Active Tactile Sensing Using a Two-Finger System. In Proceedings of the International Conference on Motion and Vibration Control, Saitama, Japan, 1–3 January 2002; pp. 762–767. [Google Scholar]
- Chen, X.; Huang, Z.; Ma, X.; Qi, L.; Fang, G. Design and experiment of tactile sensing device for measuring rice curvature. Trans. Chin. Soc. Agric. Mach. 2020, 51, 45–53. (In Chinese) [Google Scholar]
- Xu, L.; Yu, C.; Liu, W.; Yuan, Q.; Ma, S.; Duan, Z.; Xing, J. Optimal design on auto obstacle avoidance mechanism of intra-row weeder for trellis cultivated grape. Trans. Chin. Soc. Agric. Eng. 2018, 34, 23–30. (In Chinese) [Google Scholar]
- Jia, H.; Li, S.; Wang, G.; Liu, H. Design and experiment of seedling avoidable weeding control device for intertillage maize (Zea Mays L.). Trans. Chin. Soc. Agric. Eng. 2018, 34, 15–22. (In Chinese) [Google Scholar]
- Su, N.; Xiong, J.; Zhang, Q.; Huang, C. Research Methods of the Rotating Machinery Fault Diagnosis. Mach. Tool Hydraul. 2018, 46, 133–139. (In Chinese) [Google Scholar]
- Ding, S.; Deng, C.; Qiu, T. Sensibility analysis of heat transfer characteristics to dimensionless criterion in central inlet rotating disk cavity. Chin. J. Aeronaut. 2019, 40, 23–31. (In Chinese) [Google Scholar]
- Tan, X.; Shan, M.; Xu, X.; Ma, C.; Zhi, Y. Study on fault diagnosis method based on fractal box dimension of Indicator diagram for rod-less oil pumping machine. Mach. Tool Hydraul. 2014, 42, 187–189. (In Chinese) [Google Scholar]
- Wu, L.; Jia, S.; Xing, Y.; Lu, S.; Pan, J.; Yan, F. Machine identification of electrical penetration graphic waveforms of aphid based on fractal dimension and Hilbert-Huang transform. Trans. Chin. Soc. Agric. Eng. 2018, 34, 175–183. [Google Scholar]
- Nadafzadeh, M.; Mehdizadeh, S.A. Design and fabrication of an intelligent control system for determination of watering time for turfgrass plant using computer vision system and artificial neural network. Precis. Agric. 2019, 20, 857–879. [Google Scholar] [CrossRef]
- Zhang, S.; Chen, X.; Yang, R.; Li, P.; Cai, Q. Method of principal component analysis based on intra-class distance and inter-class distance. Comput. Eng. Des. 2020, 41, 2177–2183. (In Chinese) [Google Scholar]
- Ruan, X.; Zhu, Y.; Li, J.; Cheng, Y. Predicting the citation counts of individual papers via a BP neural network. J. Informetr. 2020, 14, 101039. [Google Scholar] [CrossRef]
- Wang, S.; Wu, T.H.; Shao, T.; Peng, Z. Integrated model of BP neural network and CNN algorithm for automatic wear debris classification. Wear 2019, 426–427, 1761–1770. [Google Scholar] [CrossRef]
- Zhang, T.; Huang, J. Detection of chicken infected with avian influenza based on audio features and fuzzy neural network. Trans. Chin. Soc. Agric. Eng. 2019, 35, 168–174. (In Chinese) [Google Scholar]
Type | Mean Value | Variance | Standard Deviation | Root Mean Square | Peak-to-Peak Value |
---|---|---|---|---|---|
A | 4.4887085163 | 0.0000010871 | 0.0010409263 | 4.4897086375 | 0.0065800754 |
B | 4.4901966346 | 0.0000011615 | 0.0009045177 | 4.4901967198 | 0.0070144666 |
C | 4.5020162453 | 0.0000130044 | 0.0036061615 | 4.4941781713 | 0.0137633645 |
D | 4.5016362546 | 0.0000125733 | 0.0035458856 | 4.4937001083 | 0.0136222347 |
Type | Kurtosis | Skewness | Waveform Factor | PULSE FACTOR | Peak Factor | Margin Factor |
---|---|---|---|---|---|---|
A | 2.9811741532 | 0.5894722756 | 1.0000000269 | 0.0014659157 | 0.0014659156 | 0.0014659157 |
B | 4.5065444387 | 0.6108866395 | 1.0000000312 | 0.0015621864 | 0.0015621865 | 0.0015621864 |
C | 1.6274810909 | 0.2772723874 | 1.0000003219 | 0.0030624074 | 0.0030624064 | 0.0030624079 |
D | 1.8685779661 | 0.4941003920 | 1.0000003113 | 0.0030313559 | 0.0030313549 | 0.0030313564 |
Type | Box Dimension | Hurst Exponent |
---|---|---|
A | 1.5932511530 | 0.9627329822 |
B | 1.5657002652 | 0.9406033958 |
C | 1.5636990305 | 1.0030099281 |
D | 1.5647215164 | 1.0028526065 |
Type | Training Set | Testing Set 1 | Testing Set 2 | Testing Set 3 |
---|---|---|---|---|
A | 300 | 50 | 57 | 43 |
B | 300 | 50 | 43 | 57 |
C | 300 | 50 | 50 | 50 |
D | 300 | 50 | 50 | 50 |
Total | 1200 | 200 | 200 | 200 |
Type | Testing Set 1 | Testing Set 2 | Testing Set 3 |
---|---|---|---|
Correct number of type A | 40 | 43 | 35 |
Correct number of type B | 28 | 26 | 33 |
Correct number of type C | 40 | 39 | 40 |
Correct number of type D | 39 | 36 | 37 |
Accuracy of weeds | 80% | 75.4% | 81.4% |
Accuracy of rice | 71.3% | 70.6% | 70.1% |
Type | Testing Set 1 | Testing Set 2 | Testing Set 3 |
---|---|---|---|
Correct number of type A | 38 | 45 | 33 |
Correct number of type B | 35 | 31 | 41 |
Correct number of type C | 38 | 39 | 37 |
Correct number of type D | 35 | 34 | 38 |
Accuracy of weeds | 76% | 78.9% | 76.7% |
Accuracy of rice | 72% | 72.7% | 73.9% |
Type | Testing Set 1 | Testing Set 2 | Testing Set 3 |
---|---|---|---|
Correct number of type A | 39 | 43 | 32 |
Correct number of type B | 40 | 33 | 44 |
Correct number of type C | 40 | 41 | 41 |
Correct number of type D | 42 | 43 | 41 |
Accuracy of weeds | 78% | 75.4% | 74.4% |
Accuracy of rice | 81.3% | 81.8% | 80.3% |
Type | The First Time | The Second Time | The Third Time | The Fourth Time | The Fifth Time |
---|---|---|---|---|---|
a | 10 | 20 | 12 | 11 | 2 |
b | 100 | 150 | 249 | 290 | 124 |
c | 2 | 0 | 1 | 1 | 5 |
d | 5 | 1 | 3 | 1 | 5 |
e | 6 | 10 | 14 | 5 | 9 |
f | 40 | 37 | 153 | 11 | 37 |
g | 1 | 2 | 1 | 1 | 0 |
h | 220 | 370 | 349 | 460 | 324 |
i | 1 | 0 | 0 | 2 | 2 |
j | 2 | 1 | 5 | 1 | 2 |
k | 4 | 4 | 2 | 2 | 1 |
l | 33 | 137 | 18 | 20 | 177 |
m | 72 | 54 | 39 | 37 | 39 |
Type | Testing Set 1 | Testing Set 2 | Testing Set 3 |
---|---|---|---|
Correct number of type A | 41 | 46 | 35 |
Correct number of type B | 46 | 39 | 52 |
Correct number of type C | 49 | 50 | 49 |
Correct number of type D | 48 | 47 | 48 |
Accuracy of weeds | 82% | 80.7% | 81.4% |
Accuracy of rice | 95.3% | 95.1% | 94.9% |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, X.; Mao, Y.; Ma, X.; Qi, L. A Tactile Method for Rice Plant Recognition Based on Machine Learning. Sensors 2020, 20, 5135. https://doi.org/10.3390/s20185135
Chen X, Mao Y, Ma X, Qi L. A Tactile Method for Rice Plant Recognition Based on Machine Learning. Sensors. 2020; 20(18):5135. https://doi.org/10.3390/s20185135
Chicago/Turabian StyleChen, Xueshen, Yuanyang Mao, Xu Ma, and Long Qi. 2020. "A Tactile Method for Rice Plant Recognition Based on Machine Learning" Sensors 20, no. 18: 5135. https://doi.org/10.3390/s20185135
APA StyleChen, X., Mao, Y., Ma, X., & Qi, L. (2020). A Tactile Method for Rice Plant Recognition Based on Machine Learning. Sensors, 20(18), 5135. https://doi.org/10.3390/s20185135