CN111553245A - Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion - Google Patents
Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion Download PDFInfo
- Publication number
- CN111553245A CN111553245A CN202010332782.4A CN202010332782A CN111553245A CN 111553245 A CN111553245 A CN 111553245A CN 202010332782 A CN202010332782 A CN 202010332782A CN 111553245 A CN111553245 A CN 111553245A
- Authority
- CN
- China
- Prior art keywords
- remote sensing
- image
- machine learning
- learning algorithm
- sar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 42
- 238000004422 calculation algorithm Methods 0.000 title claims abstract description 29
- 238000010801 machine learning Methods 0.000 title claims abstract description 25
- 230000004927 fusion Effects 0.000 title claims abstract description 22
- 238000012544 monitoring process Methods 0.000 claims abstract description 51
- 238000012549 training Methods 0.000 claims abstract description 12
- 238000011156 evaluation Methods 0.000 claims abstract description 3
- 241000196324 Embryophyta Species 0.000 claims description 32
- 238000012706 support-vector machine Methods 0.000 claims description 30
- 238000013528 artificial neural network Methods 0.000 claims description 29
- 238000012937 correction Methods 0.000 claims description 12
- 238000011835 investigation Methods 0.000 claims description 7
- 239000011159 matrix material Substances 0.000 claims description 6
- 230000010287 polarization Effects 0.000 claims description 6
- 238000007781 pre-processing Methods 0.000 claims description 4
- 230000005855 radiation Effects 0.000 claims description 4
- 238000011160 research Methods 0.000 description 9
- 238000012360 testing method Methods 0.000 description 8
- 238000009826 distribution Methods 0.000 description 7
- 238000005259 measurement Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 7
- 241000894007 species Species 0.000 description 6
- 230000008859 change Effects 0.000 description 4
- ATJFFYVFTNAWJD-UHFFFAOYSA-N Tin Chemical compound [Sn] ATJFFYVFTNAWJD-UHFFFAOYSA-N 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000003595 spectral effect Effects 0.000 description 3
- 238000009827 uniform distribution Methods 0.000 description 3
- 238000004458 analytical method Methods 0.000 description 2
- 238000007635 classification algorithm Methods 0.000 description 2
- 230000003247 decreasing effect Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 238000013507 mapping Methods 0.000 description 2
- 238000003672 processing method Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 238000013473 artificial intelligence Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 235000008216 herbs Nutrition 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000005036 nerve Anatomy 0.000 description 1
- 238000011002 quantification Methods 0.000 description 1
- 238000011155 quantitative monitoring Methods 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 238000012163 sequencing technique Methods 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012876 topography Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/188—Vegetation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
- G06F18/2411—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Artificial Intelligence (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- General Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Biophysics (AREA)
- Computational Linguistics (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Evolutionary Biology (AREA)
- Bioinformatics & Computational Biology (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Multimedia (AREA)
- Medical Informatics (AREA)
- Image Processing (AREA)
- Radar Systems Or Details Thereof (AREA)
- Image Analysis (AREA)
Abstract
The invention relates to the field of ecological environment monitoring, and discloses a vegetation classification method based on a machine learning algorithm and multi-source remote sensing data fusion, which is used for efficiently realizing identification and classification of vegetation types in a target area. The method comprises the steps of firstly, acquiring a low-altitude remote sensing image of a terrestrial plant in a sample region by using an unmanned aerial vehicle, and acquiring a digital ortho-image and a digital surface model of the sample region based on the low-altitude remote sensing image; then extracting elevation information of the digital surface model; then, acquiring an SAR image of a sample area corresponding to the time of aerial photography of the unmanned aerial vehicle by utilizing satellite remote sensing; then, the digital orthographic image, the elevation information and the SAR image are fused with the wave band and the image; performing inversion model training and inversion model precision evaluation on the fused image through sample region measured data and a machine learning algorithm to obtain an inversion model meeting requirements; and finally, classifying the terrestrial plants in the target area based on the inversion model. The invention is suitable for monitoring the ecological environment of terrestrial plants.
Description
Technical Field
The invention relates to the field of ecological environment monitoring, in particular to a vegetation classification method based on a machine learning algorithm and multi-source remote sensing data fusion.
Background
In the prior art, several monitoring methods such as satellite remote sensing single data source inversion (multispectral, hyperspectral, laser radar and synthetic aperture radar) and field real-time survey have the problems of poor applicability and low accuracy. Researches on fusion, classification, quantitative inversion and the like of multi-source remote sensing data are key for improving and enhancing the ecological environment monitoring technology.
With the improvement of the satellite types in China, including the carrying of various sensors such as high resolution, hyperspectrum and Synthetic Aperture Radar (SAR) and the supplement and enhancement of low altitude remote sensing of the unmanned aerial vehicle, an all-weather, all-directional and all-time-sequence monitoring system combined with the sky and the sky is formed. The method is based on data fusion of the low-altitude remote sensing image of the unmanned aerial vehicle and the satellite Synthetic Aperture Radar (SAR) image, identifies and classifies vegetation in a research area by using a machine learning algorithm, fully excavates the potential of the high-resolution image of the unmanned aerial vehicle and the satellite SAR image for identifying and classifying the vegetation, and further expands the application of the multi-source remote sensing technology in the aspect of ecological monitoring of terrestrial plants. Lays a solid foundation for realizing the accuracy, the quantification and the digitization of the ecological environment monitoring.
Disclosure of Invention
The technical problem to be solved by the invention is as follows: the vegetation classification method based on the machine learning algorithm and the multi-source remote sensing data fusion is used for efficiently realizing identification and classification of vegetation types in a target area.
In order to solve the problems, the invention adopts the technical scheme that: a vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion comprises the following steps:
acquiring a low-altitude remote sensing image of a terrestrial plant in the sample area by using an unmanned aerial vehicle, and acquiring a digital orthographic image (DOM) and a Digital Surface Model (DSM) of the sample area based on the low-altitude remote sensing image;
extracting elevation information of a sample area in a Digital Surface Model (DSM);
acquiring an SAR remote sensing image of a sample area corresponding to the aerial time of the unmanned aerial vehicle by utilizing satellite remote sensing;
fusing wave bands and images of the digital ortho-image DOM, the elevation information and the SAR remote sensing image of the sample area;
performing inversion model training and inversion model precision evaluation on the fused image through sample area measured data and a machine learning algorithm to obtain an inversion model which meets the precision requirement and meets the practical terrestrial plant ecological environment monitoring and investigation results;
and identifying and classifying the land plants in the target area based on the inversion model.
Further, the fusion of the wave bands and the images of the DOM, the elevation information and the SAR remote sensing image of the sample region specifically comprises:
extracting characteristic factors of the SAR remote sensing image;
and fusing the wave bands and the images by using the extracted characteristic factors, the DOM and the elevation information.
Further, the above extracted feature factors include: the SAR remote sensing image is characterized by comprising backscattering coefficients extracted by the SAR remote sensing image in VV and VH polarization modes and texture indexes extracted by a gray level co-occurrence matrix.
Further, the texture index includes a Mean (Mean), a Variance (Variance), a cooperativity (Homogeneity), a Contrast (Contrast), a Dissimilarity (Dissimilarity), an Entropy (Entropy), a Second Moment (Second Moment), and a Correlation (Correlation).
Furthermore, in order to ensure the quality of the SAR remote sensing image, before extracting the characteristic factors of the SAR remote sensing image, the SAR remote sensing image needs to be subjected to certain preprocessing, including radiation correction, atmospheric correction, orthorectification, geometric fine correction, image mosaic and image cutting.
Further, the machine learning algorithm comprises a Support Vector Machine (SVM) and an Artificial Neural Network (ANN).
The invention has the beneficial effects that: the invention combines the low-altitude remote sensing of the unmanned aerial vehicle with the SAR remote sensing of the satellite, and has the advantages of strong timeliness, high spatial resolution, wide monitoring range, full parameter index and the like. The three-dimensional model constructed by the unmanned aerial vehicle low-altitude remote sensing can accurately measure indexes such as volume, area and length, can output data types of general formats such as a DEM (digital elevation model), a DOM (document object model) digital ortho-image and a DSM (digital surface model), and has the characteristics of strong operability, lower equipment cost, high sampling rate, high precision and resolution and the like; meanwhile, satellite SAR remote sensing images are fused, so that the characteristic indexes of the classified images are more comprehensive and perfect, and information such as backscattering coefficients, texture indexes and the like is added.
Two machine learning algorithms of a Support Vector Machine (SVM) and an Artificial Neural Network (ANN) can be distinguished from the traditional remote sensing image classification method, wherein the Support Vector Machine (SVM) has the advantages of solving a high-dimensional problem, namely a large feature space; interactions that can handle non-linear features; the problem of machine learning under a small sample is solved; the whole data is not required to be relied on; the generalization ability is stronger. Artificial Neural Networks (ANN) have advantages including high accuracy of classification, strong parallel distribution processing capability, and strong distribution storage and learning capabilities; the method has stronger robustness and fault-tolerant capability on noise nerves, can fully approximate to a complex nonlinear relation, and has the function of associative memory and the like. Compared with the traditional influence classification algorithm, the classification characteristics and classification accuracy are greatly improved, the subsequent quantitative inversion model can be further debugged and perfected along with the expansion and enrichment of the actual measurement sample, and the quantitative inversion model suitable for researching the highest classification accuracy and the highest calculation efficiency is constructed, so that the method has obvious advantages in ecological environment monitoring by using a Support Vector Machine (SVM) method and an Artificial Neural Network (ANN) method as the classification processing method for monitoring the ecological environment of the terrestrial plants.
The invention greatly improves the capability of accurately, efficiently and quantitatively acquiring, analyzing, calculating and processing different vegetation monitoring index data in the terrestrial plant ecological environment monitoring work based on unmanned aerial vehicle low-altitude remote sensing combined with satellite SAR remote sensing data fusion and a classification algorithm of a Support Vector Machine (SVM) and an Artificial Neural Network (ANN), and also revolutionarily changes the display effect and mode of monitoring results. Generally speaking, the application of the terrestrial plant ecological monitoring method based on unmanned aerial vehicle low-altitude remote sensing, satellite SAR remote sensing data fusion, Support Vector Machine (SVM) and Artificial Neural Network (ANN) can fill the technical blank that no conventional monitoring method and means for quantitative monitored contents are available in ecological environment monitoring work, and greatly improve the automation degree of field monitoring work.
Drawings
FIG. 1 is a flow chart of an embodiment.
Detailed Description
The invention provides a land plant ecological monitoring method based on unmanned aerial vehicle low-altitude remote sensing, satellite SAR remote sensing data fusion and machine learning algorithms (SVM, ANN), aiming at the problems of poor applicability and low accuracy of a plurality of monitoring methods of satellite remote sensing single data source inversion (multispectral, hyperspectral, laser radar and synthetic aperture radar) and field real-time survey and the like adopted in the prior art.
Firstly, carrying out ground vegetation type investigation according to the sample area range, vegetation distribution characteristics and result classification accuracy requirements, designing a sample plot layout scheme, and actually measuring and recording parameter information such as the type, position, quantity, height and the like of typical terrestrial vegetation of each investigation point. Wherein the vegetation type requires the basic unit species (species) documented to biological classification; the position coordinates are measured by RTK, and the precision reaches the centimeter level; the vegetation height adopts a laser range finder/laser altimeter, and the precision reaches the decimeter level; measuring the vegetation quantity and recording arbors, shrubs and herbs visible from unmanned aerial vehicles and satellite SAR images, wherein the arbors record the number of tree species with the diameter at breast height larger than 5 cm; recording the number of plants of a single shrub plant and recording the number of clusters; the herbal record coverage area. And correspondingly recording according to the vegetation types, wherein the number of each vegetation type is not less than 50 survey units. Meanwhile, ground survey point locations are required to be uniformly distributed in positions and vegetation types in a research area, and the area related to or covered by the point locations at least exceeds 20% of the range of the research area.
Secondly, the system can adopt multi-rotor or fixed wing unmanned aerial vehicle to carry on the visible light camera and take the slope (multi-rotor) or perpendicular (fixed wing) to shoot according to design flight band, height and the number of times of putting up the sample area, gather the control point of evenly distributed and different elevations simultaneously in the research area scope. Under the explanation, the unmanned aerial vehicle with multiple rotors or fixed wings is adopted, and the three-dimensional modeling requirement of target precision can be realized by both the unmanned aerial vehicle with the multiple rotors or the unmanned aerial vehicle with the fixed wings according to the working cost, the working period and the size of a research area.
And thirdly, the number and the distribution of the control points are designed in advance and the control points are used for calibrating the low-altitude remote sensing image of the unmanned aerial vehicle. After the control point measurement, the system performs coordinate system registration, area integral adjustment and multi-view image dense matching on a plurality of original image photos (RGB images) with different angles and elevations acquired from the sample area.
Fourthly, the system extracts dense point cloud data of the matched image, and the density of the extracted point cloud is based on the three-dimensional modeling precision and the scale. The method comprises the steps of generating a TIN triangulation network by utilizing dense point cloud data, carrying out photo (RGB image) texture mapping, generating a digital ortho-image (DOM) and a Digital Surface Model (DSM) with a large proportion of 1:500 or 1:000, and if the DOM and the DSM have phenomena such as suspended matters or flower drawing, re-editing and rebuilding the three-dimensional model to ensure the precision of the three-dimensional model of a monitoring area.
Fifthly, the elevation information of the research area is extracted through the generated accurate DSM model and is used as an index of image classification.
And sixthly, the system obtains the SAR remote sensing image of the sample area corresponding to the aerial time of the multi-rotor or fixed-wing unmanned aerial vehicle through satellite remote sensing, the average cloud amount is ensured to be less than 20%, and the imaging quality is better. And carrying out preprocessing such as radiation correction, atmospheric correction, orthorectification, geometric fine correction, image mosaic, image cutting and the like on the SAR remote sensing image.
And seventhly, extracting characteristic factors of the preprocessed SAR remote sensing image, wherein the characteristic factors comprise backscattering coefficients extracted in a VV polarization mode and a VH polarization mode and 8 texture indexes extracted by a gray level co-occurrence matrix, and the 8 texture indexes are respectively a Mean value (Mean), a Variance (Variance), a cooperativity (Homogeneity), a Contrast (Contrast), a Dissimilarity (Dissimilarity), an Entropy (Entropy), a Second Moment (Second Moment) and a Correlation (Correlation).
And eighthly, the system fuses the wave bands and the images with the backscatter coefficients, the 8 texture indexes, the digital ortho-image DOM and the elevation information of the research area. In the fusion, it needs to ensure that the information of each band is not lost and changed and the values of each band in the same pixel are in one-to-one correspondence and matching.
Ninth, according to the requirements and principles of actual terrestrial plant ecological environment monitoring, the system performs quantitative inversion model training by selecting at least sample point data of seven proportions of 50%, 55%, 60%, 65%, 70%, 75%, 80% and the like by using a Support Vector Machine (SVM) and an Artificial Neural Network (ANN) according to the types (including herbaceous plants, shrubs and arbors, wherein the arbors are further divided into a plurality of typical tree species) of the terrestrial plant ecological environment monitoring vegetation classification and combining information such as spectral information, high-level data, backscattering coefficients, 8 texture indexes and the like, and different proportions can be properly increased and decreased, and the system is not limited to the seven proportions. Meanwhile, at least 50%, 45%, 40%, 35%, 30%, 25% and 20% of sample point data are correspondingly selected to carry out the test of the quantitative inversion model. The sequencing is carried out according to the precision of the prediction model through two methods, different training samples, testing samples, debugging of classification parameter indexes and the like.
And tenth, calculating the precision of quantitative inversion models of different training samples and test sample numbers of a Support Vector Machine (SVM) and an Artificial Neural Network (ANN) by adopting a confusion matrix, evaluating the classification precision of the models by adopting the total classification precision and a Kappa coefficient, and selecting the quantitative inversion model which has the highest classification precision and meets the requirements of actual terrestrial plant ecological environment monitoring and investigation results.
Eleventh, classifying the vegetation types monitored by the ecological environment of the terrestrial plants in the target area according to the inversion model obtained in the tenth step according to the specified types and requirements, and extracting the terrestrial plant information in the target area, wherein the extracted information comprises the average height, the area size, the position distribution and other conditions of various vegetation. The classification method for the unmanned aerial vehicle low-altitude remote sensing, the satellite SAR images and the machine learning algorithms (SVM and ANN) has the advantages of strong operability, strong learning improvement capability, lower equipment cost, rich parameter indexes, high sampling rate, high precision and resolution and the like. The method has obvious advantages in ecological environment monitoring by using the method as a method for monitoring and classifying the ecological environment of terrestrial plants. Generally speaking, the application of the terrestrial plant ecological monitoring method based on unmanned aerial vehicle low-altitude remote sensing, satellite SAR images and machine learning algorithms (SVM and ANN) can fill the technical blank that no conventional monitoring method and means for quantitative monitored contents of vegetation exist in ecological environment monitoring work, and greatly improve the automation degree of field monitoring work.
Examples
The method of the present invention is further described with reference to the following drawings and examples, which are only for the purpose of helping the reader to better understand the method of the present invention, and are not intended to limit the scope of the claims of the present invention.
The method based on multi-source remote sensing data fusion, object-oriented classification and the like greatly improves the capability of accurately, efficiently and quantitatively acquiring, analyzing, calculating and processing different vegetation monitoring index data in the ecological environment monitoring work of terrestrial plants, and also revolutionarily changes the display effect and mode of monitoring results. The method fills the technical blank that no conventional monitoring method and means for quantitative monitoring content of vegetation exist in the ecological environment monitoring work, greatly improves the automation degree of field monitoring work, and lays a solid foundation for the ecological environment monitoring informatization of terrestrial plants.
As shown in FIG. 1, in this example, an ecologically sensitive area of 5000m by 5000m is first selectedAnd as a monitoring target area, developing actual measurement work of ground vegetation types according to the requirements of research area and classification result precision, and investigating, measuring and recording information such as the type, position, quantity, height and the like of typical terrestrial vegetation at each point. Ground survey of vegetation is required to cover at least 5km, with uniform distribution of vegetation location, uniform distribution of tree species type within the area of study2The area of (a).
And secondly, knowing the conditions of topography, vegetation coverage, water system distribution and the like of the monitoring target area, and planning a navigation band, a number of frames and an elevation of the aerial survey of the unmanned aerial vehicle in advance by combining the requirements of a three-dimensional model scale, achievement precision and the like. A multi-rotor or fixed-wing unmanned aerial vehicle is adopted to carry a visible light camera to shoot a target area obliquely or vertically. Meanwhile, control point measurement is carried out according to the requirements of uniform distribution and different elevations, and the precision of the three-dimensional modeling after the image is obtained is guaranteed. The image acquisition tool can be an airborne single lens reflex or a digital camera, the pixels of the camera lens are matched with the flying height, the achievement scale and the precision, and the image acquisition equipment which can meet the requirements of mechanism and image processing can be used as the image acquisition tool.
Wherein 2.1: the control point measurement needs 100 control point signboards with the size of 2m, the control point signboards are uniformly placed in a monitoring area and different elevations (the position with larger fluctuation needs to be placed, and the position with gradual height step needs to be placed), and absolute coordinate measurement and recording are carried out on the central points of the 100 control points through RTK. And carrying out aerial survey on the monitored area by using low-altitude flight of the unmanned aerial vehicle to obtain an initial image photo.
Wherein 2.2: aerial survey parameters of a monitoring area are that the flying height is within 200m of the multi-rotor wing, and the fixed wing is 800m to 1000 m; the flight band has 12 rotors and 6 fixed wings; the ground resolution is within 0.3m of the multi-rotor wing and within 1m of the fixed wing; the scale is 1:500 of multi-rotor wing and 1:1000 of fixed wing; the shooting angle is 30-90 degrees for multiple rotors, and the fixed wing is 90 degrees; the longitudinal flight band overlapping degree is at least more than 60%, and the lateral direction is at least more than 30%.
And thirdly, processing the obtained initial image picture by utilizing oblique photography three-dimensional modeling software, performing coordinate system registration by depending on the measured ground control point, and then performing dense matching on the whole adjustment of the region and the multi-view image.
Fourthly, generating dense point cloud, further generating TIN triangulation network, and generating a 1:500 or 1:000 large-scale digital ortho-image (DOM) and a Digital Surface Model (DSM) through texture mapping. And from elevation information of a Digital Surface Model (DSM) as an index for image classification.
Fifthly, satellite SAR data consistent with the unmanned aerial vehicle aerial time is obtained, the average cloud amount of the image is less than 20%, the imaging quality is good, and the ground resolution at least meets the following requirements (can be further improved): the ground resolution of the interference broad-width mode is 5m by 20 m; wave mode ground resolution is 5m by 5 m; the stripe mode ground resolution is 5m by 5 m; the ground resolution of the super-wide mode breadth is 20m × 40m, and the super-wide mode breadth is subjected to preprocessing such as radiation correction, atmospheric correction, orthorectification, geometric fine correction, image mosaic, image cutting and the like.
And sixthly, extracting characteristic factors of the preprocessed SAR remote sensing image by the system, wherein the characteristic factors comprise backscattering coefficients extracted in a VV polarization mode and a VH polarization mode and 8 texture indexes extracted by utilizing a gray level co-occurrence matrix.
And seventhly, the system fuses the backscattering coefficient and 8 texture indexes (Mean, Variance, cooperativity, Contrast, Dissimilarity, Entropy, Second Moment and Correlation) of the extracted satellite SAR remote sensing image with a digital orthographic image DOM generated by unmanned aerial vehicle low-altitude remote sensing three-dimensional modeling and elevation information extracted from a Digital Surface Model (DSM) by using a system, wherein the fused image has information such as high spatial resolution, ground object height, backscattering coefficient and 8 texture indexes.
And eighthly, selecting sample point data in seven proportions of 50%, 55%, 60%, 65%, 70%, 75%, 80% and the like by the system according to the requirements and principles of actual terrestrial plant ecological environment monitoring and herbage, shrub and arbor classification including tree species 1, 2, 3 and 4) in combination with information such as spectral information, high-level data, backscattering coefficients and 8 texture indexes, and carrying out quantitative inversion model training by adopting a Support Vector Machine (SVM) and an Artificial Neural Network (ANN), wherein different proportions can be increased and decreased properly, and the system is not limited to the seven proportions. Meanwhile, the quantitative inversion model test is carried out by correspondingly selecting 50%, 45%, 40%, 35%, 30%, 25% and 20% of sample point data. The training model precision and the prediction model precision are obtained through the work of debugging different training samples, testing samples, classification parameter indexes and the like.
Achievement accuracy of different training samples and test samples
TABLE 1
And ninthly, the system adopts a confusion matrix to calculate the precision of quantitative inversion models of different training samples and test sample numbers of a Support Vector Machine (SVM) and an Artificial Neural Network (ANN), adopts total classification precision and Kappa coefficients to evaluate the classification precision of each model, and selects the quantitative inversion model which has the highest classification precision and meets the requirements of actual terrestrial plant ecological environment monitoring and investigation results. The SVM algorithm in which the training samples were 70% and the test samples were 30% had a total classification accuracy of 79.01% and a Kappa coefficient of 0.769. In the example, the SVM algorithm is high, but in some examples, the ANN algorithm is high, the advantages of the two algorithms are complementary, and the advantages of the two algorithms are respectively long.
And finally, classifying the vegetation types monitored by the ecological environment of the terrestrial plants in the target area according to the specified types and requirements according to the inversion model. And processing the second-stage image of the same target area according to the same method to obtain the average height, the area size, the position distribution and other conditions of various vegetation in the second stage, performing superposition comparison analysis on the classification results of the two stages, and calculating the dynamic change range and the area of different vegetation types. The total classification precision of the second-stage images is 81.23%, the Kappa coefficient is 0.784, and the precision requirement is met. Specific two-phase data change results are shown in table 2:
dynamic monitoring change value of terrestrial plants in target area in two stages
TABLE 2 hm2
The classification method of unmanned aerial vehicle low-altitude remote sensing, satellite SAR remote sensing data fusion and machine learning algorithm (SVM, ANN) is a comprehensive computer image artificial intelligence processing method with strong operability, low cost and higher precision. The point cloud data of the target area can be extracted from the picture shot by the real object, so that the data such as a TIN triangulation network, a DEM/DOM and the like are generated, and the rapid three-dimensional model reconstruction is realized. Meanwhile, satellite SAR remote sensing images are fused, so that the characteristic indexes of the classified images are more comprehensive and complete, and the classified images comprise information such as spectral information, high-level data, backscattering coefficients and texture indexes. The intelligent classification of the target images can be realized by utilizing machine learning algorithms (SVM and ANN), and higher classification precision is ensured, so that the change conditions of different vegetation types in different periods of the monitored target area can be completely and accurately acquired. The new monitoring method has the advantages of high sampling rate, high precision and resolution, non-contact measurement and the like. Therefore, the ecological monitoring method for the terrestrial plants based on the unmanned aerial vehicle low-altitude remote sensing, satellite SAR remote sensing data fusion and machine learning algorithm (SVM, ANN) classification method has a good application prospect in ecological environment monitoring, can realize quantitative, efficient and digital processing, analysis and display of dynamic monitoring of the terrestrial plants, greatly improves the automation and informatization capability of ecological environment monitoring work of the terrestrial plants, and provides more detailed data for terrestrial plant investigation, design of environmental protection measures of the terrestrial plants and the like.
Claims (6)
1. A vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion is characterized by comprising the following steps:
acquiring a low-altitude remote sensing image of a terrestrial plant in the sample area by using an unmanned aerial vehicle, and acquiring a digital orthographic image (DOM) and a Digital Surface Model (DSM) of the sample area based on the low-altitude remote sensing image;
extracting elevation information of a sample area in a Digital Surface Model (DSM);
acquiring an SAR remote sensing image of a sample area corresponding to the aerial time of the unmanned aerial vehicle by utilizing satellite remote sensing;
fusing wave bands and images of the digital ortho-image DOM, the elevation information and the SAR remote sensing image of the sample area;
performing inversion model training and inversion model precision evaluation on the fused image through sample area measured data and a machine learning algorithm to obtain an inversion model which meets the precision requirement and meets the practical terrestrial plant ecological environment monitoring and investigation results;
and classifying the land plants in the target area based on the inversion model.
2. The vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion of claim 1, wherein the band and image fusion of the DOM, the elevation information and the SAR remote sensing image of the sample area comprises:
extracting characteristic factors of the SAR remote sensing image;
and fusing the wave bands and the images by using the extracted characteristic factors, the DOM and the elevation information.
3. The vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion of claim 2, characterized in that the extracted feature factors comprise: the SAR remote sensing image is characterized by comprising backscattering coefficients extracted by the SAR remote sensing image in VV and VH polarization modes and texture indexes extracted by a gray level co-occurrence matrix.
4. The method of claim 3, wherein the texture index comprises mean, variance, cooperativity, contrast, dissimilarity, entropy, second moment, and correlation.
5. The vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion of claim 2, characterized in that before extracting the characteristic factors of the SAR remote sensing image, certain preprocessing is required to be carried out on the SAR remote sensing image, including radiation correction, atmospheric correction, orthorectification, geometric fine correction, image mosaic and image clipping.
6. The method for vegetation classification based on machine learning algorithm and multi-source remote sensing data fusion of claim 1, wherein the machine learning algorithm comprises Support Vector Machine (SVM) and Artificial Neural Network (ANN).
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010332782.4A CN111553245A (en) | 2020-04-24 | 2020-04-24 | Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010332782.4A CN111553245A (en) | 2020-04-24 | 2020-04-24 | Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111553245A true CN111553245A (en) | 2020-08-18 |
Family
ID=72002536
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010332782.4A Pending CN111553245A (en) | 2020-04-24 | 2020-04-24 | Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111553245A (en) |
Cited By (20)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112051226A (en) * | 2020-09-03 | 2020-12-08 | 山东省科学院海洋仪器仪表研究所 | Method for estimating total suspended matter concentration of offshore area based on unmanned aerial vehicle-mounted hyperspectral image |
CN112287892A (en) * | 2020-11-23 | 2021-01-29 | 中国电建集团成都勘测设计研究院有限公司 | Arbor biomass measurement and calculation method based on unmanned aerial vehicle hyperspectral and machine learning algorithm |
CN112330582A (en) * | 2020-12-24 | 2021-02-05 | 黑龙江省网络空间研究中心 | Unmanned aerial vehicle image and satellite remote sensing image fusion algorithm |
CN112329649A (en) * | 2020-11-09 | 2021-02-05 | 上海圣之尧智能科技有限公司 | Urban vegetation type identification method, system, equipment and medium |
CN112418075A (en) * | 2020-11-20 | 2021-02-26 | 北京艾尔思时代科技有限公司 | Corn lodging region detection method and system based on canopy height model |
CN112462756A (en) * | 2020-10-29 | 2021-03-09 | 久瓴(上海)智能科技有限公司 | Agriculture and forestry operation task generation method and device, computer equipment and storage medium |
CN112906645A (en) * | 2021-03-15 | 2021-06-04 | 山东科技大学 | Sea ice target extraction method with SAR data and multispectral data fused |
CN113009481A (en) * | 2021-01-15 | 2021-06-22 | 扬州哈工科创机器人研究院有限公司 | Forest surface feature imaging inversion method based on interferometric SAR radar |
CN113033714A (en) * | 2021-05-24 | 2021-06-25 | 华中师范大学 | Object-oriented automatic machine learning method and system for multi-mode multi-granularity remote sensing image |
CN113091599A (en) * | 2021-04-06 | 2021-07-09 | 中国矿业大学 | Surface three-dimensional deformation extraction method fusing unmanned aerial vehicle DOM and satellite-borne SAR images |
CN113269028A (en) * | 2021-04-07 | 2021-08-17 | 南方科技大学 | Water body change detection method and system based on deep convolutional neural network |
CN113705990A (en) * | 2021-08-17 | 2021-11-26 | 内蒙古申科国土技术有限责任公司 | Natural resource information processing method and device, electronic equipment and storage medium |
CN113885060A (en) * | 2021-09-26 | 2022-01-04 | 中国农业科学院草原研究所 | Grazing intensity monitoring method based on unmanned aerial vehicle remote sensing technology |
CN114548812A (en) * | 2022-03-01 | 2022-05-27 | 武汉鸟瞰天下科技有限公司 | Forest and grassland fire risk prediction decision method, device and computer equipment |
CN115830442A (en) * | 2022-11-11 | 2023-03-21 | 中国科学院空天信息创新研究院 | Machine learning-based remote sensing estimation method and system for wheat tiller density |
CN115965812A (en) * | 2022-12-13 | 2023-04-14 | 桂林理工大学 | Evaluation method for wetland vegetation species and ground feature classification by unmanned aerial vehicle image |
CN116029430A (en) * | 2022-12-27 | 2023-04-28 | 江苏师范大学科文学院 | Grassland ecological environment monitoring system based on aerial image |
CN116385883A (en) * | 2023-04-13 | 2023-07-04 | 珠江水利委员会珠江水利科学研究院 | Unmanned plane mountain shadow area vegetation coverage correction method, equipment and medium |
CN116399820A (en) * | 2023-06-07 | 2023-07-07 | 中国科学院空天信息创新研究院 | Method, device, equipment and medium for verifying authenticity of vegetation remote sensing product |
CN118366059A (en) * | 2024-06-20 | 2024-07-19 | 山东锋士信息技术有限公司 | Crop water demand calculating method based on optical and SAR data fusion |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108960300A (en) * | 2018-06-20 | 2018-12-07 | 北京工业大学 | A kind of urban land use information analysis method based on deep neural network |
CN109684929A (en) * | 2018-11-23 | 2019-04-26 | 中国电建集团成都勘测设计研究院有限公司 | Terrestrial plant ECOLOGICAL ENVIRONMENTAL MONITORING method based on multi-sources RS data fusion |
CN110427592A (en) * | 2019-06-17 | 2019-11-08 | 成都理工大学 | A kind of vegetation ecological water reserve evaluation method of optical remote sensing image collaboration polarization SAR image |
-
2020
- 2020-04-24 CN CN202010332782.4A patent/CN111553245A/en active Pending
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108960300A (en) * | 2018-06-20 | 2018-12-07 | 北京工业大学 | A kind of urban land use information analysis method based on deep neural network |
CN109684929A (en) * | 2018-11-23 | 2019-04-26 | 中国电建集团成都勘测设计研究院有限公司 | Terrestrial plant ECOLOGICAL ENVIRONMENTAL MONITORING method based on multi-sources RS data fusion |
CN110427592A (en) * | 2019-06-17 | 2019-11-08 | 成都理工大学 | A kind of vegetation ecological water reserve evaluation method of optical remote sensing image collaboration polarization SAR image |
Non-Patent Citations (4)
Title |
---|
胡德勇等: "《遥感图像处理原理和方法》", 30 November 2014 * |
贾坤等: "微波后向散射数据改进农作物光谱分类精度研究", 《光谱学与光谱分析》 * |
陈伟利等: "基于SVM的多光谱影像与SAR图像融合地物分类研究", 《安徽农业科学》 * |
魏梦莹等: "基于遥感的林芝地区森林植被分布研究", 《林业科技通讯》 * |
Cited By (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112051226B (en) * | 2020-09-03 | 2022-10-21 | 山东省科学院海洋仪器仪表研究所 | Method for estimating total suspended matter concentration of offshore area based on unmanned aerial vehicle-mounted hyperspectral image |
CN112051226A (en) * | 2020-09-03 | 2020-12-08 | 山东省科学院海洋仪器仪表研究所 | Method for estimating total suspended matter concentration of offshore area based on unmanned aerial vehicle-mounted hyperspectral image |
CN112462756A (en) * | 2020-10-29 | 2021-03-09 | 久瓴(上海)智能科技有限公司 | Agriculture and forestry operation task generation method and device, computer equipment and storage medium |
CN112462756B (en) * | 2020-10-29 | 2022-11-25 | 久瓴(上海)智能科技有限公司 | Agriculture and forestry operation task generation method and device, computer equipment and storage medium |
CN112329649A (en) * | 2020-11-09 | 2021-02-05 | 上海圣之尧智能科技有限公司 | Urban vegetation type identification method, system, equipment and medium |
CN112418075A (en) * | 2020-11-20 | 2021-02-26 | 北京艾尔思时代科技有限公司 | Corn lodging region detection method and system based on canopy height model |
CN112287892B (en) * | 2020-11-23 | 2022-12-06 | 中国电建集团成都勘测设计研究院有限公司 | Arbor biomass measurement and calculation method based on unmanned aerial vehicle hyperspectral and machine learning algorithm |
CN112287892A (en) * | 2020-11-23 | 2021-01-29 | 中国电建集团成都勘测设计研究院有限公司 | Arbor biomass measurement and calculation method based on unmanned aerial vehicle hyperspectral and machine learning algorithm |
CN112330582A (en) * | 2020-12-24 | 2021-02-05 | 黑龙江省网络空间研究中心 | Unmanned aerial vehicle image and satellite remote sensing image fusion algorithm |
CN113009481A (en) * | 2021-01-15 | 2021-06-22 | 扬州哈工科创机器人研究院有限公司 | Forest surface feature imaging inversion method based on interferometric SAR radar |
CN112906645A (en) * | 2021-03-15 | 2021-06-04 | 山东科技大学 | Sea ice target extraction method with SAR data and multispectral data fused |
CN112906645B (en) * | 2021-03-15 | 2022-08-23 | 山东科技大学 | Sea ice target extraction method with SAR data and multispectral data fused |
CN113091599B (en) * | 2021-04-06 | 2021-12-03 | 中国矿业大学 | Surface three-dimensional deformation extraction method fusing unmanned aerial vehicle DOM and satellite-borne SAR images |
CN113091599A (en) * | 2021-04-06 | 2021-07-09 | 中国矿业大学 | Surface three-dimensional deformation extraction method fusing unmanned aerial vehicle DOM and satellite-borne SAR images |
CN113269028A (en) * | 2021-04-07 | 2021-08-17 | 南方科技大学 | Water body change detection method and system based on deep convolutional neural network |
CN113033714A (en) * | 2021-05-24 | 2021-06-25 | 华中师范大学 | Object-oriented automatic machine learning method and system for multi-mode multi-granularity remote sensing image |
CN113705990A (en) * | 2021-08-17 | 2021-11-26 | 内蒙古申科国土技术有限责任公司 | Natural resource information processing method and device, electronic equipment and storage medium |
CN113885060A (en) * | 2021-09-26 | 2022-01-04 | 中国农业科学院草原研究所 | Grazing intensity monitoring method based on unmanned aerial vehicle remote sensing technology |
CN114548812A (en) * | 2022-03-01 | 2022-05-27 | 武汉鸟瞰天下科技有限公司 | Forest and grassland fire risk prediction decision method, device and computer equipment |
CN115830442A (en) * | 2022-11-11 | 2023-03-21 | 中国科学院空天信息创新研究院 | Machine learning-based remote sensing estimation method and system for wheat tiller density |
CN115830442B (en) * | 2022-11-11 | 2023-08-04 | 中国科学院空天信息创新研究院 | Remote sensing estimation method and system for wheat stem tiller density based on machine learning |
CN115965812A (en) * | 2022-12-13 | 2023-04-14 | 桂林理工大学 | Evaluation method for wetland vegetation species and ground feature classification by unmanned aerial vehicle image |
CN115965812B (en) * | 2022-12-13 | 2024-01-19 | 桂林理工大学 | Evaluation method for classification of unmanned aerial vehicle images on wetland vegetation species and land features |
CN116029430A (en) * | 2022-12-27 | 2023-04-28 | 江苏师范大学科文学院 | Grassland ecological environment monitoring system based on aerial image |
CN116385883A (en) * | 2023-04-13 | 2023-07-04 | 珠江水利委员会珠江水利科学研究院 | Unmanned plane mountain shadow area vegetation coverage correction method, equipment and medium |
CN116385883B (en) * | 2023-04-13 | 2024-01-05 | 珠江水利委员会珠江水利科学研究院 | Unmanned plane mountain shadow area vegetation coverage correction method, equipment and medium |
CN116399820A (en) * | 2023-06-07 | 2023-07-07 | 中国科学院空天信息创新研究院 | Method, device, equipment and medium for verifying authenticity of vegetation remote sensing product |
CN116399820B (en) * | 2023-06-07 | 2023-08-04 | 中国科学院空天信息创新研究院 | Method, device, equipment and medium for verifying authenticity of vegetation remote sensing product |
CN118366059A (en) * | 2024-06-20 | 2024-07-19 | 山东锋士信息技术有限公司 | Crop water demand calculating method based on optical and SAR data fusion |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111553245A (en) | Vegetation classification method based on machine learning algorithm and multi-source remote sensing data fusion | |
CN112287892B (en) | Arbor biomass measurement and calculation method based on unmanned aerial vehicle hyperspectral and machine learning algorithm | |
Torres-Sánchez et al. | Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards | |
Cunliffe et al. | Ultra-fine grain landscape-scale quantification of dryland vegetation structure with drone-acquired structure-from-motion photogrammetry | |
Kalisperakis et al. | Leaf area index estimation in vineyards from UAV hyperspectral data, 2D image mosaics and 3D canopy surface models | |
Liu et al. | LiDAR-derived high quality ground control information and DEM for image orthorectification | |
CN109684929A (en) | Terrestrial plant ECOLOGICAL ENVIRONMENTAL MONITORING method based on multi-sources RS data fusion | |
CN111709981A (en) | Registration method of laser point cloud and analog image with characteristic line fusion | |
Yin et al. | Estimation of grassland height based on the random forest algorithm and remote sensing in the Tibetan Plateau | |
CN117152371B (en) | Three-dimensional topographic mapping method and system | |
Wu et al. | Estimation of cotton canopy parameters based on unmanned aerial vehicle (UAV) oblique photography | |
Nizeyimana | Remote Sensing and GIS Integration | |
Lou et al. | An effective method for canopy chlorophyll content estimation of marsh vegetation based on multiscale remote sensing data | |
Liu et al. | Maize height estimation using combined unmanned aerial vehicle oblique photography and LIDAR canopy dynamic characteristics | |
Dong et al. | Drone-based three-dimensional photogrammetry and concave hull by slices algorithm for apple tree volume mapping | |
Zhang et al. | UAV‐derived imagery for vegetation structure estimation in rangelands: validation and application | |
Calou et al. | Estimation of maize biomass using unmanned aerial vehicles | |
Domazetovic et al. | Assessing the Vertical Accuracy of Worldview-3 Stereo-extracted Digital Surface Model over Olive Groves. | |
Saponaro et al. | Predicting the accuracy of photogrammetric 3D reconstruction from camera calibration parameters through a multivariate statistical approach | |
CN110580468B (en) | Single wood structure parameter extraction method based on image matching point cloud | |
Saponaro et al. | Influence of co-alignment procedures on the co-registration accuracy of multi-epoch SFM points clouds | |
Couteron et al. | Linking remote-sensing information to tropical forest structure: The crucial role of modelling | |
Marcaccio et al. | Potential use of remote sensing to support the management of freshwater fish habitat in Canada | |
CN112052720B (en) | High-space-time normalization vegetation index NDVI fusion model based on histogram clustering | |
Piermattei et al. | Analysis of glacial and periglacial processes using structure from motion. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200818 |
|
RJ01 | Rejection of invention patent application after publication |