CN110852207A - Blue roof building extraction method based on object-oriented image classification technology - Google Patents
Blue roof building extraction method based on object-oriented image classification technology Download PDFInfo
- Publication number
- CN110852207A CN110852207A CN201911037489.9A CN201911037489A CN110852207A CN 110852207 A CN110852207 A CN 110852207A CN 201911037489 A CN201911037489 A CN 201911037489A CN 110852207 A CN110852207 A CN 110852207A
- Authority
- CN
- China
- Prior art keywords
- blue
- segmentation
- buildings
- remote sensing
- brightness
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/176—Urban or other man-made structures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/30—Noise filtering
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/247—Aligning, centring, orientation detection or correction of the image by affine transforms, e.g. correction due to perspective effects; Quadrilaterals, e.g. trapezoids
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Multimedia (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Data Mining & Analysis (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Cheminformatics (AREA)
- General Engineering & Computer Science (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a blue roof building extraction method based on an object-oriented image classification technology, which can completely and accurately extract a blue roof building from a complex remote sensing image. The method comprises the following steps: obtaining a surface feature object obtained by segmenting a remote sensing image of a research area, wherein the surface feature object comprises: a blue roof building; obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings; and extracting the blue roof buildings in the research area from the segmentation result by using an extraction rule. The method is suitable for the technical field of remote sensing image classification.
Description
Technical Field
The invention relates to the technical field of object images, in particular to a blue roof building extraction method based on an object-oriented image classification technology.
Background
Remote Sensing images (RS) are films or photographs that record the size of electromagnetic waves of various features, and are mainly classified into aerial photographs and satellite photographs.
In the prior art, there are many remote sensing image classification methods based on pixels with high classification precision, such as U-net in a deep learning method, SVM based image classification in a machine learning method, and the like. However, the pixel-based remote sensing image classification methods do not fully utilize all features of the remote sensing image, neglect the characteristics of the surface feature object, and cannot clearly and completely retain the boundary information of the surface feature object, so that the extraction accuracy of the surface feature object is low.
Disclosure of Invention
The invention aims to solve the technical problem of providing a blue roof building extraction method based on an object-oriented image classification technology, and aims to solve the problem that the remote sensing image classification method based on pixels in the prior art ignores the characteristics of a ground object, so that the extraction accuracy of the ground object is low.
In order to solve the above technical problems, an embodiment of the present invention provides a blue rooftop building extraction method based on an object-oriented image classification technology, including:
obtaining a surface feature object obtained by segmenting a remote sensing image of a research area, wherein the surface feature object comprises: a blue roof building;
obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings;
and extracting the blue roof buildings in the research area from the segmentation result by using an extraction rule.
Further, before obtaining the ground object obtained by segmenting the remote sensing image of the research area, the method further comprises:
and carrying out orthorectification on the remote sensing image of the research area, and removing Gaussian noise in the orthorectified image by utilizing Gaussian filtering.
Further, after orthorectifying the remote sensing image of the research area and removing gaussian noise in the orthorectified image by gaussian filtering, the method further comprises:
determining the optimal segmentation scale of the blue roof building by using a mean variance method;
and carrying out multi-scale segmentation on the remote sensing image of the research area according to the determined optimal segmentation scale to obtain a plurality of ground object objects.
Further, the obtaining of the extraction rule for distinguishing the blue roof building according to the spectral feature and the shape feature of the surface feature object comprises:
according to the spectral characteristics of the ground object, determining the difference ratio of the red and blue wave bands and a first threshold value which needs to be larger than the difference ratio, and obtaining a first extraction rule, wherein blue _ ratio is larger than the first threshold value;
determining the Brightness and a second threshold value which needs to be larger than the Brightness according to the spectral characteristics of the ground object to obtain a second extraction rule, wherein Brightness is larger than the second threshold value;
and determining the length-width ratio and a third threshold value which needs to be larger according to the shape characteristics of the ground object, and obtaining a third extraction rule that L _ W is larger than the third threshold value.
Further, the red and blue band difference ratio is expressed as:
blue_ratio=(B-R)/R
wherein, blue _ ratio represents the difference ratio of red and blue wave bands, B represents the mean value of blue wave bands of the ground object obtained by segmentation, and R represents the mean value of red wave bands of the ground object obtained by segmentation.
Further, the luminance is represented as:
Brightness=(B+R+G)/3
wherein Brightness represents Brightness, and G represents the mean value of the divided feature object green band.
Further, the aspect ratio is expressed as:
L_W=L/W
wherein L _ W represents an aspect ratio, L represents a length of the divided ground feature object minimum bounding rectangle, and W represents a width of the divided ground feature object minimum bounding rectangle.
Further, the extracting, by using the extraction rule, the blue rooftop buildings in the research area from the segmentation result includes:
determining blue _ ratio, Brightness and L _ W of all ground object objects;
extracting a blue roof building and a shadow mixed object from the segmentation result according to an extraction rule that blue _ ratio is larger than a first threshold value;
and according to two extraction rules of Brightness & gt a second threshold and L _ W & gt a third threshold, removing shadow parts from the blue roof buildings and shadow mixed objects, and extracting the blue roof buildings from the segmentation results.
The technical scheme of the invention has the following beneficial effects:
in the above scheme, a surface feature object obtained by segmenting a remote sensing image of a research area is obtained, wherein the surface feature object includes: a blue roof building; obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings; and extracting the blue roof buildings in the research area from the segmentation result by using an extraction rule. Therefore, according to the characteristics of the blue roof remote sensing image, the blue roof building can be completely and accurately extracted from the complex remote sensing image by using the object-oriented image classification technology.
Drawings
Fig. 1 is a schematic flowchart of a blue rooftop building extraction method based on an object-oriented image classification technique according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of an original remote sensing image of a research area according to an embodiment of the present invention;
fig. 3 is a schematic diagram of a remote sensing image after gaussian filtering processing according to an embodiment of the present invention;
FIG. 4 is a diagram illustrating mean and variance corresponding to different segmentation scales according to an embodiment of the present invention;
FIG. 5 is a schematic flowchart of a multi-scale segmentation algorithm according to an embodiment of the present invention;
FIG. 6 is a diagram illustrating a multi-scale segmentation result according to an embodiment of the present invention;
fig. 7 is a schematic diagram of a part of extracted results of blue rooftop buildings in a research area according to an embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
The invention provides a blue roof building extraction method based on an object-oriented image classification technology, aiming at the problem that the feature of a ground object is ignored by the existing pixel-based remote sensing image classification method, so that the ground object extraction accuracy is low.
As shown in fig. 1, the method for extracting a blue rooftop building based on an object-oriented image classification technology according to an embodiment of the present invention includes:
s101, obtaining a surface feature object obtained by segmenting a remote sensing image of a research area, wherein the surface feature object comprises: a blue roof building;
s102, obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings;
and S103, extracting the blue roof buildings in the research area from the segmentation result by using the extraction rule.
The blue roof building extraction method based on the object-oriented image classification technology obtains a surface feature object obtained by segmenting a remote sensing image of a research area, wherein the surface feature object comprises: a blue roof building; obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings; and extracting the blue roof buildings in the research area from the segmentation result by using an extraction rule. Therefore, according to the characteristics of the blue roof remote sensing image, the blue roof building can be completely and accurately extracted from the complex remote sensing image by using the object-oriented image classification technology.
In this embodiment, the processing object of the object-oriented image classification technology is no longer an image element, but a polygon entity is used as a minimum unit. The polygon entity includes not only spectral information of pixels but also information such as shape information. Therefore, the object-oriented remote sensing image classification technology can be fully applied to various information such as spectral features, shape features and the like of images, the unique advantages of the images are well utilized, and the negative problem caused by overhigh resolution of the images can be solved.
In this embodiment, before performing S101, in order to complete the accurate extraction of the blue rooftop building and retain the relevant features of the blue rooftop building object, the following steps are further performed:
a1, preprocessing an original remote sensing image of an acquired research area, weakening the influence of atmosphere, light and the like to a certain extent, and improving the image quality, as shown in fig. 2 and 3; the method specifically comprises the following steps:
a11, performing orthorectification on the remote sensing image of the research area;
in this embodiment, the orthorectification generally includes selecting some ground control points on an image, performing tilt correction and projective aberration correction on the image simultaneously by using Digital Elevation Model (DEM) data in the image range that has been obtained originally, and resampling the image into an orthorectified image, thereby solving the problem of image deformation caused by various factors during shooting of the image to a certain extent.
And A12, removing Gaussian noise in the orthoimage by Gaussian filtering to eliminate the influence of the noise in the subsequent steps.
A2, performing multi-scale segmentation on the preprocessed remote sensing image to obtain a segmented object with a shape similar to a real ground object, which specifically includes the following steps:
and A21, determining the optimal segmentation scale of the blue roof building by using a mean variance method.
In this embodiment, in the multi-scale segmentation process, it is most important to determine the optimal segmentation scale. The optimal segmentation scale is utilized to segment, so that a segmentation object with higher matching degree with the real ground feature can be obtained, and the extraction effect on the target ground feature (namely, the blue roof building) is improved.
Since the optimal segmentation scale can make the heterogeneity between objects in the same category as small as possible, and the heterogeneity between objects in different categories as large as possible. When the number of mixed objects in the image is increased, the spectral heterogeneity between the mixed objects and adjacent objects is reduced, and the mean variance of all the objects in the image is reduced; when the number of mixed objects in the image is increased, the spectral heterogeneity between different objects and adjacent objects is increased, and the mean variance of all the objects in the image is increased; that is to say: the optimal segmentation scale may occur at the peak of the mean variance. Therefore, in this embodiment, the optimal segmentation scale of the blue rooftop building can be determined by using the mean variance method. In order to select the optimal segmentation scale, the waveband weight, the color parameter, the compactness parameter and the smoothness parameter are determined, the segmentation scale range is set to be 10-170, and multi-scale segmentation is carried out once every 5 times by using the current segmentation scale. And the segmentation results are exported to EXCEL, and the mean variance of all segmented objects within the study area is calculated. And describing the variation trend of the mean variance of the segmented object under different segmentation scales by using a line graph. The mean variance for different segmentation scales is shown in fig. 4.
As can be seen from fig. 4, the mean variances of the different segmentation scales can be seen, and the optimal segmentation scales for the blue rooftop building are 10, 30, 45, 55, 95, 125, and 150.
In this embodiment, next, the optimal partition scale of the blue roof building needs to be selected from the better partition scales, specifically:
and (3) segmenting the research area by using the determined better segmentation scale, comparing segmentation results of the research area, comparing whether the number, shape and boundary of the pattern spots segmented under different segmentation scales are consistent with those of the real blue roof building or not, and selecting the segmentation scale with more complete segmentation objects and moderate segmentation object number as the optimal segmentation scale of the blue roof building type. By contrast, in this embodiment, 55 is selected as the optimal segmentation scale of the blue roof building, and the band weight is set to 1: 1: 1, color parameter set to 0.6, compactness and smoothness set to 0.5.
And A22, performing multi-scale segmentation on the remote sensing image of the research area according to the determined optimal segmentation scale to obtain a plurality of ground object objects.
And carrying out multi-scale segmentation on the research region by using the optimal segmentation parameters determined in the last step to obtain a plurality of segmentation objects.
In this embodiment, a multi-scale segmentation is performed on a research region by using the optimal segmentation scale 55 and other segmentation parameters (the band weight is set to 1: 1: 1, the color parameter is set to 0.6, and the compactness and smoothness are both set to 0.5), so as to obtain a plurality of segmentation objects.
In this embodiment, as shown in fig. 5, the multi-scale segmentation process includes: firstly, inputting a Gaussian filtered remote sensing image into a multi-scale segmentation algorithm, then setting segmentation parameters (segmentation scale, waveband weight, color parameters, compactness and smoothness) according to the characteristics of the image to be segmented, carrying out segmentation operation on the remote sensing image by the multi-scale segmentation algorithm according to the set segmentation parameters, and obtaining a more regular object by the multi-scale segmentation technology, wherein the multi-scale segmentation result is shown in FIG. 6, so that the extraction result of the blue roof house building is more accurate. In the segmentation process, the multi-scale segmentation algorithm judges whether the segmentation result meets the requirement according to whether the heterogeneity of the segmented object meets a preset fourth threshold value. The multi-scale segmentation algorithm requires that the heterogeneity inside the same object is as small as possible and the heterogeneity between different objects is as large as possible. Dividing for the first time, taking a single pixel as a starting point, calculating heterogeneity after being combined with adjacent pixels, if the heterogeneity after being combined is larger than or equal to a preset fourth threshold value, indicating that the newly added pixel does not belong to the object, and ending the division; and if the heterogeneity after combination is smaller than a preset fourth threshold, indicating that the newly added pixel belongs to the object, and continuing to execute the segmentation operation. Second segmentation, taking the polygon obtained by the first segmentation as a starting point, calculating heterogeneity between adjacent objects, if the heterogeneity is greater than or equal to a preset fourth threshold, indicating that the heterogeneity between the adjacent objects meets the segmentation requirement, and ending the segmentation; if the heterogeneity is smaller than the preset fourth threshold, it indicates that the heterogeneity between adjacent objects does not meet the segmentation requirement, and the object needs to be segmented again. Repeating the operation, dividing for the Nth time, taking the polygon divided for the N-1 times as a starting point, calculating the heterogeneity of the adjacent objects, and if the heterogeneity is smaller than a threshold value, indicating that the heterogeneity between different objects does not meet the requirement; continuing to execute the segmentation for N +1 times; and if the heterogeneity is larger than a preset fourth threshold, the heterogeneity among different objects meets the segmentation requirement, and the segmentation is finished.
In this embodiment, the method for extracting a blue roof building includes the following steps of segmenting an original remote sensing image of a research area into different feature objects through the multi-scale segmentation algorithm in step a2, and then classifying the feature objects obtained through multi-scale segmentation by using an object-oriented image classification technique, where the method specifically includes:
and A31, obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings.
In the present embodiment, after obtaining different feature objects, the target feature (blue roof building) needs to be distinguished from other features according to spectral characteristics, shape characteristics, and the like of the feature objects. By observing the characteristics of blue, green and red wave bands of the ground object, the blue roof building is found to have a higher value on the blue wave band, a lower value on the red wave band and a remarkable difference between the values of the red and blue wave bands. Blue roof buildings can therefore be distinguished by differences in the red and blue band values. Meanwhile, partial building shadows are also found to have similar spectral characteristics, the brightness of shadow areas is low, the shapes are generally thin and long, and the ratio of length to width is large. The shadow area can thus be removed using both the brightness and aspect ratio features, leaving a blue rooftop building object. Therefore, in order to highlight the difference between the two red and blue band values, the present embodiment constructs a red and blue band difference ratio parameter according to the spectral feature of the surface feature object:
blue_ratio=(B-R)/R
wherein, blue _ ratio represents the difference ratio of red and blue wave bands, B represents the mean value of blue wave bands of the ground object obtained by segmentation, and R represents the mean value of red wave bands of the ground object obtained by segmentation.
In this embodiment, according to the spectral feature and the shape feature of the surface feature object, two parameters, namely, the brightness and the aspect ratio, are also respectively constructed to be used for removing the shadow region from the extraction result; wherein the luminance parameter is represented as:
Brightness=(B+R+G)/3
wherein Brightness represents Brightness, and G represents the mean value of the divided feature object green band.
The aspect ratio parameter is expressed as:
L_W=L/W
wherein L _ W represents an aspect ratio, L represents a length of the divided ground feature object minimum bounding rectangle, and W represents a width of the divided ground feature object minimum bounding rectangle.
In the embodiment, the extraction rule of blue _ ratio > first threshold (for example, the first threshold is 0.395) is set according to the difference between the red and blue band difference ratio parameter of the blue roof building and other land types; meanwhile, two extraction rules of Brightness > a second threshold (for example, the second threshold is 90) and L _ W > a third threshold (for example, the third threshold is 10) are set according to the difference between the Brightness and the aspect ratio parameters of the shadow area and the blue roof building.
A32, extracting the blue roof buildings in the research area from the segmentation result by using the extraction rule.
In the embodiment, according to the calculation method of the difference ratio, the brightness and the aspect ratio of the red and blue wave bands, the values of the three parameters are calculated for all the ground object objects by utilizing the calculation field function in the ArcGIS; after the calculation is finished, extracting a blue roof building and a shadow mixed object from the segmentation result by utilizing the attribute selection, the data derivation function and the set blue _ ratio > a first threshold value in the ArcGIS; then, the shadow part is removed from the blue rooftop building and the shadow mixture object by using the selection by attribute, the data derivation function, and the two extraction rules of Brightness > second threshold and L _ W > third threshold set in ArcGIS, and the blue rooftop building is extracted from the segmentation result. In this embodiment, the blue roof building object can be extracted well from the segmentation result by using the three set extraction rules, and a part of the extraction result is shown in fig. 7.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.
Claims (8)
1. A blue roof building extraction method based on an object-oriented image classification technology is characterized by comprising the following steps:
obtaining a surface feature object obtained by segmenting a remote sensing image of a research area, wherein the surface feature object comprises: a blue roof building;
obtaining an extraction rule for distinguishing the blue roof buildings according to the spectral characteristics and the shape characteristics of the remote sensing images of the blue roof buildings;
and extracting the blue roof buildings in the research area from the segmentation result by using an extraction rule.
2. The method for extracting blue rooftop buildings based on the object-oriented image classification technology as claimed in claim 1, wherein before the step of obtaining the ground object obtained by segmenting the remote sensing image of the research area, the method further comprises:
and carrying out orthorectification on the remote sensing image of the research area, and removing Gaussian noise in the orthorectified image by utilizing Gaussian filtering.
3. The method for extracting blue rooftop buildings based on the object-oriented image classification technology as claimed in claim 2, wherein after orthorectifying the remote sensing image of the research area and removing the gaussian noise in the orthorectified image by using gaussian filtering, the method further comprises:
determining the optimal segmentation scale of the blue roof building by using a mean variance method;
and carrying out multi-scale segmentation on the remote sensing image of the research area according to the determined optimal segmentation scale to obtain a plurality of ground object objects.
4. The method for extracting blue rooftop buildings based on object-oriented image classification technology according to claim 1, wherein the obtaining of the extraction rule for distinguishing the blue rooftop buildings according to the spectral features and the shape features of the surface feature objects comprises:
according to the spectral characteristics of the ground object, determining the difference ratio of the red and blue wave bands and a first threshold value which needs to be larger than the difference ratio, and obtaining a first extraction rule, wherein blue _ ratio is larger than the first threshold value;
determining the Brightness and a second threshold value which needs to be larger than the Brightness according to the spectral characteristics of the ground object to obtain a second extraction rule, wherein Brightness is larger than the second threshold value;
and determining the length-width ratio and a third threshold value which needs to be larger according to the shape characteristics of the ground object, and obtaining a third extraction rule that L _ W is larger than the third threshold value.
5. The method for extracting blue rooftop buildings based on the object-oriented image classification technology as claimed in claim 4, wherein the red-blue band difference ratio is expressed as:
blue_ratio=(B-R)/R
wherein, blue _ ratio represents the difference ratio of red and blue wave bands, B represents the mean value of blue wave bands of the ground object obtained by segmentation, and R represents the mean value of red wave bands of the ground object obtained by segmentation.
6. The object-oriented image classification technique-based blue rooftop building extraction method of claim 5, wherein the brightness is expressed as:
Brightness=(B+R+G)/3
wherein Brightness represents Brightness, and G represents the mean value of the divided feature object green band.
7. The object-oriented image classification technique-based blue rooftop building extraction method of claim 6, wherein the aspect ratio is expressed as:
L_W=L/W
wherein L _ W represents an aspect ratio, L represents a length of the divided ground feature object minimum bounding rectangle, and W represents a width of the divided ground feature object minimum bounding rectangle.
8. The method for extracting blue rooftop buildings based on object-oriented image classification technology as claimed in claim 7, wherein the extracting the blue rooftop buildings in the research area from the segmentation result by using the extraction rule comprises:
determining blue _ ratio, Brightness and L _ W of all ground object objects;
extracting a blue roof building and a shadow mixed object from the segmentation result according to an extraction rule that blue _ ratio is larger than a first threshold value;
and according to two extraction rules of Brightness & gt a second threshold and L _ W & gt a third threshold, removing shadow parts from the blue roof buildings and shadow mixed objects, and extracting the blue roof buildings from the segmentation results.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911037489.9A CN110852207A (en) | 2019-10-29 | 2019-10-29 | Blue roof building extraction method based on object-oriented image classification technology |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911037489.9A CN110852207A (en) | 2019-10-29 | 2019-10-29 | Blue roof building extraction method based on object-oriented image classification technology |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110852207A true CN110852207A (en) | 2020-02-28 |
Family
ID=69599011
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911037489.9A Pending CN110852207A (en) | 2019-10-29 | 2019-10-29 | Blue roof building extraction method based on object-oriented image classification technology |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110852207A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563448A (en) * | 2020-04-30 | 2020-08-21 | 北京百度网讯科技有限公司 | Method and device for detecting illegal building, electronic equipment and storage medium |
CN112033914A (en) * | 2020-09-01 | 2020-12-04 | 深圳市数字城市工程研究中心 | Color steel tile factory building extraction method based on remote sensing image |
CN112927252A (en) * | 2021-04-12 | 2021-06-08 | 二十一世纪空间技术应用股份有限公司 | Newly-added construction land monitoring method and device |
CN116258958A (en) * | 2022-12-22 | 2023-06-13 | 二十一世纪空间技术应用股份有限公司 | Building extraction method and device for homologous high-resolution images and DSM data |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103279951A (en) * | 2013-05-13 | 2013-09-04 | 武汉理工大学 | Object-oriented remote sensing image building and shade extraction method of remote sensing image building |
AU2018101336A4 (en) * | 2018-09-12 | 2018-10-11 | Hu, Yuan Miss | Building extraction application based on machine learning in Urban-Suburban-Integration Area |
-
2019
- 2019-10-29 CN CN201911037489.9A patent/CN110852207A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103279951A (en) * | 2013-05-13 | 2013-09-04 | 武汉理工大学 | Object-oriented remote sensing image building and shade extraction method of remote sensing image building |
AU2018101336A4 (en) * | 2018-09-12 | 2018-10-11 | Hu, Yuan Miss | Building extraction application based on machine learning in Urban-Suburban-Integration Area |
Non-Patent Citations (3)
Title |
---|
刘丹丹等: "面向对象的多尺度高分影像建筑物提取方法研究", 《测绘与空间地理信息》 * |
吕道双等: "面向对象的多尺度多特征高分遥感影像建筑物提取", 《北京测绘》 * |
陈云浩等: "基于面向对象和规则的遥感影像分类研究", 《武汉大学学报》 * |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111563448A (en) * | 2020-04-30 | 2020-08-21 | 北京百度网讯科技有限公司 | Method and device for detecting illegal building, electronic equipment and storage medium |
CN112033914A (en) * | 2020-09-01 | 2020-12-04 | 深圳市数字城市工程研究中心 | Color steel tile factory building extraction method based on remote sensing image |
CN112033914B (en) * | 2020-09-01 | 2021-04-20 | 深圳市数字城市工程研究中心 | Color steel tile factory building extraction method based on remote sensing image |
CN112927252A (en) * | 2021-04-12 | 2021-06-08 | 二十一世纪空间技术应用股份有限公司 | Newly-added construction land monitoring method and device |
CN112927252B (en) * | 2021-04-12 | 2023-09-22 | 二十一世纪空间技术应用股份有限公司 | Newly-added construction land monitoring method and device |
CN116258958A (en) * | 2022-12-22 | 2023-06-13 | 二十一世纪空间技术应用股份有限公司 | Building extraction method and device for homologous high-resolution images and DSM data |
CN116258958B (en) * | 2022-12-22 | 2023-12-05 | 二十一世纪空间技术应用股份有限公司 | Building extraction method and device for homologous high-resolution images and DSM data |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111415363B (en) | Image edge identification method | |
CN109271991B (en) | License plate detection method based on deep learning | |
CN110852207A (en) | Blue roof building extraction method based on object-oriented image classification technology | |
CN109740639B (en) | Wind cloud satellite remote sensing image cloud detection method and system and electronic equipment | |
CN107564017B (en) | Method for detecting and segmenting urban high-resolution remote sensing image shadow | |
US11804025B2 (en) | Methods and systems for identifying topographic features | |
CN110766708B (en) | Image comparison method based on contour similarity | |
CN111369605A (en) | Infrared and visible light image registration method and system based on edge features | |
CN109726649B (en) | Remote sensing image cloud detection method and system and electronic equipment | |
CN104657980A (en) | Improved multi-channel image partitioning algorithm based on Meanshift | |
CN111680704B (en) | Automatic and rapid extraction method and device for newly-increased human active plaque of ocean red line | |
WO2014004271A2 (en) | Method and system for use of intrinsic images in an automotive driver-vehicle-assistance device | |
CN113255452A (en) | Extraction method and extraction system of target water body | |
CN110245600B (en) | Unmanned aerial vehicle road detection method for self-adaptive initial quick stroke width | |
JP4747122B2 (en) | Specific area automatic extraction system, specific area automatic extraction method, and program | |
CN115272876A (en) | Remote sensing image ship target detection method based on deep learning | |
CN107992856A (en) | High score remote sensing building effects detection method under City scenarios | |
JPH05181411A (en) | Map information collation and update system | |
CN110310263B (en) | SAR image residential area detection method based on significance analysis and background prior | |
CN109635679B (en) | Real-time target paper positioning and loop line identification method | |
CN109389063B (en) | Remote sensing image strip noise removing method based on wave band correlation | |
CN112818983B (en) | Method for judging character inversion by using picture acquaintance | |
JP2005234603A (en) | Map information updating method and map updating device | |
CN110322454B (en) | High-resolution remote sensing image multi-scale segmentation optimization method based on spectrum difference maximization | |
CN111178175A (en) | Automatic building information extraction method and system based on high-view satellite image |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200228 |