[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110827339B - Method for extracting target point cloud - Google Patents

Method for extracting target point cloud Download PDF

Info

Publication number
CN110827339B
CN110827339B CN201911071871.1A CN201911071871A CN110827339B CN 110827339 B CN110827339 B CN 110827339B CN 201911071871 A CN201911071871 A CN 201911071871A CN 110827339 B CN110827339 B CN 110827339B
Authority
CN
China
Prior art keywords
point cloud
cloud data
matrix
threshold value
normal vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911071871.1A
Other languages
Chinese (zh)
Other versions
CN110827339A (en
Inventor
朱翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Lianping Technology Co ltd
Original Assignee
Beijing Shenzhen Survey Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Shenzhen Survey Technology Co ltd filed Critical Beijing Shenzhen Survey Technology Co ltd
Priority to CN201911071871.1A priority Critical patent/CN110827339B/en
Publication of CN110827339A publication Critical patent/CN110827339A/en
Application granted granted Critical
Publication of CN110827339B publication Critical patent/CN110827339B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to a display method of session information, which comprises the following steps: acquiring first point cloud data, and performing drying treatment on the first point cloud data to generate second point cloud data; calculating a normal vector of the second point cloud data; selecting a reference normal vector; calculating an included angle between a normal vector of the second point cloud data and a reference normal vector; judging the second point cloud data with the included angle smaller than a first threshold value as an external point; judging the second point cloud data with the included angle larger than or equal to the first threshold value as an interior point, and generating third point cloud data; and extracting the target point cloud from the third point cloud data based on a random sample consensus (RANSAC) algorithm. The method for extracting the target point cloud provided by the embodiment of the invention can be used for accurately and effectively extracting the three-dimensional point cloud data extracted from a complex scene.

Description

Method for extracting target point cloud
Technical Field
The invention relates to a data extraction method, in particular to a target point cloud extraction method.
Background
With the continuous development of measurement technology, especially the rise of sensors for point cloud data acquisition in recent years, research on scene three-dimensional point cloud data extraction is increasingly flourishing.
At present, an algorithm designed mainly for a specific scene is researched for extracting a target of three-dimensional point cloud data, a characteristic value method and a least square method are the most commonly used plane fitting extraction algorithms, if the target scene has the influence of environmental factors such as object shielding, diffused light and the like, errors and abnormal values inevitably exist in the acquired point cloud data, and the effective point cloud extraction rate for processing point cloud extraction with large deviation and abnormal values by using the characteristic value method and the least square method is not high. A single random sampling consistency RANSAC algorithm is also used for extracting the target point cloud, but the boundary false extraction condition is easy to occur.
Disclosure of Invention
The invention aims to provide a method for extracting target point cloud aiming at the defects of the prior art, which can accurately and effectively extract the three-dimensional point cloud data extracted from a complex scene.
In order to achieve the above object, the present invention provides a method for extracting a target point cloud, the method comprising:
acquiring first point cloud data, and performing drying treatment on the first point cloud data to generate second point cloud data;
calculating a normal vector of the second point cloud data;
selecting a reference normal vector;
calculating an included angle between a normal vector of the second point cloud data and the reference normal vector;
judging the second point cloud data with the included angle smaller than a first threshold value as an external point; judging the second point cloud data with the included angle larger than or equal to a first threshold value as an interior point, and generating third point cloud data;
and extracting the target point cloud from the third point cloud data based on a random sample consensus (RANSAC) algorithm.
Further, the first point cloud data is three-dimensional coordinates captured by a time-of-flight sensor.
Further, the drying the first point cloud data to generate second point cloud data specifically includes:
extracting depth data of the first point cloud data, and establishing a two-dimensional point cloud matrix;
extracting K3 multiplied by 3 sub-matrixes of the two-dimensional point cloud matrix;
establishing a position index of a central element of the 3 x 3 sub-matrix in the two-dimensional point cloud matrix;
adding the central element in the 3 multiplied by 3 sub-matrix with the absolute value of the difference of other elements respectively, and recording as M;
if the M is larger than a second threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
if the M is smaller than or equal to a second threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
and generating the second point cloud data from the first point cloud data corresponding to the elements reserved in the two-dimensional point cloud matrix.
Further, if the M is greater than a second threshold value:
extracting 4 2 × 2 sub-matrices of the 3 × 3 sub-matrices;
comparing the absolute value of the difference between the element in each 2 x 2 sub-matrix and the central element, and recording the minimum value as N;
if N is larger than or equal to a third threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
if N is smaller than a third threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
and generating the second point cloud data by using the first point cloud data corresponding to the elements reserved in the two-dimensional point cloud matrix when the M is less than or equal to a second threshold and the N is less than a third threshold.
Further, the third threshold is not greater than half of the second threshold.
Further, the selecting of the reference normal vector specifically comprises:
and selecting a reference normal vector according to the normal vector characteristics of the object to be extracted.
Furthermore, in the K3 × 3 sub-matrices, the number of K is the number of internal elements surrounded by the first row, the last row, the first column, and the last column of the two-dimensional point cloud matrix.
Further, the step of establishing a position index of the central element of the 3 × 3 sub-matrix in the two-dimensional point cloud matrix specifically includes:
marking a row mark and a column mark of the central element of each 3 multiplied by 3 sub-matrix in the two-dimensional point cloud matrix, and matching corresponding depth data in the two-dimensional point cloud matrix according to the row mark and the column mark.
Further, the specific steps of performing target point cloud extraction on the third point cloud data based on the random sample consensus RANSAC algorithm are as follows:
randomly selecting a subset from the third point cloud data as an inner group;
generating an estimation model with the subset;
traversing the third point cloud data with the estimation model;
dividing the third point cloud data meeting constraint conditions into the inner groups;
and circulating the steps, selecting the most inner cluster points in the estimation model as an optimal model, and extracting all third point cloud data which accord with the optimal model.
Further, the termination condition of the above steps of the cycle is as follows: and jumping out of the cycle when a new subset cannot be randomly selected, or setting an inner group point threshold value which accords with the estimation model, and if the inner group point threshold value is greater than or equal to the inner group point threshold value, determining the estimation model as an optimal model and jumping out of the cycle.
The method for extracting the target point cloud provided by the embodiment of the invention can be used for accurately and effectively extracting the three-dimensional point cloud data extracted from a complex scene.
Drawings
Fig. 1 is a flowchart of a method for extracting a target point cloud according to an embodiment of the present invention.
Detailed Description
The technical solution of the present invention is further described in detail by the accompanying drawings and embodiments.
Fig. 1 is a flowchart of a method for extracting a target point cloud according to an embodiment of the present invention, as shown in fig. 1, the method includes:
step S110, first point cloud data is obtained, and drying processing is carried out on the first point cloud data to generate second point cloud data.
Specifically, the first point cloud data is three-dimensional point cloud data captured by a time-of-flight sensor, coordinates of the three-dimensional point cloud data are stored in a sensor chip in a pixel array mode, the coordinates of the first point cloud data are values generated according to a sensor coordinate system, the coordinates include position information of a measured scene on a projection plane and depth information of the measured scene, and the projection plane is perpendicular to the depth data direction.
The method comprises the following steps of performing drying treatment on first point cloud data:
and extracting the depth data of the first point cloud data, and establishing a two-dimensional point cloud matrix.
Specifically, since the first point cloud data is stored in an array form, and the pixel array is the resolution of the sensor, the number of columns and the number of rows of the two-dimensional point cloud matrix extracted from the depth data of the first point cloud data are consistent with the resolution of the time-of-flight sensor. In a specific example, when the resolution of the time-of-flight sensor is 32 × 48, that is, the number of horizontal pixels of the time-of-flight sensor is 32 and the number of vertical pixels is 48, the number of rows in the two-dimensional point cloud matrix is 48 and the number of columns in the two-dimensional point cloud matrix is 32.
Specifically, the element position arrangement in the two-dimensional point cloud matrix is consistent with the storage position of the time-of-flight sensor array, and each adjacent element in the two-dimensional point cloud matrix is also adjacent in an actual scene.
And extracting K3 multiplied by 3 sub-matrixes of the two-dimensional point cloud matrix.
Specifically, the number K of 3 × 3 submatrices that can be extracted in the two-dimensional point cloud matrix and are not repeated most is the number of internal elements surrounded by the first row, the last row, the first column, and the last column of the two-dimensional point cloud matrix. In a specific example, when the two-dimensional point cloud matrix is 32 × 48, there are 1536 elements in total, 156 elements of the first row, the last row, the first column and the last column of the matrix are removed, that is, K is 1380, so that 1380 3 × 3 sub-matrices can be extracted, and when K is the maximum value, the dryness judgment of each point can be guaranteed to the maximum extent.
And establishing a position index of the central element for establishing the 3 x 3 sub-matrix in the two-dimensional point cloud matrix.
Specifically, a row mark and a column mark of a central element of each 3 × 3 sub-matrix in the two-dimensional point cloud matrix are marked, and corresponding depth data in the two-dimensional point cloud matrix is matched according to the row mark and the column mark. Because the drying judgment is carried out on the central element of the 3 multiplied by 3 sub-matrix, only the position of the central element needs to be marked, thereby greatly reducing the calculation amount of the system.
And adding the central element in the 3 multiplied by 3 sub-matrix with the absolute value of the difference of other elements respectively, and recording the sum as M. Wherein, the other elements except the central element in the 3 × 3 sub-matrix are 8 elements of the first row, the third row, the first column and the third column.
If the M is smaller than or equal to a second threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
if the M is larger than a second threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
specifically, the size of the second threshold is in inverse proportion to the number of noise points, when the second threshold is smaller, more noise points are determined to be noise points, but when the threshold is too small, supersaturation occurs, target point cloud data in the scene is discarded by mistake, when the second threshold is larger, less noise points are determined, and when the second threshold is too large, too many noise points are retained, and the drying effect is not obvious, so that a suitable second threshold can be found by selecting different thresholds for multiple times according to a standard measurement scene, and in a specific embodiment, when the two-dimensional point cloud matrix is 240 × 320, preferably, the second threshold is 0.2.
And generating the second point cloud data from the first point cloud data corresponding to the elements reserved in the two-dimensional point cloud matrix.
In a preferred embodiment, when M is greater than the second threshold, making a further determination:
extracting 4 2 × 2 sub-matrices of the 3 × 3 sub-matrices. Wherein each of said 2 x 2 sub-matrices necessarily includes a central element of said 3 x 3 sub-matrix, and the other elements of each of said 2 x 2 sub-matrices are adjacent to the central element;
comparing the absolute value of the difference between the element in each 2 x 2 sub-matrix and the central element, and recording the minimum value as N;
if N is larger than or equal to a third threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
if N is smaller than a third threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
specifically, a suitable third threshold value can be found by selecting different threshold values for multiple times according to a standard detected scene, preferably, the third threshold value is not more than half of the second threshold value, and the false deletion rate of the noise point can be effectively reduced by further judging that M is more than the second threshold value.
And generating the second point cloud data by using the first point cloud data corresponding to the elements reserved in the two-dimensional point cloud matrix when the M is less than or equal to a second threshold and the N is less than a third threshold.
Step S120, calculating a normal vector of the second point cloud data.
Specifically, the normal vector of the second point cloud data is calculated by fitting a local total least square plane of the point and the neighboring points thereof through a principal component analysis method, the normal vector of the local plane is used for approximating the normal vector of the substitute point, and the calculated normal vector is subjected to feature classification, and due to the influence of factors such as errors, the normal vector of the point cloud on the same plane is approximately parallel and perpendicular to the plane. Since the first point cloud data has been subjected to the drying process in step S110, it is avoided that the estimated normal vector has a large deviation due to the existence of outliers.
Step S130, selecting a reference normal vector. The selected reference normal vector is specifically as follows:
and selecting a reference normal vector according to the normal vector characteristics of the object to be extracted. In an actual example, a scene is composed of a background plate, a cylindrical object to be measured and a test bed, first point cloud data composed of the background plate, the cylindrical object to be measured and the test bed are acquired simultaneously through a flight time sensor, wherein the background plate and the test bed both have plane characteristics, the test bed is extracted firstly, then the background plate is extracted, and finally the purpose of retaining the required cylindrical object to be measured is achieved. In consideration of the influence of acquisition errors and the like, the normal vector of the point cloud data of the test bed is approximately perpendicular to the test bed, so the direction of the reference normal vector is selected to be perpendicular to the direction of the test bed.
Step S140, calculating an included angle between the normal vector of the second point cloud data and the reference normal vector.
Specifically, in the embodiment of step S130, an included angle between the normal vector of the second point cloud data and the reference normal vector perpendicular to the test bed is calculated, and since the normal vector of the collected point cloud data of the test bed is similar to the perpendicular test bed, the included angle between the normal vector of the point cloud data of the test bed and the reference normal vector is very small.
Step S150, judging the second point cloud data with the included angle smaller than a first threshold value as an external point; judging the second point cloud data with the included angle larger than or equal to the first threshold value as an interior point, and generating third point cloud data
Specifically, in the embodiment of step S130, an included angle between the normal vector of the extracted point cloud data of the test bed and the reference normal vector is small, so that the first threshold is set in a small range, but when an error of the point cloud data acquired by the time-of-flight sensor is large, if the first threshold is set to be too small, extraction is not complete enough, and therefore, in order to obtain a relatively ideal extraction effect, it is preferable to set the first threshold to be 20 degrees. When the included angle is larger than a first threshold value, the part of point cloud data and the test bed are not on the same parallel plane, the part of point cloud data is judged to be internal points to be reserved, when the included angle is smaller than the first threshold value, the part of point cloud is judged to be collected test bed point cloud data, and the part of point cloud data is judged to be external points to be deleted. The remaining second point cloud data is composed of the three-dimensional point cloud data of the background plate and the cylindrical object to be measured, and the processing method for judging and deleting the point cloud data of the background plate is the same as that of the experiment table, and is not repeated here. And after the experiment table and the background plate are extracted and deleted, generating third point cloud data by the remaining point cloud data.
Step S160, performing target point cloud extraction on the third point cloud data based on a Random Sample Consensus (RANSAC) algorithm. The method comprises the following specific steps:
and randomly selecting a subset from the third point cloud data as an inner group.
Specifically, the number of the point clouds of the selected subset is not less than the number of unknown parameters of the estimation model.
An estimation model is generated using the subset.
Specifically, model parameters are calculated according to the point cloud coordinates in the selected subset, and therefore an estimation model is generated.
Traversing the third point cloud data with the estimation model.
Specifically, the cloud data of other third points than the subset are substituted into the estimation model.
And dividing the third point cloud data meeting the constraint condition into the inner group.
Specifically, a minimum error value is set, and the value error of the third point cloud data substituted into the estimation model is smaller than the constraint condition, and is classified as an inner cluster point. The minimum error for use as a constraint can be determined by a number of experiments.
And circulating the steps, selecting the most inner cluster points in the estimation model as an optimal model, extracting all third point cloud data which accord with the optimal model, and extracting the final target point cloud.
Specifically, in the process of cycling through the above steps, two situations occur to cause the cycle to jump out, and the first situation is to jump out the cycle when a new non-repeating subset cannot be randomly selected. The second situation is that an inner group point threshold which accords with the estimation model is set, if the number of the inner group points merged into the subset is larger than or equal to the threshold, the estimation model is determined as the optimal model and the circulation is jumped out, the efficiency of the circulation jumping out under the second situation is higher than that under the first situation, when the point cloud data volume is large, the circulation frequency is also large, and the circulation can be terminated as long as the estimation model in the threshold is found in the circulation process, the circulation is not required to be continued, and the system resources are greatly saved.
According to the method for extracting the target point cloud, provided by the embodiment of the invention, the redundant scenes are removed through drying and normal vector, and then the RANSAC algorithm is combined to realize high-speed extraction of the target point cloud, so that the phenomenon of boundary error extraction caused by singly adopting the RANSAC algorithm is effectively improved, when a plurality of target objects are adopted, the extraction of the target point cloud through the method has the characteristics of high stability and effectiveness, and the average time for extracting the target point cloud is less than 1 second through multiple tests and averaging.
Those of skill would further appreciate that the various illustrative components and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both, and that the various illustrative components and steps have been described above generally in terms of their functionality in order to clearly illustrate this interchangeability of hardware and software. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied in hardware, a software module executed by a processor, or a combination of the two. A software module may reside in Random Access Memory (RAM), memory, Read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are merely exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (8)

1. A method of target point cloud extraction, the method comprising:
acquiring first point cloud data, and performing denoising processing on the first point cloud data to generate second point cloud data;
calculating a normal vector of the second point cloud data;
selecting a reference normal vector;
calculating an included angle between a normal vector of the second point cloud data and the reference normal vector;
judging the second point cloud data with the included angle smaller than a first threshold value as an external point; judging the second point cloud data with the included angle larger than or equal to a first threshold value as an interior point, and generating third point cloud data;
performing target point cloud extraction on the third point cloud data based on a random sample consensus (RANSAC) algorithm;
the denoising processing of the first point cloud data to generate second point cloud data specifically includes:
extracting depth data of the first point cloud data, and establishing a two-dimensional point cloud matrix;
extracting K3 multiplied by 3 sub-matrixes of the two-dimensional point cloud matrix;
establishing a position index of a central element of the 3 x 3 sub-matrix in the two-dimensional point cloud matrix;
adding the central element in the 3 multiplied by 3 sub-matrix with the absolute value of the difference of other elements respectively, and recording as M;
if the M is larger than a second threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
if the M is smaller than or equal to a second threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
generating second point cloud data from first point cloud data corresponding to elements reserved in the two-dimensional point cloud matrix;
if the M is larger than a second threshold value:
extracting 4 2 × 2 sub-matrices of the 3 × 3 sub-matrices;
comparing the absolute value of the difference between the element in each 2 x 2 sub-matrix and the central element, and recording the minimum value as N;
if N is larger than or equal to a third threshold value, judging that the central element is a noise point, finding the position of the noise point in the two-dimensional point cloud matrix according to the position index, and discarding the element corresponding to the noise point;
if N is smaller than a third threshold value, retaining an element corresponding to the position of the central element in the two-dimensional point cloud matrix;
and generating the second point cloud data by using the first point cloud data corresponding to the elements reserved in the two-dimensional point cloud matrix when the M is less than or equal to a second threshold and the N is less than a third threshold.
2. The method of target point cloud extraction of claim 1, wherein the first point cloud data is three-dimensional coordinates captured by a time-of-flight sensor.
3. The method of target point cloud extraction of claim 1, wherein the third threshold is not greater than half of the second threshold.
4. The method for extracting a target point cloud according to claim 1, wherein the selecting a reference normal vector specifically comprises:
and selecting a reference normal vector according to the normal vector characteristics of the object to be extracted.
5. The method of target point cloud extraction of claim 1, wherein the number of K in the K3 x 3 sub-matrices is the number of internal elements enclosed by the first row, the last row, the first column and the last column of the two-dimensional point cloud matrix.
6. The method for extracting a target point cloud of claim 1, wherein the step of establishing a position index of the central element of the 3 × 3 sub-matrix in the two-dimensional point cloud matrix is specifically:
and marking a row mark and a column mark of the central element of each 3 multiplied by 3 sub-matrix in the two-dimensional point cloud matrix, and matching corresponding depth data in the two-dimensional point cloud matrix according to the row mark and the column mark.
7. The method for extracting the target point cloud according to claim 1, wherein the random sample consensus (RANSAC) algorithm based on random sample consensus (RANSAC) is used for extracting the target point cloud from the third point cloud data by the following specific steps:
randomly selecting a subset from the third point cloud data as an inner group;
generating an estimation model with the subset;
traversing the third point cloud data with the estimation model;
dividing the third point cloud data meeting constraint conditions into the inner groups;
and circulating the steps, selecting the most inner cluster points in the estimation model as an optimal model, and extracting all third point cloud data which accord with the optimal model.
8. The method of claim 7, wherein the end condition of the loop is: and jumping out of the cycle when a new subset cannot be randomly selected, or setting an inner group point threshold value which accords with the estimation model, and if the inner group point threshold value is greater than or equal to the inner group point threshold value, determining the estimation model as an optimal model and jumping out of the cycle.
CN201911071871.1A 2019-11-05 2019-11-05 Method for extracting target point cloud Active CN110827339B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911071871.1A CN110827339B (en) 2019-11-05 2019-11-05 Method for extracting target point cloud

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911071871.1A CN110827339B (en) 2019-11-05 2019-11-05 Method for extracting target point cloud

Publications (2)

Publication Number Publication Date
CN110827339A CN110827339A (en) 2020-02-21
CN110827339B true CN110827339B (en) 2022-08-26

Family

ID=69552537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911071871.1A Active CN110827339B (en) 2019-11-05 2019-11-05 Method for extracting target point cloud

Country Status (1)

Country Link
CN (1) CN110827339B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507340B (en) * 2020-04-16 2023-09-01 北京深测科技有限公司 Target point cloud data extraction method based on three-dimensional point cloud data
CN111625959B (en) * 2020-05-26 2024-10-11 台州学院 Random value-taking method for two-dimensional probability distribution matrix
CN112132950B (en) * 2020-08-13 2024-01-26 中国地质大学(武汉) Three-dimensional point cloud scene updating method based on crowdsourcing image

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01222575A (en) * 1988-03-01 1989-09-05 Nec Corp Binarizing circuit for picture signal
CN102915561A (en) * 2012-09-27 2013-02-06 清华大学 Method of three-dimensional reconstruction for pipeline structures
CN103499303A (en) * 2013-09-27 2014-01-08 中国人民解放军空军工程大学 Wool fineness automatic measuring method
CN104463856A (en) * 2014-11-25 2015-03-25 大连理工大学 Outdoor scene three-dimensional point cloud data ground extraction method based on normal vector ball
CN106097311A (en) * 2016-05-31 2016-11-09 中国科学院遥感与数字地球研究所 The building three-dimensional rebuilding method of airborne laser radar data
CN107609520A (en) * 2017-09-15 2018-01-19 四川大学 Obstacle recognition method, device and electronic equipment
CN107644452A (en) * 2017-09-15 2018-01-30 武汉大学 Airborne LiDAR point cloud roof dough sheet dividing method and system
CN108052624A (en) * 2017-12-15 2018-05-18 深圳市易成自动驾驶技术有限公司 Processing Method of Point-clouds, device and computer readable storage medium
CN109839624A (en) * 2017-11-27 2019-06-04 北京万集科技股份有限公司 A kind of multilasered optical radar position calibration method and device
CN110136070A (en) * 2018-02-02 2019-08-16 腾讯科技(深圳)有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN110363849A (en) * 2018-04-11 2019-10-22 株式会社日立制作所 A kind of interior three-dimensional modeling method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4290193B2 (en) * 2006-12-26 2009-07-01 三洋電機株式会社 Image processing device

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH01222575A (en) * 1988-03-01 1989-09-05 Nec Corp Binarizing circuit for picture signal
CN102915561A (en) * 2012-09-27 2013-02-06 清华大学 Method of three-dimensional reconstruction for pipeline structures
CN103499303A (en) * 2013-09-27 2014-01-08 中国人民解放军空军工程大学 Wool fineness automatic measuring method
CN104463856A (en) * 2014-11-25 2015-03-25 大连理工大学 Outdoor scene three-dimensional point cloud data ground extraction method based on normal vector ball
CN106097311A (en) * 2016-05-31 2016-11-09 中国科学院遥感与数字地球研究所 The building three-dimensional rebuilding method of airborne laser radar data
CN107609520A (en) * 2017-09-15 2018-01-19 四川大学 Obstacle recognition method, device and electronic equipment
CN107644452A (en) * 2017-09-15 2018-01-30 武汉大学 Airborne LiDAR point cloud roof dough sheet dividing method and system
CN109839624A (en) * 2017-11-27 2019-06-04 北京万集科技股份有限公司 A kind of multilasered optical radar position calibration method and device
CN108052624A (en) * 2017-12-15 2018-05-18 深圳市易成自动驾驶技术有限公司 Processing Method of Point-clouds, device and computer readable storage medium
CN110136070A (en) * 2018-02-02 2019-08-16 腾讯科技(深圳)有限公司 Image processing method, device, computer readable storage medium and electronic equipment
CN110363849A (en) * 2018-04-11 2019-10-22 株式会社日立制作所 A kind of interior three-dimensional modeling method and system

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
A comprehensive survey on impulse and Gaussian denoising filters for digital images;MehdiMafi 等;《Signal Processing》;20190430;第157卷;第236-260页 *
A new directional weighted median filter for removal of random-valued impulse noise;Yiqiu Dong 等;《IEEE Signal Processing Letters》;20070220;第14卷(第3期);第193-196页 *
一种智能手机上的场景实时识别算法;桂振文 等;《自动化学报》;20140131;第40卷(第1期);第83-91页第1.2.4节 *
基于最佳几何约束的遥感影像点匹配研究;廖健驰;《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》;20190515(第05期);C028-93 *

Also Published As

Publication number Publication date
CN110827339A (en) 2020-02-21

Similar Documents

Publication Publication Date Title
CN110827339B (en) Method for extracting target point cloud
CN107918931B (en) Image processing method and system and computer readable storage medium
CN104182974B (en) A speeded up method of executing image matching based on feature points
JP6202147B2 (en) Curve detection method and curve detection apparatus
KR101618996B1 (en) Sampling method and image processing apparatus for estimating homography
CN110210067B (en) Method and device for determining threshold straight line based on measurement track
CN108229232B (en) Method and device for scanning two-dimensional codes in batch
CN110956078B (en) Power line detection method and device
CN108961316B (en) Image processing method and device and server
CN102214290B (en) License plate positioning method and license plate positioning template training method
CN109410183B (en) Plane extraction method, system and device based on point cloud data and storage medium
KR101421128B1 (en) Extraction method of building regions using segmented 3d raw datd based on laser radar
CN112686842B (en) Light spot detection method and device, electronic equipment and readable storage medium
CN110942473A (en) Moving target tracking detection method based on characteristic point gridding matching
KR101725041B1 (en) Method and apparatus for detecting forged image
CN113970734A (en) Method, device and equipment for removing snowing noise of roadside multiline laser radar
CN111507919A (en) Denoising processing method for three-dimensional point cloud data
CN117333795A (en) River surface flow velocity measurement method and system based on screening post-treatment
CN115115548A (en) Point cloud repairing method and device, electronic equipment and storage equipment
CN113012126B (en) Method, device, computer equipment and storage medium for reconstructing marking point
CN111507339B (en) Target point cloud acquisition method based on intensity image
Li et al. Ground target recognition based on imaging LADAR point cloud data
CN117392211B (en) BGA element rapid identification positioning method and system and storage medium
CN114037723B (en) Method and device for extracting mountain vertex based on DEM data and storage medium
CN111383231B (en) Image segmentation method, device and system based on 3D image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right

Effective date of registration: 20230414

Address after: 100121 No. 20, 1st floor, building 6, Shuangqiao East Road, Chaoyang District, Beijing

Patentee after: Beijing Lianping Technology Co.,Ltd.

Address before: 100022 s1067, 1st floor, 1089 Huihe South Street, Banbidian village, Gaobeidian Township, Chaoyang District, Beijing

Patentee before: Beijing Shenzhen Survey Technology Co.,Ltd.

TR01 Transfer of patent right
CP01 Change in the name or title of a patent holder

Address after: 100121 No. 20, 1st floor, building 6, Shuangqiao East Road, Chaoyang District, Beijing

Patentee after: Beijing Lianping Technology Co.,Ltd.

Address before: 100121 No. 20, 1st floor, building 6, Shuangqiao East Road, Chaoyang District, Beijing

Patentee before: Beijing Lianping Technology Co.,Ltd.

CP01 Change in the name or title of a patent holder