[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN108665491B - Rapid point cloud registration method based on local reference points - Google Patents

Rapid point cloud registration method based on local reference points Download PDF

Info

Publication number
CN108665491B
CN108665491B CN201810241936.1A CN201810241936A CN108665491B CN 108665491 B CN108665491 B CN 108665491B CN 201810241936 A CN201810241936 A CN 201810241936A CN 108665491 B CN108665491 B CN 108665491B
Authority
CN
China
Prior art keywords
point
points
point cloud
matching
neighborhood
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810241936.1A
Other languages
Chinese (zh)
Other versions
CN108665491A (en
Inventor
宋锐
杨星辉
李云松
贾媛
王养利
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xidian University
Original Assignee
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xidian University filed Critical Xidian University
Priority to CN201810241936.1A priority Critical patent/CN108665491B/en
Publication of CN108665491A publication Critical patent/CN108665491A/en
Application granted granted Critical
Publication of CN108665491B publication Critical patent/CN108665491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The invention belongs to the technical field of three-dimensional reconstruction, and discloses a rapid point cloud registration method based on local reference points, which comprises the steps of carrying out downsampling on input original point clouds to obtain corresponding sparse point clouds, using each point in the sparse point clouds as a reference point to obtain local included angle characteristics, using the local included angle characteristics of one visual angle to establish a kd-tree, searching nearest neighbor characteristics of each local included angle characteristic of the other visual angle, using a k1 pair with the minimum Euclidean distance error in the nearest neighbor characteristics as a seed matching pair, searching other matching pairs under the reference from the neighborhood, then using rigid body transformation to solve a transformation matrix of the adjacent visual angles so as to complete the rough registration of the point clouds, and using a partially overlapped ICP registration algorithm to optimize the rough registration result. The invention establishes initial matching by utilizing the rapid included angle characteristics, and then uses the neighborhood propagation of matching, thereby improving the point cloud registration speed and ensuring the registration precision.

Description

Rapid point cloud registration method based on local reference points
Technical Field
The invention belongs to the technical field of three-dimensional reconstruction, and particularly relates to a quick point cloud registration method based on local reference points.
Background
Currently, the current state of the art commonly used in the industry is such that:vision is the most important way for humans to perceive the world, and video plays a critical role in various applications related to vision. The three-dimensional reconstruction algorithm based on RGB-D images still recovers the structure of a target object according to the motion information of a depth camera, namely recovers the structure from motion, and the technology is the same as the three-dimensional reconstruction algorithm based on image sequences, but the two reconstruction algorithms have great difference in the mode of estimating the motion information of the camera. And in the latter, image features are analyzed and feature points existing in the image are extracted, then the feature points between two adjacent frames are matched, then a transformation matrix of the camera is estimated according to the matching point pairs, and after the pose of the camera is obtained, the matching pair information is mapped into a three-dimensional space, so that the point cloud information of the target is obtained. The last step of the two reconstruction methods is to perform surface reconstruction on the acquired point cloud to obtain a final reconstruction target. The point cloud registration algorithm in the RGB-D image-based three-dimensional reconstruction algorithm can be divided into the following three types according to different conditions in the registration process, namely pure coordinate matching, matching of joint coordinates and texture information, and analysis of matching of point cloud structural characteristics. In pure coordinate matching, the point cloud registration can be completed only by using the coordinate information of the point cloud of each view angle without other auxiliary information. The most classical of such registration algorithms is the iterative closest point algorithm. The algorithm assumes that the motion between depth cameras corresponding to two point clouds to be registered is small, namely a large overlap ratio exists between the two point clouds, then establishes a Kd-tree for one point cloud according to the coordinates of the point cloud, searches the nearest point of each point in the other point cloud in the Kd-tree, and continuously performs iteration approximation to a real camera transformation matrix according to the obtained nearest point relationship, thereby obtaining the registration result of the two point clouds. Such algorithms require minimal camera motion for two adjacent views, and furthermore the overlap ratio between two point clouds to be registeredOtherwise, iteration is carried out on the algorithm to obtain a convergence result, so that final point cloud registration fails. The matching algorithm of the joint coordinate and the texture information only adds the texture information corresponding to each point in the point cloud in the pure coordinate matching algorithm, namely, the point cloud registration condition of two visual angles is further strengthened by expanding the previous three-dimension to four-dimension when constructing the Kd-tree. However, it still requires that the overlapping rate of two point clouds to be registered is high and the camera motion corresponding to two viewing angles cannot be too large, otherwise, the final registration will also fail. The matching algorithm for analyzing the structural characteristics of the point clouds is to establish related description information according to certain common characteristics existing between the two point clouds, then calculate and obtain a matching pair between the two point clouds according to the description information, and estimate a transformation matrix of a camera according to the matching pair, thereby obtaining a final registration result. The algorithm takes a lot of time to build a description for each point in the point cloud.
In summary, the problems of the prior art are as follows:the existing three point cloud registration algorithms have small movement and high overlapping rate; the description needs to be established for each point in the point cloud, which is time-consuming.
The difficulty and significance for solving the technical problems are as follows:reducing the time complexity of registration while ensuring the point cloud registration accuracy is a difficult point of current research. Reducing the time complexity of point cloud registration enables real-time computation of a three-dimensional reconstruction system.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a quick point cloud registration method based on local reference points.
The invention is realized by the method, which is based on the local reference point for fast point cloud registration, and the method for fast point cloud registration based on the local reference point performs downsampling on the input original point cloud to obtain the corresponding sparse point cloud; respectively establishing local included angle characteristics and multi-scale local normal vector description by taking each point in the sparse point cloud as a reference point, and establishing a kd-tree by using the local included angle characteristics of a visual angle after obtaining the local included angle characteristics; searching nearest neighbor characteristics of each local included angle characteristic of another visual angle in a kd-tree, taking a k1 pair with the minimum Euclidean distance error in the nearest neighbor characteristics as a seed matching pair, then respectively taking the seed matching pair as reference matching, searching other matching pairs under the reference from the neighborhood, stopping searching when the number of the pairs to be matched reaches a specified threshold value, solving a transformation matrix of the adjacent visual angles by using rigid body transformation so as to complete the coarse registration of the point cloud, and optimizing the coarse registration result by using a partially overlapped ICP registration algorithm.
Further, the fast point cloud registration method based on the local reference points comprises the following steps:
step one, performing down-sampling on an original input point cloud P, Q to obtain corresponding sparse point clouds Pd and Qd;
step two, taking each point in the sparse point cloud as a reference point, finding k points which are nearest and farthest to each other in the r1 neighborhood in the corresponding original point cloud, and then respectively taking an included angle formed by the nearest point, the reference point and the farthest point as a characteristic;
taking each point in the sparse point cloud as a reference point, and obtaining approximate local normal vector description in L adjacent domains of the corresponding original point cloud according to the data correlation between the reference point and the points in the adjacent domains;
step four, establishing a kd-tree by using the included angle characteristics of each point in Pd, searching the nearest neighbor of each included angle characteristic in Qd in the kd-tree, and finally selecting only the k1 pair with the minimum Euclidean error as a seed matching pair;
step five, obtaining included angles between other points in the Pd and normal vectors of the seed points, then finding other matching points in the neighborhood of the seed points by using included angle constraint, and finishing coarse registration according to the matching points;
and step six, optimizing the coarse registration result obtained in the step five by using a partial overlapping ICP algorithm.
Further, the down-sampling of the original input point cloud in the first step specifically includes: the method comprises the steps of dividing a point cloud space into a plurality of small cubes by using a grid, enabling points in the point cloud to respectively fall into corresponding cubic grids in a point cluster mode, dividing the whole point cloud into a plurality of point cluster sets, representing the point cluster sets by coordinate mean values of all the points in each point cluster set, and obtaining the point cloud which is subjected to the processing as the original point cloud and subjected to down-sampling sparse point cloud.
Further, the method for calculating the local included angle feature in the second step includes: each point in the sparse point cloud is used as a reference point to find k points which are closest and farthest in r1 neighborhood in the corresponding original point cloud, and the k points which are closest and farthest are also ranked according to the distance from the reference point and are arranged from small to large according to the distance; the current reference point is X, and its corresponding k nearest and farthest points are denoted as X ═ X1,...,x2kThen the vector it constitutes with the reference point is represented as:
Figure BDA0001605484120000041
wherein x1Is the closest of the nearest k points, and x2kThe farthest point of the farthest k points;
then the included angle feature calculation expression is:
Figure BDA0001605484120000042
wherein the superscript of the terms in the calculation of the angle formula indicates that they are in the second term in the vector.
Further, the method for calculating the multi-scale local normal vector in the third step comprises the following steps: respectively calculating the difference between the point in each neighborhood radius and the current reference point in the sparse point cloud as:
Figure BDA0001605484120000043
wherein L1.. multidot.l, and S simultaneouslyl={xi|||xi-x||≤rlFor the correlation matrix C at each scaleiUsing singular value decomposition to obtain three eigenvalues lambdal1≥λl2≥λl3And the corresponding feature vector nl1,nl2,nl3Selecting the minimum eigenvalue λl3Corresponding nl3As a layout description vector for each small scale; the joint expression under multiple scales is in the form of a matrix as follows:
N=(n1,...,nL);
wherein n is1And nLAnd local description vectors corresponding to all scales respectively.
Further, the method for selecting the initial seed matching pair in the fourth step specifically comprises: establishing a kd-tree for the included angle characteristics of each point in Pd; searching nearest neighbor characteristics corresponding to each included angle characteristic in the Qd in the tree, wherein the measurement mode of the nearest neighbor is the Euclidean distance error between the two included angle characteristics; and selecting the k1 pair with the minimum Euclidean distance error as an initial seed matching, and recording as S.
Further, the neighborhood propagation method of initial seed matching in the fifth step includes: (x)i,yj) For a match in the seed match set S, the match set corresponding to that match is initialized to M { (x)i,yj) Find the distance x in PdiThe closest k2 points, the point set consisting of k2 points is represented as P'dAnd calculating the distance x in the k2 pointsiThe farthest distance d 1; with yjFinding the points in the d1 neighborhood in Qd as the reference point, wherein the point set formed by the points in the neighborhood is expressed as Q'd. For any x ∈ Pd′\xiAnd d | | | x-xi| l, define xiThe vector with point x in its d neighborhood describes N with an angle θ, expressed as θ ═ θ (θ ═ N1,...,θL)TWherein thetalIs at rlX under neighborhoodiAnd x, expressed as:
Figure BDA0001605484120000051
in Q'dMiddle yjSearching x in d' neighborhood ofiThe corresponding points of all points in the d neighborhood of (c) are expressed as:
y∈Qd′∩{y||d′-d|<∈1};
wherein d ═ y-yjI, | and θlSimilarly, y and yjThe angle between vectors is expressed as θ '═ θ'1,...,θ′L)TAll say theta'lIs at rlY under neighborhoodjAnd y, the angle between the vector descriptions, expressed as:
Figure BDA0001605484120000052
then x is putiD the angle between the vectors of the points in the neighborhood of (a) and (b)jThe singularity between the angles θ 'between the vectors of the points in the d' neighborhood is defined as follows:
Figure BDA0001605484120000053
when seed matching neighborhood transmission is carried out, adding a point pair with minimum included angle singularity in a neighborhood matched with seeds as a new matching pair into an initial matching set, wherein the initial matching set M is MeU (x, y), the included angle singularity of all points in the neighborhood is Inf, and then M is kept unchanged; when reference point xiAll points in the d neighborhood are traversed, and the number of matched pairs reaches a preset value NlimIf not, the matching added for the last time is used as a new reference point to carry out the next search; n matching pairs exist in the initial seed matching, and the obtained matching set is M1,...,MnAnd obtaining a transformation matrix according to each matching set, finishing point cloud registration according to the transformation matrix, and selecting a transformation with the minimum registration error to transform the original input point cloud.
Further, the method for optimizing the coarse registration result by the partial overlap ICP algorithm in the sixth step comprises: finding a nearest point corresponding to each point in the point cloud subjected to coarse registration by using nearest neighbor search, taking the mean square error of k3 point pairs with the minimum nearest neighbor error as an iteration initial error and taking the k3 point pairs as an iteration initial matching pair; solving a transformation matrix between the two point clouds according to the matching pair, transforming one visual angle, finding nearest neighbor matching in the two point clouds after transformation by using nearest neighbor search again, simultaneously calculating the mean square error of k3 point pairs with the minimum nearest neighbor error, and stopping iteration if the error of two adjacent iterations is less than a threshold value and the current mean square error value is less than an error threshold value or reaches the maximum iteration times; otherwise, the iteration process is repeated until the iteration stops.
Another object of the present invention is to provide a method for fast point cloud registration based on local reference points.
In summary, the advantages and positive effects of the invention are:
the method is suitable for registration when the point cloud overlapping rate is low, and the registration effect of a plurality of point cloud models is improved. Therefore, the invention is a registration algorithm with low computational complexity and high applicability.
Drawings
Fig. 1 is a flowchart of a local reference point-based fast point cloud registration method according to an embodiment of the present invention.
Fig. 2 is a test chart of the bun model according to the embodiment of the present invention.
Fig. 3 is a test chart for the Arm model provided by the embodiment of the present invention.
Fig. 4 is a test chart of the dialog model provided by the embodiment of the present invention.
Fig. 5 is a test chart for the happy model according to the embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail with reference to the following embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a quick point cloud registration method based on local reference points, and aims to solve the problem of high time complexity when the existing algorithm meets the registration accuracy.
As shown in fig. 1, the fast point cloud registration method based on local reference points provided by the embodiment of the present invention includes the following steps:
s101: the original input point cloud P, Q is subjected to down-sampling to obtain corresponding sparse point clouds Pd and Qd;
s102: taking each point in the sparse point cloud as a reference point, finding k points which are closest and farthest to each other in an r1 neighborhood in the corresponding original point cloud, and then respectively taking an included angle formed by the closest point, the reference point and the farthest point as a characteristic;
s103: each point in the sparse point cloud is used as a reference point, and approximate local normal vector description is obtained in L neighborhoods of the corresponding original point cloud according to data correlation between the reference point and points in the neighborhoods;
s104: establishing a kd-tree by using the included angle characteristics of each point in Pd, searching the nearest neighbor of each included angle characteristic in Qd in the kd-tree, and finally selecting only a k1 pair with the minimum Euclidean error as a seed matching pair;
s105: firstly, obtaining included angles between other points in Pd and normal vectors of seed points, then finding other matching points in the neighborhood of the seed points by using included angle constraint, and finishing coarse registration according to the matching points;
s106: and optimizing the coarse registration result obtained in the step S105 by using a partial overlapping ICP algorithm.
In the embodiment of the present invention, the down-sampling of the original input point cloud in step S101 specifically includes: the point cloud space is divided into a plurality of small cubes by using a grid, so that points in the whole point cloud respectively fall into the corresponding cube grids in a point cluster mode, finally the whole point cloud is divided into a plurality of point cluster sets, then the point cluster sets are represented by the coordinate mean value of all the points in each point cluster set, and the point cloud obtained through the processing is the sparse point cloud obtained after the original point cloud is subjected to downsampling. The structure of the origin cloud is greatly maintained by using the method.
In the embodiment of the invention, the calculation process of the local included angle characteristic is as follows: firstly, each point in the sparse point cloud is taken as a reference point to find the nearest and farthest k points in the r1 neighborhood in the corresponding original point cloud, and the nearest and farthest pointsThe k points of (a) are also ordered in terms of distance from the reference point, i.e. all arranged from small to large. If the current reference point is X, and its corresponding k nearest points and k farthest points can be expressed as X ═ X1,...,x2kThen the vector it constitutes with the reference point can be expressed as:
Figure BDA0001605484120000081
wherein x1Is the closest of the nearest k points, and x2kThe farthest point of the farthest k points;
then the included angle feature calculation expression is:
Figure BDA0001605484120000082
wherein the superscript of the terms in the calculation of the angle formula indicates that they are in the second term in the vector. Such as Pl 1Is shown at PlThe first term in (1), and Pl 2kIs shown at PlThe 2 k-th term in (1), i.e., the last term.
In the embodiment of the present invention, the multi-scale local normal vector calculation process in step S103 is: firstly, the difference between the point in each neighborhood radius and the current reference point in the sparse point cloud is respectively calculated, and the specific relationship can be expressed as:
Figure BDA0001605484120000083
wherein L1.. multidot.l, and S simultaneouslyl={xi|||xi-x||≤rl) For the correlation matrix C at each scaleiUsing singular value decomposition, three eigenvalues λ can be obtainedl1≥λl2≥λl3And the corresponding feature vector nl1,nl2,nl3Selecting only the minimum eigenvalue λl3Corresponding nl3As each oneThe layout with small dimension describes the vector. Therefore, the joint expression at multiple scales is in the form of a matrix as follows:
N=(n1,...,nL) (4)
wherein n is1And nLAnd local description vectors corresponding to all scales respectively.
In the embodiment of the present invention, the selection process of the initial seed matching pair in step 104 is as follows: and establishing a kd-tree for the included angle features of each point in the Pd, and then searching nearest neighbor features corresponding to each included angle feature in the Qd in the tree, wherein the measurement mode of the nearest neighbor is the Euclidean distance error between the two included angle features. And then only selecting the k1 pair with the minimum Euclidean distance error as an initial seed match, and marking the initial seed match as S. The specific k1 parameter selection can be set according to actual needs.
In the embodiment of the present invention, the neighborhood propagation process of initial seed matching in S105 is as follows: suppose (x)i,yj) For a match in the seed match set S, the match set corresponding to that match is initialized to M { (x)i,yj) Find first the distance x in PdiThe nearest k2 points, and the point set composed of k2 points can be represented as P'dAnd calculating the distance x in the k2 pointsiThe farthest distance d1, followed by yjAs a reference point, find points in Qd in its d1 neighborhood, and the set of points made up of the points in this neighborhood can be represented as Q'd. For any x ∈ Pd′\xiAnd d | | | x-xi| l, define xiThe vector with point x in its d neighborhood describes an angle θ, which can be expressed as θ ═ θ (θ)1,...,θL)TWherein thetalIs at rlX under neighborhoodiAnd x, expressed as:
Figure BDA0001605484120000091
then is in Q'dMiddle yjSearching x in d' neighborhood ofiAll points in the d neighborhood of (1), thisThe points may be represented as:
y∈Qd′∩{y||d′-d|<∈1} (6)
wherein d ═ y-yjI, | and θlSimilarly, y and yjThe angle between vectors may be expressed as θ '═ θ'1,...,θ′L)TAll say theta'lIs at rlY under neighborhoodjAnd y, can be expressed as:
Figure BDA0001605484120000092
then x is putiD the angle between the vectors of the points in the neighborhood of (a) and (b)jThe singularity between the angles θ 'between the vectors of the points in the d' neighborhood is defined as follows:
Figure BDA0001605484120000093
when the seed matching neighborhood is broadcast, the point pair with the minimum included angle singularity in the neighborhood matched with the seeds is used as a new matching pair to be added into the initial matching set, at the moment, the initial matching set M is M U (x, y), and if the included angle singularity of all the points in the neighborhood is Inf, M is kept unchanged. When reference point xiAll points in d neighborhood are traversed, and at the moment, if the number of matched pairs reaches the preset value NlimThen the search stops, otherwise the next search is conducted with the last added match as the new reference point. Since there are n matching pairs in the initial seed matching, a matching set M can be obtained finally1,...,MnAnd obtaining a transformation matrix according to each matching set, finishing point cloud registration according to the transformation matrix, and selecting a transformation with the minimum registration error to transform the original input point cloud.
In the implementation of the embodiment of the invention, the optimization process of the partial overlap ICP algorithm on the coarse registration result in the step five is as follows: the nearest point corresponding to each point pair is first found in the point cloud subjected to coarse registration by using nearest neighbor search, then the mean square error of k3 point pairs with the smallest nearest neighbor error is taken as an iterative initial error and the k3 point pairs are taken as iterative initial matching pairs. And then solving a transformation matrix between the two point clouds according to the matching pair, transforming one visual angle, finding nearest neighbor matching in the two point clouds after transformation by using nearest neighbor search again, simultaneously calculating the mean square error of k3 point pairs with the minimum nearest neighbor error, and stopping iteration if the error of two adjacent iterations is less than a threshold value and the current mean square error value is less than an error threshold value or reaches the maximum iteration times. Otherwise, the iteration process is repeated until the iteration stops. The iteration error threshold and the iteration number threshold can be set according to specific requirements on precision and time complexity.
The effect of the present invention will be described in detail with reference to the experiments.
In order to illustrate that the invention can improve the point cloud registration performance and is simultaneously suitable for the registration of various point cloud models. The experiments are carried out on various models, the experimental data are given by tables 1 to 4, point cloud registration performance tests of various models under different point cloud deficiency rates are shown in figures 2 to 5,
TABLE 1 test results of the bun model
Figure BDA0001605484120000101
TABLE 2 Arm model test results
Figure BDA0001605484120000111
TABLE 3 dragon model test results
Figure BDA0001605484120000112
TABLE 4 happy model test results
Figure BDA0001605484120000113
The Time, Rot and Tran in the four tables represent the performance of point cloud registration, the performance of the method is better when the three parameter values are smaller, and the Time complexity of the method is lower than that of the other three algorithms, and meanwhile, the reduction of the registration precision can be ignored for the whole transformation. And 2, testing the models under different deficiency rates by using the graphs of fig. 2 to fig. 5, wherein the testing result shows that the algorithm has better registration performance.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention are intended to be included within the scope of the present invention.

Claims (7)

1. A fast point cloud registration method based on local reference points is characterized in that the fast point cloud registration method based on local reference points performs downsampling on input original point cloud to obtain corresponding sparse point cloud; respectively establishing local included angle characteristics and multi-scale local normal vector description by taking each point in the sparse point cloud as a reference point, and establishing a kd-tree by using the local included angle characteristics of a visual angle after obtaining the local included angle characteristics; searching nearest neighbor characteristics of each local included angle characteristic of another visual angle in a kd-tree, taking a k1 pair with the minimum Euclidean distance error in the nearest neighbor characteristics as a seed matching pair, then respectively taking the seed matching pair as reference matching, searching other matching pairs under the reference from the neighborhood, stopping searching when the number of the matched pairs reaches a specified threshold, solving an adjacent visual angle transformation matrix by using rigid body transformation so as to complete the rough registration of point cloud, and optimizing the rough registration result by using a partially overlapped ICP registration algorithm;
the quick point cloud registration method based on the local reference points comprises the following steps:
step one, performing down-sampling on an original input point cloud P, Q to obtain corresponding sparse point clouds Pd and Qd;
step two, each point in the sparse point cloud is used as a reference point, k points which are the closest and the farthest to each other are found in r1 neighborhoods in the corresponding original point cloud, and then included angles formed by the closest point, the reference point and the farthest point are respectively used as characteristics;
taking each point in the sparse point cloud as a reference point, and obtaining approximate local normal vector description in L adjacent domains of the corresponding original point cloud according to the data correlation between the reference point and the points in the adjacent domains;
step four, establishing a kd-tree by using the included angle characteristics of each point in Pd, searching the nearest neighbor of each included angle characteristic in Qd in the kd-tree, and finally selecting only the k1 pair with the minimum Euclidean error as a seed matching pair;
step five, obtaining included angles between other points in the Pd and normal vectors of the seed points, then finding other matching points in the neighborhood of the seed points by using included angle constraint, and finishing coarse registration according to the matching points;
and step six, optimizing the coarse registration result obtained in the step five by using a partial overlapping ICP algorithm.
2. The local reference point-based fast point cloud registration method according to claim 1, wherein the down-sampling of the original input point cloud in the first step specifically comprises: the method comprises the steps of dividing a point cloud space into a plurality of small cubes by using a grid, enabling points in the point cloud to respectively fall into corresponding cubic grids in a point cluster mode, dividing the whole point cloud into a plurality of point cluster sets, representing the point cluster sets by coordinate mean values of all the points in each point cluster set, and obtaining the point cloud which is subjected to the processing as the original point cloud and subjected to down-sampling sparse point cloud.
3. The local reference point-based fast point cloud registration method according to claim 1, wherein the calculation method of the local included angle feature in the second step comprises: each point in the sparse point cloud is used as a reference point to find k points which are closest and farthest in r1 neighborhood in the corresponding original point cloud, and the k points which are closest and farthest are also ranked according to the distance from the reference point and are arranged from small to large according to the distance; the current reference point is x, and its corresponding k nearest points and k farthest points representIs X ═ X1,…,x2kThen the vector it constitutes with the reference point is represented as:
Figure FDA0003304194480000021
wherein x1Is the closest of the nearest k points, and x2kThe farthest point of the farthest k points;
then the included angle feature calculation expression is:
Figure FDA0003304194480000022
the superscript of the terms in the calculation of the angle formula indicates that the term is in the vector.
4. The local reference point-based fast point cloud registration method according to claim 1, wherein the step three multi-scale local normal vector calculation method comprises: respectively calculating the difference between the point in each neighborhood radius and the current reference point in the sparse point cloud as:
Figure FDA0003304194480000023
where L is 1, …, L, and Sl={xi|||xi-x||≤rlFor the correlation matrix C at each scaleiUsing singular value decomposition to obtain three eigenvalues lambdal1≥λl2≥λl3And the corresponding feature vector nl1,nl2,nl3Selecting the minimum eigenvalue λl3Corresponding nl3As a layout description vector for each small scale; the joint expression under multiple scales is in the form of a matrix as follows:
N=(n1,…,nL);
wherein n is1And nLAre respectively pairedThe vectors should be described locally at each scale.
5. The local reference point-based fast point cloud registration method according to claim 1, wherein the selection method of the initial seed matching pairs in the fourth step specifically comprises: establishing a kd-tree for the included angle characteristics of each point in Pd; searching nearest neighbor characteristics corresponding to each included angle characteristic in the Qd in the tree, wherein the measurement mode of the nearest neighbor is the Euclidean distance error between the two included angle characteristics; and selecting the k1 pair with the minimum Euclidean distance error as an initial seed matching, and recording as S.
6. The local reference point-based fast point cloud registration method of claim 5, wherein the initial seed matching neighborhood propagation method comprises: (x)i,yj) For a match in the seed match set S, the match set corresponding to that match is initialized to M { (x)i,yj) Find the distance x in PdiThe closest k2 points, the point set consisting of k2 points is represented as P'dAnd calculating the distance x in the k2 pointsiThe farthest distance d 1; with yjFinding the points in the d1 neighborhood in Qd as the reference point, wherein the point set formed by the points in the neighborhood is expressed as Q'd(ii) a For any x ∈ Pd′\xiAnd d | | | x-xi| l, define xiThe vector with point x in its d neighborhood describes N with an angle θ, expressed as θ ═ θ (θ ═ N1,...,θL)TWherein thetalIs at rlX under neighborhoodiAnd x, expressed as:
Figure FDA0003304194480000031
in Q'dMiddle yjSearching x in d' neighborhood ofiThe corresponding points of all points in the d neighborhood of (c) are expressed as:
y∈Qd'∩{y||d′-d|<∈1};
wherein d ═ y-yjI, | and θlSimilarly, y and yjThe angle between vectors is expressed as θ '═ θ'1,...,θ'L)TAll say theta'lIs at rlY under neighborhoodjAnd y, the angle between the vector descriptions, expressed as:
Figure FDA0003304194480000032
then x is putiD the angle between the vectors of the points in the neighborhood of (a) and (b)jThe singularity between the angles θ 'between the vectors of the points in the d' neighborhood is defined as follows:
Figure FDA0003304194480000041
when seed matching neighborhood transmission is carried out, adding a point pair with minimum included angle singularity in a neighborhood matched with seeds as a new matching pair into an initial matching set, wherein the initial matching set M is MeU (x, y), the included angle singularity of all points in the neighborhood is Inf, and then M is kept unchanged; when reference point xiAll points in the d neighborhood are traversed, and the number of matched pairs reaches a preset value NlimIf not, the matching added for the last time is used as a new reference point to carry out the next search; n matching pairs exist in the initial seed matching, and the obtained matching set is M1,…,MnAnd obtaining a transformation matrix according to each matching set, finishing point cloud registration according to the transformation matrix, and selecting a transformation with the minimum registration error to transform the original input point cloud.
7. The local reference point-based rapid point cloud registration method according to claim 1, wherein the optimization method of the coarse registration result by the step six partial overlap ICP algorithm comprises: finding a nearest point corresponding to each point in the point cloud subjected to coarse registration by using nearest neighbor search, taking the mean square error of k3 point pairs with the minimum nearest neighbor error as an iteration initial error and taking the k3 point pairs as an iteration initial matching pair; solving a transformation matrix between the two point clouds according to the matching pair, transforming one visual angle, finding nearest neighbor matching in the two point clouds after transformation by using nearest neighbor search again, simultaneously calculating the mean square error of k3 point pairs with the minimum nearest neighbor error, and stopping iteration if the error of two adjacent iterations is less than a threshold value and the current mean square error value is less than an error threshold value or reaches the maximum iteration times; otherwise, the iteration process is repeated until the iteration stops.
CN201810241936.1A 2018-03-22 2018-03-22 Rapid point cloud registration method based on local reference points Active CN108665491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810241936.1A CN108665491B (en) 2018-03-22 2018-03-22 Rapid point cloud registration method based on local reference points

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810241936.1A CN108665491B (en) 2018-03-22 2018-03-22 Rapid point cloud registration method based on local reference points

Publications (2)

Publication Number Publication Date
CN108665491A CN108665491A (en) 2018-10-16
CN108665491B true CN108665491B (en) 2022-04-12

Family

ID=63782241

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810241936.1A Active CN108665491B (en) 2018-03-22 2018-03-22 Rapid point cloud registration method based on local reference points

Country Status (1)

Country Link
CN (1) CN108665491B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109887012B (en) * 2019-01-09 2023-03-31 天津大学 Point cloud registration method combined with self-adaptive search point set
CN109887009B (en) * 2019-01-24 2022-12-09 西北大学 Point cloud local matching method
CN109859256B (en) * 2019-03-13 2023-03-31 大连理工大学 Three-dimensional point cloud registration method based on automatic corresponding point matching
CN109919984A (en) * 2019-04-15 2019-06-21 武汉惟景三维科技有限公司 A kind of point cloud autoegistration method based on local feature description's
CN111223132A (en) * 2019-12-25 2020-06-02 华东师范大学 Object registration method and system
CN111192300B (en) * 2019-12-30 2023-11-07 南京航空航天大学 Sparse marker point data registration method and device for local deformation measurement
CN112102376B (en) * 2020-08-04 2023-06-06 广东工业大学 Multi-view cloud registration method, device and storage medium of hybrid sparse ICP
CN112381862B (en) * 2020-10-27 2022-10-14 新拓三维技术(深圳)有限公司 Full-automatic registration method and device for CAD (computer-aided design) model and triangular mesh
CN112861674B (en) * 2021-01-28 2024-09-06 中振同辂(江苏)机器人有限公司 Point cloud optimization method based on ground characteristics and computer readable storage medium
CN113223062B (en) * 2021-06-04 2024-05-07 武汉工控仪器仪表有限公司 Point cloud registration method based on corner feature point selection and quick description
CN113239500B (en) * 2021-07-12 2021-09-21 四川大学 Reference point neighborhood feature matching method based on covariance matrix
CN113706587B (en) * 2021-07-14 2022-12-09 西安交通大学 Rapid point cloud registration method, device and equipment based on space grid division

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006065399A (en) * 2004-08-24 2006-03-09 Sony Corp Image processing device and method, storage medium and program
CN102799763A (en) * 2012-06-20 2012-11-28 北京航空航天大学 Point cloud posture standardization-based method for extracting linear characteristic of point cloud
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN104112289A (en) * 2014-01-29 2014-10-22 辽宁师范大学 Three-dimensional object point cloud registration method based on parallel cascaded EM-ICP
CN104134216A (en) * 2014-07-29 2014-11-05 武汉大学 Laser point cloud auto-registration method and system based on 16-dimension feature description
CN104299260A (en) * 2014-09-10 2015-01-21 西南交通大学 Contact network three-dimensional reconstruction method based on SIFT and LBP point cloud registration
CN104463826A (en) * 2013-09-18 2015-03-25 镇江福人网络科技有限公司 Novel point cloud parallel Softassign registering algorithm
CN104794678A (en) * 2015-05-04 2015-07-22 福建师范大学 Automatic registration method for high-spatial-resolution remote-sensing images based on SIFI feature points
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN105118059A (en) * 2015-08-19 2015-12-02 哈尔滨工程大学 Multi-scale coordinate axis angle feature point cloud fast registration method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018004401A (en) * 2016-06-30 2018-01-11 株式会社トプコン Laser scanner and laser scanner system, and registration method for dot group data

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006065399A (en) * 2004-08-24 2006-03-09 Sony Corp Image processing device and method, storage medium and program
CN102799763A (en) * 2012-06-20 2012-11-28 北京航空航天大学 Point cloud posture standardization-based method for extracting linear characteristic of point cloud
CN103236064A (en) * 2013-05-06 2013-08-07 东南大学 Point cloud automatic registration method based on normal vector
CN104463826A (en) * 2013-09-18 2015-03-25 镇江福人网络科技有限公司 Novel point cloud parallel Softassign registering algorithm
CN104112289A (en) * 2014-01-29 2014-10-22 辽宁师范大学 Three-dimensional object point cloud registration method based on parallel cascaded EM-ICP
CN104134216A (en) * 2014-07-29 2014-11-05 武汉大学 Laser point cloud auto-registration method and system based on 16-dimension feature description
CN104299260A (en) * 2014-09-10 2015-01-21 西南交通大学 Contact network three-dimensional reconstruction method based on SIFT and LBP point cloud registration
CN104794678A (en) * 2015-05-04 2015-07-22 福建师范大学 Automatic registration method for high-spatial-resolution remote-sensing images based on SIFI feature points
CN104952107A (en) * 2015-05-18 2015-09-30 湖南桥康智能科技有限公司 Three-dimensional bridge reconstruction method based on vehicle-mounted LiDAR point cloud data
CN105118059A (en) * 2015-08-19 2015-12-02 哈尔滨工程大学 Multi-scale coordinate axis angle feature point cloud fast registration method

Non-Patent Citations (6)

* Cited by examiner, † Cited by third party
Title
An Iterative Closest Points Algorithm for Registration of 3D Laser Scanner Point Clouds with Geometric Features;Ying He 等;《Sensors 2017,》;20170811;1-16 *
Fast Descriptors and Correspondence Propagation for Robust Global Point Cloud Registration;Huan Lei 等;《IEEE TRANSACTIONS ON IMAGE PROCESSING》;20170503;第26卷(第8期);3614-3623 *
Robust Euclidean alignment of 3D point sets;Dmitry Chetverikov 等;《Image and Vision Computing》;20050119;第23卷(第3期);70-75 *
基于FPFH特征的点云配准技术;陈学伟 等;《电脑知识与技术》;20170505;第13卷(第4期);207-209 *
基于分层块状全局搜索的三维点云自动配准;孙军华 等;《光学精密工程》;20130328;第21卷(第1期);174-180 *
基于双向随机 KD 树的点云配准;吕所军 等;《工业控制计算机》;20150812;第28卷(第7期);99-101 *

Also Published As

Publication number Publication date
CN108665491A (en) 2018-10-16

Similar Documents

Publication Publication Date Title
CN108665491B (en) Rapid point cloud registration method based on local reference points
CN111968217B (en) SMPL parameter prediction and human body model generation method based on picture
CN108038906B (en) Three-dimensional quadrilateral mesh model reconstruction method based on image
CN109919984A (en) A kind of point cloud autoegistration method based on local feature description's
CN111625667A (en) Three-dimensional model cross-domain retrieval method and system based on complex background image
CN105551015A (en) Scattered-point cloud image registering method
CN111027140B (en) Airplane standard part model rapid reconstruction method based on multi-view point cloud data
CN112232134A (en) Human body posture estimation method based on hourglass network and attention mechanism
CN113989547B (en) Three-dimensional point cloud data classification system and method based on graph convolution depth neural network
CN108305278A (en) Images match correlation improved method in a kind of ORB-SLAM algorithms
CN104463953A (en) Three-dimensional reconstruction method based on inertial measurement unit and RGB-D sensor
CN117132630A (en) Point cloud registration method based on second-order spatial compatibility measurement
CN114913552B (en) Three-dimensional human body density corresponding estimation method based on single-view-point cloud sequence
CN109934859B (en) ICP (inductively coupled plasma) registration method based on feature-enhanced multi-dimensional weight descriptor
CN114119690A (en) Point cloud registration method based on neural network reconstruction Gaussian mixture model
CN114155406A (en) Pose estimation method based on region-level feature fusion
WO2018214179A1 (en) Low-dimensional bundle adjustment calculation method and system
CN115423854B (en) Multi-view point cloud registration and point cloud fusion method based on multi-scale feature extraction
CN114742869B (en) Brain neurosurgery registration method based on pattern recognition and electronic equipment
CN117576303A (en) Three-dimensional image generation method, device, equipment and storage medium
CN116758219A (en) Region-aware multi-view stereo matching three-dimensional reconstruction method based on neural network
CN115861563A (en) Three-dimensional reconstruction method for registration of topological rigid point cloud of graph
CN114140495A (en) Single target tracking method based on multi-scale Transformer
CN113256693A (en) Multi-view registration method based on K-means and normal distribution transformation
CN118134984B (en) Three-dimensional reconstruction algorithm and three-dimensional reconstruction system based on characteristic boundary line

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant