[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN112578396B - Method and device for coordinate transformation between radars and computer-readable storage medium - Google Patents

Method and device for coordinate transformation between radars and computer-readable storage medium Download PDF

Info

Publication number
CN112578396B
CN112578396B CN201910941969.1A CN201910941969A CN112578396B CN 112578396 B CN112578396 B CN 112578396B CN 201910941969 A CN201910941969 A CN 201910941969A CN 112578396 B CN112578396 B CN 112578396B
Authority
CN
China
Prior art keywords
point cloud
feature
projection
coordinate transformation
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910941969.1A
Other languages
Chinese (zh)
Other versions
CN112578396A (en
Inventor
沈际春
向少卿
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hesai Technology Co Ltd
Original Assignee
Hesai Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hesai Technology Co Ltd filed Critical Hesai Technology Co Ltd
Priority to CN201910941969.1A priority Critical patent/CN112578396B/en
Publication of CN112578396A publication Critical patent/CN112578396A/en
Application granted granted Critical
Publication of CN112578396B publication Critical patent/CN112578396B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/42Simultaneous measurement of distance and other co-ordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Electromagnetism (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

An inter-radar coordinate transformation method and apparatus, and a computer-readable storage medium, the inter-radar coordinate transformation method comprising: respectively selecting N characteristic surfaces corresponding to the same object from the first point cloud and the second point cloud; projecting N feature surfaces corresponding to the second point cloud into the first point cloud, and adjusting the projected position in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud; determining the projected target position, and acquiring a target coordinate transformation matrix corresponding to the target position; the target position is: and each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error. The scheme can improve the precision and speed of coordinate transformation between radars.

Description

Method and device for coordinate transformation between radars and computer-readable storage medium
Technical Field
The invention relates to the technical field of laser radars, in particular to a method and a device for coordinate transformation between radars and a computer readable storage medium.
Background
With the development of vehicle intelligence and unmanned technology, it has become a development trend of intelligent vehicles to install a laser radar on a vehicle, and use the laser radar to measure distance or detect surrounding obstacles.
Because a single lidar has a blind spot, at least two radars are usually arranged on the same vehicle. The point clouds collected by at least two laser radars on the vehicle are fused, so that more comprehensive surrounding environment information can be obtained.
The installation positions of different laser radars on the same vehicle are different, and therefore, when point clouds of different laser radars are fused, a coordinate transformation relation among the different laser radars needs to be obtained.
In the prior art, the coordinate transformation relation among different laser radars cannot be accurately acquired.
Disclosure of Invention
The embodiment of the invention solves the technical problem that the coordinate transformation relation among different laser radars cannot be accurately acquired.
To solve the above technical problem, an embodiment of the present invention provides an inter-radar coordinate transformation method, including: respectively selecting N characteristic surfaces corresponding to the same object from the first point cloud and the second point cloud; n is more than or equal to 4; the first point cloud is corresponding to a first radar, and the second point cloud is corresponding to a second radar; projecting N feature surfaces corresponding to the second point cloud into the first point cloud, and adjusting the projected position in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud; determining the projected target position, and acquiring a target coordinate transformation matrix corresponding to the target position; the target position is: and each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error.
Optionally, the determining the target position of the projection includes: calculating the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud on each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
Optionally, the calculating, at each candidate position, a sum of distances between a fixed point on each feature plane in the projection and a corresponding feature plane in the first point cloud includes: calculating K on the ith feature plane in the projection at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding feature surface in the first point cloud is used as the distance sum value of the ith feature surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions; and calculating the sum of the distance and the value corresponding to each feature plane in the projection at the jth candidate position as the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud.
Optionally, the determining the target position of the projection includes: calculating the distance sum of fixed points on all feature surfaces in the first point cloud and the corresponding feature surface in the projection at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
Optionally, the calculating a sum of distances between fixed points on all feature surfaces in the first point cloud and corresponding feature surfaces in the projection at each candidate position includes: calculating K on the ith feature plane in the first point cloud at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface; and calculating the sum of the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position as the sum of the distances of the fixed points on all feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
Optionally, the obtaining of the target coordinate transformation matrix corresponding to the target position includes: acquiring a target rotation matrix and a target translation matrix corresponding to the target position; and generating the target coordinate transformation matrix according to the target rotation matrix and the target translation matrix.
Optionally, the adjusting the position of the projection in the first point cloud according to a preset candidate position set includes: respectively adjusting an initial rotation matrix and an initial translation matrix according to the preset candidate position set, so that the adjusted rotation matrix and the adjusted translation matrix correspond to candidate positions in the candidate position set; the initial rotation matrix and the initial translation matrix are determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud.
Optionally, after obtaining the target coordinate transformation matrix corresponding to the target position, the method further includes: and fusing the first point cloud and the second point cloud according to the target coordinate transformation matrix.
In order to solve the above technical problem, an embodiment of the present invention further provides an inter-radar coordinate transformation apparatus, including: the selecting unit is used for respectively selecting N characteristic surfaces corresponding to the same object from the first point cloud and the second point cloud; n is more than or equal to 4; the first point cloud is corresponding to a first radar, and the second point cloud is corresponding to a second radar; the projection unit is used for projecting the N characteristic surfaces corresponding to the second point cloud into the first point cloud; the adjusting unit is used for adjusting the position of the projection in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud; a target position determination unit for determining a target position of the projection; the target position is: each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error; and the matrix acquisition unit is used for acquiring a target coordinate transformation matrix corresponding to the target position.
Optionally, the target position determining unit is configured to calculate a sum of distances between a fixed point on each feature plane in the projection and a corresponding feature plane in the first point cloud at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
Optionally, the target position determining unit is configured to calculate K on the ith feature plane in the projection at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding feature surface in the first point cloud is used as the distance sum value of the ith feature surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions; and calculating the sum of the distance and the value corresponding to each feature plane in the projection at the jth candidate position as the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud.
Optionally, the target position determining unit is configured to calculate a sum of distances between fixed points on all feature surfaces in the first point cloud and corresponding feature surfaces in the projection at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
Optionally, theThe target position determining unit is used for calculating K on the ith candidate position and the ith feature plane in the first point cloudiThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface; and calculating the sum of the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position as the sum of the distances of the fixed points on all feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
Optionally, the matrix obtaining unit is configured to obtain a target rotation matrix and a target translation matrix corresponding to the target position; and generating the target coordinate transformation matrix according to the target rotation matrix and the target translation matrix.
Optionally, the adjusting unit is configured to adjust the initial rotation matrix and the initial translation matrix according to the preset candidate position set, so that the adjusted rotation matrix and translation matrix correspond to candidate positions in the candidate position set; the initial rotation matrix and the initial translation matrix are determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud.
Optionally, the inter-radar coordinate transformation apparatus further includes: and the fusion unit is used for fusing the first point cloud and the second point cloud according to the target coordinate transformation matrix after the matrix acquisition unit acquires the target coordinate transformation matrix corresponding to the target position.
The embodiment of the present invention further provides a computer-readable storage medium, where the computer-readable storage medium is a non-volatile storage medium or a non-transitory storage medium, and has computer instructions stored thereon, and when the computer instructions are executed, the computer instructions perform any one of the steps of the inter-radar coordinate transformation method.
The embodiment of the present invention further provides another inter-radar coordinate transformation apparatus, which includes a memory and a processor, where the memory stores computer instructions executable on the processor, and the processor executes the computer instructions to perform any one of the steps of the inter-radar coordinate transformation method.
Compared with the prior art, the technical scheme of the embodiment of the invention has the following beneficial effects:
when the N characteristic surfaces corresponding to the second point cloud are projected into the first point cloud, the projected positions in the first point cloud are adjusted according to the candidate position set, namely the projected positions of the N characteristic surfaces corresponding to the second point cloud projected into the first point cloud are adjusted, the projected target position is determined to be the candidate position with the minimum distance error between each characteristic surface in the projection and the corresponding characteristic surface in the first point cloud, and then the target coordinate transformation matrix corresponding to the target position is determined, so that the transformation of coordinates between two radars is realized. Because the target coordinate transformation matrix corresponds to the target position, and when the projection is at the target position, the distance error between each characteristic surface and the corresponding characteristic surface in the first point cloud is minimum, the coordinate transformation between two radars can be accurately realized. N feature surfaces which are not parallel to each other and correspond to the same object are selected from the first point cloud and the second point cloud respectively, and the operation speed of coordinate transformation between radars can be improved.
Drawings
Fig. 1 is a flowchart of an inter-radar coordinate transformation method in an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an inter-radar coordinate transformation apparatus in an embodiment of the present invention.
Detailed Description
As described in the above background art, in the prior art, the coordinate transformation relationship between different laser radars cannot be accurately obtained, and thus, the accurate transformation of coordinates between different laser radars cannot be realized.
In the embodiment of the invention, the target coordinate transformation matrix corresponds to the target position, and when the projection is at the target position, the distance error between each feature surface and the corresponding feature surface in the first point cloud is minimum, so that the coordinate transformation between two radars can be accurately realized. N feature surfaces which are not parallel to each other and correspond to the same object are selected from the first point cloud and the second point cloud respectively, and the operation speed of coordinate transformation between radars can be improved.
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, embodiments accompanied with figures are described in detail below.
The calibration method for the laser radar and the camera device provided by the embodiment of the invention can be executed by adopting a program, and the program can be run on hardware with program processing capability, such as a processor.
An embodiment of the present invention provides a method for transforming coordinates between radars, which is described in detail below with reference to fig. 1 through specific steps.
Step S101, N feature surfaces corresponding to the same object are respectively selected from the first point cloud and the second point cloud.
In particular implementations, at least two lidar sources may be provided on a vehicle, as a single radar may have a blind spot. For the convenience of distinguishing, in the embodiment of the present invention, different lidar may be distinguished by the first radar and the second radar. The first radar and the second radar may be disposed at different locations on the vehicle. For example, the first radar is disposed on the roof of the vehicle and the second radar is disposed on the front face of the vehicle. In practical application, the first radar and the second radar may have other setting positions, and may be set according to different application requirements or different user preferences, which is not described in detail herein.
In the embodiment of the invention, the radar can be laser radar. The structure of the laser radar and its working principle can be referred to the laser radar provided in the prior art.
In implementations, the first point cloud may be a point cloud acquired by a first radar and the second point cloud may be a point cloud acquired by a second radar. In practical applications, one of the first radar and the second radar may be responsible for primary point cloud acquisition, and the other one may be responsible for secondary point cloud acquisition, where the primary and secondary are relative concepts, and the skilled person may define the concept according to practical applications. Therefore, one of the first radar and the second radar may be referred to as a primary radar and the other may be referred to as a secondary radar, depending on the roles of the first radar and the second radar.
In the embodiment of the invention, the first radar is a main radar, and the second radar is an auxiliary radar; or the first radar is a secondary radar and the second radar is a primary radar. When the first radar is a main radar and the second radar is an auxiliary radar, the inter-radar coordinate transformation method provided in the embodiment of the present invention transforms the point cloud acquired by the auxiliary radar to the point cloud acquired by the main radar, in other words, calibrates the point cloud acquired by the auxiliary radar to be consistent with the coordinate system of the main radar.
Therefore, the first radar can be selected as the primary radar and the second radar can be selected as the secondary radar, or the second radar can be selected as the primary radar and the first radar can be selected as the secondary radar according to specific application requirements.
For example, if the point cloud collected by the primary radar needs to be transformed to the point cloud collected by the secondary radar, the secondary radar is selected as the primary radar, and the first radar is selected as the secondary radar. If the point cloud collected by the secondary radar needs to be transformed to the point cloud collected by the main radar, the first radar is selected as the main radar, and the second radar is selected as the secondary radar.
In specific implementation, N feature surfaces corresponding to the same object can be selected from the first point cloud and the second point cloud respectively, where N is an integer and N is greater than or equal to 4. That is, at least 4 feature surfaces corresponding to the same object are selected from the first point cloud, and at least 4 feature surfaces corresponding to the same object are selected from the second point cloud, and the feature surfaces selected from the first point cloud and the feature surfaces selected from the second point cloud have a one-to-one correspondence relationship.
For example, for the same object, 4 feature surfaces are selected from the first point cloud as feature surface 1, feature surface 2, feature surface 3, and feature surface 4, and 4 feature surfaces are selected from the second point cloud as feature surface 1 ', feature surface 2', feature surface 3 ', and feature surface 4', where: feature plane 1 corresponds to feature plane 1 'and both correspond to the same part of the same object, feature plane 2 corresponds to feature plane 2' and both correspond to the same part of the same object, feature plane 3 corresponds to feature plane 3 'and both correspond to the same part of the same object, and feature plane 4 corresponds to feature plane 4' and both correspond to the same part of the same object.
In a specific implementation, N feature surfaces corresponding to the same object may be selected from the first point cloud and the second point cloud respectively according to a preset rule. In the embodiment of the present invention, the preset rule may be: and characterizing the same object and ensuring that any two surfaces are not parallel to each other. That is, there are at least two cases where N feature planes intersect each other.
In practical application, the preset rule may further include other rules as long as N feature surfaces corresponding to the same object can be selected from the first point cloud and the second point cloud respectively, and the N feature surfaces are not parallel to each other.
In a specific implementation, an operator may also select N feature surfaces corresponding to the same object from the first point cloud and the second point cloud respectively.
For example, the operator selects a feature plane 1, a feature plane 2, a feature plane 3, and a feature plane 4 in the first point cloud; in the second point cloud, feature plane 1 ', feature plane 2', feature plane 3 ', and feature plane 4' are selected. The feature plane 1 selected by the operator corresponds to the feature plane 1 ', the feature plane 2 corresponds to the feature plane 2', the feature plane 3 corresponds to the feature plane 3 ', and the feature plane 4 corresponds to the feature plane 4'.
Step S102, projecting N feature surfaces corresponding to the second point cloud into the first point cloud, and adjusting the projected position in the first point cloud according to a preset candidate position set.
In a specific implementation, after the N feature surfaces are selected from the second point cloud, the N feature surfaces in the selected second point cloud may be projected into the first point cloud. And projecting the N selected characteristic surfaces in the second point cloud into the first point cloud, and substantially projecting points on the N characteristic surfaces in the second point cloud into the first point cloud.
After the N feature surfaces corresponding to the second point cloud are projected to the first point cloud, the projection in the first point cloud may be adjusted. In the embodiment of the present invention, the projection in the first point cloud is: and (3) projecting the N characteristic surfaces corresponding to the second point cloud in the first point cloud. Therefore, the projection in the first point cloud is substantially adjusted to the projection positions of the N feature surfaces corresponding to the second point cloud in the first point cloud.
In a specific implementation, there may be M candidate positions in the projection of N feature surfaces corresponding to the second point cloud in the first point cloud, and the M candidate positions constitute a candidate position set. When the value of M is large, the accuracy of the finally obtained target transformation matrix is high, and correspondingly, the required calculation complexity is large; when the value of M is small, the accuracy of the finally obtained target transformation matrix is low, and accordingly the required computational complexity is small. Therefore, when the value of M is selected, the accuracy and the calculation complexity can be comprehensively considered, and the balance of the accuracy and the calculation complexity is realized.
In a specific implementation, when the position of the projection in the first point cloud is adjusted, the initial coordinate transformation matrix corresponding to the projection in the first point cloud may be adjusted according to the candidate position in the candidate position set. Each candidate position in the candidate position set corresponds to a coordinate transformation matrix, and the coordinate transformation matrices corresponding to different candidate positions are different.
In the embodiment of the present invention, adjusting the position of the projection in the first point cloud may be regarded as adjusting an initial coordinate transformation matrix corresponding to the projection.
When the position projected in the first point cloud is adjusted, for each candidate position in the candidate position set, a coordinate transformation matrix corresponding to the position may be obtained, and the initial coordinate transformation matrix is adjusted, so that the adjusted initial coordinate transformation matrix is the same as the coordinate transformation matrix corresponding to the position.
For example, for the ith candidate position in the candidate position set, the coordinate transformation matrix i corresponds. The initial coordinate transformation matrix may be transformed into a coordinate transformation matrix i when the location projected in the first point cloud is adjusted to the ith candidate location.
When the position of the projection in the first point cloud is adjusted, the coordinate transformation matrix after the previous adjustment can be adjusted. For example, when the position projected in the first point cloud is adjusted to the ith candidate position, the coordinate transformation matrix corresponding to the (i-1) th candidate position is adjusted to the coordinate transformation matrix corresponding to the ith candidate position.
It is to be understood that the adjustment of the initial coordinate transformation matrix alone or the adjustment of the coordinate transformation matrix after the last adjustment is essentially performed on the basis of the initial coordinate transformation matrix. The results obtained for both of the above matrix adjustments are substantially the same.
In a specific implementation, the coordinate transformation matrix may be represented by:
Figure BDA0002223164200000081
wherein T is a coordinate transformation matrix, R is a rotation matrix, and T is a translation matrix.
If the coordinate transformation matrix is a 4 × 4 matrix, the rotation matrix is a 3 × 3 matrix and the translation matrix is a 3 × 1 column vector.
As known from the coordinate transformation matrix, the coordinate transformation matrix may be composed of a rotation matrix and a translation matrix. Accordingly, the initial coordinate transformation matrix includes an initial rotation matrix and an initial translation matrix. Therefore, the initial coordinate transformation matrix may be adjusted by an initial rotation matrix and an initial translation matrix.
Since the initial rotation matrix is usually a 3 × 3 matrix, a large amount of calculation is required if the initial rotation matrix is directly operated. In order to reduce the calculation amount of the initial rotation matrix in the transformation process, the rotation matrix can be subjected to angle transformation, and the initial rotation matrix after the angle transformation is a 3 × 1 column vector. Therefore, when the initial rotation matrix after the angle transformation is operated, the corresponding calculation amount is smaller than that of the initial transformation matrix.
In a specific implementation, the initial rotation matrix may be angle-converted by a roll, pitch, yaw angle conversion method (attitude angle conversion method).
It is understood that other angle transformation methods may also be adopted to perform angle transformation on the initial rotation matrix, thereby implementing the dimension reduction processing on the initial rotation matrix.
In a specific implementation, when the initial translation matrix corresponding to the projection line is adjusted, the position of the initial translation matrix corresponding to the projection line may be converted, and the initial translation matrix after the position conversion may be adjusted.
In a specific implementation, the initial rotation matrix and the initial translation matrix may be determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud. In the embodiment of the present invention, an initial coordinate transformation matrix may be calculated and obtained by using an iterative Closest point method (ICP) according to N feature surfaces selected from the first point cloud and N feature surfaces selected from the second point cloud, so as to obtain an initial rotation matrix and an initial translation matrix.
It is understood that, in practical applications, other calculation methods may also be used to determine the initial coordinate transformation matrix, and the specific method for obtaining the initial coordinate transformation matrix does not affect the protection scope of the present invention.
In a specific implementation, when the position of the projection in the first point cloud is adjusted, since the projection corresponds to the projection of the N feature surfaces in the second point cloud in the first point cloud, the positions of the N feature surfaces projected to the second point cloud in the first point cloud can be adjusted.
For example, when the position of the projection in the first point cloud is adjusted, the positions of the 4 feature surfaces projected in the first point cloud are all adjusted.
In a specific implementation, when calculating the distance between the fixed point and the plane, an equation fitting the plane may be calculated first, and then the distance between the fixed point and the plane may be calculated. In fitting the equation of the plane, a Singular Value Decomposition (SVD) algorithm may be employed to fit the equation of the plane. It will be appreciated that other methods may be used to fit the equation for the plane.
For example, when calculating the distance and value between the fixed point on the feature plane 1 ″ in the projection and the feature plane 1 in the first feature plane, the SVD algorithm is used to fit the plane equation corresponding to the feature plane 1, and then the distance between the fixed point on the feature plane 1 ″ and the feature plane 1 is calculated. The specific algorithm for calculating the distance between the point and the plane is common knowledge, and the present invention may refer to the prior art, which is not described in detail herein.
Step S103, determining the projected target position, and acquiring a target coordinate transformation matrix corresponding to the target position.
In a specific implementation, the target position may be: and each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error. In this embodiment of the present invention, the minimum distance error between each feature plane in the projection and the corresponding feature plane in the first point cloud may be: the sum of the distances of the fixed points on each feature plane in the projection from the corresponding feature plane in the first point cloud is minimum.
In the embodiment of the present invention, the corresponding feature plane is a feature plane having a correspondence relationship. For example, feature plane 1 corresponds to feature plane 1 ', the projection of feature plane 1' in the first point cloud is feature plane 1 ", and then the feature plane in the first point cloud corresponding to feature plane 1" is feature plane 1.
In a specific implementation, when the target position is determined, the sum of distances between a fixed point on each feature plane in the projection and a corresponding feature plane in the first point cloud on each candidate position can be calculated; and after the distance sum corresponding to each candidate position is obtained, selecting the minimum value from the distance sums, and taking the candidate position corresponding to the minimum value as a target position.
That is, when the projection is at the target position, the sum of the distances between the fixed point on each feature surface in the projection and the corresponding feature surface in the first point cloud is minimum, that is, the distance error between the projection of each feature surface in the second point cloud in the first point cloud and the corresponding feature surface in the first point cloud is minimum.
In a specific implementation, when the projection is at the jth candidate position, K on the jth candidate position, the ith feature plane in the projection may be calculatediThe sum of the distances between each fixed point and the corresponding feature plane in the first point cloud is used as the projection valueThe distance and value of the ith feature plane in the shadow; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions.
By analogy, the distance and the value corresponding to each feature plane in the projection at the jth candidate position can be calculated, and then the sum of the distance and the value corresponding to each feature plane in the projection is obtained, and the obtained sum is: and the sum of the distances between the fixed points on all the feature surfaces in the projection and the corresponding feature surfaces in the first point cloud.
In the concrete implementation, at the jth candidate position, for the ith feature plane in the projection, the number of the selected fixed points is KiAnd (4) respectively. The number of the selected fixed points can be different, the same or partially the same for different feature planes.
For example, at the jth candidate position, the number of feature surfaces in the projection is 4, that is, 4 feature surfaces of the second point cloud are projected in the first point cloud. For the 1 st feature plane in projection, the number of the selected fixed points is 3; for the 2 nd feature plane in projection, the number of the selected fixed points is 2; for the 3 rd feature plane in projection, the number of the selected fixed points is 4; for the 4 th feature plane in the projection, the number of the selected fixed points is 3.
For another example, at the jth candidate position, the number of feature planes in the projection is 4, and the number of fixed points on each feature plane in the projection is 3.
The following illustrates calculating the distance and value of the anchor point on the ith feature plane in the projection to the corresponding feature plane in the first point cloud.
And setting the 1 st feature surface in the projection as a feature surface 1' on the 1 st candidate position, wherein the feature surface corresponds to the feature surface 1 in the first point cloud. The 3 fixed points on the feature plane 1' are the point a, the point b and the point c in sequence, the distances between the point a, the point b and the point c and the feature plane 1 are respectively calculated, and the sum of the three obtained distances is taken to obtain the distance sum of the fixed point on the 1 st feature plane and the corresponding feature plane in the first point cloud in the projection on the 1 st candidate position.
By analogy, the distance and the value corresponding to each feature plane in the projection on the 1 st candidate position can be calculated. And adding the distance sum value corresponding to each feature surface in the projection to obtain a sum value, namely the distance sum corresponding to the 1 st candidate position. By adopting the same calculation process, the sum of distances corresponding to other candidate positions can be obtained.
In the embodiment of the present invention, when the distance error between each feature surface projected to the first point cloud and the corresponding feature surface is the minimum, the following may be performed: the sum of the distances of the fixed points on all the feature surfaces in the first point cloud to the corresponding feature surfaces in the projection is minimal.
In a specific implementation, when the projection is at the jth candidate position, K on the jth candidate position and the ith feature plane in the first point cloud can be calculatediThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions.
By analogy, the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position can be obtained through calculation. Adding the distance sum value corresponding to each feature surface in the first point cloud, wherein the obtained sum is the distance sum corresponding to the jth candidate position, namely: and the sum of the distances between the fixed points on all the feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
The following illustrates the calculation of the distance and value of the anchor point on the ith feature plane in the first point cloud from the corresponding feature plane in the projection.
And setting the ith feature surface in the first point cloud as a feature surface 1 at the 1 st candidate position, wherein the feature surface 1' in the projection corresponds to the feature surface 1. And 3 fixed points are selected from the feature plane 1 and sequentially comprise a point d, a point e and a point f. And respectively calculating the distances between the point e, the point d and the point f and the feature surface 1' and summing the three obtained distances to obtain the distance sum value of the 1 st feature surface in the first point cloud and the corresponding feature surface in the projection on the 1 st candidate position.
By analogy, the distance and the value corresponding to each feature plane in the first point cloud at the 1 st candidate position can be calculated. And adding the distance sum value corresponding to each feature surface in the first point cloud, wherein the obtained sum value is the distance sum corresponding to the 1 st candidate position. By adopting the same calculation process, the sum of distances corresponding to other candidate positions can be obtained.
In the embodiment of the invention, after the target position is determined, the target rotation matrix and the target translation matrix corresponding to the target position can be obtained. And generating a corresponding target coordinate transformation matrix according to the target rotation matrix and the target translation matrix.
In an implementation, the target rotation matrix is set to RtargetSetting the target translation matrix as ttargetThen the target coordinate transformation matrix is:
Figure BDA0002223164200000121
therefore, in the embodiment of the invention, the target coordinate transformation matrix corresponds to the target position, and when the projection is at the target position, the distance error between each feature plane and the corresponding feature plane in the first point cloud is minimum, so that the coordinate transformation between two radars can be accurately realized.
In addition, when the inter-radar coordinate transformation method provided by the embodiment of the invention is executed, only N feature surfaces which are not parallel to each other and correspond to the same object need to be respectively selected from the first point cloud and the second point cloud, and the feature surfaces are selected from the first point cloud and the second point cloud according to a certain rule, so that the operation speed of inter-radar coordinate transformation can be improved.
Referring to fig. 2, there is provided an inter-radar coordinate transformation apparatus 20 according to an embodiment of the present invention, including: a selecting unit 201, a projecting unit 202, an adjusting unit 203, a target position determining unit 204, and a matrix obtaining unit 205, wherein:
a selecting unit 201, configured to select N feature surfaces corresponding to the same object from the first point cloud and the second point cloud respectively; n is more than or equal to 4; the first point cloud is corresponding to a first radar, and the second point cloud is corresponding to a second radar;
a projection unit 202, configured to project N feature surfaces corresponding to the second point cloud into the first point cloud;
an adjusting unit 203, configured to adjust a position of a projection in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud;
a target position determination unit 204 for determining a target position of the projection; the target position is: each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error;
a matrix obtaining unit 205, configured to obtain a target coordinate transformation matrix corresponding to the target position.
In a specific implementation, the target position determining unit 204 may be configured to calculate a sum of distances between a fixed point on each feature plane in the projection and a corresponding feature plane in the first point cloud at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
In an implementation, the target position determining unit 204 may be configured to calculate K on the ith feature plane in the projection at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding feature surface in the first point cloud is used as the distance sum value of the ith feature surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions; and calculating the sum of the distance and the value corresponding to each feature plane in the projection at the jth candidate position as the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud.
In a specific implementation, the target position determining unit 204 may be configured to calculate a sum of distances between fixed points on all feature surfaces in the first point cloud and corresponding feature surfaces in the projection at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
In specific implementations, theA target position determining unit 204, configured to calculate K on the ith feature plane in the jth candidate position and the first point cloudiThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface; and calculating the sum of the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position as the sum of the distances of the fixed points on all feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
In a specific implementation, the matrix obtaining unit 205 may be configured to obtain a target rotation matrix and a target translation matrix corresponding to the target position; and generating the target coordinate transformation matrix according to the target rotation matrix and the target translation matrix.
In a specific implementation, the adjusting unit 203 may be configured to adjust the initial rotation matrix and the initial translation matrix according to the preset candidate position set, so that the adjusted rotation matrix and the adjusted translation matrix correspond to candidate positions in the candidate position set; the initial rotation matrix and the initial translation matrix are determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud.
In a specific implementation, the inter-radar coordinate transformation apparatus 20 may further include: a fusion unit 206, configured to fuse the first point cloud and the second point cloud according to the target coordinate transformation matrix after the matrix obtaining unit 205 obtains the target coordinate transformation matrix corresponding to the target position.
An embodiment of the present invention provides a computer-readable storage medium, which is a non-volatile storage medium or a non-transitory storage medium, and on which computer instructions are stored, where the computer executes, when running, the steps of the inter-radar coordinate transformation method provided in the above-mentioned embodiment of the present invention.
Another coordinate transformation device between radars according to an embodiment of the present invention includes a memory and a processor, where the memory stores computer instructions executable on the processor, and the processor executes the computer instructions to perform the steps of the coordinate transformation method between radars according to the above embodiment of the present invention.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by instructing the relevant hardware through a program, which may be stored in a computer-readable storage medium, and the storage medium may include: ROM, RAM, magnetic or optical disks, and the like.
Although the present invention is disclosed above, the present invention is not limited thereto. Various changes and modifications may be effected therein by one skilled in the art without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (16)

1. An inter-radar coordinate transformation method, comprising:
respectively selecting N characteristic surfaces corresponding to the same object from the first point cloud and the second point cloud; n is more than or equal to 4; the first point cloud is corresponding to a first radar, and N characteristic surfaces in the first point cloud are not parallel to each other; the second point cloud is corresponding to a second radar, and N characteristic surfaces in the second point cloud are not parallel to each other;
projecting N feature surfaces corresponding to the second point cloud into the first point cloud, and adjusting the projected position in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud;
determining the projected target position, and acquiring a target rotation matrix and a target translation matrix corresponding to the target position; generating a target coordinate transformation matrix corresponding to the target position according to the target rotation matrix and the target translation matrix; the target position is: and each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error.
2. The inter-radar coordinate transformation method of claim 1, wherein the determining the projected target location comprises:
calculating the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud on each candidate position;
and determining a candidate position corresponding to the minimum distance sum as the target position.
3. The method of coordinate transformation between radars according to claim 2, wherein calculating the sum of distances of the fixed point on each feature plane in the projection to the corresponding feature plane in the first point cloud at each candidate position comprises:
calculating K on the ith feature plane in the projection at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding feature surface in the first point cloud is used as the distance sum value of the ith feature surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions;
and calculating the sum of the distance and the value corresponding to each feature plane in the projection at the jth candidate position as the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud.
4. The inter-radar coordinate transformation method of claim 1, wherein the determining the projected target location comprises:
calculating the distance sum of fixed points on all feature surfaces in the first point cloud and the corresponding feature surface in the projection at each candidate position;
and determining a candidate position corresponding to the minimum distance sum as the target position.
5. The method of coordinate transformation between radars according to claim 4, wherein said calculating the sum of distances of the fixed points on all feature surfaces in the first point cloud to the corresponding feature surface in the projection at each candidate location comprises:
calculating K on the ith feature plane in the first point cloud at the jth candidate positioniThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface;
and calculating the sum of the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position as the sum of the distances of the fixed points on all feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
6. The method of coordinate transformation between radars according to claim 1, wherein the adjusting the position of the projection in the first point cloud according to a preset set of candidate positions comprises:
respectively adjusting an initial rotation matrix and an initial translation matrix according to the preset candidate position set, so that the adjusted rotation matrix and the adjusted translation matrix correspond to candidate positions in the candidate position set; the initial rotation matrix and the initial translation matrix are determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud.
7. The inter-radar coordinate transformation method according to any one of claims 1 to 6, further comprising, after obtaining a target coordinate transformation matrix corresponding to the target position:
and fusing the first point cloud and the second point cloud according to the target coordinate transformation matrix.
8. An inter-radar coordinate transformation apparatus, comprising:
the selecting unit is used for respectively selecting N characteristic surfaces corresponding to the same object from the first point cloud and the second point cloud; n is more than or equal to 4; the first point cloud is corresponding to a first radar, and the second point cloud is corresponding to a second radar;
the projection unit is used for projecting the N characteristic surfaces corresponding to the second point cloud into the first point cloud;
the adjusting unit is used for adjusting the position of the projection in the first point cloud according to a preset candidate position set; the projection in the first point cloud is: the projection of N characteristic surfaces corresponding to the second point cloud in the first point cloud;
a target position determination unit for determining a target position of the projection; the target position is: each feature plane in the projection and the corresponding feature plane in the first point cloud have the candidate position with the minimum distance error;
a matrix obtaining unit, configured to obtain a target coordinate transformation matrix corresponding to the target position, including: acquiring a target rotation matrix and a target translation matrix corresponding to the target position; and generating the target coordinate transformation matrix according to the target rotation matrix and the target translation matrix.
9. The inter-radar coordinate transformation apparatus according to claim 8, wherein the target position determination unit is configured to calculate a sum of distances between a fixed point on each feature plane in the projection and a corresponding feature plane in the first point cloud at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
10. The inter-radar coordinate transformation apparatus of claim 9, wherein the target position determination unit is configured to calculate K on an i-th feature plane in the projection at a j-th candidate positioniThe sum of the distances between each fixed point and the corresponding feature surface in the first point cloud is used as the distance sum value of the ith feature surface; kiNot less than 2; j is more than or equal to 1 and less than or equal to M, i is more than or equal to 1 and less than or equal to N, and M is the total number of the candidate positions; and calculating the sum of the distance and the value corresponding to each feature plane in the projection at the jth candidate position as the sum of the distances between the fixed point on each feature plane in the projection and the corresponding feature plane in the first point cloud.
11. The inter-radar coordinate transformation apparatus of claim 8, wherein the target position determination unit is configured to calculate a sum of distances of fixed points on all feature surfaces in the first point cloud to corresponding feature surfaces in the projection at each candidate position; and determining a candidate position corresponding to the minimum distance sum as the target position.
12. The inter-radar coordinate transformation apparatus of claim 11, wherein the target position determination unit is configured to calculate K on an i-th feature plane in the first point cloud at a j-th candidate positioniThe sum of the distances between each fixed point and the corresponding characteristic surface in the projection is used as the distance sum value of the ith characteristic surface; and calculating the sum of the distance and the value corresponding to each feature surface in the first point cloud at the jth candidate position as the sum of the distances of the fixed points on all feature surfaces in the first point cloud and the corresponding feature surfaces in the projection.
13. The inter-radar coordinate transformation apparatus according to claim 8, wherein the adjusting unit is configured to adjust an initial rotation matrix and an initial translation matrix according to the preset candidate position set, respectively, so that the adjusted rotation matrix and translation matrix correspond to candidate positions in the candidate position set; the initial rotation matrix and the initial translation matrix are determined from the N feature surfaces in the first point cloud and the N feature surfaces in the second point cloud.
14. The inter-radar coordinate transformation apparatus according to any one of claims 8 to 13, further comprising: and the fusion unit is used for fusing the first point cloud and the second point cloud according to the target coordinate transformation matrix after the matrix acquisition unit acquires the target coordinate transformation matrix corresponding to the target position.
15. A computer readable storage medium, being a non-volatile storage medium or a non-transitory storage medium, having computer instructions stored thereon, wherein the computer instructions, when executed by a processor, perform the steps of the inter-radar coordinate transformation method according to any one of claims 1 to 7.
16. An inter-radar coordinate transformation apparatus comprising a memory and a processor, the memory having stored thereon computer instructions executable on the processor, wherein the processor executes the computer instructions to perform the steps of the inter-radar coordinate transformation method according to any one of claims 1 to 7.
CN201910941969.1A 2019-09-30 2019-09-30 Method and device for coordinate transformation between radars and computer-readable storage medium Active CN112578396B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910941969.1A CN112578396B (en) 2019-09-30 2019-09-30 Method and device for coordinate transformation between radars and computer-readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910941969.1A CN112578396B (en) 2019-09-30 2019-09-30 Method and device for coordinate transformation between radars and computer-readable storage medium

Publications (2)

Publication Number Publication Date
CN112578396A CN112578396A (en) 2021-03-30
CN112578396B true CN112578396B (en) 2022-04-19

Family

ID=75116260

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910941969.1A Active CN112578396B (en) 2019-09-30 2019-09-30 Method and device for coordinate transformation between radars and computer-readable storage medium

Country Status (1)

Country Link
CN (1) CN112578396B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11782131B2 (en) 2016-12-31 2023-10-10 Innovusion, Inc. 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US11782138B2 (en) 2018-02-21 2023-10-10 Innovusion, Inc. LiDAR detection systems and methods with high repetition rate to observe far objects
US11789128B2 (en) 2021-03-01 2023-10-17 Innovusion, Inc. Fiber-based transmitter and receiver channels of light detection and ranging systems
US11796645B1 (en) 2018-08-24 2023-10-24 Innovusion, Inc. Systems and methods for tuning filters for use in lidar systems
US11808888B2 (en) 2018-02-23 2023-11-07 Innovusion, Inc. Multi-wavelength pulse steering in LiDAR systems
US11860313B2 (en) 2018-06-15 2024-01-02 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
US11871130B2 (en) 2022-03-25 2024-01-09 Innovusion, Inc. Compact perception device
US12146988B2 (en) 2022-11-28 2024-11-19 Innovusion, Inc. Dynamic compensation to polygon and motor tolerance using galvo control profile

Families Citing this family (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11609336B1 (en) 2018-08-21 2023-03-21 Innovusion, Inc. Refraction compensation for use in LiDAR systems
WO2018182812A2 (en) 2016-12-30 2018-10-04 Innovusion Ireland Limited Multiwavelength lidar design
US10969475B2 (en) 2017-01-05 2021-04-06 Innovusion Ireland Limited Method and system for encoding and decoding LiDAR
US11009605B2 (en) 2017-01-05 2021-05-18 Innovusion Ireland Limited MEMS beam steering and fisheye receiving lens for LiDAR system
US11054508B2 (en) 2017-01-05 2021-07-06 Innovusion Ireland Limited High resolution LiDAR using high frequency pulse firing
US11675050B2 (en) 2018-01-09 2023-06-13 Innovusion, Inc. LiDAR detection systems and methods
WO2019139895A1 (en) 2018-01-09 2019-07-18 Innovusion Ireland Limited Lidar detection systems and methods that use multi-plane mirrors
US11927696B2 (en) 2018-02-21 2024-03-12 Innovusion, Inc. LiDAR systems with fiber optic coupling
WO2019165294A1 (en) 2018-02-23 2019-08-29 Innovusion Ireland Limited 2-dimensional steering system for lidar systems
US11422234B2 (en) 2018-02-23 2022-08-23 Innovusion, Inc. Distributed lidar systems
WO2019199775A1 (en) 2018-04-09 2019-10-17 Innovusion Ireland Limited Lidar systems and methods for exercising precise control of a fiber laser
US11579300B1 (en) 2018-08-21 2023-02-14 Innovusion, Inc. Dual lens receive path for LiDAR system
US11614526B1 (en) 2018-08-24 2023-03-28 Innovusion, Inc. Virtual windows for LIDAR safety systems and methods
US11579258B1 (en) 2018-08-30 2023-02-14 Innovusion, Inc. Solid state pulse steering in lidar systems
CN118915020A (en) 2018-11-14 2024-11-08 图达通智能美国有限公司 LIDAR system and method using a polygon mirror
US11675055B2 (en) 2019-01-10 2023-06-13 Innovusion, Inc. LiDAR systems and methods with beam steering and wide angle signal detection
US11486970B1 (en) 2019-02-11 2022-11-01 Innovusion, Inc. Multiple beam generation from a single source beam for use with a LiDAR system
US11977185B1 (en) 2019-04-04 2024-05-07 Seyond, Inc. Variable angle polygon for use with a LiDAR system
US12061289B2 (en) 2021-02-16 2024-08-13 Innovusion, Inc. Attaching a glass mirror to a rotating metal motor frame
US11422267B1 (en) 2021-02-18 2022-08-23 Innovusion, Inc. Dual shaft axial flux motor for optical scanners
US11555895B2 (en) 2021-04-20 2023-01-17 Innovusion, Inc. Dynamic compensation to polygon and motor tolerance using galvo control profile
US11614521B2 (en) 2021-04-21 2023-03-28 Innovusion, Inc. LiDAR scanner with pivot prism and mirror
WO2022225859A1 (en) 2021-04-22 2022-10-27 Innovusion, Inc. A compact lidar design with high resolution and ultra-wide field of view
US11662439B2 (en) 2021-04-22 2023-05-30 Innovusion, Inc. Compact LiDAR design with high resolution and ultra-wide field of view
CN113223137B (en) * 2021-05-13 2023-03-24 广州虎牙科技有限公司 Generation method and device of perspective projection human face point cloud image and electronic equipment
US11662440B2 (en) 2021-05-21 2023-05-30 Innovusion, Inc. Movement profiles for smart scanning using galvonometer mirror inside LiDAR scanner
US11768294B2 (en) 2021-07-09 2023-09-26 Innovusion, Inc. Compact lidar systems for vehicle contour fitting
CN216356147U (en) 2021-11-24 2022-04-19 图达通智能科技(苏州)有限公司 Vehicle-mounted laser radar motor, vehicle-mounted laser radar and vehicle
CN115166654B (en) * 2022-06-24 2024-10-01 海信集团控股股份有限公司 Multi-millimeter wave radar calibration method, device and storage medium
CN115236690B (en) * 2022-09-20 2023-02-10 图达通智能科技(武汉)有限公司 Data fusion method and device for laser radar system and readable storage medium

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018214605A1 (en) * 2017-05-24 2018-11-29 京东方科技集团股份有限公司 Positioning method and apparatus for intelligent terminal device, and associated intelligent terminal device

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102314674B (en) * 2011-08-29 2013-04-03 北京建筑工程学院 Registering method for data texture image of ground laser radar
US8913836B1 (en) * 2013-12-20 2014-12-16 I.R.I.S. Method and system for correcting projective distortions using eigenpoints
CN104007444B (en) * 2014-06-09 2017-02-08 北京建筑大学 Ground laser radar reflection intensity image generation method based on central projection
CN107564069B (en) * 2017-09-04 2020-09-29 北京京东尚科信息技术有限公司 Method and device for determining calibration parameters and computer readable storage medium
CN110031824B (en) * 2019-04-12 2020-10-30 杭州飞步科技有限公司 Laser radar combined calibration method and device
CN110221275B (en) * 2019-05-21 2023-06-23 菜鸟智能物流控股有限公司 Calibration method and device between laser radar and camera

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2018214605A1 (en) * 2017-05-24 2018-11-29 京东方科技集团股份有限公司 Positioning method and apparatus for intelligent terminal device, and associated intelligent terminal device

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
三维激光扫描多视点云拼接新方法;蔡润彬等;《同济大学学报(自然科学版)》;20060728(第07期);全文 *
基于QR分解的广义辨别分析用于雷达目标识别;刘华林等;《红外与毫米波学报》;20070615(第03期);全文 *

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11782131B2 (en) 2016-12-31 2023-10-10 Innovusion, Inc. 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US11782132B2 (en) 2016-12-31 2023-10-10 Innovusion, Inc. 2D scanning high precision LiDAR using combination of rotating concave mirror and beam steering devices
US11899134B2 (en) 2016-12-31 2024-02-13 Innovusion, Inc. 2D scanning high precision lidar using combination of rotating concave mirror and beam steering devices
US11782138B2 (en) 2018-02-21 2023-10-10 Innovusion, Inc. LiDAR detection systems and methods with high repetition rate to observe far objects
US11808888B2 (en) 2018-02-23 2023-11-07 Innovusion, Inc. Multi-wavelength pulse steering in LiDAR systems
US11860313B2 (en) 2018-06-15 2024-01-02 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
US11796645B1 (en) 2018-08-24 2023-10-24 Innovusion, Inc. Systems and methods for tuning filters for use in lidar systems
US11789128B2 (en) 2021-03-01 2023-10-17 Innovusion, Inc. Fiber-based transmitter and receiver channels of light detection and ranging systems
US11871130B2 (en) 2022-03-25 2024-01-09 Innovusion, Inc. Compact perception device
US12146988B2 (en) 2022-11-28 2024-11-19 Innovusion, Inc. Dynamic compensation to polygon and motor tolerance using galvo control profile

Also Published As

Publication number Publication date
CN112578396A (en) 2021-03-30

Similar Documents

Publication Publication Date Title
CN112578396B (en) Method and device for coordinate transformation between radars and computer-readable storage medium
CN110567480B (en) Optimization method, device and equipment for vehicle positioning and storage medium
CN109781119B (en) Laser point cloud positioning method and system
CN110793544B (en) Method, device and equipment for calibrating parameters of roadside sensing sensor and storage medium
KR101054736B1 (en) Method for 3d object recognition and pose estimation
KR20190088866A (en) Method, apparatus and computer readable medium for adjusting point cloud data collection trajectory
CN107067437B (en) Unmanned aerial vehicle positioning system and method based on multi-view geometry and bundle adjustment
WO2022217988A1 (en) Sensor configuration scheme determination method and apparatus, computer device, storage medium, and program
US11263774B2 (en) Three-dimensional position estimation device and program
CN114677588A (en) Obstacle detection method, obstacle detection device, robot and storage medium
WO2021262837A1 (en) Systems and methods for fine adjustment of roof models
CN110068824B (en) Sensor pose determining method and device
CN109035345A (en) The TOF camera range correction method returned based on Gaussian process
CN112967344A (en) Method, apparatus, storage medium, and program product for camera external reference calibration
WO2024119705A1 (en) Method and apparatus for measuring point of fall of jet flow of fire monitor, and fire-fighting control method and apparatus for fire monitor
CN112946612A (en) External parameter calibration method and device, electronic equipment and storage medium
CN111337010B (en) Positioning method and positioning device of movable equipment and electronic equipment
KR20220100813A (en) Automatic driving vehicle registration method and device, electronic equipment and a vehicle
CN111612835B (en) System and method suitable for extended target tilt tracking
CN112669388B (en) Calibration method and device for laser radar and camera device and readable storage medium
CN112652018B (en) External parameter determining method, external parameter determining device and electronic equipment
CN107992677B (en) Infrared weak and small moving target tracking method based on inertial navigation information and brightness correction
JPH07146121A (en) Recognition method and device for three dimensional position and attitude based on vision
CN111457928B (en) Robot positioning method and device
CN110261820A (en) A kind of time difference positioning method and device of more measuring stations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: No.2 building, no.468 xinlai Road, Jiading District, Shanghai, 201821

Applicant after: Shanghai Hesai Technology Co.,Ltd.

Address before: 201800 Building 2, no.468, xinlai Road, Jiading District, Shanghai

Applicant before: Shanghai Hesai Technology Co., Ltd

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant