CN112233188B - Calibration method of data fusion system of laser radar and panoramic camera - Google Patents
Calibration method of data fusion system of laser radar and panoramic camera Download PDFInfo
- Publication number
- CN112233188B CN112233188B CN202011153637.6A CN202011153637A CN112233188B CN 112233188 B CN112233188 B CN 112233188B CN 202011153637 A CN202011153637 A CN 202011153637A CN 112233188 B CN112233188 B CN 112233188B
- Authority
- CN
- China
- Prior art keywords
- camera
- panoramic
- image
- fisheye
- laser radar
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 40
- 230000004927 fusion Effects 0.000 title claims description 44
- 239000004744 fabric Substances 0.000 claims abstract description 26
- 238000006243 chemical reaction Methods 0.000 claims abstract description 25
- 238000005457 optimization Methods 0.000 claims abstract description 7
- 238000004519 manufacturing process Methods 0.000 claims abstract description 4
- 239000011159 matrix material Substances 0.000 claims description 29
- 230000009466 transformation Effects 0.000 claims description 7
- 238000013527 convolutional neural network Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 4
- 238000007500 overflow downdraw method Methods 0.000 claims description 4
- 238000013528 artificial neural network Methods 0.000 claims description 3
- 238000004364 calculation method Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 8
- 230000007704 transition Effects 0.000 description 7
- 230000007547 defect Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000001514 detection method Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000009434 installation Methods 0.000 description 2
- 230000008447 perception Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000019771 cognition Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/86—Combinations of lidar systems with systems other than lidar, radar or sonar, e.g. with direction finders
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/93—Lidar systems specially adapted for specific applications for anti-collision purposes
- G01S17/931—Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/497—Means for monitoring or calibrating
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4038—Image mosaicing, e.g. composing plane images from plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
- G06T2207/10044—Radar image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
- G06T2207/30208—Marker matrix
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Image Processing (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to a vehicle roof panoramic camera based on a laser radar and a calibration method thereof, wherein the vehicle roof panoramic camera and the circular scanning laser radar are erected on a vehicle roof, the panoramic camera and the circular scanning laser radar are mutually perpendicular and are not shielded, and the calibration method comprises the following steps: s1: installing the calibrated panoramic camera on the top of the automobile; s2: manufacturing rectangular calibration cloth printed with a regular checkerboard; s3: running the automobile on the calibration cloth, aligning the long side and the short side of the automobile with the calibration cloth, and aligning the origin of an automobile coordinate system with a certain corner point of the checkerboard; s4: acquiring a calibration image of the panoramic camera, and measuring pixel coordinates of partial angular points on the calibration image and three-dimensional coordinates corresponding to a whole vehicle coordinate system; s5: and acquiring data conversion external parameters from the panoramic camera to the automobile coordinate system through a nonlinear optimization algorithm. The panoramic camera constructed by the invention acquires the panoramic image of the sensing area above the vehicle roof, and can detect and identify traffic marks to form a panoramic environment sensing system.
Description
Technical Field
The invention belongs to the field of panoramic camera calibration, and particularly relates to a calibration method of a data fusion system of a laser radar and a panoramic camera.
Background
The intelligent driving technology plays an important role in preventing and avoiding traffic accidents. In place of the human driver to accomplish ambient sensing and cognition, intelligent driving vehicles are often equipped with ambient sensing sensors such as cameras, lidar, millimeter wave radar, and the like. However, the environmental perception technology based on a single sensor has the defects that, for example, although the image provides rich color semantic information, each pixel does not provide depth information due to the imaging principle; the laser radar can provide space three-dimensional point cloud information, but the point cloud is usually sparse, so that the condition that small objects are easy to miss detection occurs; millimeter wave radars have strong environmental interference resistance, but have lower accuracy and often clutter. For the deficiency of a single sensor, sensor fusion is increasingly gaining attention.
Intelligent automobiles are often equipped with multiple sensors to form accurate and redundant sensing systems, and for the specific problems of intelligent automobiles, various fusion systems have been proposed in recent years, such as laser radar and cameras, cameras and cameras, laser radar and laser radar, cameras and IMU, laser radar and IMU, looking-around cameras and ultrasonic waves, etc., especially fusion of laser radar and cameras, greatly improving the accuracy of environmental target detection. But currently, most of data fusion of images and laser point clouds mainly uses a fusion scheme of a plane camera and a laser radar or a panoramic camera formed by more than 2 cameras. In addition, the precondition of multi-sensor fusion is that spatial alignment can be formed between the sensor data, and the accuracy of data alignment directly determines the fusion result. The existing multi-sensor fusion device and calibration method mainly have the following defects:
(1) Panoramic cameras assembled based on a plurality of cameras are subjected to image stitching technology through plane cameras facing different directions, so that panoramic images are obtained, but due to the fact that the optical centers of the cameras deviate greatly, severe uneven textures of the stitched images are easy to occur. Even if the least square image transformation algorithm is adopted, only partial textures can be aligned, but partial object distortion or inaccurate spatial position of pixels are caused by large pixel movement.
(2) There currently exist partial fusion devices that place a lidar over a panoramic camera. By the method, although the data of all laser radar point clouds can be prevented from being blocked by the support, the area above the panoramic camera is blocked by the bottom surface of the laser radar, so that a larger area above the fusion device is not covered by the sensor.
(3) The external parameters between the sensors are mainly used for acquiring the data conversion relation between two sensor coordinate systems, and if the external parameters are directly used for joint calibration between multiple sensors, accumulated errors can be generated. At present, no proper calibration method is available for calibrating a panoramic camera consisting of two fisheye cameras and a laser radar at the same time.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provides a calibration method of a data fusion system of a laser radar and a panoramic camera.
The aim of the invention is achieved by the following technical scheme: a calibration method of a data fusion system of a laser radar and a panoramic camera comprises the panoramic camera and a circular scanning type laser radar which are erected on a vehicle roof by adopting a bracket, wherein the panoramic camera and the circular scanning type laser radar are mutually perpendicular and are not shielded, the panoramic camera can be arranged right above the circular scanning type laser radar and also can be arranged right below the circular scanning type laser radar, and a special bracket is required to be designed for facilitating installation of the panoramic camera and the circular scanning type laser radar.
The calibration method comprises the following steps:
s1: installing the calibrated panoramic camera on the top of the automobile;
s2: manufacturing rectangular calibration cloth, printing a regular checkerboard on the calibration cloth, and paving the calibration cloth on a horizontal ground with a flat surface;
s3: the method comprises the steps that an automobile is driven onto a calibration cloth, the direction of the automobile head is defined as being parallel to the long side of the rectangular calibration cloth, the direction of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, the Y axis of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, and the origin of a whole automobile coordinate system is aligned to a certain angular point of a checkerboard;
s4: acquiring a calibration image of the panoramic camera, and measuring pixel coordinates of partial angular points on the calibration image and three-dimensional coordinates corresponding to a whole vehicle coordinate system;
s5: acquiring a panoramic camera to an automobile coordinate system through a nonlinear optimization algorithmData conversion external reference T s2v =(R s2v ,t s2v )。
Let the coordinate of any obstacle perceived by the panoramic camera be O (x) s ,y s ,z s ) The coordinates of the obstacle in the automobile coordinate system are:
further, the panoramic camera is formed by combining two fisheye cameras which are mounted back to back in a fitting manner; one of the fish-eye cameras is opposite to the front direction of the automobile, and the other fish-eye camera is opposite to the rear of the automobile.
Further, the spatial-temporal alignment between the panoramic camera and the circular scanning lidar consists essentially of time synchronization and spatial synchronization;
the method mainly solves the conversion matrix T of mutual alignment among data by using the space synchronization, and comprises the following steps:
two fisheye images acquired by two fisheye cameras are used for acquiring pixel positions of black points in the fisheye images, and three-dimensional positions of the black points in an environment coordinate system are also known, then a nonlinear optimization method is used for solving the minimum target loss function of the reprojection error, namely solving the environment coordinate system and external parameter transformation matrixes T of the two fisheye cameras g2c1 And T g2c2 The objective loss function is as follows:
wherein R is g2c And t g2c Representing a rotation matrix and a translation vector, X representing the three-dimensional coordinates of a black dot,representing coordinates of black points after three-dimensional coordinates are projected onto an image plane, f representing a fish-eye camera projection function, p c And representing the pixel coordinates of the corresponding black points on the image, and K represents the internal reference of the fisheye camera.
Respectively obtaining a calibrated environment coordinate system, an annular scanning laser radar and external parameter conversion matrixes T of two fish-eye cameras l2g ,T g2c1 And T g2c2 Then, obtaining external parameters T between the circular scanning laser radar and the two fish-eye cameras l2c1 And T c22c1 Laser point cloud (x l ,y l ,z l ) Pixel coordinates (u) of a fisheye image projected to the fisheye camera c ,v c ) The formula is as follows:
wherein the laser point cloud is first converted into fisheye camera coordinate system coordinates (x c ,y c ,z c ),f K The conversion formula for projecting the three-dimensional point under the fisheye camera coordinate system to the image plane is as follows:
f K :
θ dist =θ(1+p 1 ·θ 2 +p 2 ·θ 4 +p 3 ·θ 6 +p 4 ·θ 8 )
wherein f x ,f y ,u 0 ,v 0 ,p 1 ,p 2 ,p 3 ,p 4 Is an internal reference of the fish-eye camera, and is obtained by an internal reference calibration method.
The two fisheye cameras are used for panoramic image stitching, so that the function of the panoramic camera is formed;
firstly, according to a data conversion matrix between two fisheye cameras, projecting an image of the fisheye camera with back vision to the fisheye camera with front vision, so as to form a spherical panoramic view; projecting the image of the fish-eye camera with the rear view onto a spherical surface with the radius of 1, and then overlapping the pixel points with the coordinate system of the fish-eye camera with the front view by rotating the pixel points on the spherical surface, wherein the conversion matrix is as follows:
wherein,R -1 g2c2 is R g2c2 Is a matrix of inverse of (a).
The converted pixel points are according to f K A projection formula is projected to an image acquired by the spherical fisheye camera (1), so that a panoramic image is formed; the two fisheye cameras are fused in an alpha mixing algorithm in the region where the pixels overlap after panoramic expansion, and the fused panoramic image is more continuous and even in image texture and brightness at the transition part, so that a panoramic camera is constructed to form a panoramic image;
the formula of the panoramic camera image formed by splicing the conversion matrix of the laser point cloud projection of the circular scanning laser radar to the fisheye camera and the two fisheye camera images is as follows:
laser point cloud to front view fisheye camera image: t (T) l2g ·T g2c1
Panoramic stitching formula of fish-eye image: t (T) c22c1 ·T g2c1 ·T -1 g2c2 Wherein the optical axis of the front-view fisheye camera is opposite to the driving direction of the automobile, T -1 g2c2 Is T g2c2 Inverse matrix of T c22c1 Is a transformation matrix rotated 180 degrees about the y-axis.
For the two fish-eye cameras with the same model, as the frame rate is the same, the same image capturing time is only required to be set, and the images of the two fish-eye cameras can be synchronously acquired by starting multiple threads; for time synchronization of heterogeneous sensors between the circular scanning laser radar and the fisheye camera, the frame rate of the circular scanning laser radar with lower frame rate is used as data updating frame rate due to different frame rates, and when laser point cloud data are output, the current image of the fisheye camera is output, so that time synchronization is formed.
Further, the heterogeneous data fusion method between the panoramic camera and the circular scanning laser radar focuses on the fusion of the panoramic image and the panoramic laser point cloud, and is divided into pixel-level fusion, feature-level fusion and target-level fusion, wherein the calculation formula is as follows:
pixel level fusion:
feature level fusion:
target level fusion:
wherein x is c X is the original image of the camera image l Conv is the original point cloud of the laser radar E Conv represents a deep convolutional neural network fused at the pixel level M Deep convolutional neural network for feature and level fusion, conv c And Conv l Neural networks representing the processing of panoramic camera images and circular scanning lidar, respectively.
Further, in the panoramic image stitching process, the two fisheye cameras are fused in the region with overlapped pixels after panoramic expansion by adopting an alpha mixing algorithm.
Compared with the prior art, the invention has the following advantages:
(1) The invention adopts the rectangular calibration cloth to calibrate the external parameters of the panoramic camera consisting of the two fisheye cameras and the circular scanning laser radar, and can finish the calibration by only driving the automobile into the calibration cloth, thereby the calibration method is simple and efficient.
(2) The panoramic camera and the circular scanning type laser radar are perpendicular to each other and are not shielded, and the transition areas of the panoramic camera are led to be directed to the two sides of the automobile, so that the parts of front and rear key areas of the automobile have higher pixel quality. Meanwhile, the upper area of the panoramic camera is not shielded, so that traffic signs such as signal lamps or guideboards can be recognized.
(3) When the external parameters of the panoramic camera are acquired, the transition part of the unfolded image is fused by adopting an alpha mixing algorithm, and the image texture and the brightness of the transition part are more continuous and even in transition, so that the panoramic camera is constructed to form the panoramic image.
Drawings
FIG. 1 is a flow chart of a method for calibrating a laser radar-based vehicle roof panoramic camera;
FIG. 2 is a schematic view of a fused area of a lidar-based roof panoramic camera of the present invention;
FIG. 3 is a schematic diagram of a method for calibrating a laser radar-based vehicle roof panoramic camera according to the present invention;
fig. 4 is a data conversion diagram of the panoramic camera and the circular scanning laser radar of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
As shown in fig. 1-3, a calibration method of a data fusion system of a laser radar and a panoramic camera includes that the panoramic camera and the circular scanning type laser radar are erected on a vehicle roof by adopting a bracket, the panoramic camera and the circular scanning type laser radar are mutually perpendicular and are not shielded, and the panoramic camera can be arranged right above the circular scanning type laser radar and also right below the circular scanning type laser radar.
The calibration method of the car roof panoramic camera based on the laser radar comprises the following steps:
s1: installing the calibrated panoramic camera on the top of the automobile;
s2: manufacturing rectangular calibration cloth, printing a regular checkerboard on the calibration cloth, and paving the calibration cloth on a horizontal ground with a flat surface;
s3: the method comprises the steps that an automobile is driven onto a calibration cloth, the direction of the automobile head is defined as being parallel to the long side of the rectangular calibration cloth, the direction of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, the Y axis of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, and the origin of a whole automobile coordinate system is aligned to a certain angular point of a checkerboard;
s4: acquiring a calibration image of the panoramic camera, and measuring pixel coordinates of partial angular points on the calibration image and three-dimensional coordinates corresponding to a whole vehicle coordinate system;
s5: acquiring data conversion external parameters T from a panoramic camera to an automobile coordinate system through a nonlinear optimization algorithm s2v =(R s2v ,t s2v )。
Set up the coordinate of arbitrary obstacle of panoramic camera perceptionIn the panoramic camera coordinate system, O (x s ,y s ,z s ) The coordinates of the obstacle in the automobile coordinate system are:
preferably, the panoramic camera is formed by combining two fisheye cameras which are mounted in a back-to-back joint manner; one of the fish-eye cameras is opposite to the front direction of the automobile, and the other fish-eye camera is opposite to the rear of the automobile.
The specific installation method comprises the following steps: firstly, a panoramic camera 1 consisting of two fish-eye cameras is arranged on the top of a bracket 3, meanwhile, an annular scanning type laser radar 2 is fastened on the side edge of the bracket 3, and the annular scanning type laser radar 2 is positioned right below the panoramic camera 1, or the positions of the panoramic camera 1 and the annular scanning type laser radar 2 are exchanged, namely, the annular scanning type laser radar 2 is erected on the top of the bracket 3, and the panoramic camera 1 is positioned right below the annular scanning type laser radar 2, at the moment, the bracket 3 needs to be improved, for example, the part for fastening the panoramic camera 1 is designed into a circular ring; and finally, fixing the data line along the side surface of the bracket 3, so as to avoid the wire harness from interfering with the point cloud of the laser radar.
Preferably, the spatial-temporal alignment between the panoramic camera and the circular scanning lidar mainly comprises time synchronization and spatial synchronization;
the method mainly solves the conversion matrix T aligned with each other among the data by the space synchronization, and comprises the following steps:
two fisheye images acquired by two fisheye cameras are used for acquiring pixel positions of black points in the fisheye images, the three-dimensional positions of the black points in an environment coordinate system are also known, and then the minimum target loss function of the reprojection error is obtained through a nonlinear optimization method, namely, the environment coordinate system and external parameter transformation matrixes T of the two fisheye cameras are obtained g2c1 And T g2c2 The objective loss function is as follows:
wherein R is g2c And t g2c Representing a rotation matrix and a translation vector, X representing the three-dimensional coordinates of a black dot,representing coordinates of black points after three-dimensional coordinates are projected onto an image plane, f representing a fish-eye camera projection function, p c And representing the pixel coordinates of the corresponding black points on the image, and K represents the internal reference of the fisheye camera.
Respectively obtaining a calibrated environment coordinate system, an annular scanning laser radar and external parameter conversion matrixes T of two fish-eye cameras l2g ,T g2c1 And T g2c2 Then, obtaining external parameters T between the circular scanning laser radar and the two fish-eye cameras l2c1 And T c22c1 As shown in fig. 4, the laser point cloud (x l ,y l ,z l ) Pixel coordinates (u) of a fisheye image projected to the fisheye camera c ,v c ) The formula is as follows:
wherein the laser point cloud is first converted into fisheye camera coordinate system coordinates (x c ,y c ,z c ),f K The conversion formula for projecting the three-dimensional point under the fisheye camera coordinate system to the image plane is as follows:
f K :
θ dist =θ(1+p 1 ·θ 2 +p 2 ·θ 4 +p 3 ·θ 6 +p 4 ·θ 8 )
wherein f x ,f y ,u 0 ,v 0 ,p 1 ,p 2 ,p 3 ,p 4 Is an internal reference of the fish-eye camera, and is obtained by an internal reference calibration method.
The two fisheye cameras are used for panoramic image stitching, so that the function of the panoramic camera is formed;
firstly, according to a data conversion matrix between two fisheye cameras, projecting an image of the fisheye camera with back vision to the fisheye camera with front vision, so as to form a spherical panoramic view; projecting the image of the fish-eye camera with the rear view onto a spherical surface with the radius of 1, and then overlapping the pixel points with the coordinate system of the fish-eye camera with the front view by rotating the pixel points on the spherical surface, wherein the conversion matrix is as follows:
wherein,R -1 g2c2 is R g2c2 Is a matrix of inverse of (a).
The converted pixel points are according to f K Projection formula, projectionAn image acquired by the spherical fisheye camera (1) is obtained, so that a panoramic image is formed; the two fisheye cameras are fused in an alpha mixing algorithm in the region where the pixels overlap after panoramic expansion, and the fused panoramic image is more continuous and even in image texture and brightness at the transition part, so that a panoramic camera is constructed to form a panoramic image;
for the two fish-eye cameras with the same model, the same frame rate is adopted, so that the images of the two fish-eye cameras can be synchronously acquired only by setting the same image capturing time and starting multiple threads; for time synchronization of heterogeneous sensors between the circular scanning laser radar and the fisheye camera, the frame rate of the circular scanning laser radar with lower frame rate is used as data updating frame rate due to different frame rates, and when laser point cloud data are output, the current image of the fisheye camera is output, so that time synchronization is formed.
For the heterogeneous sensor target level data fusion of the fisheye camera and the circular scanning laser radar, the time synchronization problem can be avoided, and at the moment, the observation data z (t) of each sensor and the prediction data of the global system are obtained in the fusion process in a sensor-to-global Kalman filter modeFused data are +.>
Wherein K is Kalman gain, H is space conversion matrix, F is state transition matrix,system data at the previous time.
Preferably, the heterogeneous data fusion method between the panoramic camera and the circular scanning laser radar focuses on the fusion of the panoramic image and the panoramic laser point cloud, and is divided into pixel-level fusion, feature-level fusion and target-level fusion, wherein the calculation formula is as follows:
pixel level fusion:
feature level fusion:
target level fusion:
wherein x is c X is the original image of the camera image l Conv is the original point cloud of the laser radar E Conv represents a deep convolutional neural network fused at the pixel level M Deep convolutional neural network for feature and level fusion, conv c And Conv l Neural networks representing the processing of panoramic camera images and circular scanning lidar, respectively.
Further, the formula of the panoramic camera image formed by splicing the conversion matrix of the laser point cloud projection of the circular scanning laser radar to the fisheye camera and the two fisheye camera images is as follows:
laser point cloud to front view fisheye camera image: t (T) l2g ·T g2c1
Panoramic stitching formula of fish-eye image: t (T) c22c1 ·T g2c1 ·T -1 g2c2 Wherein the optical axis of the front-view fisheye camera is opposite to the driving direction of the automobile, T -1 g2c2 Is T g2c2 Inverse matrix of T c22c1 Is a transformation matrix rotated 180 degrees about the y-axis.
Preferably, in the panoramic image stitching process, the two fisheye cameras are fused in an alpha mixing algorithm in the region where the pixels overlap after panoramic expansion.
In a word, the panoramic camera constructed by the two fisheye cameras can form the perception of a 360-degree area, and can not be shielded with the circular scanning laser radar. After being mounted on the top of an automobile, important areas in front and back can be covered by the panoramic camera and the laser radar at the same time. In addition, the calibration method provided by the invention can be used for calibrating the external parameters of the panoramic camera and the circular scanning laser radar arranged on the vehicle roof at the same time, and is more stable and reliable.
While the invention has been described with reference to certain preferred embodiments, it will be understood by those skilled in the art that various changes and substitutions of equivalents may be made and equivalents will be apparent to those skilled in the art without departing from the scope of the invention. Therefore, the protection scope of the invention is subject to the protection scope of the claims.
Claims (3)
1. A calibration method of a data fusion system of a laser radar and a panoramic camera is characterized by comprising the following steps of: the method comprises the steps that a panoramic camera and an annular scanning type laser radar are erected on a vehicle roof by adopting a bracket, the panoramic camera and the annular scanning type laser radar are mutually perpendicular and are not shielded, and the calibrating method comprises the following steps:
s1: installing the calibrated panoramic camera on the top of the automobile;
s2: manufacturing rectangular calibration cloth, printing a regular checkerboard on the calibration cloth, and paving the calibration cloth on a horizontal ground with a flat surface;
s3: the method comprises the steps that an automobile is driven onto a calibration cloth, the direction of the automobile head is defined as being parallel to the long side of the rectangular calibration cloth, the direction of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, the Y axis of the automobile head is defined as being parallel to the short side of the rectangular calibration cloth, and the origin of a whole automobile coordinate system is aligned to a certain angular point of a checkerboard;
s4: acquiring a calibration image of the panoramic camera, and measuring pixel coordinates of partial angular points on the calibration image and three-dimensional coordinates corresponding to a whole vehicle coordinate system;
s5: acquiring data conversion external parameters T from a panoramic camera to an automobile coordinate system through a nonlinear optimization algorithm s2v =(R s2v ,t s2v );
Let the coordinate of any obstacle perceived by the panoramic camera be O (x) s ,y s ,z s ) The coordinates of the obstacle in the automobile coordinate system are:
the panoramic camera is formed by combining two fisheye cameras which are mounted back to back in a fitting manner; one of the fish-eye cameras is opposite to the front direction of the automobile, and the other fish-eye camera is opposite to the rear of the automobile;
the spatial-temporal alignment between the panoramic camera and the circular scanning lidar includes time synchronization and spatial synchronization;
the method mainly solves the conversion matrix T of mutual alignment among data by using the space synchronization, and comprises the following steps:
two fisheye images acquired by two fisheye cameras are used for acquiring pixel positions of black points in the fisheye images, and three-dimensional positions of the black points in an environment coordinate system are also known, then a nonlinear optimization method is used for solving the minimum target loss function of the reprojection error, namely solving the environment coordinate system and external parameter transformation matrixes T of the two fisheye cameras g2c1 And T g2c2 The objective loss function is as follows:
wherein R is g2c And t g2c Representing a rotation matrix and a translation vector, X representing the three-dimensional coordinates of a black dot,representing coordinates of black points after three-dimensional coordinates are projected onto an image plane, f representing a fish-eye camera projection function, p c Representing pixel coordinates of the corresponding black points on the image, wherein K represents an internal reference of the fish-eye camera;
respectively obtaining a calibrated environment coordinate system, an annular scanning laser radar and external parameter conversion matrixes T of two fish-eye cameras l2g ,T g2c1 And T g2c2 Then, obtaining external parameters T between the circular scanning laser radar and the two fish-eye cameras l2c1 And T c22c1 Laser point cloud (x l ,y l ,z l ) Pixel coordinates (u) of a fisheye image projected to the fisheye camera c ,v c ) The formula is as follows:
wherein the laser point cloud is firstly converted into a fisheye camera coordinate system coordinate (x c ,y c ,z c ),f K The conversion formula for projecting the three-dimensional point under the fisheye camera coordinate system to the image plane is as follows:
f K :
θ dist =θ(1+p 1 ·θ 2 +p 2 ·θ 4 +p 3 ·θ 6 +p 4 ·θ 8 )
wherein f x ,f y ,u 0 ,v 0 ,p 1 ,p 2 ,p 3 ,p 4 Is an internal reference of the fish-eye camera, and is obtained by an internal reference calibration method;
the two fisheye cameras are used for panoramic image stitching, so that the function of the panoramic camera is formed;
firstly, according to a data conversion matrix between two fisheye cameras, projecting an image of the fisheye camera with back vision to the fisheye camera with front vision, so as to form a spherical panoramic view; projecting the image of the fish-eye camera with the rear view onto a spherical surface with the radius of 1, and then overlapping the pixel points with the coordinate system of the fish-eye camera with the front view by rotating the pixel points on the spherical surface, wherein the conversion matrix is as follows:
wherein,R -1 g2c2 is R g2c2 An inverse matrix of (a); r is R g2 c 2 For converting matrix T to external parameters g2c2 Is a rotation matrix of (a); t is t g2c2 For converting matrix T to external parameters g2c2 Is a translation vector of (a);
the converted pixel points are according to f K A projection formula projects an image acquired by the spherical fisheye camera, so that a panoramic image is formed;
the formula of the panoramic camera image formed by splicing the conversion matrix of the laser point cloud projection of the circular scanning laser radar to the fisheye camera and the two fisheye camera images is as follows:
laser point cloud to front view fisheye camera image: t (T) l2g ·T g2c1
Panoramic stitching formula of fisheye camera image: t (T) c22c1 ·T g2c1 ·T -1 g2c2 Wherein the optical axis of the front-view fisheye camera is opposite to the driving direction of the automobile, T -1 g2c2 Is T g2c2 Inverse matrix of T c22c1 A transformation matrix rotated 180 degrees about the y-axis;
for the two fish-eye cameras with the same model, as the frame rate is the same, the same image capturing time is only required to be set, and the images of the two fish-eye cameras can be synchronously acquired by starting multiple threads; for time synchronization of heterogeneous sensors between the circular scanning laser radar and the fisheye camera, the frame rate of the circular scanning laser radar with lower frame rate is used as data updating frame rate due to different frame rates, and when laser point cloud data are output, the current image of the fisheye camera is output, so that time synchronization is formed.
2. The method for calibrating a data fusion system of a laser radar and a panoramic camera according to claim 1, wherein the method comprises the following steps: the panoramic camera and the circular scanning laser radar further comprise a heterogeneous data fusion method, wherein the heterogeneous data fusion method focuses on the fusion of a panoramic image and a panoramic laser point cloud and is divided into pixel-level fusion, feature-level fusion and target-level fusion, and the calculation formula is as follows:
pixel level fusion:
feature level fusion:
target level fusion:
wherein x is c X is the original image of the camera image l Conv is the original point cloud of the laser radar E Conv represents a deep convolutional neural network fused at the pixel level M Deep convolutional neural network for feature and level fusion, conv c And Conv l Neural networks representing the processing of panoramic camera images and circular scanning lidar, respectively.
3. The method for calibrating a data fusion system of a laser radar and a panoramic camera according to claim 1, wherein the method comprises the following steps: in the panoramic image stitching process, the two fisheye cameras are fused in an alpha mixing algorithm in the region where the pixels overlap after panoramic expansion.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011153637.6A CN112233188B (en) | 2020-10-26 | 2020-10-26 | Calibration method of data fusion system of laser radar and panoramic camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011153637.6A CN112233188B (en) | 2020-10-26 | 2020-10-26 | Calibration method of data fusion system of laser radar and panoramic camera |
Publications (2)
Publication Number | Publication Date |
---|---|
CN112233188A CN112233188A (en) | 2021-01-15 |
CN112233188B true CN112233188B (en) | 2024-03-12 |
Family
ID=74110032
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011153637.6A Active CN112233188B (en) | 2020-10-26 | 2020-10-26 | Calibration method of data fusion system of laser radar and panoramic camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112233188B (en) |
Families Citing this family (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113192182A (en) * | 2021-04-29 | 2021-07-30 | 山东产研信息与人工智能融合研究院有限公司 | Multi-sensor-based live-action reconstruction method and system |
CN113096187B (en) * | 2021-05-03 | 2022-05-17 | 湖北汽车工业学院 | Method for automatically acquiring relative position of vehicle and obstacle |
CN113205604A (en) * | 2021-05-17 | 2021-08-03 | 南昌智能新能源汽车研究院 | Feasible region detection method based on camera and laser radar |
CN113759385A (en) * | 2021-08-12 | 2021-12-07 | 江苏徐工工程机械研究院有限公司 | A lidar and camera fusion ranging method and system |
CN113838145B (en) * | 2021-09-24 | 2024-04-30 | 重庆长安汽车股份有限公司 | Automatic calibration method for external parameters of vehicle-mounted camera |
CN116203542B (en) * | 2022-12-31 | 2023-10-03 | 中山市博测达电子科技有限公司 | Laser radar distortion test calibration method |
CN118570409B (en) * | 2024-08-01 | 2024-12-06 | 苏州魔视智能科技有限公司 | Image generation method, model training method, device, equipment and storage medium |
Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102508258A (en) * | 2011-11-29 | 2012-06-20 | 中国电子科技集团公司第二十七研究所 | Three-dimensional imaging laser radar for obtaining surveying and mapping information |
CN102608613A (en) * | 2012-03-20 | 2012-07-25 | 西安理工大学 | Device and method for accurately calibrating point object detectivity of laser radar |
CN104156972A (en) * | 2014-08-25 | 2014-11-19 | 西北工业大学 | Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras |
CN104573646A (en) * | 2014-12-29 | 2015-04-29 | 长安大学 | Detection method and system, based on laser radar and binocular camera, for pedestrian in front of vehicle |
CN104833372A (en) * | 2015-04-13 | 2015-08-12 | 武汉海达数云技术有限公司 | External parameter calibration method of high-definition panoramic camera of mobile measuring system |
CN107133988A (en) * | 2017-06-06 | 2017-09-05 | 科大讯飞股份有限公司 | The scaling method and calibration system of camera in vehicle-mounted panoramic viewing system |
CN108805801A (en) * | 2018-05-24 | 2018-11-13 | 北京华捷艾米科技有限公司 | A kind of panoramic picture bearing calibration and system |
CN109360245A (en) * | 2018-10-26 | 2019-02-19 | 魔视智能科技(上海)有限公司 | The external parameters calibration method of automatic driving vehicle multicamera system |
CN110162919A (en) * | 2019-05-31 | 2019-08-23 | 中国汽车工程研究院股份有限公司 | It is a kind of that abnormal sound risk class scaling method is hit based on the automobile interior of limiting temperature off field |
CN110221275A (en) * | 2019-05-21 | 2019-09-10 | 菜鸟智能物流控股有限公司 | Calibration method and device between laser radar and camera |
CN111369630A (en) * | 2020-02-27 | 2020-07-03 | 河海大学常州校区 | A method of multi-line lidar and camera calibration |
CN111382541A (en) * | 2018-12-29 | 2020-07-07 | 达索系统公司 | Set of neural networks |
CN111400960A (en) * | 2018-12-29 | 2020-07-10 | 美宅科技(北京)有限公司 | Furniture active customization method and active customization device |
US10726579B1 (en) * | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
-
2020
- 2020-10-26 CN CN202011153637.6A patent/CN112233188B/en active Active
Patent Citations (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102508258A (en) * | 2011-11-29 | 2012-06-20 | 中国电子科技集团公司第二十七研究所 | Three-dimensional imaging laser radar for obtaining surveying and mapping information |
CN102608613A (en) * | 2012-03-20 | 2012-07-25 | 西安理工大学 | Device and method for accurately calibrating point object detectivity of laser radar |
CN104156972A (en) * | 2014-08-25 | 2014-11-19 | 西北工业大学 | Perspective imaging method based on laser scanning distance measuring instrument and multiple cameras |
CN104573646A (en) * | 2014-12-29 | 2015-04-29 | 长安大学 | Detection method and system, based on laser radar and binocular camera, for pedestrian in front of vehicle |
CN104833372A (en) * | 2015-04-13 | 2015-08-12 | 武汉海达数云技术有限公司 | External parameter calibration method of high-definition panoramic camera of mobile measuring system |
CN107133988A (en) * | 2017-06-06 | 2017-09-05 | 科大讯飞股份有限公司 | The scaling method and calibration system of camera in vehicle-mounted panoramic viewing system |
CN108805801A (en) * | 2018-05-24 | 2018-11-13 | 北京华捷艾米科技有限公司 | A kind of panoramic picture bearing calibration and system |
CN109360245A (en) * | 2018-10-26 | 2019-02-19 | 魔视智能科技(上海)有限公司 | The external parameters calibration method of automatic driving vehicle multicamera system |
CN111382541A (en) * | 2018-12-29 | 2020-07-07 | 达索系统公司 | Set of neural networks |
CN111400960A (en) * | 2018-12-29 | 2020-07-10 | 美宅科技(北京)有限公司 | Furniture active customization method and active customization device |
CN110221275A (en) * | 2019-05-21 | 2019-09-10 | 菜鸟智能物流控股有限公司 | Calibration method and device between laser radar and camera |
CN110162919A (en) * | 2019-05-31 | 2019-08-23 | 中国汽车工程研究院股份有限公司 | It is a kind of that abnormal sound risk class scaling method is hit based on the automobile interior of limiting temperature off field |
US10726579B1 (en) * | 2019-11-13 | 2020-07-28 | Honda Motor Co., Ltd. | LiDAR-camera calibration |
CN111369630A (en) * | 2020-02-27 | 2020-07-03 | 河海大学常州校区 | A method of multi-line lidar and camera calibration |
Non-Patent Citations (4)
Title |
---|
Radar-Vision Fusion for Correcting the Position of Target Vehicles;QingQuan Feng;《2018 10th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC)》;全文 * |
基于激光雷达与视觉融合的环境感知与自主定位系统;欧阳毅;《中国优秀硕士论文全文数据库》;全文 * |
基于监督式学习的全景相机与激光雷达的联合标定;曹明玮;《机电一体化》;全文 * |
曹明玮.基于监督式学习的全景相机与激光雷达的联合标定.《机电一体化》.2018,全文. * |
Also Published As
Publication number | Publication date |
---|---|
CN112233188A (en) | 2021-01-15 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN112233188B (en) | Calibration method of data fusion system of laser radar and panoramic camera | |
CN107133988B (en) | Calibration method and calibration system for camera in vehicle-mounted panoramic looking-around system | |
US10919458B2 (en) | Method and system for calibrating vehicular cameras | |
CN110264520B (en) | Vehicle-mounted sensor and vehicle pose relation calibration method, device, equipment and medium | |
CN112308927B (en) | Fusion device of panoramic camera and laser radar and calibration method thereof | |
US9225942B2 (en) | Imaging surface modeling for camera modeling and virtual view synthesis | |
US10445928B2 (en) | Method and system for generating multidimensional maps of a scene using a plurality of sensors of various types | |
JP4695167B2 (en) | Method and apparatus for correcting distortion and enhancing an image in a vehicle rear view system | |
CN109360245B (en) | External parameter calibration method for multi-camera system of unmanned vehicle | |
US8036424B2 (en) | Field recognition apparatus, method for field recognition and program for the same | |
JP5455124B2 (en) | Camera posture parameter estimation device | |
CN108367714B (en) | Filling in areas of peripheral vision obscured by mirrors or other vehicle components | |
CN103065323B (en) | Subsection space aligning method based on homography transformational matrix | |
CN111559314B (en) | Depth and image information fused 3D enhanced panoramic looking-around system and implementation method | |
CN108020826A (en) | Multi-line laser radar and multichannel camera mixed calibration method | |
JP2006151125A (en) | On-vehicle image processing device | |
CN114445592B (en) | Bird's eye view semantic segmentation label generation method based on inverse perspective transformation and point cloud projection | |
CN114283391A (en) | Automatic parking sensing method fusing panoramic image and laser radar | |
CN112255604B (en) | Method and device for judging accuracy of radar data and computer equipment | |
WO2021172264A1 (en) | Device for detecting posture/position of detector | |
CN110750153A (en) | Dynamic virtualization device of unmanned vehicle | |
CN115936995A (en) | A method for panorama stitching of vehicle four-way fisheye camera | |
CN114926550A (en) | Multi-vehicle rear-mounted camera viewing angle self-adaption method and system | |
KR102612658B1 (en) | Method of matching radar and camera coordinates | |
CN117970325A (en) | Looking-around positioning and mapping method and system based on 4D imaging radar and vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |