[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN109360245A - The external parameters calibration method of automatic driving vehicle multicamera system - Google Patents

The external parameters calibration method of automatic driving vehicle multicamera system Download PDF

Info

Publication number
CN109360245A
CN109360245A CN201811256308.7A CN201811256308A CN109360245A CN 109360245 A CN109360245 A CN 109360245A CN 201811256308 A CN201811256308 A CN 201811256308A CN 109360245 A CN109360245 A CN 109360245A
Authority
CN
China
Prior art keywords
camera
calibrated
relay
vehicle
common
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201811256308.7A
Other languages
Chinese (zh)
Other versions
CN109360245B (en
Inventor
周易
李发成
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Magic Vision Intelligent Technology (shanghai) Co Ltd
Original Assignee
Magic Vision Intelligent Technology (shanghai) Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Magic Vision Intelligent Technology (shanghai) Co Ltd filed Critical Magic Vision Intelligent Technology (shanghai) Co Ltd
Priority to CN201811256308.7A priority Critical patent/CN109360245B/en
Publication of CN109360245A publication Critical patent/CN109360245A/en
Application granted granted Critical
Publication of CN109360245B publication Critical patent/CN109360245B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A kind of external parameters calibration method of automatic driving vehicle multicamera system, comprising: at least two relaying phase units are set between the adjacent camera to be calibrated of vehicle;Synchronize the camera to be calibrated and relaying camera;Around the mobile scaling board of the vehicle, the scaling board is made successively to pass through all cameras;Start the camera to be calibrated and relaying camera, mobile scaling board is shot;Each characteristic pattern is detected in each magazine 2D pixel coordinate;Outer parameter estimation.Redundancy posture figure optimisation strategy is established present invention introduces relaying camera and based on this, overcome the problem of accumulating between conventional method camera due to the outer parameter error that distance generates, it is satisfied the accurate outer parameter Estimation of SLAM system requirements, and it does not need to establish the calibration structure proportional to vehicle platform itself, the design and manufacturing expense of the marker that saves space, saves.

Description

External parameter calibration method for multi-camera system of unmanned vehicle
Technical Field
The invention belongs to the technical field of multi-camera systems, and particularly relates to an external parameter calibration method of a multi-camera system of an unmanned vehicle.
Background
As one of the most potential technologies in the world today, unmanned driving means that an automobile senses the surrounding environment and completes navigation tasks through a sensor equipped in the automobile without human operation. The popularization of the prediction unmanned technology of the Puhua Yongdao reduces the whole traffic accidents by ninety percent; the bimaway research center predicts that unmanned technology will drive productivity and energy efficiency improvements and new business models will emerge.
Unmanned vehicles are typically equipped with sensors such as cameras, Inertial Measurement Units (IMUs), lidar and Global Positioning Systems (GPS). The external information which can be sensed by the camera is most abundant, and comprises the color, the structure and the texture of a scene and some semantic information (such as roads, pedestrians, traffic signs and the like). Compared with the situation that a human driver can only observe the traffic condition in a certain direction at the same time, the unmanned technology aims at realizing 360-degree all-dimensional sensing of the environment around the vehicle body without dead angles. Due to the limited field angle of a single camera, a panoramic imaging system is typically composed using multiple cameras. The navigation task usually requires the information of multiple cameras to be converted into the same coordinate system for description, so that the external parameters between the multiple cameras need to be calibrated. For small vehicles, manufacturers or developers can obtain extrinsic parameters between multi-view cameras through global positioning by building static markers (calibration plates). However, for large vehicles (e.g., heavy trucks with trailers), a limited number of cameras are often mounted in a surrounding manner on the vehicle body for both blind around view and cost considerations. Two problems arise at this time: (1) there will be a large separation between cameras; (2) there is no (or only a small) overlap of fields of view between some cameras. These practical situations make external parameter calibration for multi-view cameras very difficult. Simply taking the above calibration strategy for a small vehicle all-round multi-view camera will have great requirements on the field. In addition, most of the existing external parameter calibration technologies for the panoramic system are mainly developed for the task of generating a high-quality 360-degree aerial view, and the quality of the final image stitching result is usually determined by visual sense. In fact, the requirements of the SLAM system for the external parameter calibration accuracy of the multi-view camera system are much higher than those of the system aiming at image stitching.
The existing calibration scheme and the advantages and disadvantages thereof are as follows:
1. the invention has the patent name: a calibration method of a panoramic vision auxiliary parking system is disclosed as follows: CN 101425181B:
the invention discloses a calibration method of a panoramic vision auxiliary parking system, which is characterized in that a virtual aerial view at a certain height at the top of an automobile is generated by images generated by four wide-angle fisheye cameras arranged on the periphery of the automobile. Wherein the positional relationship of each camera with respect to the virtual bird's eye view camera is determined by calculation based on a homography transformation matrix of the ground plane. Since the position of the virtual bird's-eye view camera is estimated by low-precision measurement, the spatial position relationship between the multi-view cameras based on the virtual bird's-eye view camera meets the requirement of completing seamless image stitching, but cannot meet the design requirement of the SLAM system based on the multi-view cameras.
2. The invention has the patent name: dynamic calibration system, joint optimization method and device in dynamic calibration system, the publication number: CN 105844624A:
the invention provides a dynamic calibration system, and a joint optimization method and device in the dynamic calibration system. In the external parameter calibration step between cameras, a plurality of groups of static calibration objects with known relative spatial positions need to be artificially constructed. The calibration task requires that a vehicle carrying a camera passes through the static markers along a designed track, and calibration data is acquired in a dynamic process. Compared with the method in the patent CN101425181B, the method obtains the camera external parameters more accurately, but the requirements on the calibration scene are too strict, and the method is not easy to be popularized to the calibration task of the all-round multi-view camera system under a large-scale vehicle platform.
3. The invention has the patent name: a method for detecting and tracking a running obstacle of a heavy-duty truck based on binocular fisheye cameras is disclosed as follows: CN 105678787A:
the invention discloses a method for detecting and tracking a driving obstacle of a heavy-duty truck based on binocular fisheye cameras, and belongs to the technical field of active safety of traffic vehicles. In order to detect the rear obstacle of the truck when the truck backs up, a group of binocular fisheye cameras are arranged at the tail part of the truck. The depth of field needs to be measured, and the visual fields of the two fisheye cameras have enough overlapped visual fields. The configuration simplifies the external parameter calibration of the binocular camera, and can be completed by using the existing open-source camera calibration tool box. However, for a large truck look around multi-view camera system, the above approach cannot be used because the distance between the cameras (baseline) is far and there is no (or only a small) overlapping area of view.
Disclosure of Invention
Based on the above technical problem, an external parameter calibration method for a multi-camera system of an unmanned vehicle is provided.
In order to solve the technical problems, the invention adopts the following technical scheme:
a method for calibrating external parameters of a multi-camera system of an unmanned vehicle comprises the following steps:
110. arranging at least two relay camera groups between adjacent cameras to be calibrated of a vehicle, wherein the at least two relay camera groups are arranged left and right, each relay camera group comprises at least one relay camera in the vertical direction, a common-view area is arranged between the adjacent cameras, and the angle of the common-view area is 50-150 degrees;
120. synchronizing the camera to be calibrated and the relay camera;
130. moving the calibration board around the vehicle to enable the calibration board to pass through all the cameras in sequence, wherein the front surface of the calibration board faces the camera to be calibrated and the relay camera, and the front surface is provided with a plurality of characteristic patterns in matrix arrangement;
140. starting the camera to be calibrated and the relay camera, and shooting a moving calibration plate;
150. detecting 2D pixel coordinates of each characteristic pattern in each camera;
160. external parameter estimation:
161. carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute posture of the camera as a node and the common-view relation as an edge, wherein the absolute posture is represented by { R, t }, R is a rotation matrix, and t is a translation vector;
162. and generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same;
163. the absolute pose of any one of the camera nodes to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree are taken as the external parameter set to be calibrated, and the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmIs the absolute pose, T, of the m-number camera at the moment ff mnThe relative pose of the n camera relative to the m camera is obtained;
164. and obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
The relay camera is fixed on the vehicle or fixed on a tripod in a manner of being capable of being adjusted in a left-right rotating mode.
And synchronizing the camera to be calibrated and the relay camera by sending a synchronous clock signal.
A manually held calibration plate is moved around the vehicle.
The invention introduces the relay camera and establishes the redundant attitude map optimization strategy based on the relay camera, overcomes the problem of external parameter error accumulation caused by distance between cameras in the traditional method, obtains accurate external parameter estimation meeting the requirements of the SLAM system, does not need to establish a calibration structure in proportion to the vehicle platform, saves the field and saves the design and manufacturing cost of the marker.
Drawings
The invention is described in detail below with reference to the following figures and detailed description:
FIG. 1 is a schematic diagram of the principles of the present invention;
fig. 2 is a schematic structural diagram of a relay camera group according to the present invention;
fig. 3 is a schematic top view of a relay camera group according to the present invention;
FIG. 4 is a schematic diagram of the pose graph and minimum spanning tree of the present invention;
FIG. 5 is a schematic diagram of a calibration plate feature pattern employed in the present invention.
Detailed Description
A method for calibrating external parameters of a multi-camera system of an unmanned vehicle comprises the following steps:
110. as shown in fig. 1, at least two relay camera sets 30 are provided between adjacent cameras 21 to be calibrated of the vehicle 20, the relay camera sets 30 are arranged in the left and right, each relay camera set 30 includes at least one relay camera 31 in the up-down direction, a common viewing area is provided between the adjacent cameras, the common viewing area is in the shape of a sector, and the angle of the common viewing area is 50 ° to 150 °.
In each relay camera group 30, when the number of the relay cameras 31 is 1, the relay cameras may be fixed to the vehicle 20 by mechanical fixing or magnetic force adsorption, and as shown in fig. 2 and 3, when the number of the relay cameras 31 is 2 or more, the relay cameras may be vertically arranged on a tripod 33 by a camera fixing frame 32, and the relay cameras may be adjusted by left and right rotation by the camera fixing frame.
120. By transmitting the synchronization clock signal to each camera, the camera to be calibrated 21 and the relay camera 31 are synchronized, so that photographing can be performed at the same frame rate.
130. The calibration board 40 is moved around the vehicle 20 so that the calibration board 40 passes through all the cameras in sequence, the front side of the calibration board 40 faces the camera to be calibrated 21 and the relay camera 31, and the front side thereof has a plurality of feature patterns arranged in a matrix, in this embodiment, aprilatas feature patterns are adopted, see fig. 5.
In the present embodiment, the calibration plate 40 is moved around the vehicle 20 by a manual hand, and the moving path L is shown in fig. 1.
140. The camera to be calibrated 21 and the relay camera 31 are started to shoot the moving calibration board 40.
150. Detecting the 2D pixel coordinate x of each characteristic pattern in each camerak=(uk,vk)T
160. External parameter estimation:
161. and carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute position and posture of the camera as nodes and the common-view relation as edges.
Wherein, a certain characteristic pattern in the common view area of two cameras has different 2D pixel coordinates in the two adjacent cameras, and the common view association is to associate the 3D world coordinates of the characteristic pattern with the corresponding two 2D pixel coordinates.
The common-view relationship means that two cameras have a common-view area, and then the corresponding two nodes are communicated through edges.
162. And generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: and selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same.
In computer vision, Reprojection error (Reprojection error) is often used, which is the error between the pixel coordinates of a 3D pattern (the projected position observed by a camera) and the position of the 3D pattern projected according to the current estimated pose of the camera, e.g., the 2D pixel coordinates into which the 3D coordinates of a certain feature pattern are transformed by the absolute pose of the a camera and the 2D pixel coordinates observed by the a camera.
Wherein, the absolute pose is represented by { R, t }, R is a rotation matrix, and t is a translation vector.
The initial value of the absolute pose of each camera is obtained through a PnP (passive-n-Point) algorithm, the associated 3D world coordinate and 2D pixel coordinate are input parameters of the PnP algorithm, a 3D world coordinate system is defined on one corner of a calibration board 40, the movement of the calibration board 40 relative to the camera is equivalent to the movement of the camera relative to the calibration board 40, and the 3D world coordinate of a characteristic pattern is known because the size of the characteristic pattern sprayed on the calibration board 40 is known.
163. Selecting the absolute pose of any one camera node to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree as an extrinsic parameter set to be calibrated, wherein the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmIs the absolute pose, T, of the m-number camera at the moment ff mnThe relative pose of the camera number n relative to the camera number m.
164. And obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
When the relay camera group 30 includes a plurality of relay cameras 31 arranged up and down, a dense pose graph (i.e., an overconstrained least square problem) can be formed with the camera 21 to be calibrated, which is beneficial to obtaining a more accurate estimation value.
As shown in fig. 4, taking two cameras to be calibrated as an example, two relay cameras are arranged between the two cameras to be calibrated, in the figure, 1 and 2 represent the two cameras to be calibrated, 3 and 4 represent the two relay cameras, the light color edge is the path of the minimum spanning tree, and the external parameters to be calibrated are optimized and estimated through an energy function:
wherein,representing reprojection error, xkFor the camera to observe the 2D pixel coordinates of the feature pattern,is the 2d pixel coordinate converted by the absolute pose of the n-number camera for the 3d world coordinate of the characteristic pattern,is the absolute pose, X, of camera n at time fkIs the 3d world coordinate of the feature pattern, n is the camera number, and k is the number of the feature pattern.
Representing the external parameters to be calibrated, namely the absolute pose of the camera No. 1 at each moment, the relative pose of the camera No. 3 relative to the camera No. 1, the relative pose of the camera No. 4 relative to the camera No. 3, the relative pose of the camera No. 4 relative to the camera No. 2, and the projection position coordinates of the feature points on the image are expressed by xkAnd (4) showing.
The invention introduces the relay camera and establishes a redundant attitude map optimization strategy based on the relay camera, overcomes the problem of external parameter error accumulation caused by distance between cameras in the traditional method, obtains accurate external parameter estimation meeting the requirements of the SLAM system, does not need to establish a calibration structure in proportion to a vehicle platform, saves the field and the design and manufacturing cost of markers, and is suitable for external parameter calibration of a panoramic multi-view camera system on a large vehicle platform.
However, those skilled in the art should realize that the above embodiments are illustrative only and not limiting to the present invention, and that changes and modifications to the above described embodiments are intended to fall within the scope of the appended claims, provided they fall within the true spirit of the present invention.

Claims (4)

1. A method for calibrating external parameters of a multi-camera system of an unmanned vehicle is characterized by comprising the following steps:
110. arranging at least two relay camera groups between adjacent cameras to be calibrated of a vehicle, wherein the at least two relay camera groups are arranged left and right, each relay camera group comprises at least one relay camera in the vertical direction, a common-view area is arranged between the adjacent cameras, and the angle of the common-view area is 50-150 degrees;
120. synchronizing the camera to be calibrated and the relay camera;
130. moving the calibration board around the vehicle to enable the calibration board to pass through all the cameras in sequence, wherein the front surface of the calibration board faces the camera to be calibrated and the relay camera, and the front surface is provided with a plurality of characteristic patterns in matrix arrangement;
140. starting the camera to be calibrated and the relay camera, and shooting a moving calibration plate;
150. detecting 2D pixel coordinates of each characteristic pattern in each camera;
160. external parameter estimation:
161. carrying out common-view association on the 2D pixel coordinates and the 3D world coordinates of the characteristic patterns in the common-view area to generate a posture graph with the absolute posture of the camera as a node and the common-view relation as an edge, wherein the absolute posture is represented by { R, t }, R is a rotation matrix, and t is a translation vector;
162. and generating a minimum spanning tree by a breadth-first search algorithm according to the number of the common-view characteristic patterns and the maximum reprojection error: selecting the side with the large number of the common-view characteristic patterns as the side for connecting the two nodes, and selecting the side with the small maximum reprojection error as the side for connecting the two nodes if the number of the common-view characteristic patterns is the same;
163. the absolute pose of any one of the camera nodes to be calibrated in the minimum spanning tree and the relative poses between all nodes on the path of the minimum spanning tree are taken as the external parameter set to be calibrated, and the relative poses pass through a formula Tf Wn=Tf WmTf mnTo obtain wherein Tf WnIs the absolute pose, T, of camera n at time ff WmIs the absolute pose, T, of the m-number camera at the moment ff mnThe relative pose of the n camera relative to the m camera is obtained;
164. and obtaining an optimized estimation value of the external parameter to be calibrated by a nonlinear least square method taking the reprojection error as energy.
2. The method for calibrating the extrinsic parameters of a multi-camera system of an unmanned vehicle as claimed in claim 1, wherein said relay camera is fixed to said vehicle or a tripod capable of being adjusted to rotate left and right.
3. The extrinsic parameter calibration method of a multi-camera system in an unmanned vehicle according to claim 1 or 2, characterized in that the camera to be calibrated and the relay camera are synchronized by sending a synchronous clock signal.
4. The method of claim 3, wherein a manually held calibration plate is moved around the vehicle.
CN201811256308.7A 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle Active CN109360245B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811256308.7A CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811256308.7A CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Publications (2)

Publication Number Publication Date
CN109360245A true CN109360245A (en) 2019-02-19
CN109360245B CN109360245B (en) 2021-07-06

Family

ID=65346751

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811256308.7A Active CN109360245B (en) 2018-10-26 2018-10-26 External parameter calibration method for multi-camera system of unmanned vehicle

Country Status (1)

Country Link
CN (1) CN109360245B (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
CN110910453A (en) * 2019-11-28 2020-03-24 魔视智能科技(上海)有限公司 Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
CN111210478A (en) * 2019-12-31 2020-05-29 重庆邮电大学 Method, medium and system for calibrating external parameters of common-view-free multi-camera system
CN111260733A (en) * 2020-01-13 2020-06-09 魔视智能科技(上海)有限公司 External parameter estimation method and system of vehicle-mounted all-around multi-camera system
CN111256689A (en) * 2020-01-15 2020-06-09 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111768364A (en) * 2020-05-15 2020-10-13 成都飞机工业(集团)有限责任公司 Aircraft surface quality detection system calibration method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112489141A (en) * 2020-12-21 2021-03-12 像工场(深圳)科技有限公司 Production line calibration method and device for single board single-image relay lens of vehicle-mounted camera
CN112598749A (en) * 2020-12-21 2021-04-02 西北工业大学 Large-scene non-common-view multi-camera calibration method
GB2588489A (en) * 2019-07-12 2021-04-28 Sela Gal System and method for optical axis calibration
WO2021110497A1 (en) * 2019-12-04 2021-06-10 Valeo Schalter Und Sensoren Gmbh Estimating a three-dimensional position of an object
CN113112551A (en) * 2021-04-21 2021-07-13 阿波罗智联(北京)科技有限公司 Camera parameter determination method and device, road side equipment and cloud control platform
CN110163915B (en) * 2019-04-09 2021-07-13 深圳大学 Spatial three-dimensional scanning method and device for multiple RGB-D sensors
CN113256742A (en) * 2021-07-15 2021-08-13 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN113345031A (en) * 2021-06-23 2021-09-03 地平线征程(杭州)人工智能科技有限公司 Multi-camera external parameter calibration device and method, storage medium and electronic device
CN114092564A (en) * 2021-10-29 2022-02-25 上海科技大学 External parameter calibration method, system, terminal and medium of non-overlapping view field multi-camera system
CN114299120A (en) * 2021-12-31 2022-04-08 北京银河方圆科技有限公司 Compensation method, registration method and readable storage medium based on multiple camera modules
TWI814053B (en) * 2021-05-31 2023-09-01 新加坡商聯發科技(新加坡)私人有限公司 A calibration template, calibration system and calibration method thereof
CN117128985A (en) * 2023-04-27 2023-11-28 荣耀终端有限公司 Point cloud map updating method and equipment

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073119A (en) * 2004-12-11 2007-11-14 三星电子株式会社 Information storage medium including meta data for multi-angle title, and apparatus and method for reproducing the same
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101419055A (en) * 2008-10-30 2009-04-29 北京航空航天大学 Space target position and pose measuring device and method based on vision
CN201373736Y (en) * 2008-11-28 2009-12-30 北京航空航天大学 Initiative vision non-contact servo mechanism parameter measuring device
CN201881988U (en) * 2010-09-17 2011-06-29 长安大学 Vehicle lane changing auxiliary device
CN102478759A (en) * 2010-11-29 2012-05-30 中国空间技术研究院 Integrated measurement method for wavefront distortion and optical axis jitter of space camera
WO2015170361A1 (en) * 2014-05-07 2015-11-12 野村ユニソン株式会社 Cable robot calibration method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN206563649U (en) * 2017-03-24 2017-10-17 中国工程物理研究院应用电子学研究所 A kind of pupil on-line measurement device based on imaging conjugate
CN107346425A (en) * 2017-07-04 2017-11-14 四川大学 A kind of three-D grain photographic system, scaling method and imaging method
CN107401976A (en) * 2017-06-14 2017-11-28 昆明理工大学 A kind of large scale vision measurement system and its scaling method based on monocular camera

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101073119A (en) * 2004-12-11 2007-11-14 三星电子株式会社 Information storage medium including meta data for multi-angle title, and apparatus and method for reproducing the same
CN101226638A (en) * 2007-01-18 2008-07-23 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101226638B (en) * 2007-01-18 2010-05-19 中国科学院自动化研究所 Method and apparatus for standardization of multiple camera system
CN101419055A (en) * 2008-10-30 2009-04-29 北京航空航天大学 Space target position and pose measuring device and method based on vision
CN201373736Y (en) * 2008-11-28 2009-12-30 北京航空航天大学 Initiative vision non-contact servo mechanism parameter measuring device
CN201881988U (en) * 2010-09-17 2011-06-29 长安大学 Vehicle lane changing auxiliary device
CN102478759A (en) * 2010-11-29 2012-05-30 中国空间技术研究院 Integrated measurement method for wavefront distortion and optical axis jitter of space camera
WO2015170361A1 (en) * 2014-05-07 2015-11-12 野村ユニソン株式会社 Cable robot calibration method
CN106408650A (en) * 2016-08-26 2017-02-15 中国人民解放军国防科学技术大学 3D reconstruction and measurement method for spatial object via in-orbit hedgehopping imaging
CN206563649U (en) * 2017-03-24 2017-10-17 中国工程物理研究院应用电子学研究所 A kind of pupil on-line measurement device based on imaging conjugate
CN107401976A (en) * 2017-06-14 2017-11-28 昆明理工大学 A kind of large scale vision measurement system and its scaling method based on monocular camera
CN107346425A (en) * 2017-07-04 2017-11-14 四川大学 A kind of three-D grain photographic system, scaling method and imaging method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YU QI-FENG: "Pose-relay videometric method and ship deformation measurement system with camera-series", 《2010 INTERNATIONAL SYMPOSIUM ON OPTOMECHATRONIC TECHNOLOGIES》 *
封倩倩: "基于单目RGB摄像机的三维重建技术的算法研究与实现", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110163915B (en) * 2019-04-09 2021-07-13 深圳大学 Spatial three-dimensional scanning method and device for multiple RGB-D sensors
CN110244282A (en) * 2019-06-10 2019-09-17 于兴虎 A kind of multicamera system and laser radar association system and its combined calibrating method
GB2588489B (en) * 2019-07-12 2023-02-15 Sela Gal System and method for optical axis calibration
GB2588489A (en) * 2019-07-12 2021-04-28 Sela Gal System and method for optical axis calibration
CN110910453A (en) * 2019-11-28 2020-03-24 魔视智能科技(上海)有限公司 Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
CN110910453B (en) * 2019-11-28 2023-03-24 魔视智能科技(上海)有限公司 Vehicle pose estimation method and system based on non-overlapping view field multi-camera system
WO2021110497A1 (en) * 2019-12-04 2021-06-10 Valeo Schalter Und Sensoren Gmbh Estimating a three-dimensional position of an object
CN111210478A (en) * 2019-12-31 2020-05-29 重庆邮电大学 Method, medium and system for calibrating external parameters of common-view-free multi-camera system
CN111260733B (en) * 2020-01-13 2023-03-24 魔视智能科技(上海)有限公司 External parameter estimation method and system of vehicle-mounted all-around multi-camera system
CN111260733A (en) * 2020-01-13 2020-06-09 魔视智能科技(上海)有限公司 External parameter estimation method and system of vehicle-mounted all-around multi-camera system
CN111256689A (en) * 2020-01-15 2020-06-09 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111256689B (en) * 2020-01-15 2022-01-21 北京智华机器人科技有限公司 Robot positioning method, robot and storage medium
CN111768364B (en) * 2020-05-15 2022-09-20 成都飞机工业(集团)有限责任公司 Aircraft surface quality detection system calibration method
CN111768364A (en) * 2020-05-15 2020-10-13 成都飞机工业(集团)有限责任公司 Aircraft surface quality detection system calibration method
CN111815716A (en) * 2020-07-13 2020-10-23 北京爱笔科技有限公司 Parameter calibration method and related device
CN112233188B (en) * 2020-10-26 2024-03-12 南昌智能新能源汽车研究院 Calibration method of data fusion system of laser radar and panoramic camera
CN112233188A (en) * 2020-10-26 2021-01-15 南昌智能新能源汽车研究院 Laser radar-based roof panoramic camera and calibration method thereof
CN112489141A (en) * 2020-12-21 2021-03-12 像工场(深圳)科技有限公司 Production line calibration method and device for single board single-image relay lens of vehicle-mounted camera
CN112598749A (en) * 2020-12-21 2021-04-02 西北工业大学 Large-scene non-common-view multi-camera calibration method
CN112598749B (en) * 2020-12-21 2024-02-27 西北工业大学 Calibration method for large-scene non-common-view multi-camera
CN112489141B (en) * 2020-12-21 2024-01-30 像工场(深圳)科技有限公司 Production line calibration method and device for single-board single-image strip relay lens of vehicle-mounted camera
CN113112551B (en) * 2021-04-21 2023-12-19 阿波罗智联(北京)科技有限公司 Camera parameter determining method and device, road side equipment and cloud control platform
CN113112551A (en) * 2021-04-21 2021-07-13 阿波罗智联(北京)科技有限公司 Camera parameter determination method and device, road side equipment and cloud control platform
TWI814053B (en) * 2021-05-31 2023-09-01 新加坡商聯發科技(新加坡)私人有限公司 A calibration template, calibration system and calibration method thereof
CN113345031A (en) * 2021-06-23 2021-09-03 地平线征程(杭州)人工智能科技有限公司 Multi-camera external parameter calibration device and method, storage medium and electronic device
CN113256742A (en) * 2021-07-15 2021-08-13 禾多科技(北京)有限公司 Interface display method and device, electronic equipment and computer readable medium
CN114092564A (en) * 2021-10-29 2022-02-25 上海科技大学 External parameter calibration method, system, terminal and medium of non-overlapping view field multi-camera system
CN114092564B (en) * 2021-10-29 2024-04-09 上海科技大学 External parameter calibration method, system, terminal and medium for non-overlapping vision multi-camera system
CN114299120A (en) * 2021-12-31 2022-04-08 北京银河方圆科技有限公司 Compensation method, registration method and readable storage medium based on multiple camera modules
CN114299120B (en) * 2021-12-31 2023-08-04 北京银河方圆科技有限公司 Compensation method, registration method, and readable storage medium
CN117128985A (en) * 2023-04-27 2023-11-28 荣耀终端有限公司 Point cloud map updating method and equipment
CN117128985B (en) * 2023-04-27 2024-05-31 荣耀终端有限公司 Point cloud map updating method and equipment

Also Published As

Publication number Publication date
CN109360245B (en) 2021-07-06

Similar Documents

Publication Publication Date Title
CN109360245B (en) External parameter calibration method for multi-camera system of unmanned vehicle
US10659677B2 (en) Camera parameter set calculation apparatus, camera parameter set calculation method, and recording medium
JP7073315B2 (en) Vehicles, vehicle positioning systems, and vehicle positioning methods
CN111986506B (en) Mechanical parking space parking method based on multi-vision system
KR102550678B1 (en) Non-Rigid Stereo Vision Camera System
US9858639B2 (en) Imaging surface modeling for camera modeling and virtual view synthesis
CN111046743B (en) Barrier information labeling method and device, electronic equipment and storage medium
CN104442567B (en) Object Highlighting And Sensing In Vehicle Image Display Systems
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
CN110345937A (en) Appearance localization method and system are determined in a kind of navigation based on two dimensional code
KR102295809B1 (en) Apparatus for acquisition distance for all directions of vehicle
WO2005088971A1 (en) Image generation device, image generation method, and image generation program
JP2004198212A (en) Apparatus for monitoring vicinity of mobile object
CN102163331A (en) Image-assisting system using calibration method
JP2004198211A (en) Apparatus for monitoring vicinity of mobile object
JP2011215063A (en) Camera attitude parameter estimation device
CN103802725A (en) New method for generating vehicle-mounted driving assisting image
CN101487895A (en) Reverse radar system capable of displaying aerial vehicle image
JP2018139084A (en) Device, moving object device and method
CN109883433B (en) Vehicle positioning method in structured environment based on 360-degree panoramic view
JP6910454B2 (en) Methods and systems for generating composite top-view images of roads
Kinzig et al. Real-time seamless image stitching in autonomous driving
TW202020734A (en) Vehicle, vehicle positioning system, and vehicle positioning method
WO2020118619A1 (en) Method for detecting and modeling of object on surface of road
CN108195359B (en) Method and system for acquiring spatial data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant