[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN104167001A - Large-visual-field camera calibration method based on orthogonal compensation - Google Patents

Large-visual-field camera calibration method based on orthogonal compensation Download PDF

Info

Publication number
CN104167001A
CN104167001A CN201410432768.6A CN201410432768A CN104167001A CN 104167001 A CN104167001 A CN 104167001A CN 201410432768 A CN201410432768 A CN 201410432768A CN 104167001 A CN104167001 A CN 104167001A
Authority
CN
China
Prior art keywords
centerdot
prime
point
orthogonal
video camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410432768.6A
Other languages
Chinese (zh)
Other versions
CN104167001B (en
Inventor
贾振元
刘巍
王亮
刘阳
杨景豪
王灵丽
张弛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dalian University of Technology
Original Assignee
Dalian University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dalian University of Technology filed Critical Dalian University of Technology
Priority to CN201410432768.6A priority Critical patent/CN104167001B/en
Publication of CN104167001A publication Critical patent/CN104167001A/en
Application granted granted Critical
Publication of CN104167001B publication Critical patent/CN104167001B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a large-visual-field camera calibration method based on orthogonal compensation, belongs to the field of computer visual inspection and image processing and relates to a method for calibrating internal parameters and external parameters of a camera in a large forged piece dimension measurement system. Generally, in an active visual calibration process, two translation guide rails of an electric-control platform are not complete orthogonally, and calibration is conducted simply by blindly selecting a high-precision electric-control platform and regarding that the translation guide rails of the selected electric-control platform are orthogonal approximately. According to the large-visual-field camera calibration method based on orthogonal compensation, the internal parameters and the external parameters of a camera are obtained by obtaining the principal point, the scale factor, the translation vector and the rotation matrix. According to the large-visual-field camera calibration method based on orthogonal compensation, the guide rail orthogonal degree is considered during calibration, namely orthogonal compensation is introduced, and thus the errors, caused by the guide rail orthogonal degree, of a calibration result are reduced.

Description

Large-field shooting machine scaling method based on quadrature compensation
Technical field
The invention belongs to Computer Vision Detection and image processing field, relate to the scaling method of the inside and outside parameter of video camera in large forgings Size Measuring System, specifically refer to a kind of large-field shooting machine scaling method based on quadrature compensation.
Background technology
One of basic task of heavy forging size vision measurement is to recover three-dimensional geometric information according to the two-dimensional image information of heavy forging in forging process, asks for real-time dimensional parameters.Realize task of utilizing picture point to ask for corresponding space object surface point, need to determine video camera imaging geometric model, the parameter of this geometric model is called camera parameters.Camera intrinsic parameter is the parameter relevant with optical characteristics with how much of video camera self, and outer parameter is three-dimensional position and the direction of video camera with respect to a certain world coordinate system.
The principal element that affects camera calibration precision comprises: accurate, the robust of feature point for calibration extract.The process of determining the inside and outside parameter of video camera is called camera calibration, and the precision of scaling method directly has influence on the precision of computer vision measurement.Therefore, video camera being carried out to research quick, simple and direct, accurate demarcation is significant undoubtedly.
According to the difference of camera calibration mode, can be divided into traditional standardization, camera self-calibration method and the camera marking method based on active vision at present; Based on active vision scaling method stability and the high advantage of robustness, and conventionally in active vision calibration process automatically controlled platform two translation guide rails not completely orthogonal, and demarcate, automatically controlled platform that just blindly choice accuracy is high is approximate thinks that it is orthogonal.For this problem, consider in calibration process, to add guide rail intercept factor, introduce quadrature compensation, the error of calibration result being brought to reduce guide rail intercept.
Summary of the invention
Large-field shooting machine scaling method based on quadrature compensation of the present invention, first utilize varifocal to ask principal point technology to obtain principal point coordinate, then utilize four groups of orthogonal motions to obtain 12 images, in the case of considering the intercept of four-dimensional automatically controlled platform, utilize 12 images to ask for scale factor, so far, inner parameter has been demarcated complete; Finally, ask for external parameter in conjunction with the inner parameter of asking for.
The technical scheme that the present invention takes is:
A kind of large-field shooting machine scaling method based on quadrature compensation, conventionally in active vision calibration process, automatically controlled platform two translation guide rails are not completely orthogonal, think that it is orthogonal and the demarcation automatically controlled platform that just blindly choice accuracy is high is approximate, therefore propose a kind of camera marking method based on quadrature compensation and obtain the inside and outside parameter of video camera by asking for respectively principal point and scale factor, translation vector and rotation matrix; Concrete steps are as follows:
Step 1: varifocal method is asked principal point
Under different focal, take two-dimentional target with video camera, in the time that focal length changes, photocentre moves along optical axis, in camera field of view, two-dimentional target is will equal proportion scalable, obtain target image under different visual fields, in different images, have the image of any to remain unchanged, be called scalable center, in the image of different visual fields, scalable center is principal point, and therefore by each unique point of target, the intersection point under the least square meaning of the line under different focal is principal point.
Step 2: utilize the active vision scaling method of four-dimensional automatically controlled platform to obtain target image
Controlling four-dimensional automatically controlled platform drives video camera to do four groups of orthogonal motions in plane, every group of orthogonal motion contains twice translation, video camera stops three positions in each orthogonal motion, and then obtain three images, and guaranteeing under the prerequisite of target in visual field, between every group of orthogonal motion, video camera all has the luffing angle or the pan angle that are not less than 5 degree, obtains totally 12 target images by twice pure flat shifting movement of four groups.
Step 3: ask for scale factor
Parallel lines and plane at infinity intersect at same infinity point, i.e. 12 image packets after distortion correction are carried out vanishing point estimation by hidden disappearing a little, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points; Angle between 8 the vanishing point coordinates, principal point coordinate and the guide rail that obtain is brought in following formula:
cos θ = v i 1 T wv i 2 v i 1 T wv i 1 v i 2 T wv i 2 , i = 1,2,3,4 - - - ( 1 )
Wherein, θ is the angle between two translation guide rails of automatically controlled platform, v i1, v i2be the right pixel coordinate (homogeneous form) of vanishing point that twice translation motion under the orthogonal motion of i group obtained, w=K -Tk -1be the image of absolute conic, its concrete form is as follows:
w = 1 / f x 2 0 - u 0 / f x 2 0 1 / f y 2 - v 0 / f y 2 - u 0 / f x 2 - v 0 / f y 2 u 0 2 / f x 2 + v 0 2 / f y 2 + 1 - - - ( 2 )
Can obtain by above-mentioned two formulas and eight vanishing point coordinates the equation with two unknowns group that comprises four nonlinear equations, go out scale factor f with nonlinear least square method iterative xand f y.
Step 4: demarcate outer parameter
The homogeneous coordinates of one group of corresponding picture point can obtain a linear equation, as follows:
uu′f 11+vu′f 12+u′f 13+uv′f 21+vv′f 22+v′f 23+uf 31+vf 32+f 33=0 (3)
Wherein, f ijfor the element that i is capable, j lists of fundamental matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
Wherein, f=[f 11, f 12, f 13, f 21, f 22, f 23, f 31, f 32, f 33] tbe 9 dimensional vectors;
If provide the corresponding { m of n group point i, m ' i, there is following system of homogeneous linear equations
Af = u 1 u 1 ′ v 1 u 1 ′ u 1 ′ u 1 v 1 ′ v 1 v 1 ′ v 1 ′ u 1 v 1 1 · · · · · · · · · · · · · · · · · · · · · · · · · · · u n u n ′ v n u n ′ u n ′ u n v n ′ v n v n ′ v n ′ u n v n 1 f = 0 - - - ( 5 )
The corresponding picture points of given at least 8 groups can solve above-mentioned system of homogeneous linear equations differing a unknown non-zero proportions factor lower linear, in this ‖ f ‖=1 that imposes restriction, in the time of corresponding picture point exact matching, can only utilize 8 groups of corresponding point to ask for fundamental matrix; And then can obtain and differ an essential matrix under the non-zero proportions factor wherein s is the scale factor differing; Utilize E *=Udiag ((σ 1+ σ 2)/2, (σ 1+ σ 2)/2,0) V tas approximate, wherein eigenwert σ 1>=σ 2>=σ 3.
From essential matrix E *can estimate the external parameter between video camera there are following 4 kinds of possible array configurations:
[UWV T|u 3];[UWV T|-u 3];[UW TV T|u 3];[UW TV T|-u 3]. (6)
Wherein, W = 0 - 1 0 1 0 0 0 0 1 , U 3for the 3rd column vector of U,
By Chirality retrain, crab angle is an acute angle and on the occasion of and the X of translation vector cto component for negative value is carried out correct judgment parameter combinations; Utilize the intrinsic parameters of the camera obtaining and differ external parameter between the video camera under a scale factor and rebuild accurately two known unique points of range information, rebuild distance and be scale factor s with the ratio of actual range.
Brief description of the drawings
Fig. 1 is that varifocal is asked principal point schematic diagram.
Fig. 2 is apparatus structure schematic diagram of the present invention.
Fig. 3 is to utmost point geometrical principle figure in binocular stereo vision.
In figure: 1 platform; The automatically controlled platform of 2 left video camera; The automatically controlled platform of 3 right video camera; 4 left video cameras;
5 right video cameras; 6 two-dimentional gridiron pattern scaling boards.
Embodiment
Further describe the specific embodiment of the present invention below in conjunction with accompanying drawing and technical scheme.
Camera calibration adopts classical pinhole imaging system model conventionally, and the expression formula of this model is as follows:
Wherein, (X w, Y w, Z w, 1) tfor the homogeneous coordinates of spatial point in world coordinate system, (u, v, 1) tfor corresponding image picture point pixel coordinate is o 0homogeneous coordinates in uv, α x=f/dx is o 0scale factor in uv coordinate system on u axle, α y=f/dy is o 0scale factor in uv coordinate system on v axle, f is camera lens focal length, dx and dy are respectively the horizontal stroke of pixel, vertical physical size, (u 0, v 0) be principal point coordinate, ρ cfor scale-up factor, K is intrinsic parameters of the camera matrix, the external parameter matrix that [R|t] is video camera, and wherein, R is rotation matrix, t is translation vector.
Intrinsic parameters of the camera comprises principal point coordinate (u 0, v 0), scale factor α x, α y, coefficient of radial distortion k 1, k 2with tangential distortion coefficient p 1, p 2.Video camera external parameter is the orientation of camera coordinate system with respect to world coordinate system, comprises rotation matrix R and translation vector t.
Step 1: varifocal method is asked principal point;
Under different focal, take two-dimentional target with video camera, as shown in Figure 1, in the time that focal length changes, photocentre moves along optical axis, and in camera field of view, two-dimentional target is will equal proportion scalable, obtains target image under different visual fields, in different images, there is the image of any to remain unchanged, be called scalable center, in the image of different visual fields, scalable center is principal point, and therefore by each unique point of target, the intersection point under the least square meaning of the line under different focal is principal point.
Step 2: utilize the active vision scaling method of four-dimensional automatically controlled platform to obtain target image;
As Fig. 2 builds experimental facilities, two video cameras are fixed on two automatically controlled platforms of the four-dimension, controlling four-dimensional automatically controlled platform drives video camera to do four groups of orthogonal motions in plane, every group of orthogonal motion contains twice translation, video camera stops three positions in each orthogonal motion, and then obtain three images, and guaranteeing under the prerequisite of target in visual field, between every group of orthogonal motion, video camera all has the luffing angle or the pan angle that are not less than 5 degree, obtains totally 12 target images by twice pure flat shifting movement of four groups.
Step 3: ask for scale factor;
Parallel lines and plane at infinity intersect at same infinity point, i.e. 12 image packets after distortion correction are carried out vanishing point estimation by hidden disappearing a little, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points; Angle between 8 the vanishing point coordinates, principal point coordinate and the guide rail that obtain is brought in following formula:
cos θ = v i 1 T wv i 2 v i 1 T wv i 1 v i 2 T wv i 2 , i = 1,2,3,4 - - - ( 1 )
Wherein, θ is the angle between two translation guide rails of automatically controlled platform, v i1, v i2be the right pixel coordinate (homogeneous form) of vanishing point that twice translation motion under the orthogonal motion of i group obtained, w=K -Tk -1be the image of absolute conic, its concrete form is as follows:
w = 1 / f x 2 0 - u 0 / f x 2 0 1 / f y 2 - v 0 / f y 2 - u 0 / f x 2 - v 0 / f y 2 u 0 2 / f x 2 + v 0 2 / f y 2 + 1 - - - ( 2 )
Can obtain by above-mentioned two formulas and eight vanishing point coordinates the equation with two unknowns group that comprises four nonlinear equations, go out scale factor f with nonlinear least square method iterative xand f y.
Step 4: demarcate outer parameter;
Fig. 3 is to utmost point geometrical principle figure in binocular stereo vision.
Each homogeneous coordinates of organizing corresponding picture point can obtain a linear equation, as follows:
uu′f 11+vu′f 12+u′f 13+uv′f 21+vv′f 22+v′f 23+uf 31+vf 32+f 33=0 (3)
Wherein, f ijfor the element that i is capable, j lists of fundamental matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
Wherein, f=[f 11, f 12, f 13, f 21, f 22, f 23, f 31, f 32, f 33] tbe 9 dimensional vectors;
If provide the corresponding { m of n group point i, m ' i, there is following system of homogeneous linear equations
Af = u 1 u 1 ′ v 1 u 1 ′ u 1 ′ u 1 v 1 ′ v 1 v 1 ′ v 1 ′ u 1 v 1 1 · · · · · · · · · · · · · · · · · · · · · · · · · · · u n u n ′ v n u n ′ u n ′ u n v n ′ v n v n ′ v n ′ u n v n 1 f = 0 - - - ( 5 )
The corresponding picture points of given at least 8 groups can solve above-mentioned system of homogeneous linear equations differing a unknown non-zero proportions factor lower linear, in this ‖ f ‖=1 that imposes restriction, in the time of corresponding picture point exact matching, can only utilize 8 groups of corresponding point to ask for fundamental matrix; And then can obtain and differ an essential matrix under the non-zero proportions factor wherein s is the scale factor differing; Utilize E *=Udiag ((σ 1+ σ 2)/2, (σ 1+ σ 2)/2,0) V tas approximate, wherein eigenwert σ 1>=σ 2>=σ 3;
From essential matrix E *can estimate the external parameter between video camera there are following 4 kinds of possible array configurations:
[UWV T|u 3];[UWV T|-u 3];[UW TV T|u 3];[UW TV T|-u 3]. (6)
Wherein, W = 0 - 1 0 1 0 0 0 0 1 , U 3for the 3rd column vector of U,
By Chirality retrain, crab angle is an acute angle and on the occasion of and the X of translation vector cto component for negative value is carried out correct judgment parameter combinations; Utilize the intrinsic parameters of the camera obtaining and differ external parameter between the video camera under a scale factor and rebuild accurately two known unique points of range information, rebuild distance and be scale factor s with the ratio of actual range.

Claims (1)

1. the calibration process based on quadrature compensation, is characterized in that following steps:
Step 1: varifocal method is asked principal point
Under different focal, take two-dimentional target with video camera, in the time that focal length changes, photocentre moves along optical axis, in camera field of view, two-dimentional target is will equal proportion scalable, obtain target image under different visual fields, in different images, have the image of any to remain unchanged, be called scalable center, in the image of different visual fields, scalable center is principal point, and therefore by each unique point of target, the intersection point under the least square meaning of the line under different focal is principal point;
Step 2: utilize the active vision scaling method of four-dimensional automatically controlled platform to obtain target image
Controlling four-dimensional automatically controlled platform drives video camera to do four groups of orthogonal motions in plane, every group of orthogonal motion contains twice translation, video camera stops three positions in each orthogonal motion, and then obtain three images, and guaranteeing under the prerequisite of target in visual field, between every group of orthogonal motion, video camera all has the luffing angle or the pan angle that are not less than 5 degree, obtains totally 12 target images by twice pure flat shifting movement of four groups;
Step 3: ask for scale factor
Parallel lines and plane at infinity intersect at same infinity point, i.e. 12 image packets after distortion correction are carried out vanishing point estimation by hidden disappearing a little, and in every group of 3 images, corresponding point line obtains 2 vanishing points, 4 groups of totally 8 vanishing points; Angle between 8 the vanishing point coordinates, principal point coordinate and the guide rail that obtain is brought in following formula:
cos θ = v i 1 T wv i 2 v i 1 T wv i 1 v i 2 T wv i 2 , i = 1,2,3,4 - - - ( 1 )
Wherein, θ is the angle between two translation guide rails of automatically controlled platform, v i1, v i2be the right pixel coordinate of vanishing point that twice translation motion under the orthogonal motion of i group obtained, w=K -Tk -1be the image of absolute conic, its concrete form is as follows:
w = 1 / f x 2 0 - u 0 / f x 2 0 1 / f y 2 - v 0 / f y 2 - u 0 / f x 2 - v 0 / f y 2 u 0 2 / f x 2 + v 0 2 / f y 2 + 1 - - - ( 2 )
Obtain by above-mentioned two formulas and eight vanishing point coordinates the equation with two unknowns group that comprises four nonlinear equations, go out scale factor f with nonlinear least square method iterative xand f y;
Step 4: demarcate outer parameter;
The homogeneous coordinates of one group of corresponding picture point can obtain a linear equation, as follows:
uu′f 11+vu′f 12+u′f 13+uv′f 21+vv′f 22+v′f 23+uf 31+vf 32+f 33=0 (3)
Wherein, f ijfor the element that i is capable, j lists of fundamental matrix;
[uu′,vu′,u′,uv′,vv′,v′,u,v,1]f=0 (4)
Wherein, f=[f 11, f 12, f 13, f 21, f 22, f 23, f 31, f 32, f 33] tbe 9 dimensional vectors;
If provide the corresponding { m of n group point i, m ' i, there is following system of homogeneous linear equations
Af = u 1 u 1 ′ v 1 u 1 ′ u 1 ′ u 1 v 1 ′ v 1 v 1 ′ v 1 ′ u 1 v 1 1 · · · · · · · · · · · · · · · · · · · · · · · · · · · u n u n ′ v n u n ′ u n ′ u n v n ′ v n v n ′ v n ′ u n v n 1 f = 0 - - - ( 5 )
The corresponding picture points of given at least 8 groups can solve above-mentioned system of homogeneous linear equations differing a unknown non-zero proportions factor lower linear, in this ‖ f ‖=1 that imposes restriction, in the time of corresponding picture point exact matching, only utilize 8 groups of corresponding point to ask for fundamental matrix; And then obtain and differ an essential matrix under the non-zero proportions factor wherein s is the scale factor differing; Utilize E *=Udiag ((σ 1+ σ 2)/2, (σ 1+ σ 2)/2,0) V tas approximate, wherein eigenwert σ 1>=σ 2>=σ 3;
From essential matrix E *estimate the external parameter between video camera there are following 4 kinds of possible array configurations:
[UWV T|u 3];[UWV T|-u 3];[UW TV T|u 3];[UW TV T|-u 3]. (6)
Wherein, W = 0 - 1 0 1 0 0 0 0 1 , U 3for the 3rd column vector of U,
By Chirality retrain, crab angle is an acute angle and on the occasion of and the X of translation vector cto component for negative value is carried out correct judgment parameter combinations; Utilize the intrinsic parameters of the camera obtaining and differ external parameter between the video camera under a scale factor and rebuild accurately two known unique points of range information, rebuild distance and be scale factor s with the ratio of actual range.
CN201410432768.6A 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation Active CN104167001B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410432768.6A CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Publications (2)

Publication Number Publication Date
CN104167001A true CN104167001A (en) 2014-11-26
CN104167001B CN104167001B (en) 2017-02-15

Family

ID=51910792

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410432768.6A Active CN104167001B (en) 2014-08-27 2014-08-27 Large-visual-field camera calibration method based on orthogonal compensation

Country Status (1)

Country Link
CN (1) CN104167001B (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751452A (en) * 2015-03-08 2015-07-01 大连理工大学 Monocular camera calibration method based on any known movement
CN105716542A (en) * 2016-04-07 2016-06-29 大连理工大学 Method for three-dimensional data registration based on flexible feature points
CN106157322A (en) * 2016-07-28 2016-11-23 北京航空航天大学 A kind of camera installation site scaling method based on plane mirror
CN109727291A (en) * 2018-12-28 2019-05-07 北京航空航天大学 A kind of high-precision online calibration method of zoom camera
CN114152190A (en) * 2021-11-15 2022-03-08 苏州铸正机器人有限公司 Industrial camera precision and working space test platform
WO2023178658A1 (en) * 2022-03-22 2023-09-28 张慧 Camera self-calibration method under general motion

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101285676A (en) * 2008-06-10 2008-10-15 北京航空航天大学 Multi-visual sense sensor calibration method based on one-dimensional target
CN101581569A (en) * 2009-06-17 2009-11-18 北京信息科技大学 Calibrating method of structural parameters of binocular visual sensing system
WO2013111229A1 (en) * 2012-01-23 2013-08-01 日本電気株式会社 Camera calibration device, camera calibration method, and camera calibration program
CN103530880A (en) * 2013-10-16 2014-01-22 大连理工大学 Camera calibration method based on projected Gaussian grid pattern

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104751452A (en) * 2015-03-08 2015-07-01 大连理工大学 Monocular camera calibration method based on any known movement
CN105716542A (en) * 2016-04-07 2016-06-29 大连理工大学 Method for three-dimensional data registration based on flexible feature points
CN106157322A (en) * 2016-07-28 2016-11-23 北京航空航天大学 A kind of camera installation site scaling method based on plane mirror
CN106157322B (en) * 2016-07-28 2019-04-26 北京航空航天大学 A kind of camera installation site scaling method based on plane mirror
CN109727291A (en) * 2018-12-28 2019-05-07 北京航空航天大学 A kind of high-precision online calibration method of zoom camera
CN109727291B (en) * 2018-12-28 2020-10-23 北京航空航天大学 High-precision online calibration method for zoom camera
CN114152190A (en) * 2021-11-15 2022-03-08 苏州铸正机器人有限公司 Industrial camera precision and working space test platform
CN114152190B (en) * 2021-11-15 2023-10-24 苏州铸正机器人有限公司 Industrial camera precision and working space test platform
WO2023178658A1 (en) * 2022-03-22 2023-09-28 张慧 Camera self-calibration method under general motion

Also Published As

Publication number Publication date
CN104167001B (en) 2017-02-15

Similar Documents

Publication Publication Date Title
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN103759670B (en) A kind of object dimensional information getting method based on numeral up short
CN109559355B (en) Multi-camera global calibration device and method without public view field based on camera set
CN109859272B (en) Automatic focusing binocular camera calibration method and device
CN104167001A (en) Large-visual-field camera calibration method based on orthogonal compensation
CN105469389B (en) A kind of grid ball target for vision sensor calibration and corresponding scaling method
CN101231750A (en) Calibrating method of binocular three-dimensional measuring system
CN101216296A (en) Binocular vision rotating axis calibration method
CN105118086B (en) 3D point cloud Registration of Measuring Data method and system in 3D AOI equipment
CN103473771A (en) Method for calibrating camera
CN109961485A (en) A method of target positioning is carried out based on monocular vision
CN101783018B (en) Method for calibrating camera by utilizing concentric circles
CN109360230A (en) A kind of method for registering images and system based on 2D camera Yu 3D camera
CN104268876A (en) Camera calibration method based on partitioning
CN109272555B (en) External parameter obtaining and calibrating method for RGB-D camera
CN111192235A (en) Image measuring method based on monocular vision model and perspective transformation
CN103852060A (en) Visible light image distance measuring method based on monocular vision
CN109238235A (en) Monocular sequence image realizes rigid body pose parameter continuity measurement method
CN110044374A (en) A kind of method and odometer of the monocular vision measurement mileage based on characteristics of image
CN110223355A (en) A kind of feature mark poiX matching process based on dual epipolar-line constraint
CN108154536A (en) The camera calibration method of two dimensional surface iteration
CN113724337B (en) Camera dynamic external parameter calibration method and device without depending on tripod head angle
CN104807405A (en) Three-dimensional coordinate measurement method based on light ray angle calibration
CN109084959B (en) Optical axis parallelism correction method based on binocular distance measurement algorithm
CN105513074B (en) A kind of scaling method of shuttlecock robot camera and vehicle body to world coordinate system

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant