CN113379848A - Target positioning method based on binocular PTZ camera - Google Patents
Target positioning method based on binocular PTZ camera Download PDFInfo
- Publication number
- CN113379848A CN113379848A CN202110642192.6A CN202110642192A CN113379848A CN 113379848 A CN113379848 A CN 113379848A CN 202110642192 A CN202110642192 A CN 202110642192A CN 113379848 A CN113379848 A CN 113379848A
- Authority
- CN
- China
- Prior art keywords
- camera
- rotation
- coordinate system
- matrix
- focal length
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/80—Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
- G06T7/85—Stereo camera calibration
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
The invention relates to a target positioning method based on a binocular PTZ camera, which is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps: step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr(ii) a Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr(ii) a And step 3: calculating homography matrix H from world coordinate system to left and right camera pixel coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) And calculating the coordinate value of the target in the world coordinate system by using a least square method. Has the advantages that: the invention utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera, can realize the three-dimensional positioning of the target in a larger range and a longer distance, and can greatly position the targetThe perception capability of the intelligent unmanned platform is greatly improved.
Description
Technical Field
The invention belongs to the visual perception positioning technology, and particularly relates to a target positioning method based on a binocular PTZ camera.
Background
At present, sensors of a perception module of an intelligent platform mainly comprise a laser radar, a camera, a binocular camera, a millimeter wave radar and the like, and the sensors have certain defects in perception capability, for example, although the laser radar can obtain accurate depth information, the laser radar is expensive, point clouds at a distance are sparse, texture information is absent, and the sensors are easily influenced by dust particles, heavy rain and the like; the camera is low in cost and can provide rich texture information, but the monocular camera is limited by a small-hole imaging principle, cannot directly acquire target depth information and is easily influenced by illumination conditions and climates; although the traditional binocular camera can realize stereoscopic vision, the long-distance ranging error is large due to the short base line, the visual field range is fixed, and the focal length and the angle cannot be adjusted; the millimeter wave radar has a large detection angle and strong anti-interference capability, but has lower resolution and precision; the millimeter wave radar is accurate in speed measurement of the obstacle, but has more noise data and high false detection rate.
The perception module is used as an unmanned robot, particularly one of four key technologies in unmanned vehicle application, and provides information support for the positioning and planning module. The main perceived tasks include identification of traffic lights and traffic signs in urban environments; the method comprises the steps of detecting, identifying, tracking and positioning the obstacles in urban and field environments, predicting the track of the obstacles and the like. The existing unmanned vehicle sensing system has weak sensing capability on a target with a long distance, when the target distance is long, the laser radar point cloud is sparse, and texture information is little; the traditional camera and the binocular camera cannot zoom and change the posture, the target is small in the picture, the number of pixels is small, texture information is not abundant, and the distance measurement precision of the binocular camera is low. Especially, the distance of the target in the field battlefield environment is long, and the existing sensor is difficult to accurately position the long-distance target. The problem that a monocular camera is small in view field and fixed in focal length is solved by installing a plurality of cameras with different focal lengths in a plurality of directions on a vehicle body of a part of automatic driving vehicles. If a vehicle with an automatic driving function is provided with 8 cameras, the method causes certain resource waste, and the computing unit has high computing resource consumption.
The PTZ camera is an abbreviation of Pan/Tilt/Zoom, representing that the camera can move in all directions (left and right/up and down) and the lens is Zoom-controlled. Binocular PTZ camera simulation chameleon's visual system compares in traditional camera and static binocular camera, and its advantage lies in that the camera can move alone and acquire big scene information, also can cooperate each other to constitute stereo vision system and acquire the degree of depth information, can also acquire local high resolution information, can realize carrying out on a large scale, remote tracking and location to the target. The binocular PTZ camera system can be used in the fields of video monitoring, unmanned driving, intelligent traffic systems, military reconnaissance and the like, in recent years, the binocular PTZ camera vision system is always a research hotspot, but because the angle values and the focal length values of the left camera and the right camera in the zooming and rotating processes are constantly changed, the real-time calibration difficulty of the binocular PTZ camera is high, the domestic research on static binocular cameras and single PTZ cameras is quite deep, but the calibration research on the binocular PTZ camera is less.
Disclosure of Invention
The invention aims to overcome the defects of the technology and provide a target positioning method based on a binocular PTZ camera, which utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera to realize the three-dimensional positioning of a target in a larger range and a longer distance.
In order to achieve the purpose, the invention adopts the following technical scheme: a target positioning method based on a binocular PTZ camera is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr;
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr;
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2;
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
Solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrFitting a piecewise function of the focal lengths of the left camera and the right camera with respect to zoom, which comprises the following specific steps:
step 1-1: respectively calibrating focal lengths f of left and right cameras under a plurality of specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the momentAndobtaining internal reference matrixes of the left camera and the right camera respectively as follows:
in the formula (1)Andunits of dpi, u0、v0The pixel coordinates of the center point of the image, whose value is 1/2 at the resolution of the image.
Solving R of the left camera and the right camera at any angle and any focal length in the step 2l、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration methodThe camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotationHorizontal rotation axis direction vector of pitch rotation
Step 2-2: direction vector of the vertical rotation axisAnd the direction vector of the horizontal rotation axisThe intersection points with the respective optical center rotation planes gamma are respectivelyCalculating the vector from the initial position camera coordinate system origin O to each intersection pointAnd
step 2-3: are respectively provided withCalculating a rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotatedAnd translation vector
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr。
Solving of the step 2-1Andthe method of calibrating the vertical spindle direction vector Firstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Obtaining the direction vector according to the Rodrigues formulaIs a reverse symmetric matrixComprises the following steps:
In the formula:
the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2 (4)。
step 2-2 said vectorThe left camera horizontal rotation vectorIs calculated byAfter rotating pi + alpha around the rotating shaft, a vector is obtainedThen the translation vector generated by the rotationIn conjunction with the formula (4), then
In the step 2-3, the solution of the rotation matrix after the left camera and the right camera rotate, the horizontal rotation angle alpha and the pitching rotation angle beta, can be obtained by using a rodgers formula to solve:
in the formulaThe rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
And 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
the optical center translation vector generated by zooming in the step 2-4Solving the translation vector T generated by zooming according to the current focal length value fz,
In the formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, obtainable from claim 2, in dpi,the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
The rotation and translation matrix R under any angle and zoom value in the step 2-5l、Tl、Rr、TrThe result of the left and right cameras relative to the world coordinate system is:
has the advantages that: compared with the prior art, the invention utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera, and can realize the three-dimensional positioning of the target in a larger range and a longer distance. The PTZ camera is low in price, can rotate and zoom, obtains texture information of a target in a wider range and a longer distance (abundant texture information cannot be obtained by a laser radar and a millimeter wave radar), and can greatly improve the perception capability of the intelligent unmanned platform.
Drawings
FIG. 1 is a zoom schematic of a PTZ camera;
FIG. 2 is a schematic view of binocular PTZ camera coordinates;
FIG. 3a is a top calibration schematic view of a horizontally rotating vertical shaft;
FIG. 3b is a side view calibration schematic of a horizontally rotating vertical shaft;
FIG. 4 is a schematic view of the horizontal pivot axis of the pitch rotation of the binocular PTZ camera;
FIG. 5 is a schematic diagram of camera zoom external reference calibration.
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments thereof which are illustrated in the appended drawings. In addition, the embodiments and features of the embodiments of the present application may be combined with each other without conflict. In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
In various embodiments of the present invention, for convenience in description and not in limitation, the term "coupled" as used in the specification and claims of the present application is not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships are changed accordingly.
Referring to the drawings in detail, the invention provides a target positioning method based on a binocular PTZ camera, which is used for calibrating internal and external parameters of the binocular PTZ camera under any angle and any focal length, and realizing three-dimensional positioning of a target by using a least square method, and comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr;
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr;
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2;
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
Solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrFitting a piecewise function of the focal lengths of the left camera and the right camera with respect to zoom, which comprises the following specific steps:
step 1-1: respective labelDetermining the focal length f of left and right cameras under several specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the momentAndobtaining internal reference matrixes of the left camera and the right camera respectively as follows:
in the formula (1)Andunits of dpi, u0、v0The pixel coordinates of the center point of the image, whose value is 1/2 at the resolution of the image.
Solving R of the left camera and the right camera at any angle and any focal length in the step 2l、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration methodThe camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotationHorizontal rotation axis direction vector of pitch rotation
Step 2-2: direction vector of the vertical rotation axisAnd the direction vector of the horizontal rotation axisThe intersection points with the respective optical center rotation planes gamma are respectivelyCalculating the vector from the initial position camera coordinate system origin O to each intersection pointAnd
step 2-3: respectively calculating the rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotatedAnd translation vector
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr。
Solving of the step 2-1Andthe method of calibrating the vertical spindle direction vector Firstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Obtaining the direction vector according to the Rodrigues formulaIs a reverse symmetric matrixComprises the following steps:
In the formula:
the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2 (4)。
step 2-2 said vectorThe left camera horizontal rotation vectorIs calculated byAfter rotating pi + alpha around the rotating shaft, a vector is obtainedThen the translation vector generated by the rotationIn conjunction with the formula (4), then
In the step 2-3, the solution of the rotation matrix after the left camera and the right camera rotate, the horizontal rotation angle alpha and the pitching rotation angle beta, can be obtained by using a rodgers formula to solve:
in the formulaThe rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
And 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
the optical center translation vector generated by zooming in the step 2-4Solving the translation vector T generated by zooming according to the current focal length value fz,
In the formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, obtainable from claim 2, in dpi,the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
The rotation and translation matrix R under any angle and zoom value in the step 2-5l、Tl、Rr、TrThe result of the left and right cameras relative to the world coordinate system is:
the corner mark l and the corner mark r in the upper right corner of the symbol in the formula respectively represent the parameters of the left camera (left) and the right camera (right); the l-r corner marks represent parameters between the left and right cameras, and the lower right corner mark init represents parameters of the initial position.
When the method is used for calibration, internal and external parameters of the camera at the initial position are calibrated by using a Zhangyingyou plane calibration method. When the internal reference is calibrated, focal length values under a plurality of specific zoom values are calibrated, the focal length values are fitted into a piecewise function related to the zoom, and the camera internal reference is estimated in real time according to the zoom values of the camera
Examples
The method divides the appearance parameter updating into two parts, namely, the appearance parameter change generated by the rotation of the camera comprises a rotation matrix R of the rotated left camera and the rotated right camera relative to the original positionrotAnd translation vector TrotThe text utilizes Zhao Zhiting, Wang jin Jiang, Wang Chenguang, and multiple freedom based on the parameters of the rotating shaftBinocular vision system calibration [ J ]]The optical technology, 2018,44(02) 140 and 146 respectively demarcate the horizontal and pitching rotation vectors of the left and right PTZ cameras, and then calculate the rotation matrix of the horizontal and pitching rotation of the cameras by utilizing the Rodrigues formula; for the deviation of the optical center caused by rotation, the intersection point of the rotating shaft and the rotating plane of the optical center is recorded as A, and the vector from the optical center O to the point A is obtained through calibration calculationRecalculationTranslation vectors due to rotation; secondly, the external parameter change is generated by zooming of the left and right PTZ cameras, the zooming process is actually the process of moving the optical center of the camera, the change of a rotation matrix can not be caused, and only the change of a translation vector T can be causedzAnd zoom, and then according to the rotation matrix, calculating the translation vector of the optical center of the monocular PTZ camera relative to the initial position after the monocular PTZ camera rotates and zooms. Finally, the external parameter changes generated by rotation and zooming are combined, and the horizontal rotation alpha of the left camera and the right camera respectively can be solved1、α2Pitch rotation beta1,β2Each zoom value is z1,z2External reference matrix R of left and right cameras relative to world coordinate systeml、Tl、Rr、Tr。
The specific method comprises the following steps:
first, PTZ camera internal reference calibration
Referring to fig. 1 in detail, according to the pinhole imaging model, the zooming process of the camera can be equivalent to the process of moving the optical center back and forth on the optical axis, the zoom value increases, and the optical center is equivalent to moving forward. According to the camera characteristics, the size of the focal length value is approximately in linear relation with the zoom value, the zoom value fed back in real time is recorded as m, and the zoom value is taken according to the size of the camera zoom rangemin(zoom=1),zoom2,…,zoomn-1,zoommaxAnd n different zoom values are calibrated. Only considering the camera focal length value as large as the camera zooming processSmall variations, not considering camera principal point, distortion changes with focal length. Calibrating zoom m according to Zhangzhen friend calibration methodiFx at valueiAnd fyiThe focal length f between the individual zoom values is fitted to a piecewise function with respect to m using linear interpolation:
in the formula mi-1<m<miObtaining the internal reference matrix of the camera
The same way a piecewise function of the focal length of the right camera with respect to m can be obtained. The left camera and the right camera can realize self-calibration of camera internal parameters according to zoom values fed back by the cameras.
External parameter calibration of two-eye and two-eye PTZ camera
1) Camera rotation external reference calibration
Most documents idealize the optical center to coincide with the rotation axis in the calibration process of the binocular PTZ camera, the optical center deviates from the rotation axis when the actual PTZ camera rotates, the optical center is not fixed but rotates around the rotation axis, and the schematic diagram of the binocular PTZ camera is shown in FIG. 2, wherein O isl、OrIs the optical center of the left and right cameras, Ol-xyz、Or-xyz is the camera coordinate system of the left and right cameras, O-xwywzwWhich is a specified world coordinate system, coincides with the camera coordinate system of the initial position of the left camera.The rotation vector of the horizontal rotation and the rotation vector of the vertical rotation of the left camera,the rotation vector of the horizontal rotation and the rotation vector of the vertical rotation of the right camera are provided, and the optical center of the camera rotates around the rotation vector;
2.1 rotation matrix solution
Referring to FIG. 3a and FIG. 3b in detail, the present invention utilizes Zhao Zhiting, Wang jin Jiang, Wang Chenguang, multi-degree of freedom binocular vision system calibration based on spindle parameters [ J]Optical technique, 2018,44(02):140 and 146. The optical center of the camera rotates around a rotating shaft, the pose relation between the camera and the calibration template is calculated by utilizing a homography principle, the direction of the rotating shaft is calculated, and the zoom of the left camera is usedmin(zoom 1), horizontal rotation, position 1 camera coordinate system relative to calibration plate corner point coordinate system B-xByBzBIs R1Translation matrix is T1Keeping the position of the chessboard marking template fixed, rotating the camera by an angle alpha in the horizontal direction to obtain a rotation matrix R at the position 22Translation matrix T2,
Rotation matrix R for rotation from position 1 to position 2 of the camera coordinate systemv-αIs composed of
The rotation matrix, the rotation vector and the rotation angle can be mutually converted according to the Rodrigues formula, and the relationship is as follows:
Rv-α=cosαI+(1-cosα)vvT+sinαvΛ
wherein tr (R)v-α) Is a matrix Rv-αThe antisymmetric matrix of the rotation vector can be obtained by the above formulaThereby obtainingThe vector is then normalized to determine the rotation vector from position 1 to position 2
We take the average of multiple measurements to eliminate the error, so the vertical rotation vector of the left camera is:
the horizontal rotation vector can be obtained by the same methodAnd vertical and horizontal rotation vectors of right camera
The direction rotation of the camera conforms to a right-hand coordinate system, and according to a Rodrigues formula, a vector of the left camera rotating around the vertical direction can be obtainedRotation matrix of rotation alphaThe pitch rotation matrix can be solved by the same methodAnd right camera horizontal and pitch rotation matrix
2.2 translation vector solving
Take horizontal rotation of the left camera as an example, o in FIG. 3initIs the initial position of the camera coordinate system and the rotation axisThe vertical rotating plane passes through the optical center, the intersection point of the rotating shaft and the rotating plane is A, and the A is recorded in the initial coordinate system O of the camerainitAt-xyz, oinitThe translation vector to the A point of the camera optical center isWound aroundRotated by an angle of pi + alphaThe rotation direction is in accordance with the right-hand coordinate system, and the translation vector generated by rotation can be known
With particular reference to FIG. 3a, FIG. 3B, the calibration plate coordinate system B-x at position 1 (initial position)ByBzBOrigin relative to coordinate system O1-translation vector of xyz is T1Calibration plate origin at position 2 is O2The translation vector in the xyz coordinate system is T2,T1、T2All can be obtained by calibration, and can also obtain Tv-α:
Tv-α=T1-Rv-αT2
Selecting different rotation angles for calibration, and taking the average value of multiple groups of results to obtainThe translation vector from the intersection point of the left camera horizontal rotating shaft, the right camera vertical rotating shaft and the right camera horizontal rotating shaft and the rotating plane to the optical center can be obtained by the same method
Then according toThe translation vector generated by the horizontal and pitching rotation of the left and right cameras can be obtained
2.3 calculation of external parameters after rotation
Defining the coordinate system of the left camera at the initial position as the world coordinate system O-xwywzwWhen the left (right) camera rotates horizontally by alpha and rotates in pitch by beta, the left camera coordinate system O-x is kept unchangedlylzlRelative to O-xwywzwThe rotational translation matrix of (a) is:
right camera coordinate system O-xryrzrRelative to O-xwywzwThe rotational translation matrix of (a) is:
2) camera zoom rear external reference calibration
Referring to fig. 5 in detail, the zooming process of the PTZ camera is a process in which the optical center moves back and forth along the optical axis, in addition to the change of the camera internal parameters, the translation vector between the binocular PTZ cameras is also changed, and the camera slave zoom can be known through the calibration of the internal parametersminIn the process of changing to zoom, the focal length value and the zoom value are approximately in a linear relation, and the translation vector of the optical center is TzAccording to the internal reference calibration, TzLinearly with the focal length value. Solving the translation vector T generated by zooming according to the current focal length value fz,
In the above formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, which can be obtained from the internal reference calibration, in dpi,the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
According to the rotation formula, the translation vector of the optical center when the PTZ camera rotates horizontally by alpha and rotates vertically by beta and the zoom value is z is obtained as follows:
3) external reference calculation for rotary zoom rear camera
Combining the external parameters caused by rotation and zooming, it can be known that when the left (right) camera respectively rotates horizontally by alpha, rotates vertically by beta, and has zoom values of z1,z2Time, left camera coordinate system O-xlylzlRelative to the world coordinate system O-xwywzwThe rotational translation matrix of (a) is:
right camera coordinate system O-xryrzrSit against the worldThe marker system O-xwywzwThe rotational translation matrix of (a) is:
the target point is mapped to a homography matrix H of a pixel coordinate system from a world coordinate system1,H2The mapping relation is as follows:
the solving method for solving the coordinate of the P point is that the formula (1) is simultaneously expanded to obtain
Writing equation (2) as a matrix multiplication form
By least squares
Third, experiment and analysis
3.1 Experimental platform
The experimental platform selects two Haikangwei vision iDS-2PT7T40BX-D4/T3 cameras with the focal length of 11-55mm, the cameras can realize quintupling optical zooming (zoom is 1-5), 360-degree horizontal rotation and-40-30-degree pitching rotation, the rotation precision is 0.1 degrees, the zoom value precision is 0.1 degrees, and the camera resolution is 704 multiplied by 596. The calibration plate is a chessboard calibration plate with the size of 12 multiplied by 9 multiplied by 35 mm.
3.2 initial position calibration
The initial positions of the left camera and the right camera are respectively pan-0 degrees, tilt-0 degrees, zoom-1 degrees,
the internal and external parameters of the camera at the initial position are calibrated by using a matlab camera calibration kit, and the results are shown in table 1:
TABLE 1 initial position internal and external parameters
3.3PTZ Camera reference calibration
The left and right camera focal length values of zoom ═ 1, 2, 3, 4, 5 were calibrated, and the results are shown in table 2:
TABLE 2 Focus distance values for left and right cameras at specific zoom values
The left camera and the right camera respectively take 4 focal length values, and the values are calculated according to the method of the invention and compared with the calibration value. The comparative results are shown in table 3:
TABLE 3 calibration results of camera internal parameters
As can be seen from the table, the internal reference error of the PTZ camera internal reference estimation by using the linear interpolation method is basically less than 1%, and the automatic calibration of the internal reference under any zoom value can be realized under the condition of low precision requirement.
When the left camera and the right camera have coincident view fields and can form an angle and focal length range of a stereoscopic vision system, randomly setting any angle and focal length of the left camera and the right camera, calculating external parameters of the left camera and the right camera according to the external parameter calibration method of the camera proposed in 2.4, and comparing calibration results of the Zhang Zhengyou calibration method as truth values, wherein the results are as follows:
the error between the calibration result and the calculation result is small, so that the method can realize the calibration of the external parameters.
3.4 results of localization test
The method is used for carrying out positioning research on the target within the range of 100m, the distance of a meter is taken as a true value, the measurement result is shown in an attachment 1, the measurement error within 100 meters is below 2%, the method can be used for positioning the target at any angle and at any focal length, and the texture information of the target can be acquired by zooming the long-distance target.
The above detailed description of the target location method based on the binocular PTZ camera with reference to the embodiments is illustrative and not restrictive, and several embodiments may be enumerated within the limited scope, so that changes and modifications that do not depart from the general concept of the present invention are intended to be within the scope of the present invention.
Claims (10)
1. A target positioning method based on a binocular PTZ camera is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr;
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr;
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2On the left and right according to the same name pointCoordinate p in pixel coordinate systeml(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2;
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
2. The binocular PTZ camera based target positioning method of claim 1, wherein: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrThe method comprises the following steps of calculating a piecewise function of focal lengths of left and right cameras with respect to zoom, and specifically:
step 1-1: respectively calibrating focal lengths f of left and right cameras under a plurality of specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the momentAndobtaining internal reference matrixes of the left camera and the right camera respectively as follows:
3. The binocular PTZ camera based target positioning method of claim 1, wherein: step 2, solving R of left camera and right camera at any angle and any focal lengthl、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration methodThe camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotationHorizontal rotation axis direction vector of pitch rotation
Step 2-2: direction vector of the vertical rotation axisAnd the direction vector of the horizontal rotation axisThe intersection points with the respective optical center rotation planes gamma are respectivelyCalculating the initial positionVector from origin O of camera coordinate system to each intersection pointAnd
step 2-3: respectively calculating the rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotatedAnd translation vector
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr。
4. The binocular PTZ camera based target positioning method of claim 1, wherein: solving as described in step 2-1Andthe method of calibrating the vertical spindle direction vectorFirstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Obtaining the direction vector according to the Rodrigues formulaIs a reverse symmetric matrixComprises the following steps:
In the formula:
5. The binocular PTZ camera based target positioning method of claim 1, wherein: the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2(4)。
6. the binocular PTZ camera based target positioning method of claim 1, wherein: step 2-2 said vectorThe left camera horizontal rotation vectorIs calculated byAfter rotating pi + alpha around the rotating shaft, a vector is obtainedThen the translation vector generated by the rotationIn conjunction with the formula (4), then
7. The binocular PTZ camera based target positioning method of claim 1, wherein: and 2-3, solving a rotation matrix after the left camera and the right camera rotate, wherein the horizontal rotation angle alpha and the pitching rotation angle beta are solved by using a Rodrigues formula to obtain:
in the formulaThe rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
8. The binocular PTZ camera based target positioning method of claim 1, wherein: and 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
9. the binocular PTZ camera based target positioning method of claim 1, wherein: step 2-4 optical center translation vector generated by zoomingSolving the translation vector T generated by zooming according to the current focal length value fz,
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110642192.6A CN113379848A (en) | 2021-06-09 | 2021-06-09 | Target positioning method based on binocular PTZ camera |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110642192.6A CN113379848A (en) | 2021-06-09 | 2021-06-09 | Target positioning method based on binocular PTZ camera |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113379848A true CN113379848A (en) | 2021-09-10 |
Family
ID=77573112
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110642192.6A Pending CN113379848A (en) | 2021-06-09 | 2021-06-09 | Target positioning method based on binocular PTZ camera |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113379848A (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114754743A (en) * | 2022-04-18 | 2022-07-15 | 中国人民解放军陆军军事交通学院军事交通运输研究所 | Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform |
CN114862959A (en) * | 2022-03-24 | 2022-08-05 | 阿里云计算有限公司 | Method and apparatus for controlling camera |
CN115272491A (en) * | 2022-08-12 | 2022-11-01 | 哈尔滨工业大学 | Binocular PTZ camera dynamic self-calibration method |
CN115713565A (en) * | 2022-12-16 | 2023-02-24 | 盐城睿算电子科技有限公司 | Target positioning method for binocular servo camera |
CN115797459A (en) * | 2022-08-29 | 2023-03-14 | 南京航空航天大学 | Binocular vision system distance measurement method with arbitrary focal length combination |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103854291A (en) * | 2014-03-28 | 2014-06-11 | 中国科学院自动化研究所 | Camera calibration method in four-degree of freedom binocular vision system |
CN110415278A (en) * | 2019-07-30 | 2019-11-05 | 中国人民解放军火箭军工程大学 | The ptz camera that moves linearly assists principal and subordinate's tracking of binocular PTZ vision system |
CN112053405A (en) * | 2020-08-21 | 2020-12-08 | 合肥工业大学 | Deviation calibration and external reference correction method for optical axis and rotating shaft of follow-up vision system |
CN112734863A (en) * | 2021-03-31 | 2021-04-30 | 武汉理工大学 | Crossed binocular camera calibration method based on automatic positioning |
-
2021
- 2021-06-09 CN CN202110642192.6A patent/CN113379848A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103854291A (en) * | 2014-03-28 | 2014-06-11 | 中国科学院自动化研究所 | Camera calibration method in four-degree of freedom binocular vision system |
CN110415278A (en) * | 2019-07-30 | 2019-11-05 | 中国人民解放军火箭军工程大学 | The ptz camera that moves linearly assists principal and subordinate's tracking of binocular PTZ vision system |
CN112053405A (en) * | 2020-08-21 | 2020-12-08 | 合肥工业大学 | Deviation calibration and external reference correction method for optical axis and rotating shaft of follow-up vision system |
CN112734863A (en) * | 2021-03-31 | 2021-04-30 | 武汉理工大学 | Crossed binocular camera calibration method based on automatic positioning |
Non-Patent Citations (4)
Title |
---|
KANG MAO ET.AL: "A General Calibration Method for Dual PTZ Cameras Based on Feedback Parameters", 《SSRN》 * |
SUDIPTA N. SINHA ET.AL: "Pan–tilt–zoom camera calibration and high-resolution mosaic generation", 《COMPUTER VISION AND IMAGE UNDERSTANDING》 * |
崔智高 等: "大视场双目主动视觉传感器的协同跟踪方法", 《光电子激光》 * |
赵显庭 等: "基于转轴参数的多自由度双目视觉系统标定", 《光学技术》 * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114862959A (en) * | 2022-03-24 | 2022-08-05 | 阿里云计算有限公司 | Method and apparatus for controlling camera |
CN114754743A (en) * | 2022-04-18 | 2022-07-15 | 中国人民解放军陆军军事交通学院军事交通运输研究所 | Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform |
CN115272491A (en) * | 2022-08-12 | 2022-11-01 | 哈尔滨工业大学 | Binocular PTZ camera dynamic self-calibration method |
CN115797459A (en) * | 2022-08-29 | 2023-03-14 | 南京航空航天大学 | Binocular vision system distance measurement method with arbitrary focal length combination |
CN115797459B (en) * | 2022-08-29 | 2024-02-13 | 南京航空航天大学 | Binocular vision system ranging method with arbitrary focal length combination |
CN115713565A (en) * | 2022-12-16 | 2023-02-24 | 盐城睿算电子科技有限公司 | Target positioning method for binocular servo camera |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN113379848A (en) | Target positioning method based on binocular PTZ camera | |
CN111951305B (en) | Target detection and motion state estimation method based on vision and laser radar | |
CN107389026B (en) | A kind of monocular vision distance measuring method based on fixed point projective transformation | |
CN110842940A (en) | Building surveying robot multi-sensor fusion three-dimensional modeling method and system | |
CN105716542B (en) | A kind of three-dimensional data joining method based on flexible characteristic point | |
CN102034238B (en) | Multi-camera system calibrating method based on optical imaging probe and visual graph structure | |
CN106878687A (en) | A kind of vehicle environment identifying system and omni-directional visual module based on multisensor | |
CN107944390B (en) | Motor-driven vehicle going objects in front video ranging and direction localization method | |
US20230351625A1 (en) | A method for measuring the topography of an environment | |
CN106772431A (en) | A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision | |
CN206611521U (en) | A kind of vehicle environment identifying system and omni-directional visual module based on multisensor | |
CN112132874B (en) | Calibration-plate-free heterogeneous image registration method and device, electronic equipment and storage medium | |
CN105059190B (en) | The automobile door opening collision warning device and method of view-based access control model | |
CN108489398B (en) | Method for measuring three-dimensional coordinates by laser and monocular vision under wide-angle scene | |
CN106127115B (en) | hybrid visual target positioning method based on panoramic vision and conventional vision | |
CN112669354B (en) | Multi-camera motion state estimation method based on incomplete constraint of vehicle | |
CN103971375A (en) | Panoramic gaze camera space calibration method based on image splicing | |
CN108469254A (en) | A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose | |
CN109883433A (en) | Vehicle positioning method in structured environment based on 360 degree of panoramic views | |
Kim et al. | External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots | |
Ye et al. | Extrinsic calibration of a monocular camera and a single line scanning Lidar | |
Schönbein et al. | Environmental Perception for Intelligent Vehicles Using Catadioptric Stereo Vision Systems. | |
Chenchen et al. | A camera calibration method for obstacle distance measurement based on monocular vision | |
Gehrig et al. | 6D vision goes fisheye for intersection assistance | |
CN112364793A (en) | Target detection and fusion method based on long-focus and short-focus multi-camera vehicle environment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WD01 | Invention patent application deemed withdrawn after publication | ||
WD01 | Invention patent application deemed withdrawn after publication |
Application publication date: 20210910 |