[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113379848A - Target positioning method based on binocular PTZ camera - Google Patents

Target positioning method based on binocular PTZ camera Download PDF

Info

Publication number
CN113379848A
CN113379848A CN202110642192.6A CN202110642192A CN113379848A CN 113379848 A CN113379848 A CN 113379848A CN 202110642192 A CN202110642192 A CN 202110642192A CN 113379848 A CN113379848 A CN 113379848A
Authority
CN
China
Prior art keywords
camera
rotation
coordinate system
matrix
focal length
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110642192.6A
Other languages
Chinese (zh)
Inventor
徐友春
冒康
娄静涛
朱愿
李永乐
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Military Transportation Research Institute Of Chinese People's Liberation Army Army Military Transportation Academy
Original Assignee
Military Transportation Research Institute Of Chinese People's Liberation Army Army Military Transportation Academy
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Military Transportation Research Institute Of Chinese People's Liberation Army Army Military Transportation Academy filed Critical Military Transportation Research Institute Of Chinese People's Liberation Army Army Military Transportation Academy
Priority to CN202110642192.6A priority Critical patent/CN113379848A/en
Publication of CN113379848A publication Critical patent/CN113379848A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • G06T7/85Stereo camera calibration
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • G06T2207/10012Stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention relates to a target positioning method based on a binocular PTZ camera, which is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps: step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr(ii) a Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr(ii) a And step 3: calculating homography matrix H from world coordinate system to left and right camera pixel coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) And calculating the coordinate value of the target in the world coordinate system by using a least square method. Has the advantages that: the invention utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera, can realize the three-dimensional positioning of the target in a larger range and a longer distance, and can greatly position the targetThe perception capability of the intelligent unmanned platform is greatly improved.

Description

Target positioning method based on binocular PTZ camera
Technical Field
The invention belongs to the visual perception positioning technology, and particularly relates to a target positioning method based on a binocular PTZ camera.
Background
At present, sensors of a perception module of an intelligent platform mainly comprise a laser radar, a camera, a binocular camera, a millimeter wave radar and the like, and the sensors have certain defects in perception capability, for example, although the laser radar can obtain accurate depth information, the laser radar is expensive, point clouds at a distance are sparse, texture information is absent, and the sensors are easily influenced by dust particles, heavy rain and the like; the camera is low in cost and can provide rich texture information, but the monocular camera is limited by a small-hole imaging principle, cannot directly acquire target depth information and is easily influenced by illumination conditions and climates; although the traditional binocular camera can realize stereoscopic vision, the long-distance ranging error is large due to the short base line, the visual field range is fixed, and the focal length and the angle cannot be adjusted; the millimeter wave radar has a large detection angle and strong anti-interference capability, but has lower resolution and precision; the millimeter wave radar is accurate in speed measurement of the obstacle, but has more noise data and high false detection rate.
The perception module is used as an unmanned robot, particularly one of four key technologies in unmanned vehicle application, and provides information support for the positioning and planning module. The main perceived tasks include identification of traffic lights and traffic signs in urban environments; the method comprises the steps of detecting, identifying, tracking and positioning the obstacles in urban and field environments, predicting the track of the obstacles and the like. The existing unmanned vehicle sensing system has weak sensing capability on a target with a long distance, when the target distance is long, the laser radar point cloud is sparse, and texture information is little; the traditional camera and the binocular camera cannot zoom and change the posture, the target is small in the picture, the number of pixels is small, texture information is not abundant, and the distance measurement precision of the binocular camera is low. Especially, the distance of the target in the field battlefield environment is long, and the existing sensor is difficult to accurately position the long-distance target. The problem that a monocular camera is small in view field and fixed in focal length is solved by installing a plurality of cameras with different focal lengths in a plurality of directions on a vehicle body of a part of automatic driving vehicles. If a vehicle with an automatic driving function is provided with 8 cameras, the method causes certain resource waste, and the computing unit has high computing resource consumption.
The PTZ camera is an abbreviation of Pan/Tilt/Zoom, representing that the camera can move in all directions (left and right/up and down) and the lens is Zoom-controlled. Binocular PTZ camera simulation chameleon's visual system compares in traditional camera and static binocular camera, and its advantage lies in that the camera can move alone and acquire big scene information, also can cooperate each other to constitute stereo vision system and acquire the degree of depth information, can also acquire local high resolution information, can realize carrying out on a large scale, remote tracking and location to the target. The binocular PTZ camera system can be used in the fields of video monitoring, unmanned driving, intelligent traffic systems, military reconnaissance and the like, in recent years, the binocular PTZ camera vision system is always a research hotspot, but because the angle values and the focal length values of the left camera and the right camera in the zooming and rotating processes are constantly changed, the real-time calibration difficulty of the binocular PTZ camera is high, the domestic research on static binocular cameras and single PTZ cameras is quite deep, but the calibration research on the binocular PTZ camera is less.
Disclosure of Invention
The invention aims to overcome the defects of the technology and provide a target positioning method based on a binocular PTZ camera, which utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera to realize the three-dimensional positioning of a target in a larger range and a longer distance.
In order to achieve the purpose, the invention adopts the following technical scheme: a target positioning method based on a binocular PTZ camera is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
Solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrFitting a piecewise function of the focal lengths of the left camera and the right camera with respect to zoom, which comprises the following specific steps:
step 1-1: respectively calibrating focal lengths f of left and right cameras under a plurality of specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the moment
Figure BDA0003107478180000031
And
Figure BDA0003107478180000032
obtaining internal reference matrixes of the left camera and the right camera respectively as follows:
Figure BDA0003107478180000033
in the formula (1)
Figure BDA0003107478180000034
And
Figure BDA0003107478180000035
units of dpi, u0、v0The pixel coordinates of the center point of the image, whose value is 1/2 at the resolution of the image.
Solving R of the left camera and the right camera at any angle and any focal length in the step 2l、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration method
Figure BDA0003107478180000036
The camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotation
Figure BDA0003107478180000037
Horizontal rotation axis direction vector of pitch rotation
Figure BDA0003107478180000038
Step 2-2: direction vector of the vertical rotation axis
Figure BDA0003107478180000039
And the direction vector of the horizontal rotation axis
Figure BDA00031074781800000310
The intersection points with the respective optical center rotation planes gamma are respectively
Figure BDA00031074781800000311
Calculating the vector from the initial position camera coordinate system origin O to each intersection point
Figure BDA00031074781800000312
And
Figure BDA00031074781800000313
step 2-3: are respectively provided withCalculating a rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotated
Figure BDA00031074781800000314
And translation vector
Figure BDA00031074781800000315
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Figure BDA00031074781800000316
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr
Solving of the step 2-1
Figure BDA00031074781800000317
And
Figure BDA00031074781800000318
the method of calibrating the vertical spindle direction vector
Figure BDA00031074781800000319
Figure BDA00031074781800000320
Firstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Figure BDA00031074781800000321
Obtaining the direction vector according to the Rodrigues formula
Figure BDA00031074781800000322
Is a reverse symmetric matrix
Figure BDA00031074781800000323
Comprises the following steps:
Figure BDA0003107478180000041
thereby obtaining the direction vector of the rotating shaft
Figure BDA0003107478180000042
Can be obtained by the same method
Figure BDA0003107478180000043
In the formula:
Rv-αfor horizontal rotation of camera (around)
Figure BDA00031074781800000419
A rotating shaft) rotates by an angle α;
the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2 (4)。
step 2-2 said vector
Figure BDA0003107478180000044
The left camera horizontal rotation vector
Figure BDA0003107478180000045
Is calculated by
Figure BDA0003107478180000046
After rotating pi + alpha around the rotating shaft, a vector is obtained
Figure BDA0003107478180000047
Then the translation vector generated by the rotation
Figure BDA0003107478180000048
In conjunction with the formula (4), then
Figure BDA0003107478180000049
By the same method
Figure BDA00031074781800000410
In the step 2-3, the solution of the rotation matrix after the left camera and the right camera rotate, the horizontal rotation angle alpha and the pitching rotation angle beta, can be obtained by using a rodgers formula to solve:
Figure BDA00031074781800000411
in the formula
Figure BDA00031074781800000412
The rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
Figure BDA00031074781800000413
And 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
Figure BDA00031074781800000414
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
Figure BDA00031074781800000415
the optical center translation vector generated by zooming in the step 2-4
Figure BDA00031074781800000416
Solving the translation vector T generated by zooming according to the current focal length value fz
Figure BDA00031074781800000417
In the formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, obtainable from claim 2, in dpi,
Figure BDA00031074781800000418
the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
The rotation and translation matrix R under any angle and zoom value in the step 2-5l、Tl、Rr、TrThe result of the left and right cameras relative to the world coordinate system is:
Figure BDA0003107478180000051
Figure BDA0003107478180000052
has the advantages that: compared with the prior art, the invention utilizes the stereoscopic vision formed by zooming and rotating the binocular PTZ camera, and can realize the three-dimensional positioning of the target in a larger range and a longer distance. The PTZ camera is low in price, can rotate and zoom, obtains texture information of a target in a wider range and a longer distance (abundant texture information cannot be obtained by a laser radar and a millimeter wave radar), and can greatly improve the perception capability of the intelligent unmanned platform.
Drawings
FIG. 1 is a zoom schematic of a PTZ camera;
FIG. 2 is a schematic view of binocular PTZ camera coordinates;
FIG. 3a is a top calibration schematic view of a horizontally rotating vertical shaft;
FIG. 3b is a side view calibration schematic of a horizontally rotating vertical shaft;
FIG. 4 is a schematic view of the horizontal pivot axis of the pitch rotation of the binocular PTZ camera;
FIG. 5 is a schematic diagram of camera zoom external reference calibration.
Detailed Description
So that the manner in which the above recited objects, features and advantages of the present invention can be understood in detail, a more particular description of the invention, briefly summarized above, may be had by reference to the embodiments thereof which are illustrated in the appended drawings. In addition, the embodiments and features of the embodiments of the present application may be combined with each other without conflict. In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, are within the scope of the present invention. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used herein in the description of the invention is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
In various embodiments of the present invention, for convenience in description and not in limitation, the term "coupled" as used in the specification and claims of the present application is not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", and the like are used merely to indicate relative positional relationships, and when the absolute position of the object being described is changed, the relative positional relationships are changed accordingly.
Referring to the drawings in detail, the invention provides a target positioning method based on a binocular PTZ camera, which is used for calibrating internal and external parameters of the binocular PTZ camera under any angle and any focal length, and realizing three-dimensional positioning of a target by using a least square method, and comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2According to the coordinate p of the homonymous point in the left and right pixel coordinate systemsl(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
Solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrFitting a piecewise function of the focal lengths of the left camera and the right camera with respect to zoom, which comprises the following specific steps:
step 1-1: respective labelDetermining the focal length f of left and right cameras under several specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the moment
Figure BDA0003107478180000061
And
Figure BDA0003107478180000062
obtaining internal reference matrixes of the left camera and the right camera respectively as follows:
Figure BDA0003107478180000063
in the formula (1)
Figure BDA0003107478180000064
And
Figure BDA0003107478180000065
units of dpi, u0、v0The pixel coordinates of the center point of the image, whose value is 1/2 at the resolution of the image.
Solving R of the left camera and the right camera at any angle and any focal length in the step 2l、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration method
Figure BDA0003107478180000066
The camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotation
Figure BDA0003107478180000071
Horizontal rotation axis direction vector of pitch rotation
Figure BDA0003107478180000072
Step 2-2: direction vector of the vertical rotation axis
Figure BDA0003107478180000073
And the direction vector of the horizontal rotation axis
Figure BDA0003107478180000074
The intersection points with the respective optical center rotation planes gamma are respectively
Figure BDA0003107478180000075
Calculating the vector from the initial position camera coordinate system origin O to each intersection point
Figure BDA0003107478180000076
And
Figure BDA0003107478180000077
step 2-3: respectively calculating the rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotated
Figure BDA0003107478180000078
And translation vector
Figure BDA0003107478180000079
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Figure BDA00031074781800000710
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr
Solving of the step 2-1
Figure BDA00031074781800000711
And
Figure BDA00031074781800000712
the method of calibrating the vertical spindle direction vector
Figure BDA00031074781800000713
Figure BDA00031074781800000714
Firstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Figure BDA00031074781800000715
Obtaining the direction vector according to the Rodrigues formula
Figure BDA00031074781800000727
Is a reverse symmetric matrix
Figure BDA00031074781800000716
Comprises the following steps:
Figure BDA00031074781800000717
thereby obtaining the direction vector of the rotating shaft
Figure BDA00031074781800000718
Can be obtained by the same method
Figure BDA00031074781800000719
In the formula:
Rv-αfor horizontal rotation of camera (around)
Figure BDA00031074781800000720
A rotating shaft) rotates by an angle α;
the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2 (4)。
step 2-2 said vector
Figure BDA00031074781800000721
The left camera horizontal rotation vector
Figure BDA00031074781800000722
Is calculated by
Figure BDA00031074781800000723
After rotating pi + alpha around the rotating shaft, a vector is obtained
Figure BDA00031074781800000724
Then the translation vector generated by the rotation
Figure BDA00031074781800000725
In conjunction with the formula (4), then
Figure BDA00031074781800000726
By the same method
Figure BDA0003107478180000081
In the step 2-3, the solution of the rotation matrix after the left camera and the right camera rotate, the horizontal rotation angle alpha and the pitching rotation angle beta, can be obtained by using a rodgers formula to solve:
Figure BDA0003107478180000082
in the formula
Figure BDA0003107478180000083
The rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
Figure BDA0003107478180000084
And 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
Figure BDA0003107478180000085
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
Figure BDA0003107478180000086
the optical center translation vector generated by zooming in the step 2-4
Figure BDA0003107478180000087
Solving the translation vector T generated by zooming according to the current focal length value fz
Figure BDA0003107478180000088
In the formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, obtainable from claim 2, in dpi,
Figure BDA0003107478180000089
the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
The rotation and translation matrix R under any angle and zoom value in the step 2-5l、Tl、Rr、TrThe result of the left and right cameras relative to the world coordinate system is:
Figure BDA00031074781800000810
Figure BDA00031074781800000811
the corner mark l and the corner mark r in the upper right corner of the symbol in the formula respectively represent the parameters of the left camera (left) and the right camera (right); the l-r corner marks represent parameters between the left and right cameras, and the lower right corner mark init represents parameters of the initial position.
When the method is used for calibration, internal and external parameters of the camera at the initial position are calibrated by using a Zhangyingyou plane calibration method. When the internal reference is calibrated, focal length values under a plurality of specific zoom values are calibrated, the focal length values are fitted into a piecewise function related to the zoom, and the camera internal reference is estimated in real time according to the zoom values of the camera
Figure BDA0003107478180000091
Examples
The method divides the appearance parameter updating into two parts, namely, the appearance parameter change generated by the rotation of the camera comprises a rotation matrix R of the rotated left camera and the rotated right camera relative to the original positionrotAnd translation vector TrotThe text utilizes Zhao Zhiting, Wang jin Jiang, Wang Chenguang, and multiple freedom based on the parameters of the rotating shaftBinocular vision system calibration [ J ]]The optical technology, 2018,44(02) 140 and 146 respectively demarcate the horizontal and pitching rotation vectors of the left and right PTZ cameras, and then calculate the rotation matrix of the horizontal and pitching rotation of the cameras by utilizing the Rodrigues formula; for the deviation of the optical center caused by rotation, the intersection point of the rotating shaft and the rotating plane of the optical center is recorded as A, and the vector from the optical center O to the point A is obtained through calibration calculation
Figure BDA0003107478180000092
Recalculation
Figure BDA0003107478180000093
Translation vectors due to rotation; secondly, the external parameter change is generated by zooming of the left and right PTZ cameras, the zooming process is actually the process of moving the optical center of the camera, the change of a rotation matrix can not be caused, and only the change of a translation vector T can be causedzAnd zoom, and then according to the rotation matrix, calculating the translation vector of the optical center of the monocular PTZ camera relative to the initial position after the monocular PTZ camera rotates and zooms. Finally, the external parameter changes generated by rotation and zooming are combined, and the horizontal rotation alpha of the left camera and the right camera respectively can be solved1、α2Pitch rotation beta1,β2Each zoom value is z1,z2External reference matrix R of left and right cameras relative to world coordinate systeml、Tl、Rr、Tr
The specific method comprises the following steps:
first, PTZ camera internal reference calibration
Referring to fig. 1 in detail, according to the pinhole imaging model, the zooming process of the camera can be equivalent to the process of moving the optical center back and forth on the optical axis, the zoom value increases, and the optical center is equivalent to moving forward. According to the camera characteristics, the size of the focal length value is approximately in linear relation with the zoom value, the zoom value fed back in real time is recorded as m, and the zoom value is taken according to the size of the camera zoom rangemin(zoom=1),zoom2,…,zoomn-1,zoommaxAnd n different zoom values are calibrated. Only considering the camera focal length value as large as the camera zooming processSmall variations, not considering camera principal point, distortion changes with focal length. Calibrating zoom m according to Zhangzhen friend calibration methodiFx at valueiAnd fyiThe focal length f between the individual zoom values is fitted to a piecewise function with respect to m using linear interpolation:
Figure BDA0003107478180000094
in the formula mi-1<m<miObtaining the internal reference matrix of the camera
Figure BDA0003107478180000095
The same way a piecewise function of the focal length of the right camera with respect to m can be obtained. The left camera and the right camera can realize self-calibration of camera internal parameters according to zoom values fed back by the cameras.
External parameter calibration of two-eye and two-eye PTZ camera
1) Camera rotation external reference calibration
Most documents idealize the optical center to coincide with the rotation axis in the calibration process of the binocular PTZ camera, the optical center deviates from the rotation axis when the actual PTZ camera rotates, the optical center is not fixed but rotates around the rotation axis, and the schematic diagram of the binocular PTZ camera is shown in FIG. 2, wherein O isl、OrIs the optical center of the left and right cameras, Ol-xyz、Or-xyz is the camera coordinate system of the left and right cameras, O-xwywzwWhich is a specified world coordinate system, coincides with the camera coordinate system of the initial position of the left camera.
Figure BDA0003107478180000101
The rotation vector of the horizontal rotation and the rotation vector of the vertical rotation of the left camera,
Figure BDA0003107478180000102
the rotation vector of the horizontal rotation and the rotation vector of the vertical rotation of the right camera are provided, and the optical center of the camera rotates around the rotation vector;
2.1 rotation matrix solution
Referring to FIG. 3a and FIG. 3b in detail, the present invention utilizes Zhao Zhiting, Wang jin Jiang, Wang Chenguang, multi-degree of freedom binocular vision system calibration based on spindle parameters [ J]Optical technique, 2018,44(02):140 and 146. The optical center of the camera rotates around a rotating shaft, the pose relation between the camera and the calibration template is calculated by utilizing a homography principle, the direction of the rotating shaft is calculated, and the zoom of the left camera is usedmin(zoom 1), horizontal rotation, position 1 camera coordinate system relative to calibration plate corner point coordinate system B-xByBzBIs R1Translation matrix is T1Keeping the position of the chessboard marking template fixed, rotating the camera by an angle alpha in the horizontal direction to obtain a rotation matrix R at the position 22Translation matrix T2
Rotation matrix R for rotation from position 1 to position 2 of the camera coordinate systemv-αIs composed of
Figure BDA0003107478180000103
The rotation matrix, the rotation vector and the rotation angle can be mutually converted according to the Rodrigues formula, and the relationship is as follows:
Rv-α=cosαI+(1-cosα)vvT+sinαvΛ
Figure BDA0003107478180000104
Figure BDA0003107478180000105
wherein tr (R)v-α) Is a matrix Rv-αThe antisymmetric matrix of the rotation vector can be obtained by the above formula
Figure BDA0003107478180000106
Thereby obtaining
Figure BDA0003107478180000107
The vector is then normalized to determine the rotation vector from position 1 to position 2
Figure BDA0003107478180000108
We take the average of multiple measurements to eliminate the error, so the vertical rotation vector of the left camera is:
Figure BDA0003107478180000111
the horizontal rotation vector can be obtained by the same method
Figure BDA0003107478180000112
And vertical and horizontal rotation vectors of right camera
Figure BDA0003107478180000113
The direction rotation of the camera conforms to a right-hand coordinate system, and according to a Rodrigues formula, a vector of the left camera rotating around the vertical direction can be obtained
Figure BDA0003107478180000114
Rotation matrix of rotation alpha
Figure BDA0003107478180000115
The pitch rotation matrix can be solved by the same method
Figure BDA0003107478180000116
And right camera horizontal and pitch rotation matrix
Figure BDA0003107478180000117
2.2 translation vector solving
Take horizontal rotation of the left camera as an example, o in FIG. 3initIs the initial position of the camera coordinate system and the rotation axis
Figure BDA0003107478180000118
The vertical rotating plane passes through the optical center, the intersection point of the rotating shaft and the rotating plane is A, and the A is recorded in the initial coordinate system O of the camerainitAt-xyz, oinitThe translation vector to the A point of the camera optical center is
Figure BDA0003107478180000119
Wound around
Figure BDA00031074781800001110
Rotated by an angle of pi + alpha
Figure BDA00031074781800001111
The rotation direction is in accordance with the right-hand coordinate system, and the translation vector generated by rotation can be known
Figure BDA00031074781800001112
With particular reference to FIG. 3a, FIG. 3B, the calibration plate coordinate system B-x at position 1 (initial position)ByBzBOrigin relative to coordinate system O1-translation vector of xyz is T1Calibration plate origin at position 2 is O2The translation vector in the xyz coordinate system is T2,T1、T2All can be obtained by calibration, and can also obtain Tv-α
Tv-α=T1-Rv-αT2
In conjunction with the calculated value, to obtain
Figure BDA00031074781800001113
Selecting different rotation angles for calibration, and taking the average value of multiple groups of results to obtain
Figure BDA00031074781800001114
The translation vector from the intersection point of the left camera horizontal rotating shaft, the right camera vertical rotating shaft and the right camera horizontal rotating shaft and the rotating plane to the optical center can be obtained by the same method
Figure BDA00031074781800001115
Then according to
Figure BDA00031074781800001116
The translation vector generated by the horizontal and pitching rotation of the left and right cameras can be obtained
Figure BDA00031074781800001117
2.3 calculation of external parameters after rotation
Defining the coordinate system of the left camera at the initial position as the world coordinate system O-xwywzwWhen the left (right) camera rotates horizontally by alpha and rotates in pitch by beta, the left camera coordinate system O-x is kept unchangedlylzlRelative to O-xwywzwThe rotational translation matrix of (a) is:
Figure BDA0003107478180000121
Figure BDA0003107478180000122
right camera coordinate system O-xryrzrRelative to O-xwywzwThe rotational translation matrix of (a) is:
Figure BDA0003107478180000123
Figure BDA0003107478180000124
2) camera zoom rear external reference calibration
Referring to fig. 5 in detail, the zooming process of the PTZ camera is a process in which the optical center moves back and forth along the optical axis, in addition to the change of the camera internal parameters, the translation vector between the binocular PTZ cameras is also changed, and the camera slave zoom can be known through the calibration of the internal parametersminIn the process of changing to zoom, the focal length value and the zoom value are approximately in a linear relation, and the translation vector of the optical center is TzAccording to the internal reference calibration, TzLinearly with the focal length value. Solving the translation vector T generated by zooming according to the current focal length value fz
Figure BDA0003107478180000125
In the above formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, which can be obtained from the internal reference calibration, in dpi,
Figure BDA0003107478180000126
the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
According to the rotation formula, the translation vector of the optical center when the PTZ camera rotates horizontally by alpha and rotates vertically by beta and the zoom value is z is obtained as follows:
Figure BDA0003107478180000127
3) external reference calculation for rotary zoom rear camera
Combining the external parameters caused by rotation and zooming, it can be known that when the left (right) camera respectively rotates horizontally by alpha, rotates vertically by beta, and has zoom values of z1,z2Time, left camera coordinate system O-xlylzlRelative to the world coordinate system O-xwywzwThe rotational translation matrix of (a) is:
Figure BDA0003107478180000128
Figure BDA0003107478180000129
right camera coordinate system O-xryrzrSit against the worldThe marker system O-xwywzwThe rotational translation matrix of (a) is:
Figure BDA00031074781800001210
Figure BDA00031074781800001211
the target point is mapped to a homography matrix H of a pixel coordinate system from a world coordinate system1,H2The mapping relation is as follows:
Figure BDA0003107478180000131
the solving method for solving the coordinate of the P point is that the formula (1) is simultaneously expanded to obtain
Figure BDA0003107478180000132
Writing equation (2) as a matrix multiplication form
Figure BDA0003107478180000133
Formula (3) is
Figure BDA0003107478180000134
By least squares
Figure BDA0003107478180000135
Third, experiment and analysis
3.1 Experimental platform
The experimental platform selects two Haikangwei vision iDS-2PT7T40BX-D4/T3 cameras with the focal length of 11-55mm, the cameras can realize quintupling optical zooming (zoom is 1-5), 360-degree horizontal rotation and-40-30-degree pitching rotation, the rotation precision is 0.1 degrees, the zoom value precision is 0.1 degrees, and the camera resolution is 704 multiplied by 596. The calibration plate is a chessboard calibration plate with the size of 12 multiplied by 9 multiplied by 35 mm.
3.2 initial position calibration
The initial positions of the left camera and the right camera are respectively pan-0 degrees, tilt-0 degrees, zoom-1 degrees,
the internal and external parameters of the camera at the initial position are calibrated by using a matlab camera calibration kit, and the results are shown in table 1:
TABLE 1 initial position internal and external parameters
Figure BDA0003107478180000141
3.3PTZ Camera reference calibration
The left and right camera focal length values of zoom ═ 1, 2, 3, 4, 5 were calibrated, and the results are shown in table 2:
TABLE 2 Focus distance values for left and right cameras at specific zoom values
Figure BDA0003107478180000142
The left camera and the right camera respectively take 4 focal length values, and the values are calculated according to the method of the invention and compared with the calibration value. The comparative results are shown in table 3:
TABLE 3 calibration results of camera internal parameters
Figure BDA0003107478180000143
As can be seen from the table, the internal reference error of the PTZ camera internal reference estimation by using the linear interpolation method is basically less than 1%, and the automatic calibration of the internal reference under any zoom value can be realized under the condition of low precision requirement.
When the left camera and the right camera have coincident view fields and can form an angle and focal length range of a stereoscopic vision system, randomly setting any angle and focal length of the left camera and the right camera, calculating external parameters of the left camera and the right camera according to the external parameter calibration method of the camera proposed in 2.4, and comparing calibration results of the Zhang Zhengyou calibration method as truth values, wherein the results are as follows:
Figure BDA0003107478180000151
the error between the calibration result and the calculation result is small, so that the method can realize the calibration of the external parameters.
3.4 results of localization test
The method is used for carrying out positioning research on the target within the range of 100m, the distance of a meter is taken as a true value, the measurement result is shown in an attachment 1, the measurement error within 100 meters is below 2%, the method can be used for positioning the target at any angle and at any focal length, and the texture information of the target can be acquired by zooming the long-distance target.
The above detailed description of the target location method based on the binocular PTZ camera with reference to the embodiments is illustrative and not restrictive, and several embodiments may be enumerated within the limited scope, so that changes and modifications that do not depart from the general concept of the present invention are intended to be within the scope of the present invention.

Claims (10)

1. A target positioning method based on a binocular PTZ camera is characterized by comprising the following steps: calibrating internal and external parameters of the binocular PTZ camera under any angle and focal length, and realizing three-dimensional positioning of a target by using a least square method, wherein the method comprises the following specific steps:
step 1: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,Kr
Step 2: solving a rotation matrix R of the left camera and the right camera relative to a world coordinate system under any angle and any focal lengthl、RrAnd translation vector Tl、Tr
And step 3: calculating a homography matrix H of the target point mapped to the pixel coordinate systems of the left camera and the right camera from the world coordinate system1、H2On the left and right according to the same name pointCoordinate p in pixel coordinate systeml(ul,vl),pr(ur,vr) The method comprises the following steps of calculating coordinate values of a target under a world coordinate system by using a least square method:
step 3-1, finding the target Point P (X)w,Yw,Zw) Mapping to corresponding point p under left and right camera pixel coordinate systems from world coordinate systeml(ul,vl),pr(ur,vr) Homography matrix H1,H2
Step 3-2, solving P (X) by using least square methodw,Yw,Zw) And (4) coordinates.
2. The binocular PTZ camera based target positioning method of claim 1, wherein: solving the internal reference matrix K of the left camera and the right camera under any focal lengthl,KrThe method comprises the following steps of calculating a piecewise function of focal lengths of left and right cameras with respect to zoom, and specifically:
step 1-1: respectively calibrating focal lengths f of left and right cameras under a plurality of specific zoom valuesx,fyFitting the focus value to a piecewise function about zoom using linear interpolation;
step 1-2: substituting the zoom value of the current camera into the piecewise function obtained in the step 1-1 to obtain the focal length values of the left camera and the right camera at the moment
Figure FDA0003107478170000011
And
Figure FDA0003107478170000012
obtaining internal reference matrixes of the left camera and the right camera respectively as follows:
Figure FDA0003107478170000013
in the formula (1)
Figure FDA0003107478170000014
And
Figure FDA0003107478170000015
units of dpi, u0、v0The pixel coordinates of the center point of the image, whose value is 1/2 at the resolution of the image.
3. The binocular PTZ camera based target positioning method of claim 1, wherein: step 2, solving R of left camera and right camera at any angle and any focal lengthl、RrAnd Tl、TrExternal reference matrix of left and right cameras for calibrating initial position by Zhangyingyou calibration method
Figure FDA0003107478170000016
The camera coordinate system of the initial position of the left camera is used as a world coordinate system Ow-xwywzwCalculating Rl、Rr、Tl、TrThe method comprises the following specific steps:
step 2-1: solving the vertical rotating shaft direction vector of the left camera and the right camera during horizontal rotation
Figure FDA0003107478170000021
Horizontal rotation axis direction vector of pitch rotation
Figure FDA0003107478170000022
Step 2-2: direction vector of the vertical rotation axis
Figure FDA0003107478170000023
And the direction vector of the horizontal rotation axis
Figure FDA0003107478170000024
The intersection points with the respective optical center rotation planes gamma are respectively
Figure FDA0003107478170000025
Calculating the initial positionVector from origin O of camera coordinate system to each intersection point
Figure FDA0003107478170000026
And
Figure FDA0003107478170000027
step 2-3: respectively calculating the rotation matrix of the camera coordinate system relative to the original position after the left camera and the right camera are rotated
Figure FDA0003107478170000028
And translation vector
Figure FDA0003107478170000029
Step 2-4: respectively calculating translation vectors generated by the left camera and the right camera under different zoom values relative to the zoom-1 optical center
Figure FDA00031074781700000210
Step 2-5: calculating z as the horizontal rotation alpha, the pitch rotation beta and the zoom value of the left (right) camera respectively1、z2The world coordinate system O-x relative to which the left camera and the right camera arewywzwRotational translation matrix R ofl、Tl、Rr、Tr
4. The binocular PTZ camera based target positioning method of claim 1, wherein: solving as described in step 2-1
Figure FDA00031074781700000211
And
Figure FDA00031074781700000212
the method of calibrating the vertical spindle direction vector
Figure FDA00031074781700000213
Firstly, calibrating a calibration plate coordinate system B-x under an initial position by using a Zhang Zhengyou calibration methodByBzBO-xyz rotation-translation matrix R with the camera coordinate system1、T1(ii) a Keeping the position of the chessboard pattern calibration plate fixed, rotating the camera to the position 2 by an angle alpha in the horizontal direction, and obtaining the calibration plate coordinate system B-x at the position 2 by using the same methodByBzBA rotational translation matrix R with the camera coordinate system2、T2From the initial position to the rotation matrix at position 2
Figure FDA00031074781700000214
Obtaining the direction vector according to the Rodrigues formula
Figure FDA00031074781700000215
Is a reverse symmetric matrix
Figure FDA00031074781700000216
Comprises the following steps:
Figure FDA00031074781700000217
thereby obtaining the direction vector of the rotating shaft
Figure FDA00031074781700000218
Can be obtained by the same method
Figure FDA00031074781700000219
In the formula:
Rv-αfor horizontal rotation of camera (around)
Figure FDA00031074781700000220
The rotation axis) is rotated by an angle alpha.
5. The binocular PTZ camera based target positioning method of claim 1, wherein: the horizontal direction of the camera is rotated by an angle alpha to a position 2, and the translation vector generated by the optical center is as follows:
Tv-α=T1-Rv-αT2(4)。
6. the binocular PTZ camera based target positioning method of claim 1, wherein: step 2-2 said vector
Figure FDA0003107478170000031
The left camera horizontal rotation vector
Figure FDA0003107478170000032
Is calculated by
Figure FDA0003107478170000033
After rotating pi + alpha around the rotating shaft, a vector is obtained
Figure FDA0003107478170000034
Then the translation vector generated by the rotation
Figure FDA0003107478170000035
In conjunction with the formula (4), then
Figure FDA0003107478170000036
By the same method
Figure FDA0003107478170000037
7. The binocular PTZ camera based target positioning method of claim 1, wherein: and 2-3, solving a rotation matrix after the left camera and the right camera rotate, wherein the horizontal rotation angle alpha and the pitching rotation angle beta are solved by using a Rodrigues formula to obtain:
Figure FDA0003107478170000038
in the formula
Figure FDA0003107478170000039
The rotation vectors of horizontal rotation and pitching rotation are shown, and I is a third-order unit array;
the rotation matrix of the camera coordinate system after the left and right cameras rotate relative to the original position is
Figure FDA00031074781700000310
8. The binocular PTZ camera based target positioning method of claim 1, wherein: and 2-3, after the left camera and the right camera rotate, solving translation matrixes of a camera coordinate system relative to respective initial positions, wherein the translation matrixes of the horizontal rotation alpha angle and the pitching rotation beta angle are as follows:
Figure FDA00031074781700000311
the translation matrix generated by the horizontal and pitching rotation of the left camera and the right camera can be obtained as follows:
Figure FDA00031074781700000312
9. the binocular PTZ camera based target positioning method of claim 1, wherein: step 2-4 optical center translation vector generated by zooming
Figure FDA00031074781700000313
Solving the translation vector T generated by zooming according to the current focal length value fz
Figure FDA00031074781700000314
In the formula fmax、fminIs the focal length range of the camera, and f is the current focal length value, obtainable from claim 2, in dpi,
Figure FDA00031074781700000315
the zoom range, in mm, is an intrinsic parameter of the PTZ camera.
10. The binocular PTZ camera based target positioning method of claim 1, wherein: step 2-5, the rotation and translation matrix R under any angle and zoom valuel、Tl、Rr、TrThe result of the left and right cameras relative to the world coordinate system is:
Figure FDA0003107478170000041
Figure FDA0003107478170000042
Figure FDA0003107478170000043
CN202110642192.6A 2021-06-09 2021-06-09 Target positioning method based on binocular PTZ camera Pending CN113379848A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110642192.6A CN113379848A (en) 2021-06-09 2021-06-09 Target positioning method based on binocular PTZ camera

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110642192.6A CN113379848A (en) 2021-06-09 2021-06-09 Target positioning method based on binocular PTZ camera

Publications (1)

Publication Number Publication Date
CN113379848A true CN113379848A (en) 2021-09-10

Family

ID=77573112

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110642192.6A Pending CN113379848A (en) 2021-06-09 2021-06-09 Target positioning method based on binocular PTZ camera

Country Status (1)

Country Link
CN (1) CN113379848A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114754743A (en) * 2022-04-18 2022-07-15 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform
CN114862959A (en) * 2022-03-24 2022-08-05 阿里云计算有限公司 Method and apparatus for controlling camera
CN115272491A (en) * 2022-08-12 2022-11-01 哈尔滨工业大学 Binocular PTZ camera dynamic self-calibration method
CN115713565A (en) * 2022-12-16 2023-02-24 盐城睿算电子科技有限公司 Target positioning method for binocular servo camera
CN115797459A (en) * 2022-08-29 2023-03-14 南京航空航天大学 Binocular vision system distance measurement method with arbitrary focal length combination

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103854291A (en) * 2014-03-28 2014-06-11 中国科学院自动化研究所 Camera calibration method in four-degree of freedom binocular vision system
CN110415278A (en) * 2019-07-30 2019-11-05 中国人民解放军火箭军工程大学 The ptz camera that moves linearly assists principal and subordinate's tracking of binocular PTZ vision system
CN112053405A (en) * 2020-08-21 2020-12-08 合肥工业大学 Deviation calibration and external reference correction method for optical axis and rotating shaft of follow-up vision system
CN112734863A (en) * 2021-03-31 2021-04-30 武汉理工大学 Crossed binocular camera calibration method based on automatic positioning

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103854291A (en) * 2014-03-28 2014-06-11 中国科学院自动化研究所 Camera calibration method in four-degree of freedom binocular vision system
CN110415278A (en) * 2019-07-30 2019-11-05 中国人民解放军火箭军工程大学 The ptz camera that moves linearly assists principal and subordinate's tracking of binocular PTZ vision system
CN112053405A (en) * 2020-08-21 2020-12-08 合肥工业大学 Deviation calibration and external reference correction method for optical axis and rotating shaft of follow-up vision system
CN112734863A (en) * 2021-03-31 2021-04-30 武汉理工大学 Crossed binocular camera calibration method based on automatic positioning

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
KANG MAO ET.AL: "A General Calibration Method for Dual PTZ Cameras Based on Feedback Parameters", 《SSRN》 *
SUDIPTA N. SINHA ET.AL: "Pan–tilt–zoom camera calibration and high-resolution mosaic generation", 《COMPUTER VISION AND IMAGE UNDERSTANDING》 *
崔智高 等: "大视场双目主动视觉传感器的协同跟踪方法", 《光电子激光》 *
赵显庭 等: "基于转轴参数的多自由度双目视觉系统标定", 《光学技术》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114862959A (en) * 2022-03-24 2022-08-05 阿里云计算有限公司 Method and apparatus for controlling camera
CN114754743A (en) * 2022-04-18 2022-07-15 中国人民解放军陆军军事交通学院军事交通运输研究所 Target positioning method for carrying multiple PTZ cameras on intelligent ground unmanned platform
CN115272491A (en) * 2022-08-12 2022-11-01 哈尔滨工业大学 Binocular PTZ camera dynamic self-calibration method
CN115797459A (en) * 2022-08-29 2023-03-14 南京航空航天大学 Binocular vision system distance measurement method with arbitrary focal length combination
CN115797459B (en) * 2022-08-29 2024-02-13 南京航空航天大学 Binocular vision system ranging method with arbitrary focal length combination
CN115713565A (en) * 2022-12-16 2023-02-24 盐城睿算电子科技有限公司 Target positioning method for binocular servo camera

Similar Documents

Publication Publication Date Title
CN113379848A (en) Target positioning method based on binocular PTZ camera
CN111951305B (en) Target detection and motion state estimation method based on vision and laser radar
CN107389026B (en) A kind of monocular vision distance measuring method based on fixed point projective transformation
CN110842940A (en) Building surveying robot multi-sensor fusion three-dimensional modeling method and system
CN105716542B (en) A kind of three-dimensional data joining method based on flexible characteristic point
CN102034238B (en) Multi-camera system calibrating method based on optical imaging probe and visual graph structure
CN106878687A (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN107944390B (en) Motor-driven vehicle going objects in front video ranging and direction localization method
US20230351625A1 (en) A method for measuring the topography of an environment
CN106772431A (en) A kind of Depth Information Acquistion devices and methods therefor of combination TOF technologies and binocular vision
CN206611521U (en) A kind of vehicle environment identifying system and omni-directional visual module based on multisensor
CN112132874B (en) Calibration-plate-free heterogeneous image registration method and device, electronic equipment and storage medium
CN105059190B (en) The automobile door opening collision warning device and method of view-based access control model
CN108489398B (en) Method for measuring three-dimensional coordinates by laser and monocular vision under wide-angle scene
CN106127115B (en) hybrid visual target positioning method based on panoramic vision and conventional vision
CN112669354B (en) Multi-camera motion state estimation method based on incomplete constraint of vehicle
CN103971375A (en) Panoramic gaze camera space calibration method based on image splicing
CN108469254A (en) A kind of more visual measuring system overall calibration methods of big visual field being suitable for looking up and overlooking pose
CN109883433A (en) Vehicle positioning method in structured environment based on 360 degree of panoramic views
Kim et al. External vehicle positioning system using multiple fish-eye surveillance cameras for indoor parking lots
Ye et al. Extrinsic calibration of a monocular camera and a single line scanning Lidar
Schönbein et al. Environmental Perception for Intelligent Vehicles Using Catadioptric Stereo Vision Systems.
Chenchen et al. A camera calibration method for obstacle distance measurement based on monocular vision
Gehrig et al. 6D vision goes fisheye for intersection assistance
CN112364793A (en) Target detection and fusion method based on long-focus and short-focus multi-camera vehicle environment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210910