Method and device for realizing multi-angle camera monitoring effect
Technical Field
The invention relates to the technical field of fisheye lenses, in particular to a method and a device for realizing multi-angle camera monitoring effect.
Background
In the current video monitoring system, the general visual angle of a lens is less than 60 degrees; in order to realize large-range panoramic monitoring, a plurality of cameras with multiple angles are generally adopted for realizing. However, the monitoring scheme with multiple cameras at multiple angles can cause the construction and maintenance costs of the monitoring system to be higher. In addition, at present, a plurality of images are shot through rotation of one camera and then spliced into a panoramic image, and by adopting the mode, not only is the operation troublesome, but also real-time and all-dimensional real-time monitoring cannot be realized.
Disclosure of Invention
The invention provides a method for realizing the monitoring effect of a multi-angle camera, which can reduce the cost of panoramic monitoring.
The invention also provides a device for realizing the monitoring effect of the multi-angle camera, which can reduce the cost of panoramic monitoring.
The technical scheme of the invention is realized as follows:
a method for realizing multi-angle camera monitoring effect comprises the following steps:
shooting a panoramic image through a panoramic camera;
setting a plurality of roaming angles for the panoramic image;
and obtaining a plurality of local correction graphs according to each roaming angle.
The utility model provides a realize device of multi-angle camera monitoring effect, includes:
the device comprises a presetting module, a display module and a control module, wherein the presetting module is used for setting a plurality of roaming angles aiming at a panoramic image shot by a panoramic camera;
and the correction module is used for obtaining a plurality of local correction graphs according to each roaming angle.
Therefore, the method and the device for realizing the monitoring effect of the multi-angle camera provided by the invention can obtain a plurality of local correction pictures for the panoramic image shot by the panoramic camera according to a plurality of different roaming angles respectively, and the local correction pictures can achieve the monitoring effect of a plurality of cameras in multiple angles.
Drawings
FIG. 1 is a flow chart of a method for realizing multi-angle camera monitoring effect according to the present invention;
FIG. 2 is a schematic diagram of a fisheye image-to-fisheye image correction map;
FIG. 3 is a schematic diagram illustrating an implementation effect of the first embodiment;
FIG. 4 is a schematic structural diagram of an apparatus for implementing multi-angle camera monitoring effect according to the present invention.
Detailed Description
The invention provides a method and a device for realizing multi-angle camera monitoring effect based on a panoramic camera. The panoramic camera can adopt a fisheye camera or a reflecting mirror surface, wherein the reflecting mirror surface can be a parabolic mirror surface or a mirror surface hyperbolic mirror surface.
The panoramic camera adopts a fisheye camera as an example, the fisheye camera can independently realize monitoring without dead angles in a large range, and the mounting modes of the fisheye camera are generally two, namely a ceiling type and a wall type. When the fish-eye camera is installed in a ceiling type, the visual range of the fish-eye camera is 360 degrees below the lens; when the wall-mounted type installation is adopted, the visual range of the fisheye camera is 180 degrees in front of the lens. The invention corrects panoramic images (particularly fisheye images) shot by the fisheye camera corresponding to different roaming angles, each roaming angle corresponds to one fisheye image correction image, and the effect is equivalent to that a plurality of common cameras with multiple angles shoot images with different angles at the same time. For a ceiling mounted camera, the roaming angle is horizontal, which can be understood as being in the plane formed by the X-axis and the Y-axis; for a wall-mounted camera, the roaming angle is vertical, and can be understood as being in the plane formed by the Y-axis and the Z-axis.
Fig. 1 is a flowchart of a method for realizing a multi-angle camera monitoring effect according to the present invention, which includes:
step 101: shooting a panoramic image through a panoramic camera;
step 102: setting a plurality of roaming angles for the panoramic image;
step 103: and obtaining a plurality of local correction graphs according to each roaming angle.
Wherein, step 103 may specifically be: determining a new optical center on the panoramic image according to each roaming angle; and locally correcting the panoramic image according to the new optical center to obtain a panoramic image correction map of the roaming angle.
The way to determine the new optical center on the panoramic image according to the roaming angle may be:
determining the coordinate value of the new optical center according to the roaming angle, the preset angle value, the radius of the projection spherical surface imaged by the panoramic camera and the preset parameter value; the specific determination will be described in detail below.
Through the process, aiming at each frame of panoramic image, a plurality of panoramic image correction graphs aiming at the frame of panoramic image can be obtained, and each panoramic image correction graph corresponds to a roaming angle; the visual effect of the panoramic image correction maps is equivalent to images shot by a plurality of common cameras when the shooting angles are different roaming angles.
For ease of understanding, the basic principle of panoramic image correction is first described below. In the following example, a panoramic image is specifically described as a fish-eye image.
A spatial three-dimensional rectangular coordinate system is adopted, and a spherical projection model of the fisheye image is as follows:
from the origin O (0,0,0), rays pointing to any point (x, y, z) in space, intersect the projection sphere x2+y2+z2=R2At one point, this point is projected onto a fixed plane perpendicular to the z-axis to obtain a plane point (u, v). The plane points (u, v) and (x, y, z) have the following relationships:
the projection (u, v) of each point (x, y, z) in the space on a given fixed plane is the imaging plane of the fisheye lens. By utilizing the inverse transformation of the model, the fisheye image can be transformed into an image on a perspective projection model, namely the fisheye image can be corrected.
The above is the introduction of the principle of fisheye image correction. Because the fisheye image is a large-visual-angle image and has obvious distortion at the edge of the image, the correction algorithm of the fisheye image is improved, so that all new optical central points on the fisheye image are positioned on a circle in the range of the fisheye image, and the viewing angle of the fisheye image correction graph is natural.
Fig. 2 is a schematic diagram illustrating a correction chart for converting a fisheye image into a fisheye image. The transformation process comprises the following steps:
step 1: determining a new optical center on the fisheye image according to the roaming angle, and specifically determining by adopting the following formula:
wherein,
P0(u1,v1) Is the new optical center on the fisheye image;
is a preset angle value; the value can be
Theta is the roaming angle;
viewratio is a preset value.
Thus, is atUnder the condition, the theta takes different values, and the effect of monitoring a plurality of angles can be realized. The viewing angle of the corrected image is natural. Wherein the viewratio is used to control a new optical center point P on the original fisheye image0(u1,v1) Position of (2), ensure all P0(u1,v1) Is located on a circle with radius R × viewratio; the value of viewratio is (0, 1).
Step 2: determining P0(u1,v1) Panoramic cameraThe coordinates of the corresponding point P on the projection sphere of the imaging are
And step 3: let the distance of the correction plane to the perspective center O (i.e. the sphere center of the projected sphere) be kR, where k>0, R is the radius of the projection spherical surface; determining P0(u1,v1) The coordinates of the corresponding point P' on the correction plane areTaking P' as the optical center of the correction plane;
and 4, step 4: and determining two directions of a plane coordinate axis on the correction plane, wherein the two directions are represented by vectors alpha and beta and respectively correspond to the positive directions of u and v coordinate axes on the fisheye image. Beta is chosen to be the upward tangent direction of the meridian passing through point P on the projection sphere at that point.
The constraints of the vectors α and β are:
alpha is perpendicular to beta;
α, β and OP constitute a left-hand coordinate system.
Wherein,
α=BP×AP,
β=OP×α,
wherein BP is a vector with a starting point of B and an end point of P, AP is a vector with a starting point of a and an end point of P, OP is a vector with a starting point of O and an end point of P, and "x" represents a multiplication of 2 vectors; the coordinates of a and B are as follows:
step 5, unitizing α and β, and determining the width, height and position of P' in the fisheye image correction map (i)0,j0) Generally, the center position of the image is defaulted.
Step 6: determining the color value of each pixel point (i, j) in the fisheye image correction graph at a corresponding point Q in space in sequence in the following manner to obtain the fisheye image correction graph of the roaming angle;
determining the corresponding point Q of the pixel point (i, j) in the space as (j-j)0)α+(i-i0) β + kP, and determining the intersection point of the ray OQ and the projection spherical surfaceAnd obtaining a corresponding point (u, v) of the point Q in the fisheye image, and carrying out interpolation processing on the color value of the point (u, v) to obtain the color value of the point Q.
By the method, after traversing each point of the fisheye image correction image, one-time image correction is completed.
The following specific examples are presented in detail:
the first embodiment is as follows:
in the present embodiment, a camera using a 180 ° fisheye lens is mounted in a ceiling type, and the visible range of the fisheye camera is 360 ° below the lens. In this embodiment, 8 roaming angles θ are set, and the values of the roaming angles θ are: 0 °, 45 °, 90 °, 135 °, 180 °, 225 °, 270 °, and 315 °.
Aiming at fisheye images shot by a fisheye camera, the same frame of fisheye images are respectively corrected by corresponding different roaming angles to obtain 8 corresponding fisheye image correction images, and the effect of multi-angle camera monitoring is realized. The specific way to determine the new optical center and correction based on the roaming angle is described above and is not exhaustive here.
As shown in fig. 3, which is a schematic diagram of the implementation effect of the present embodiment, in fig. 3, the picture at the center is a fisheye image captured by a fisheye camera; the 8 pictures displayed in the anticlockwise sequence from the picture on the right side of the fisheye image are corresponding fisheye image correction graphs when the theta values are 0 degrees, 45 degrees, 90 degrees, 135 degrees, 180 degrees, 225 degrees, 270 degrees and 315 degrees respectively, and the display effect of the fisheye image correction graph can be seen to accord with the angle of head-up and be viewed naturally.
The above embodiment is an implementation manner for a case where the visible range of the fisheye camera is 360 ° below the lens; when the wall-mounted type installation is adopted, the visual range of a camera with a 180-degree fisheye lens is 180 degrees in front of the lens, and aiming at the condition, the mode for realizing the multi-angle camera effect provided by the invention is slightly different, and the method specifically comprises the following steps:
collecting fisheye images in real time through a fisheye camera;
the center of a fish-eye image is taken as the origin of an xy coordinate axis, the horizontal right direction is taken as the positive direction of an x axis, the vertical upward direction is taken as the positive direction of a y axis, and R is taken as the radius of a center image. And selecting a new optical center (u, v) as the center of the new correction chart, wherein the value ranges of u and v are (-R, R).
And according to the new optical center, locally correcting the fisheye image to obtain a fisheye image correction image.
On the fisheye image, n points are selected as new optical centers as required, and the same frame of circular fisheye image is respectively corrected to obtain n fisheye image correction images, namely the correction images are equivalent to the effect shot by n cameras.
The above introduces a method for realizing multi-angle camera monitoring effect, and the present invention further provides a device for realizing multi-angle camera monitoring effect, as shown in fig. 4, which is a schematic structural diagram of the device, and includes:
a preset module 401, configured to set a plurality of roaming angles for a panoramic image captured by a panoramic camera;
a calibration module 402, configured to obtain multiple local calibration graphs according to each roaming angle.
In the device, a correction module 402 determines a new optical center on the panoramic image according to each roaming angle; and locally correcting the panoramic image according to the new optical center to obtain a panoramic image correction map of the roaming angle.
The way for the correction module 402 to determine the new optical center on the panoramic image according to the roaming angle is:
and determining the coordinate value of the new optical center according to the roaming angle, the preset angle value, the radius of the projection spherical surface imaged by the panoramic camera and the preset parameter value.
Specifically, the following formula can be used for determination:
wherein,
P0(u1,v1) Is the new optical center on the panoramic image;
is a preset angle value;
theta is the roaming angle;
viewratio is a preset value.
The manner of the correction module 402 performing local correction on the panoramic image according to the new optical center to obtain the panoramic image correction map of the roaming angle may be as follows:
determining P0(u1,v1) The coordinates of the corresponding point P on the projection spherical surface imaged by the panoramic camera are
Let the distance from the correction plane to the center O of the projection sphere be kR, where k>0, R is the radius of the projection spherical surface; determining P0(u1,v1) The coordinates of the corresponding point P' on the correction plane areTaking P' as the optical center of the correction plane;
two directions of the plane coordinate axes are determined on the correction plane, denoted by vectors alpha, beta, wherein,
α=BP×AP,
β=OP×α,
the BP is a vector with a starting point of B and an end point of P, the AP is a vector with a starting point of A and an end point of P, the OP is a vector with a starting point of O and an end point of P, and the x represents the multiplication of 2 vectors; the coordinates of A and B are as follows:
unitizing α and β, determining the width, height and position of P' in the panorama image correction map (i)0,j0);
Determining the color value of a corresponding point Q of each pixel point (i, j) in the space in the panoramic image correction map in sequence according to the following mode to obtain the panoramic image correction map of the roaming angle;
determining the corresponding point Q of the pixel point (i, j) in the space as (j-j)0)α+(i-i0) β + kP, and determining the intersection point of the ray OQ and the projection spherical surfaceAnd obtaining a corresponding point (u, v) of the point Q in the panoramic image, and carrying out interpolation processing on the color value of the point (u, v) to obtain the color value of the point Q.
In the device, the panoramic camera can adopt a fisheye lens or a reflecting mirror surface; wherein the reflecting mirror can be a parabolic mirror or a hyperbolic mirror.
In summary, the method and apparatus for realizing multi-angle camera monitoring effect provided by the present invention determine a new optical center on a panoramic image captured by a panoramic camera according to each roaming angle, and correct the panoramic image according to the new optical center, thereby obtaining a plurality of panoramic image correction maps corresponding to a plurality of roaming angles, respectively, and the panoramic image correction maps can achieve the monitoring effect of a plurality of cameras in multiple angles; when image correction is performed, the new optical center on the panoramic image determined by the plurality of roaming angles is positioned on a circle in the panoramic image range, so that the viewing angle of the panoramic image correction map is natural.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.