[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN111784622A - Image splicing method based on monocular inclination of unmanned aerial vehicle and related device - Google Patents

Image splicing method based on monocular inclination of unmanned aerial vehicle and related device Download PDF

Info

Publication number
CN111784622A
CN111784622A CN202010925730.8A CN202010925730A CN111784622A CN 111784622 A CN111784622 A CN 111784622A CN 202010925730 A CN202010925730 A CN 202010925730A CN 111784622 A CN111784622 A CN 111784622A
Authority
CN
China
Prior art keywords
image
coordinate system
rotation matrix
photogrammetric
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010925730.8A
Other languages
Chinese (zh)
Other versions
CN111784622B (en
Inventor
袁睿
刘夯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Chengdu Jouav Automation Technology Co ltd
Original Assignee
Chengdu Jouav Automation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Chengdu Jouav Automation Technology Co ltd filed Critical Chengdu Jouav Automation Technology Co ltd
Priority to CN202010925730.8A priority Critical patent/CN111784622B/en
Publication of CN111784622A publication Critical patent/CN111784622A/en
Application granted granted Critical
Publication of CN111784622B publication Critical patent/CN111784622B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4038Image mosaicing, e.g. composing plane images from plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The application provides an image splicing method and a related device based on monocular inclination of an unmanned aerial vehicle, and relates to the field of image splicing of unmanned aerial vehicles. The method comprises the following steps: acquiring a plurality of sequence images sent by a camera of an unmanned aerial vehicle; acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images; acquiring images to be spliced of each sequence image in a photogrammetric coordinate system according to the image space-photographic rotation matrix; acquiring a target image according to all images to be spliced; the target image represents image information acquired by the camera in a photogrammetric coordinate system. After the first frame sequence image is processed by using the corresponding image space-photography rotation matrix, the subsequent sequence images determine the images to be spliced under the photogrammetric coordinate system, and all the images to be spliced use the consistent photogrammetric coordinate system, so that the image splicing plane is prevented from being dislocated.

Description

Image splicing method based on monocular inclination of unmanned aerial vehicle and related device
Technical Field
The application relates to the field of image splicing of unmanned aerial vehicles, in particular to an image splicing method based on monocular inclination of an unmanned aerial vehicle and a related device.
Background
With the development of technologies such as automatic control and wireless transmission, the unmanned aerial vehicle develops rapidly in military and civil use. Unmanned aerial vehicle carries on the camera, and the acquisition ground image that can be quick is convenient for in real time to the regional condition of ground. However, since a single frame image is limited, it is difficult to obtain the entire photographing region. In the field of photogrammetry, high-precision orthographic images, topographic maps and three-dimensional models can be obtained by processing sequence images afterwards by using internal software, but the time consumption is long, and the images cannot be obtained in real time.
The video image splicing technology utilizes sequence images transmitted by an unmanned aerial vehicle system in real time, and obtains panoramic images of a flight area in an incremental mode through an algorithm on the ground. For some flight tasks, like highway patrol and inspection, river course patrol and inspection etc. require that unmanned aerial vehicle can not fly directly over, the camera can't keep the state of taking a photograph, but the tilt state of using in the survey and drawing usually. The single-camera oblique photography brings many problems, such as incapability of fitting a ground plane and serious image splicing dislocation; the homographic transformation relationship between the images is wrong, so that the homographic transformed images are seriously distorted and deformed.
Disclosure of Invention
In view of this, an object of the present application is to provide an image stitching method and a related apparatus based on monocular tilting of an unmanned aerial vehicle.
In order to achieve the purpose, the technical scheme adopted by the application is as follows:
in a first aspect, the present application provides an image stitching method based on monocular tilting of an unmanned aerial vehicle, the method including: acquiring a plurality of sequence images sent by a camera of an unmanned aerial vehicle; acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image; acquiring images to be spliced of each sequence image under the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix; acquiring a target image according to all the images to be spliced; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system.
In an optional embodiment, acquiring images to be stitched of each of the sequence images in the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix includes: acquiring displacement information of the first image and the second image; the second image is any one image except the first image in the plurality of sequence images; setting an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information; the x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system; and acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
In an optional embodiment, obtaining the images to be stitched corresponding to each of the sequence images according to the reference coordinate system includes: acquiring inclination angle information of the second image; the inclination angle information represents a deviation angle between a Z axis of an image space coordinate system and a Z axis of the photogrammetric coordinate system when the camera shoots the second image; acquiring a coordinate rotation matrix according to the inclination angle information and the image space-photography rotation matrix; and carrying out coordinate transformation on the second image according to the coordinate rotation matrix to obtain an image to be spliced.
In an alternative embodiment, the tilt angle information is obtained by:
Figure 688501DEST_PATH_IMAGE001
wherein,
Figure 83710DEST_PATH_IMAGE002
is the tilt angle information of the Z axis of the image space coordinate system relative to the Z axis of the photogrammetric coordinate system;
Figure DEST_PATH_IMAGE003
Figure 68721DEST_PATH_IMAGE004
is the rotation vector of the image space coordinate system relative to the photogrammetric coordinate system,n i for said image spaceThe vector of the Z-axis of the coordinate system corresponding in the image space coordinate system,
Figure DEST_PATH_IMAGE005
the vector corresponding to the Z axis of the image space coordinate system in the photogrammetric coordinate system is obtained;
Figure 360025DEST_PATH_IMAGE006
and the vector of the Z axis of the photogrammetric coordinate system corresponding to the photogrammetric coordinate system.
In an alternative embodiment, the coordinate rotation matrix is obtained by:
Figure DEST_PATH_IMAGE007
wherein,
Figure 299162DEST_PATH_IMAGE008
for the purpose of the coordinate rotation matrix, the coordinates are,
Figure DEST_PATH_IMAGE009
to be composed of
Figure 435746DEST_PATH_IMAGE005
And (5) carrying out normalization processing on the antisymmetric matrix.
In an alternative embodiment, an image spatio-photographic rotation matrix of the first image is acquired, comprising: acquiring a first rotation matrix from the image space coordinate system to a camera coordinate system; the camera coordinate system is a space coordinate system with the center of the camera as an origin when the first image is shot; acquiring a second rotation matrix from the camera coordinate system to the unmanned aerial vehicle coordinate system; the unmanned aerial vehicle coordinate system is a space coordinate system with the center of the unmanned aerial vehicle as an origin when the first image is shot; acquiring a third rotation matrix from the unmanned aerial vehicle coordinate system to a geographic coordinate system; acquiring a fourth rotation matrix from the geographic coordinate system to the photogrammetric coordinate system; and multiplying the fourth rotation matrix, the third rotation matrix, the second rotation matrix and the first rotation matrix in sequence to obtain the image space-photography rotation matrix.
In a second aspect, the present application further provides an image stitching device based on unmanned aerial vehicle monocular inclination, the device includes: the communication module is used for acquiring a plurality of sequence images sent by a camera of the unmanned aerial vehicle; an acquisition module for acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image; the processing module is used for acquiring images to be spliced of each sequence image under the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix; the processing module is also used for acquiring a target image according to all the images to be spliced; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system.
In an optional embodiment, the processing module is further configured to obtain displacement information of the first image and the second image; the second image is any one image except the first image in the plurality of sequence images; the processing module is further configured to set an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information; the x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system; and the processing module is also used for acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
In a third aspect, the present application provides an electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the method of any one of the preceding embodiments.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the method of any one of the preceding embodiments.
Compared with the prior art, the application provides an image stitching method and a related device based on monocular inclination of the unmanned aerial vehicle, and relates to the field of image stitching of the unmanned aerial vehicle. The method comprises the following steps: acquiring a plurality of sequence images sent by a camera of an unmanned aerial vehicle; acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image; acquiring images to be spliced of each sequence image under the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix; acquiring a target image according to all the images to be spliced; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system. After the first frame sequence image is processed by using the corresponding image space-photography rotation matrix, the subsequent sequence images determine the images to be spliced under the photogrammetric coordinate system, and all the images to be spliced use the consistent photogrammetric coordinate system, so that the image splicing plane is prevented from being dislocated.
In order to make the aforementioned objects, features and advantages of the present application more comprehensible, preferred embodiments accompanied with figures are described in detail below.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and for those skilled in the art, other related drawings can be obtained from the drawings without inventive effort.
Fig. 1 is a schematic flowchart of an image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 2 is a schematic flowchart of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 3 is a schematic flowchart of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 5 is a schematic block diagram of an image stitching system based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 6 is a schematic block diagram of an image stitching device based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application;
fig. 7 is a block diagram of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations.
Thus, the following detailed description of the embodiments of the present application, presented in the accompanying drawings, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It is noted that relational terms such as "first" and "second," and the like, may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
In the current technical scheme, because a camera cannot effectively determine the coordinate transformation relation between a local map constructed in an algorithm and a splicing plane, namely the ground, in an inclined state, image homography calculation errors are caused, and splicing distortion and dislocation are serious.
In order to solve at least the above problems and the disadvantages of the background art, an embodiment of the present application provides an image stitching method based on monocular tilting of an unmanned aerial vehicle, please refer to fig. 1, where fig. 1 is a schematic flow diagram of the image stitching method based on monocular tilting of an unmanned aerial vehicle according to the embodiment of the present application, and the image stitching method may include the following steps:
and S31, acquiring a plurality of sequence images sent by the camera of the unmanned aerial vehicle.
It should be understood that the sequence of images may have a sequence identification to obtain positional information for each sequence of images. For example, the position information and the attitude data corresponding to each sequence object are acquired by a direct geo-positioning system: and converting the photo information between different coordinate systems by using position and attitude data provided by a satellite-inertial navigation combined positioning system carried by a carrier or a task load, and acquiring the geographic positions of external orientation elements and a photographic target of the photo.
S32, an image space-photographic rotation matrix of the first image is acquired.
The first image is a first frame image in a plurality of sequence images, and the image space-shooting rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a shooting measurement coordinate system when the camera shoots the first image. It should be understood that, in the image processing process of the unmanned aerial vehicle, various coordinate systems are included: an image space coordinate system, a camera coordinate system, a carrier coordinate system, a local navigation coordinate system, an image space auxiliary coordinate system, a photogrammetric coordinate system and the like. For example, a direct geo-location system onboard the drone may be used to acquire an image spatio-photographic rotation matrix for each sequence of images. The first image may also be, in one possible case, another frame image of the plurality of sequence images.
And S33, acquiring images to be spliced of each sequence image in a photogrammetric coordinate system according to the image space-photogrammetric rotation matrix.
It can be understood that an image space-photography rotation matrix is used for processing each sequence image, so that the camera is transformed from an image space coordinate system to a photogrammetry coordinate system in an inclined state, the x-y planes of the images before and after processing are overlapped, and image splicing errors are avoided.
And S34, acquiring a target image according to all the images to be spliced.
The target image represents image information acquired by the camera in the photogrammetric coordinate system. For example, according to the image stitching method provided by the embodiment of the present application, visual synchronous positioning and mapping (SLAM) may be used to define the reconstructed camera pose and three-dimensional structure information in an image space coordinate system, and the image space coordinate system is consistent with an image space coordinate system of a first frame sequential image in the plurality of sequential images at the time of initialization. And each sequence image uses the rotation matrix corresponding to the first frame image, and when the subsequent sequence images are subjected to SLAM reconstruction, the x-y plane of an image space coordinate system and the x-y plane of a photogrammetric coordinate system are kept coincident, so that the accuracy of an image splicing plane is ensured.
By using the image stitching method provided by the embodiment of the application, after the first frame sequence image is processed by using the corresponding image space-photographic rotation matrix, the subsequent sequence images determine the images to be stitched under the photogrammetric coordinate system, and all the images to be stitched use the consistent photogrammetric coordinate system, so that the image stitching plane is prevented from being dislocated.
In an alternative embodiment, in order to obtain an image to be stitched, a possible implementation manner is given on the basis of fig. 1, please refer to fig. 2, and fig. 2 is a schematic flowchart of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application, where the above S33 may include:
s331, obtaining displacement information of the first image and the second image.
The second image is any one of the plurality of sequential images except the first image. For example, there may be a large displacement between the first image and the second image.
And S332, setting an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information.
The x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system.
And S333, acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
For example, firstly, monocular initialization processing is performed on a camera of the unmanned aerial vehicle, a sequence image is used as input by using a visual SLAM, and a three-dimensional structure and a camera pose are continuously calculated based on an algorithm of a motion recovery structure; the following scheme may be employed for initialization of a single camera: triangularization is carried out to restore an initial three-dimensional structure based on two frame sequence images with enough displacement, and if camera matrixes corresponding to the first frame sequence image and the second frame sequence image are respectivelyP 1 AndP 2 two-frame sequence image havingnA common view three-dimensional point ofX i Andncommon view feature pointx i i=1-n. Displacement between two frame imagesTCan be expressed as:
Figure 540843DEST_PATH_IMAGE010
whereinRIs a relative rotation matrix between two frame sequence images (a first frame sequence image and a second frame sequence image), which is a matrix of 3 × 3;tis the relative displacement between two frames of sequence images, which is a 3 × 1 matrix, both defined in the image space coordinate system.P 1 AndP 2 is shown asThe following formula is given in the following formula,P 1 the representation of (a) shows that the initialization algorithm sets the image space coordinate system of the first frame image to the coordinate system in the whole reconstruction process:
Figure DEST_PATH_IMAGE011
the triangularization algorithm is expressed using collinear equations in projection relations or photogrammetry, i.e.:
Figure 471890DEST_PATH_IMAGE012
wherein,
Figure 226219DEST_PATH_IMAGE013
in order to normalize the factors, the method comprises the steps of,i= 1-n, so as to solve the initial three-dimensional structure linearlyX i
It should be understood that use is made ofP 1 And setting the image space coordinate system of the first frame image as a coordinate system in the whole reconstruction process, so that the subsequent sequence images determine the images to be spliced under the photogrammetric coordinate system, and all the images to be spliced use the consistent photogrammetric coordinate system, thereby avoiding the dislocation of the image splicing plane.
In an optional embodiment, in order to obtain an image to be stitched, a possible implementation manner is provided on the basis of fig. 2, please refer to fig. 3, fig. 3 is a schematic flow diagram of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application, and the above S333 may include:
s333a, tilt angle information of the second image is acquired.
The tilt angle information represents a shift angle of a Z-axis of an image space coordinate system and a Z-axis of a photogrammetric coordinate system when the camera takes the second image. The tilt angle information may be acquired by:
Figure 900914DEST_PATH_IMAGE014
wherein,
Figure 424299DEST_PATH_IMAGE015
is the tilt angle information of the Z-axis of the image space coordinate system relative to the Z-axis of the photogrammetric coordinate system.
Figure 791827DEST_PATH_IMAGE003
Figure 767873DEST_PATH_IMAGE016
Is a rotation vector of the image space coordinate system relative to the photogrammetric coordinate system,n i is the vector corresponding to the Z axis of the image space coordinate system in the image space coordinate system,
Figure 744794DEST_PATH_IMAGE005
the vector corresponding to the Z axis of the image space coordinate system in the photogrammetric coordinate system is shown.
Figure 388265DEST_PATH_IMAGE006
Is the vector of the Z axis of the photogrammetric coordinate system corresponding to the photogrammetric coordinate system. E.g. for the Z-axis of the image space coordinate systemn i =(0,0,1)TIndicating that the Z-axis in the photogrammetric coordinate system can be usedn α =(0,0,1)TIn a representation like a space coordinate systemn i Expressed in photogrammetric coordinate system as
Figure 926694DEST_PATH_IMAGE017
Figure 655615DEST_PATH_IMAGE018
Like a spatio-photographic rotation matrix.
For example, when the tilt state of the single camera is defined by transforming the image space coordinate system into the photogrammetric coordinate system, the relative rotation angle between the Z axis of the image space coordinate system and the Z axis direction of the photogrammetric coordinate system
Figure 734430DEST_PATH_IMAGE019
Size. Camera in positive shooting stateIs vertically downward, which is parallel to but opposite to the Z axis of the photogrammetric coordinate system,
Figure 904511DEST_PATH_IMAGE020
. When the camera is in a tilted state, the tilt angle is such that the sky is not taken into account in the unmanned aerial vehicle aerial application
Figure 410579DEST_PATH_IMAGE021
Satisfies the following conditions:
Figure 564479DEST_PATH_IMAGE022
it should be noted that, as can be seen from direct geographic positioning, the tilt state not only changes the pose of the drone due to the pose change of the camera, but also changes the tilt state, i.e., changes in the tilt angle information.
S333b, a coordinate rotation matrix is obtained from the tilt angle information and the image space-photography rotation matrix.
The coordinate rotation matrix
Figure 446985DEST_PATH_IMAGE008
Can be obtained by the following method:
Figure 501266DEST_PATH_IMAGE007
wherein,
Figure 443814DEST_PATH_IMAGE008
for the purpose of the coordinate rotation matrix, the coordinates are,
Figure 553853DEST_PATH_IMAGE009
to be composed of
Figure 974470DEST_PATH_IMAGE005
And (5) carrying out normalization processing on the antisymmetric matrix.
And S333c, performing coordinate transformation on the second image according to the coordinate rotation matrix to obtain the image to be spliced.
For example, firstly, monocular initialization processing is performed on a camera of the unmanned aerial vehicle, a sequence image is used as input by using a visual SLAM, and a three-dimensional structure and a camera pose are continuously calculated based on an algorithm of a motion recovery structure; the following scheme may be employed for initialization of a single camera: triangularization is carried out to restore an initial three-dimensional structure based on two frame sequence images with enough displacement, and if camera matrixes corresponding to the first frame sequence image and the second frame sequence image are respectivelyP 1 AndP 2 two-frame sequence image havingnA common view three-dimensional point ofX i Andncommon view feature pointx i i=1-n. Displacement between two frame imagesTCan be expressed as:
Figure 384723DEST_PATH_IMAGE010
whereinRIs a relative rotation matrix between two frame sequence images (a first frame sequence image and a second frame sequence image), which is a matrix of 3 × 3;tis the relative displacement between two frames of sequence images, which is a 3 × 1 matrix, both defined in the image space coordinate system.P 1 AndP 2 expressed as the following formula,P 1 the representation of (a) shows that the initialization algorithm sets the image space coordinate system of the first frame image to the coordinate system in the whole reconstruction process:
Figure 232593DEST_PATH_IMAGE011
the triangularization algorithm is expressed using collinear equations in projection relations or photogrammetry, i.e.:
Figure 95507DEST_PATH_IMAGE012
wherein,
Figure 319815DEST_PATH_IMAGE013
in order to normalize the factors, the method comprises the steps of,i= 1-n, so as to solve the initial three-dimensional structure linearlyX i
Relative rotation angle of image space coordinate system when initialized
Figure 83109DEST_PATH_IMAGE019
Is composed of
Figure 367460DEST_PATH_IMAGE023
When the camera is in a positive shooting state, the x-y plane of the image space coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the direction of the z axis is opposite. It should be noted that, in the current technical solution, if the camera is in the forward state during SLAM initialization, if there is a slight inclination angle in the subsequent sequence image, the initialized image space coordinate system is not changed, so the stitching method before being used can still work, because in this case, the x-y plane of the reconstructed image space coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the z-axis direction is parallel and opposite.
When in use
Figure 983249DEST_PATH_IMAGE019
Satisfy the requirement of
Figure 745669DEST_PATH_IMAGE024
If monocular initialization is used for processing, the image space coordinate system in which the SLAM is reconstructed is converted into the photogrammetric coordinate system, and the x-y plane is not parallel to the x-y plane of the photogrammetric coordinate system and the z-axis direction is not parallel. At the moment, the method for determining the spliced plane in the prior art is still used, and the x-y plane of the image space coordinate system and the x-y plane of the photogrammetric coordinate system are not overlapped any more, so that the homography estimation of the image is wrong, and the spliced image is distorted and misplaced.
The image stitching dislocation distortion is caused by that the X-y plane of the SLAM algorithm reconstructed image space coordinate system and the X-y plane of the photogrammetric coordinate system are not overlapped any more, so that the error of the estimated stitching plane is caused. The main purpose of the algorithm is to coincide the two planes, keep the x-y plane of the coordinate system where SLAM reconstruction is located in the jigsaw puzzle process coincident with the x-y coordinate system of photogrammetry, and keep the z-axes of the two coordinate systems parallel.
Obtained by direct geolocation
Figure 599355DEST_PATH_IMAGE018
From 2.2.2, the included angle between the z-axis of the image space coordinate system and the z-axis in the photogrammetric coordinate system can be calculated
Figure 54607DEST_PATH_IMAGE019
. Based on the vectors before and after rotation, the rotation matrix can be solved
Figure 892113DEST_PATH_IMAGE008
And transforming the x-y plane of the image space coordinate system to coincide with the x-y plane of the photogrammetric coordinate system.
To pair
Figure 458224DEST_PATH_IMAGE005
Is normalized by
Figure 196110DEST_PATH_IMAGE005
And (4) showing. By
Figure 494368DEST_PATH_IMAGE005
To
Figure 881487DEST_PATH_IMAGE025
Is known to
Figure 923392DEST_PATH_IMAGE021
And obtaining a coordinate rotation matrix as follows:
Figure 79567DEST_PATH_IMAGE007
wherein,
Figure 814305DEST_PATH_IMAGE026
to represent
Figure 423141DEST_PATH_IMAGE005
Is used to generate the inverse symmetric matrix. To obtain
Figure 32851DEST_PATH_IMAGE008
Then, the image stitching method provided by the embodiment of the application initializes the SLAM reconstruction to obtain the image stitching methodP 1 P 2 And three-dimensional pointsX i And realizing the consistency of an image space coordinate system where the SLAM reconstruction space is positioned and a photogrammetric coordinate system:
Figure 43533DEST_PATH_IMAGE027
X inew1 for initializing three-dimensional points in sequential imagesX i With coordinate rotation matrix phaseR trans And three-dimensional points of the sequence images acquired after multiplication in a photogrammetric coordinate system. It should be understood that the method realizes that the image space coordinate system where the SLAM reconstruction space is located is consistent with the photogrammetric coordinate system, and solves the problem that the calculation plane is inconsistent with the actual splicing plane when homographic transformation of the splicing plane is calculated in the real-time splicing process.
In an alternative embodiment, the coordinate rotation matrix described above may use a direct geo-located image space-camera rotation matrix
Figure 214751DEST_PATH_IMAGE028
And replacing, realizing that an image space coordinate system in which the SLAM reconstruction space is positioned is consistent with a photogrammetric coordinate system:
Figure 310883DEST_PATH_IMAGE029
X inew2 for initializing three-dimensional points in sequential imagesX i And (3) obtaining three-dimensional points of the sequence image obtained by multiplying the sequence image by the image space-shooting rotation matrix in a local navigation coordinate system (namely, a geographic coordinate system), wherein the finally obtained target image also becomes image information acquired by the unmanned aerial vehicle in the local navigation coordinate system.
In an alternative embodiment, in order to obtain an image space-camera rotation matrix of each sequence of images, on the basis of fig. 1, a possible implementation is given, please refer to fig. 4, fig. 4 is a flowchart of another image stitching method based on monocular tilting of an unmanned aerial vehicle according to an embodiment of the present application, and the above S32 may include:
s321, a first rotation matrix from the image space coordinate system to the camera coordinate system is obtained.
The camera coordinate system is a space coordinate system with the center of the camera as an origin when the first image is shot, and the first image is any one of a plurality of sequence images. E.g. by
Figure 429012DEST_PATH_IMAGE030
A transformation matrix (first rotation matrix) representing the image space coordinate system to the camera coordinate system.
S322, acquiring a second rotation matrix from the camera coordinate system to the unmanned aerial vehicle coordinate system.
The unmanned aerial vehicle coordinate system is a space coordinate system with the center of the unmanned aerial vehicle as an original point when the first image is shot. E.g. by
Figure 559779DEST_PATH_IMAGE031
The representation indicates a rotation matrix (second rotation matrix) from the camera coordinate system to the drone coordinate system.
S323, a third rotation matrix from the unmanned aerial vehicle coordinate system to the geographic coordinate system is obtained.
For example, by
Figure 636319DEST_PATH_IMAGE032
A rotation matrix (third rotation matrix) representing the drone coordinate system to a local navigation coordinate system (i.e. a geographical coordinate system).
S324, a fourth rotation matrix from the geographic coordinate system to the photogrammetric coordinate system is obtained.
E.g. by
Figure 219747DEST_PATH_IMAGE033
Representing a local navigation coordinate system (i.e. a geographical coordinate system) to a photogrammetric coordinate systemThe rotation matrix of (2).
And S325, multiplying the fourth rotation matrix, the third rotation matrix, the second rotation matrix and the first rotation matrix in sequence to obtain an image space-shooting rotation matrix.
E.g. image space-photographic rotation matrix
Figure 469463DEST_PATH_IMAGE034
Can be obtained by the following method:
Figure 156534DEST_PATH_IMAGE035
wherein,
Figure 466293DEST_PATH_IMAGE034
refers to a rotation matrix from the image space coordinate system to the image space auxiliary coordinate system,
Figure 209121DEST_PATH_IMAGE036
the rotation matrix is a rotation matrix from an image space coordinate system to a photogrammetry coordinate system, and the image space auxiliary coordinate system and the photogrammetry coordinate system have different coordinate origin points but are parallel on a shaft system, so that the rotation matrices of the image space coordinate system transformed to the two coordinate systems are equal;
Figure 262527DEST_PATH_IMAGE037
a rotation matrix representing a local navigation coordinate system (i.e. a geographical coordinate system) to a photogrammetric coordinate system,
Figure 39990DEST_PATH_IMAGE038
refers to a rotation matrix from a carrier coordinate system (i.e. a coordinate system of the unmanned aerial vehicle) to a local navigation coordinate system (i.e. a geographic coordinate system),
Figure 520650DEST_PATH_IMAGE039
refers to a rotation matrix from a camera coordinate system to a carrier coordinate system (i.e. a coordinate system of the unmanned aerial vehicle),
Figure 750774DEST_PATH_IMAGE040
finger space seatA transformation matrix of the coordinate system to the camera coordinate system.
For the above image stitching method, an embodiment of the present application provides an image stitching system based on monocular tilting of an unmanned aerial vehicle, please refer to fig. 5, and fig. 5 is a block schematic diagram of the image stitching system based on monocular tilting of the unmanned aerial vehicle provided by the embodiment of the present application, where the image stitching system includes an unmanned aerial vehicle sensor system, a camera system, a direct geographic positioning system, and an image stitching module. The system comprises an unmanned aerial vehicle carrier or a task load, a satellite-inertial navigation combined positioning system, a camera system, a direct geographic positioning system and an image splicing module, wherein the satellite-inertial navigation combined positioning system can provide position and attitude data, the camera system collects ground images (sequence images), the direct geographic positioning system can calculate the transformation relation from a current photo (an image space coordinate system corresponding to the sequence images) to a geodetic measurement coordinate system (so as to obtain an image space-photography rotation matrix), and the image splicing module utilizes the data to splice and obtain a flight area panoramic image (a target image) in real time; the problems of perspective transformation, image distortion, splicing dislocation and the like of the oblique image are solved.
It should be understood that the image stitching method based on the monocular inclination of the unmanned aerial vehicle provided by the embodiment of the application solves the problems of serious image stitching dislocation and poor precision when the camera is inclined at a large angle (-0 to-90 degrees) relative to the ground; and the image splicing system shown in fig. 5 can realize the splicing of the images of the flight areas of the oblique cameras. In one possible embodiment, the SLAM described above can be replaced with a three-dimensional reconstruction (SFM) system.
According to the image splicing method based on the monocular inclination of the unmanned aerial vehicle, the camera of the unmanned aerial vehicle can shoot the target object (such as a pipeline, a river channel, a road and the like) in an inclined state, and the unmanned aerial vehicle can fly at the side of the target object without flying above the pipeline, the river channel, the road and the like when executing a flight inspection task, so that the potential flight safety problem is avoided. It should be noted that the image stitching method provided by the embodiment of the present application may introduce a direct geographic positioning system, so as to process the sequence image according to the image space-photography rotation matrix, and solve the problem that the current image stitching scheme cannot process oblique images.
In order to implement the image stitching method provided in the foregoing embodiment, an embodiment of the present application provides an image stitching device based on monocular tilting of an unmanned aerial vehicle, please refer to fig. 6, where fig. 6 is a schematic block diagram of the image stitching device based on monocular tilting of an unmanned aerial vehicle provided in the embodiment of the present application, and the image stitching device 40 includes: a communication module 41, an acquisition module 42 and a processing module 43.
The communication module 41 is configured to acquire a plurality of sequence images sent by a camera of the unmanned aerial vehicle; the acquisition module 42 is configured to acquire an image space-photographic rotation matrix of the first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image; the processing module 43 is configured to obtain images to be stitched of each of the sequence images in the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix; the processing module 43 is further configured to obtain a target image according to all the images to be stitched; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system.
In an alternative embodiment, the processing module 43 is further configured to obtain displacement information of the first image and the second image; the second image is any one image except the first image in the plurality of sequence images; the processing module 43 is further configured to set an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information; the x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system; and the processing module is also used for acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
It should be understood that the communication module 41, the obtaining module 42 and the processing module 43 may cooperatively implement the corresponding S31-S34 and possible sub-steps thereof of the image stitching method described above.
An electronic device is provided in an embodiment of the present application, and as shown in fig. 7, fig. 7 is a block schematic diagram of an electronic device provided in an embodiment of the present application. The electronic device 60 comprises a memory 61, a processor 62 and a communication interface 63. The memory 61, processor 62 and communication interface 63 are electrically connected to each other, directly or indirectly, to enable transmission or interaction of data. For example, the components may be electrically connected to each other via one or more communication buses or signal lines. The memory 61 may be configured to store software programs and modules, such as program instructions/modules corresponding to the image stitching method based on monocular tilting of the unmanned aerial vehicle provided in the embodiment of the present application, and the processor 62 executes various functional applications and data processing by executing the software programs and modules stored in the memory 61. The communication interface 63 may be used for communicating signaling or data with other node devices. The electronic device 60 may have a plurality of communication interfaces 63 in this application.
The Memory 61 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), an electrically Erasable Read-Only Memory (EEPROM), and the like.
The processor 62 may be an integrated circuit chip having signal processing capabilities. The processor may be a general-purpose processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like; but may also be a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc.
Electronic device 60 may implement any of the image stitching methods based on unmanned aerial vehicle monocular tilting provided herein. The electronic device 60 may be, but is not limited to, a cell phone, a tablet computer, a notebook computer, a server, or other electronic device with processing capabilities.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method can be implemented in other ways. The apparatus embodiments described above are merely illustrative, and for example, the flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of apparatus, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
In addition, functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The functions, if implemented in the form of software functional modules and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
In conclusion, the application provides an image stitching method and a related device based on monocular inclination of an unmanned aerial vehicle, and relates to the field of image stitching of unmanned aerial vehicles. The method comprises the following steps: acquiring a plurality of sequence images sent by a camera of an unmanned aerial vehicle; acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-shooting rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a shooting measurement coordinate system when the camera shoots the first image; acquiring images to be spliced of each sequence image in a photogrammetric coordinate system according to the image space-photographic rotation matrix; acquiring a target image according to all images to be spliced; the target image represents image information acquired by the camera in a photogrammetric coordinate system. After the first frame sequence image is processed by using the corresponding image space-photography rotation matrix, the subsequent sequence images determine the images to be spliced under the photogrammetric coordinate system, and all the images to be spliced use the consistent photogrammetric coordinate system, so that the image splicing plane is prevented from being dislocated.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (10)

1. An image stitching method based on unmanned aerial vehicle monocular inclination is characterized by comprising the following steps:
acquiring a plurality of sequence images sent by a camera of an unmanned aerial vehicle;
acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image;
acquiring images to be spliced of each sequence image under the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix;
acquiring a target image according to all the images to be spliced; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system.
2. The method according to claim 1, wherein acquiring the images to be stitched of each of the sequence images in the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix comprises:
acquiring displacement information of the first image and the second image; the second image is any one image except the first image in the plurality of sequence images;
setting an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information; the x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system;
and acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
3. The method according to claim 2, wherein obtaining the images to be stitched corresponding to each of the sequence images according to the reference coordinate system comprises:
acquiring inclination angle information of the second image; the inclination angle information represents a deviation angle between a Z axis of an image space coordinate system and a Z axis of the photogrammetric coordinate system when the camera shoots the second image;
acquiring a coordinate rotation matrix according to the inclination angle information and the image space-photography rotation matrix;
and carrying out coordinate transformation on the second image according to the coordinate rotation matrix to obtain an image to be spliced.
4. The method of claim 3, wherein the tilt angle information is obtained by:
Figure 756264DEST_PATH_IMAGE001
wherein,
Figure 621189DEST_PATH_IMAGE002
is the tilt angle information of the Z axis of the image space coordinate system relative to the Z axis of the photogrammetric coordinate system;
Figure 272750DEST_PATH_IMAGE003
Figure 521329DEST_PATH_IMAGE004
is the rotation vector of the image space coordinate system relative to the photogrammetric coordinate system,n i is the vector corresponding to the Z axis of the image space coordinate system in the image space coordinate system,
Figure 916538DEST_PATH_IMAGE005
the vector corresponding to the Z axis of the image space coordinate system in the photogrammetric coordinate system is obtained;
Figure 137435DEST_PATH_IMAGE006
and the vector of the Z axis of the photogrammetric coordinate system corresponding to the photogrammetric coordinate system.
5. The method of claim 4, wherein the coordinate rotation matrix is obtained by:
Figure 225477DEST_PATH_IMAGE007
wherein,
Figure 492510DEST_PATH_IMAGE008
for the purpose of the coordinate rotation matrix, the coordinates are,
Figure 363514DEST_PATH_IMAGE009
to be composed of
Figure 32393DEST_PATH_IMAGE005
And (5) carrying out normalization processing on the antisymmetric matrix.
6. The method of claim 1, wherein acquiring an image spatio-photographic rotation matrix of the first image comprises:
acquiring a first rotation matrix from the image space coordinate system to a camera coordinate system; the camera coordinate system is a space coordinate system with the center of the camera as an origin when the first image is shot;
acquiring a second rotation matrix from the camera coordinate system to the unmanned aerial vehicle coordinate system; the unmanned aerial vehicle coordinate system is a space coordinate system with the center of the unmanned aerial vehicle as an origin when the first image is shot;
acquiring a third rotation matrix from the unmanned aerial vehicle coordinate system to a geographic coordinate system;
acquiring a fourth rotation matrix from the geographic coordinate system to the photogrammetric coordinate system;
and multiplying the fourth rotation matrix, the third rotation matrix, the second rotation matrix and the first rotation matrix in sequence to obtain the image space-photography rotation matrix.
7. The utility model provides an image splicing apparatus based on unmanned aerial vehicle monocular inclination, its characterized in that, the device includes:
the communication module is used for acquiring a plurality of sequence images sent by a camera of the unmanned aerial vehicle;
an acquisition module for acquiring an image space-photographic rotation matrix of a first image; the first image is a first frame image in the multiple sequence images, and the image space-photography rotation matrix represents a rotation vector of an image space coordinate system of the camera relative to a photogrammetric coordinate system when the camera shoots the first image;
the processing module is used for acquiring images to be spliced of each sequence image under the photogrammetric coordinate system according to the image space-photogrammetric rotation matrix;
the processing module is also used for acquiring a target image according to all the images to be spliced; the target image characterizes image information acquired by the camera in the photogrammetric coordinate system.
8. The apparatus of claim 7, wherein the processing module is further configured to obtain displacement information of the first image and the second image; the second image is any one image except the first image in the plurality of sequence images;
the processing module is further configured to set an image space coordinate system corresponding to the first image as a reference coordinate system for image reconstruction according to the displacement information; the x-y plane of the reference coordinate system is parallel to the x-y plane of the photogrammetric coordinate system, and the positive direction of the z axis of the reference coordinate system is the same as the negative direction of the z axis of the photogrammetric coordinate system;
and the processing module is also used for acquiring the images to be spliced corresponding to each sequence image according to the reference coordinate system.
9. An electronic device comprising a processor and a memory, the memory storing machine executable instructions executable by the processor to implement the method of any one of claims 1-6.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the method of any one of claims 1 to 5.
CN202010925730.8A 2020-09-07 2020-09-07 Image splicing method based on monocular inclination of unmanned aerial vehicle and related device Active CN111784622B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010925730.8A CN111784622B (en) 2020-09-07 2020-09-07 Image splicing method based on monocular inclination of unmanned aerial vehicle and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010925730.8A CN111784622B (en) 2020-09-07 2020-09-07 Image splicing method based on monocular inclination of unmanned aerial vehicle and related device

Publications (2)

Publication Number Publication Date
CN111784622A true CN111784622A (en) 2020-10-16
CN111784622B CN111784622B (en) 2021-01-26

Family

ID=72762326

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010925730.8A Active CN111784622B (en) 2020-09-07 2020-09-07 Image splicing method based on monocular inclination of unmanned aerial vehicle and related device

Country Status (1)

Country Link
CN (1) CN111784622B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115597516A (en) * 2022-10-26 2023-01-13 华能澜沧江水电股份有限公司(Cn) High-precision geological crack monitoring method and system based on unmanned aerial vehicle image

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1693851A (en) * 2005-06-08 2005-11-09 中国科学院上海技术物理研究所 Aviation linear array CCD image geometric rough correct algorithm
CN101354250A (en) * 2008-09-08 2009-01-28 中国测绘科学研究院 Combined wide angle aviation digital camera system with self-checking self-stabilization function
US8249390B2 (en) * 2006-09-04 2012-08-21 Samsung Electronics Co., Ltd. Method for taking panorama mosaic photograph with a portable terminal
CN102829763A (en) * 2012-07-30 2012-12-19 中国人民解放军国防科学技术大学 Pavement image collecting method and system based on monocular vision location
CN105627991A (en) * 2015-12-21 2016-06-01 武汉大学 Real-time panoramic stitching method and system for unmanned aerial vehicle images
CN109269430A (en) * 2018-08-12 2019-01-25 浙江农林大学 The more plants of standing tree diameter of a cross-section of a tree trunk 1.3 meters above the ground passive measurement methods based on depth extraction model
CN109724625A (en) * 2019-01-22 2019-05-07 中国人民解放军61540部队 A kind of aberration correcting method of the compound large area array mapping camera of optics
CN109725340A (en) * 2018-12-31 2019-05-07 成都纵横大鹏无人机科技有限公司 Direct geographic positioning and device
CN110648283A (en) * 2019-11-27 2020-01-03 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN110717861A (en) * 2019-12-12 2020-01-21 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1693851A (en) * 2005-06-08 2005-11-09 中国科学院上海技术物理研究所 Aviation linear array CCD image geometric rough correct algorithm
US8249390B2 (en) * 2006-09-04 2012-08-21 Samsung Electronics Co., Ltd. Method for taking panorama mosaic photograph with a portable terminal
CN101354250A (en) * 2008-09-08 2009-01-28 中国测绘科学研究院 Combined wide angle aviation digital camera system with self-checking self-stabilization function
CN102829763A (en) * 2012-07-30 2012-12-19 中国人民解放军国防科学技术大学 Pavement image collecting method and system based on monocular vision location
CN105627991A (en) * 2015-12-21 2016-06-01 武汉大学 Real-time panoramic stitching method and system for unmanned aerial vehicle images
CN109269430A (en) * 2018-08-12 2019-01-25 浙江农林大学 The more plants of standing tree diameter of a cross-section of a tree trunk 1.3 meters above the ground passive measurement methods based on depth extraction model
CN109725340A (en) * 2018-12-31 2019-05-07 成都纵横大鹏无人机科技有限公司 Direct geographic positioning and device
CN109724625A (en) * 2019-01-22 2019-05-07 中国人民解放军61540部队 A kind of aberration correcting method of the compound large area array mapping camera of optics
CN110648283A (en) * 2019-11-27 2020-01-03 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium
CN110717861A (en) * 2019-12-12 2020-01-21 成都纵横大鹏无人机科技有限公司 Image splicing method and device, electronic equipment and computer readable storage medium

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
VEHICLEMAN: "视觉SLAM十四讲(一)之旋转矩阵与旋转向量[罗德里格斯公式]", 《HTTPS://WWW.CNBLOGS.COM/LIUHUACAI/P/12093770.HTML》 *
刘尧文: "微型无人机遥感影像快速拼接技术研究与实现", 《中国优秀硕士学位论文全文数据库 基础科学辑》 *
高翔: "无人机影像快速拼接方法研究", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115597516A (en) * 2022-10-26 2023-01-13 华能澜沧江水电股份有限公司(Cn) High-precision geological crack monitoring method and system based on unmanned aerial vehicle image

Also Published As

Publication number Publication date
CN111784622B (en) 2021-01-26

Similar Documents

Publication Publication Date Title
CN110717861B (en) Image splicing method and device, electronic equipment and computer readable storage medium
CN111784585B (en) Image splicing method and device, electronic equipment and computer readable storage medium
US10789673B2 (en) Post capture imagery processing and deployment systems
CN110799921A (en) Shooting method and device and unmanned aerial vehicle
CN110675450A (en) Method and system for generating orthoimage in real time based on SLAM technology
CN111829532B (en) Aircraft repositioning system and method
CN110703805B (en) Method, device and equipment for planning three-dimensional object surveying and mapping route, unmanned aerial vehicle and medium
CN114565863B (en) Real-time generation method, device, medium and equipment for orthophoto of unmanned aerial vehicle image
WO2018214778A1 (en) Method and device for presenting virtual object
CN113361365A (en) Positioning method and device, equipment and storage medium
US8509522B2 (en) Camera translation using rotation from device
WO2020198963A1 (en) Data processing method and apparatus related to photographing device, and image processing device
CN108801225B (en) Unmanned aerial vehicle oblique image positioning method, system, medium and equipment
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
CN113034347A (en) Oblique photographic image processing method, device, processing equipment and storage medium
CN111784622B (en) Image splicing method based on monocular inclination of unmanned aerial vehicle and related device
CN116170689A (en) Video generation method, device, computer equipment and storage medium
CN111598930B (en) Color point cloud generation method and device and terminal equipment
CN113920192A (en) Visual positioning method, device, server and computer readable storage medium
CN113436267A (en) Visual inertial navigation calibration method and device, computer equipment and storage medium
CN113129422A (en) Three-dimensional model construction method and device, storage medium and computer equipment
CN110148205B (en) Three-dimensional reconstruction method and device based on crowdsourcing image
WO2023062994A1 (en) Learning device, learning method, learning program, camera parameter calculating device, camera parameter calculating method, and camera parameter calculating program
KR102225321B1 (en) System and method for building road space information through linkage between image information and position information acquired from a plurality of image sensors
CN114187344A (en) Map construction method, device and equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP03 Change of name, title or address
CP03 Change of name, title or address

Address after: 7 / F, area a, building 6, No. 200, Tianfu 5th Street, high tech Zone, Chengdu, Sichuan 610000

Patentee after: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region after: China

Address before: Room 801-805, 8th floor, Building A, No. 200, Tianfu Wujie, Chengdu High-tech Zone, Sichuan Province, 610000

Patentee before: CHENGDU JOUAV AUTOMATION TECHNOLOGY Co.,Ltd.

Country or region before: China