[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN113487676B - Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity - Google Patents

Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity Download PDF

Info

Publication number
CN113487676B
CN113487676B CN202110926887.7A CN202110926887A CN113487676B CN 113487676 B CN113487676 B CN 113487676B CN 202110926887 A CN202110926887 A CN 202110926887A CN 113487676 B CN113487676 B CN 113487676B
Authority
CN
China
Prior art keywords
camera
angle
images
image
pair
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110926887.7A
Other languages
Chinese (zh)
Other versions
CN113487676A (en
Inventor
周珣
谢远帆
王亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202110926887.7A priority Critical patent/CN113487676B/en
Publication of CN113487676A publication Critical patent/CN113487676A/en
Application granted granted Critical
Publication of CN113487676B publication Critical patent/CN113487676B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)
  • Image Analysis (AREA)

Abstract

According to an exemplary implementation of the present disclosure, a method and apparatus for determining a relative pose angle between cameras mounted to a collection entity is provided. Concretely, a method for determining a relative pose angle between cameras is provided. The method comprises the following steps: obtaining a first set of images from a first camera and a second set of images from a second camera, respectively; acquiring a first pair of poles associated with a first camera based on two images in a first set of images; acquiring a second pair of poles associated with a second camera based on two images of the second set of images; and determining a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles. According to exemplary implementations of the present disclosure, there are also provided an apparatus, a device, and a computer storage medium for determining a relative pose angle between cameras.

Description

Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity
The present application is a divisional application of the invention patent application with application number 201810225217.0, application date 2018, 03, and title of the invention "method and apparatus for determining relative attitude angle between cameras mounted on acquisition entities".
Technical Field
Implementations of the present disclosure relate generally to positioning of cameras and, more particularly, relate to methods, apparatuses, devices, and computer storage media for determining a relative pose angle between cameras.
Background
With the development of imaging technology, cameras are widely used for image acquisition in various fields. These acquired images may then be applied in many fields of mobile robots, automotive electronics, autopilot, etc., and used as a basis for subsequent other processing. To more fully acquire image data, multiple cameras may be installed on an acquisition entity (e.g., acquisition cart, etc.). The images from the plurality of cameras may be used for visual processing and perception, and in subsequent image processing, the relative pose (including position (x, y, z) and pose angle (pitch angle, yaw angle, roll angle)) between the plurality of cameras first needs to be determined in order to further process the images from the respective cameras.
In general, conventional solutions for acquiring a relative pose between two cameras rely to a large extent on setting a calibration object in the natural environment, or can only determine the relative pose between two cameras that meets the requirements of a specific location. It is therefore desirable to be able to provide a solution for determining the relative attitude angle between cameras in a more convenient and efficient manner.
Disclosure of Invention
According to an example implementation of the present disclosure, a scheme for determining a relative pose angle between cameras is provided.
In a first aspect of the present disclosure, a method for determining a relative pose angle between cameras mounted to an acquisition entity is provided. The method comprises the following steps: obtaining a first set of images from a first camera and a second set of images from a second camera, respectively; acquiring a first pair of poles associated with a first camera based on two images in a first set of images; acquiring a second pair of poles associated with a second camera based on two images of the second set of images; and determining a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles.
In a second aspect of the present disclosure, an apparatus for determining a relative pose angle between cameras mounted to an acquisition entity is provided. The device comprises: an obtaining module configured to obtain a first set of images from the first camera and a second set of images from the second camera, respectively; a first acquisition module configured to acquire a first pair of poles associated with a first camera based on two images of a first set of images; a second acquisition module configured to acquire a second pair of poles associated with a second camera based on two images of a second set of images; and a determination module configured to determine a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles.
In a third aspect of the present disclosure, an apparatus is provided. The apparatus includes one or more processors; and storage means for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement a method according to the first aspect of the present disclosure.
In a fourth aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements a method according to the first aspect of the present disclosure.
In a fifth aspect of the present disclosure, an acquisition entity is provided. The acquisition entity comprises an apparatus according to the third aspect of the present disclosure.
It should be understood that what is described in the summary section is not intended to limit key or critical features of the implementations of the disclosure nor is it intended to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following description.
Drawings
The above and other features, advantages, and aspects of various implementations of the present disclosure will become more apparent with reference to the following detailed description when taken in conjunction with the accompanying drawings. In the drawings, wherein like or similar reference numerals denote like or similar elements, in which:
FIG. 1 schematically illustrates a diagram of two cameras in which a solution according to an exemplary implementation of the present disclosure may be employed to determine relative pose angles;
FIG. 2 schematically illustrates a diagram of the definition of attitude angles according to an exemplary implementation of the present disclosure;
FIG. 3 schematically illustrates a block diagram of a technique for determining a relative pose angle between two cameras according to an exemplary implementation of the present disclosure;
FIG. 4 schematically illustrates a flow chart of a method for determining a relative pose angle between two cameras according to an exemplary implementation of the present disclosure;
FIG. 5 schematically illustrates a block diagram of pairing of a set of feature points in a first image and a second image depicting the same object in accordance with an exemplary implementation of the present disclosure;
FIG. 6 schematically illustrates a block diagram of terms associated with a camera according to an example implementation of the present disclosure;
FIG. 7 schematically illustrates a block diagram for acquiring a first pair of poles associated with a first camera, according to an example implementation of the present disclosure;
FIG. 8 schematically illustrates a block diagram of determining a relative attitude angle based on a first angle and a second angle, according to an example implementation of the present disclosure;
FIG. 9 schematically illustrates a block diagram of an apparatus for determining a relative pose angle between two cameras according to an exemplary implementation of the present disclosure;
FIG. 10 schematically illustrates a block diagram of the first acquisition module shown in FIG. 9, in accordance with an exemplary implementation of the present disclosure;
FIG. 11 schematically illustrates a block diagram of the determination module shown in FIG. 9, according to an exemplary implementation of the present disclosure; and
FIG. 12 illustrates a block diagram of a computing device capable of implementing various implementations of the disclosure.
Detailed Description
Implementations of the present disclosure will be described in more detail below with reference to the accompanying drawings. While certain implementations of the present disclosure are shown in the accompanying drawings, it is to be understood that the present disclosure may be embodied in various forms and should not be construed as limited to the implementations set forth herein, but rather are provided to provide a more thorough and complete understanding of the present disclosure. It should be understood that the drawings and implementations of the present disclosure are for illustrative purposes only and are not intended to limit the scope of the present disclosure.
In describing implementations of the present disclosure, the term "include" and its similar terms should be interpreted as open-ended inclusion, i.e., including, but not limited to. The term "based on" should be understood as "based at least in part on". The term "one implementation" or "the implementation" should be understood as "at least one implementation". The terms "first," "second," and the like, may refer to different or the same object. Other explicit and implicit definitions are also possible below.
Hereinafter, meanings of terms used in the present disclosure are first introduced. It is noted that "camera" is to be understood herein as a broad-sense camera, that is, a camera that does not only take a photograph of visible light, but also includes a device that images electromagnetic waves of any other wavelength band, and also includes a device that images by using reflection, scattering, diffraction, and the like characteristics of other types of waves, such as ultrasonic imaging, gamma ray imaging, and the like. The "acquisition entity" herein may be a driving system, such as an autonomous driving system or a non-autonomous driving system. The driving system may be a general personal vehicle, a dedicated collection vehicle, or any other suitable vehicle.
Hereinafter, the implementation of the present disclosure will be discussed by way of example with respect to a vehicle, however, it should be understood that the aspects of the present disclosure may be similarly applied to other types of acquisition entities, such as vehicles like aircraft, surface or underwater craft, and even any suitable entity such as living beings, robots (e.g., robots with sweeping functions, etc.). It should be noted that "capturing" herein merely indicates that the entity is provided with a camera, and should not be construed as limiting the entity.
For convenience of description, an example of an application environment of various implementations of the present disclosure is first introduced with reference to fig. 1. In particular, fig. 1 schematically shows a diagram 100 of two cameras in which a technical solution according to an exemplary implementation of the present disclosure may be employed to determine a relative pose angle. In fig. 1, a first camera 110 and a second camera 120 may be mounted on a collection entity 130 (e.g., collection truck, etc.). At this time, there may be a difference between the positions of the two cameras 110 and 120, and the difference may be described using the following 6 degrees of freedom: position (x, y, z) and attitude angle (pitch angle, yaw angle, roll angle). In general, the position coordinates of the first camera 110 and the second camera 120 may be determined based on various measurement tools in the related art. However, since it is difficult for the existing measuring tool to measure the mounting angles of the two cameras, it is generally necessary to determine the relative attitude angle between the two cameras based on analyzing the images acquired by the two cameras, respectively.
It will be appreciated that "installed" herein may be temporarily installed at the collection entity 130, or may also be permanently installed at the collection entity 130. For example, the first camera 110 and the second camera 120 may be mounted, for example, on a cradle head or other detachable support, thereby enabling temporary mounting to the acquisition entity 130. As another example, the first camera 110 and the second camera 120 may also be permanently fixed at the acquisition entity 130.
Various technical solutions for determining the relative attitude angle between two cameras have been proposed. In one aspect, the relative pose angle between two cameras may be determined based on utilizing a particular calibration object in a natural scene. However, this solution needs to be performed in a specific environment, and has a limited application range. In another embodiment, images acquired by two cameras, respectively, with overlapping areas may be analyzed to determine the relative pose angle. However, this solution requires that the fields of view of the two cameras must have overlapping areas and is not suitable for cameras where the fields of view do not overlap.
Based on the above-mentioned deficiencies in the prior art, it is desirable to provide a technical solution that can determine the relative attitude angle between two cameras in a more convenient and quick manner. Further, it is expected that this technical solution can be combined with the existing technical solution and implemented with as little change as possible to the hardware architecture of the existing technical solution.
According to an exemplary implementation of the present disclosure, a technical solution for determining a relative pose angle between two cameras is provided. The technical scheme can be executed without setting a calibration object and without the existence of a coincidence region of the visual field ranges of the two cameras. It should be noted that although there is a region of overlap in the fields of view of the first camera 110 and the second camera 120 schematically illustrated in fig. 1, in other implementations, there may be no region of overlap in the fields of view of the first camera 110 and the second camera 120. For example, the orientations of the first camera 110 and the second camera 120 may be diametrically opposed.
For convenience of description, definition of attitude angle according to an exemplary implementation of the present disclosure is first introduced. Specifically, fig. 2 schematically illustrates a diagram 200 of the definition of attitude angles according to an exemplary implementation of the present disclosure. The definition of the attitude angle associated therewith will be described hereinafter taking only the first camera 110 as an example. A person skilled in the art may determine a definition of the pose angle associated with the second camera 120 based on the description for the first camera 110.
As shown in fig. 2, the pose of the first camera 110 may be defined in three angles in the world coordinate system XYZ: pitch angle (pitch) 210, yaw angle (yaw) 220, and roll angle (roll) 230. Specifically, the pitch angle 210 is the angle between the first camera 110 and the positive direction of the X-axis, the yaw angle 220 is the angle between the first camera 110 and the positive direction of the Y-axis, and the roll angle 230 is the angle between the first camera 110 and the positive direction of the Z-axis. At this time, assuming that the attitude angle of the first camera 110 is (pitch 1, yaw1, roll 1) and the attitude angle of the second camera 120 is (pitch 2, yaw2, roll 2), the relative attitude angle between the two cameras 110 and 120 may be expressed as (pitch 1-pitch2, yaw1-yaw2, roll1-roll 2).
Hereinafter, an exemplary implementation according to the present disclosure will be described in detail with reference to fig. 3. Fig. 3 schematically illustrates a block diagram of a technical solution 300 for determining a relative pose angle between two cameras according to an exemplary implementation of the present disclosure. According to an exemplary implementation of the present disclosure, a method for determining a relative pose angle between a first camera 110 and a second camera 120 is provided.
As shown in fig. 3, a first set of images 310 from the first camera 110 and a second set of images 320 from the second camera 120 may be obtained, respectively. Next, a first pair of poles 312 associated with the first camera 110 may be acquired based on the two images in the first set of images 310. Further, a second pair of poles 322 associated with the second camera 120 may be acquired based on two images in the second set of images 320. Finally, a relative pose angle 330 between the first camera 110 and the second camera 120 is determined based on the first pair of poles 312 and the second pair of poles 322.
With the above-described exemplary implementation, the two sets of images from the two cameras may be unrelated in content, and there is no requirement for a region of overlap between the two sets of images. In this way, even if the orientations of the two cameras are opposite and the field of view is completely misaligned, the relative attitude angle between the two cameras can be determined. Further, with the above-described exemplary implementation, a set of images from each camera may be processed separately in an independent manner. For example, a set of images from each camera may be processed in parallel, thereby improving data processing efficiency.
Hereinafter, specific steps of a method according to an exemplary implementation of the present disclosure will be described in detail with reference to fig. 4. Fig. 4 schematically illustrates a flow chart of a method 400 for determining a relative pose angle between two cameras according to an exemplary implementation of the present disclosure. At block 410, a first set of images 310 from the first camera 110 and a second set of images 310 from the second camera 120 may be obtained, respectively. According to an exemplary implementation of the present disclosure, there may be no overlapping region between any of the first set of images 310 and any of the second set of images 320, and the first set of images 310 and the second set of images 320 may be processed in a completely independent manner. For example, the first set of images 310 and the second set of images 320 may be processed using the same or different devices, and the first set of images 310 and the second set of images 320 may also be processed in parallel or in series.
At block 420, a first pair of poles 312 associated with the first camera 110 may be acquired based on two images in the first set of images 310. At block 430, a second pair of poles 322 associated with the second camera 120 may be acquired based on two images in the second set of images 320. At block 440, a relative pose angle 330 between the first camera 110 and the second camera 120 is determined based on the first pair of poles 312 and the second pair of poles 322. The relative pose angle 330 may be determined at this time based on the positional relationship to the pole, camera position, and camera principal point.
For simplicity, the specific processing steps of how the first pair of poles 312 is obtained based on the first set of images 310 will be described in detail below using only the first set of images 310 as an example. It will be appreciated that the processing for the first set of images 310 and the second set of images 320 to obtain the first pair of poles 312 and the second pair of poles 322, respectively, is similar and that one skilled in the art may process for the second set of images 320 to obtain the second pair of poles 322.
According to an exemplary implementation of the present disclosure, a first image and a second image acquired by the first camera 110 at a first time and a second time, respectively, may be selected from the first set of images 310. With the above-described exemplary implementations, the first image and the second image are images taken by the first camera 110 at different times during motion (e.g., during acquisition of physical movement). In a shorter time interval, the position of the first camera 110 does not shift much, so there will be more overlapping areas in the first image and the second image. In this way, more feature points can be conveniently found in the first image and the second image for subsequent computation. It will be appreciated that the respective first moments involved in the specific implementation described for the two cameras respectively refer to the same moment. Strictly speaking, the time difference between the respective first moments described for the two cameras should be smaller than a certain threshold, for example 0.01 seconds.
Fig. 5 schematically illustrates a block diagram 500 of pairing of a set of feature points in a first image 510 and a second image 520 depicting the same object according to an exemplary implementation of the present disclosure. Hereinafter, how pairing of a set of feature points depicting the same object is detected in the first image 510 and the second image 520 based on the image matching technique will be described with reference to fig. 5. According to an exemplary implementation of the present disclosure, feature points are corresponding points that depict the same object in different images. As shown in fig. 5, the first image 510 and the second image 520 are two images captured by the first camera 110, and the same object in the real scene (e.g., the same tree in the real scene) is included in both images. As shown in fig. 5, the feature point 512 in the first image 510 and the feature point 522 in the second image 520 are a pair of corresponding feature points, and the feature point 514 in the first image 510 and the feature point 524 in the second image 520 are a pair of corresponding feature points. In this case, the feature points represent pairs of feature points that depict the same object.
According to an exemplary implementation of the present disclosure, a pairing of a set of feature points depicting the same object may be detected in the first image 510 and the second image 520 based on an image matching technique. It will be appreciated that although the pairing of two feature points is shown in fig. 5 by way of example only, more feature point pairing may be detected in the first image 510 and the second image 520. According to an exemplary implementation of the present disclosure, when determining the first pair of poles 312 of the first camera 110 based on a pairing of a set of feature points, a greater number (e.g., more than 5) of pairing of feature points may be detected in order to ensure that higher accuracy may be obtained in subsequent computations. According to an exemplary implementation of the present disclosure, pairing of feature points may be obtained based on a variety of image processing techniques known in the art or to be developed in the future.
In turn, a first pair of poles 312 may be determined based on the pairing of the detected set of feature points using epipolar geometry techniques. It should be appreciated that epipolar geometry describes a particular geometric relationship that exists between two images acquired at two camera locations. Based on epipolar geometry techniques, a basic model of the relative pose between two cameras can be solved. In the context of the present invention, since the first image 510 and the second image 520 are acquired by the first camera 110 at different moments in time, the relative pose of the first camera 110 at different moments in time may be obtained based on epipolar geometry techniques.
According to an exemplary implementation of the present disclosure, based on the first image 510 and the second image 520, a relative pose at a location where the first camera 110 captured the first image 510 and the second image 520 may be determined using epipolar geometry, thereby obtaining the first epipole 312 associated with the first camera 110.
During the process of determining the first pair of poles 312 based on the epipolar geometry, a number of terms will be referred to with respect to the first camera 110. Hereinafter, a block diagram 600 of terms associated with a camera according to an exemplary implementation of the present disclosure will be described with reference to fig. 6. The positional relationship between the position of the camera, the focal length, and the imaging plane is shown in fig. 6. In fig. 6, reference numeral 620 denotes a camera position, i.e., a position where the camera is located when acquiring an image. The distance between camera position 620 and principal point 630 is the focal length 610 of the camera, and imaging plane 640 represents the imaging plane in which the camera acquired the image at camera position 620. The principal point 630 represents the center of the imaging plane 640.
Hereinafter, how to acquire the first pair of poles 312 of the first camera 110 based on the first set of images 310 will be described with reference to fig. 7. Fig. 7 schematically illustrates a block diagram 700 of acquiring a first pair of poles 312 associated with a first camera 110, according to an example implementation of the present disclosure. Specifically, fig. 7 shows a graphical representation of the positions of the first camera 110 and the second camera 120 at a first time 730 and a second time 740, respectively. At a first time 730, the first camera 110 is located at a first location 710; at a second time 740, the first camera 110 is located at the second position 720. As shown in fig. 7, the distance between the first location 710 and the first principal point 716 is the first focal length 714 of the first camera 110. The right side of fig. 7 shows the relevant information of the second camera 120 at the first time 730 and the second time 740. The specific information about the second camera 120 can be determined by a person skilled in the art based on the description of the first camera 110, and will not be described here.
According to an exemplary implementation of the present disclosure, epipolar geometry techniques may be utilized to determine a first motion pose of the first camera 110 between the first time 730 and the second time 740 based on a pairing of a set of feature points. Here, the first motion gesture may indicate a relative motion gesture of the first camera 110 between the two moments in time, i.e., whether the gesture of the camera at the two moments in time changes.
It will be appreciated that a precondition for applying the method according to an exemplary implementation of the present disclosure is that the orientation of the first camera 110 at the first moment 730 and the second moment 740 should be the same. In other words, the geometrical relationship described in the context of the present disclosure of determining the relative pose angle 330 based on the first pair of poles 312 and the second pair of poles 322 can only be established when the first camera 110 is moving only in a straight line. Thus, it is necessary to first determine whether the first camera 110 moves in a straight line. If the first camera 110 moves in a straight line, the orientation of the first camera 110 at both moments is the same; if the first camera 110 does not move in a straight line, the orientation of the first camera 110 at the two moments in time is different.
With the above-described exemplary implementations, it may be convenient and accurate to determine whether the orientation of the first camera 110 changes (e.g., moves in a straight line or in a non-straight line) during movement, and thus whether the first image 510 and/or the second image 520 need to be reselected. If it is determined that the first camera 110 is moving in a straight line between the first time 730 and the second time 740, the first pair of poles 312 may be determined based on the pairing of a set of feature points.
According to an exemplary implementation of the present disclosure, if the motion indicates that the first camera 110 is moving in a non-straight line between the first time 730 and the second time 740, then the images need to be reselected from the first image group 310. In particular, only one of the first image 510 and the second image 520 may be reselected, alternatively both images may also be reselected until the movement determined based on the first image 510 and the second image 520 indicates that the first camera 110 is moving in a straight line between the first time 730 and the second time 740. In this way, the accuracy of the subsequent calculation step can be ensured.
In accordance with an exemplary implementation of the present disclosure, to determine the relative pose angle 330 between the first camera 110 and the second camera 120, a first angle 718 associated with the first camera 110 may be determined based on the position of the first camera 110 at a first principal point 716 in the first set of images 310, the position of the first pair of poles 312, and a first focal length 714 of the first camera. It should be noted that epipolar points generally occur in pairs, i.e., there is one epipolar point in both the first image 510 and the second image 520. Only one pole position is schematically shown here and the subsequent calculation of the pole position can be based on the illustration. The location of the other pair of poles associated with the first camera 110 may also be determined by one skilled in the art based on similar methods and will not be described in detail herein.
Similarly, for the second camera 120, a second angle 728 associated with the second camera 120 may be determined based on the location of the second dominant point 726 in the second set of images 320, the location of the second pair of poles 322, and the second focal length of the second camera 120. Finally, the relative attitude angle 330 may be determined based on the first angle 718 and the second angle 728.
In this implementation, where the first camera 110 is confirmed to move in a straight line, the position of the first pair of poles 312 can be obtained based on the first image 510 and the second image 520 from the first camera 110 using the principle of epipolar geometry. As shown in FIG. 7, at the point O 1 、C 1 And B 1 To represent the first location 710, the first principal point 716, and the first pair of poles 312, then represented by point O 1 C 1 B 1 The triangle formed is a right triangle. In this right triangle, the first angle 718 (i.e., angle α) may be determined based on the position coordinates between the points 1 ) An associated tangent value, wherein the tangent function of the first angle 718 may be calculated as follows: tan alpha 1 =|B 1 C 1 |/f 1 . At this time, due to point C 1 And point B 1 Is defined by the position of the first focal length f 1 Are known, and thus tan alpha can be obtained 1 To thereby determine the angle alpha 1
As shown in fig. 7, for the second camera 120, the points O are respectively shown 2 、C 2 And B 2 Representing the second location 720, the second principal point 726, and the second pair of poles 322, then represented by point O 2 C 2 B 2 The triangle formed is a right triangle. In the right triangle, the position coordinates between the stores can be determinedSecond angle 728 (i.e., angle alpha 2 ) An associated tangent value, wherein the tangent function of the second angle 728 may be calculated as follows: tan alpha 2 =|B 2 C 2 |/f 2 . Similarly, due to point C 2 And point B 2 Is defined by the position of the second focal length f 2 Are known, and thus tan alpha can be obtained 2 To thereby determine the angle alpha 2
With the above-described exemplary implementation, a complex process of measuring the relative attitude angle in three-dimensional space can be converted into a process of determining the positions of the principal point and the opposite pole of the two cameras, respectively. Then, with the determined position and the known focal length of the two cameras, the relative pose angle 330 between the two cameras can be calculated. In this way, the process of determining the relative pose angle may be simplified on the one hand, and on the other hand, the two cameras may be completely independent and may have different focal lengths or other internal parameters. Compared to prior art techniques that require two cameras to have the same focal length and even identical camera internal parameters, exemplary implementations of the present disclosure can greatly reduce the requirements for the cameras and are thus suitable for a wider application scenario.
Hereinafter, a block diagram 800 of determining the relative pose angle 330 based on the first angle 718 and the second angle 728 according to an exemplary implementation of the present disclosure will be described with reference to fig. 8. A precondition for applying a method according to an exemplary implementation of the present disclosure is first described with reference to fig. 8. As shown in fig. 8, in the case where the first camera 110 moves in a straight line, since the relative positions of the two cameras are unchanged, the directions of the first camera 110 and the second camera 120 do not change during the first time 730 to the second time 740. The second angle 728 may be translated to the left to coincide with the vertex of the first angle 718 (i.e., the second position 720 is translated to the first position 710), at which point the auxiliary point D826 corresponds to the second principal point C 2 726. As shown in fig. 8, based on the geometric principle, the relative attitude angle 330 (represented by θ) may be calculated at this time based on the difference between the first angle 718 and the second angle 828. Specifically, the relative attitude angle θ=α 12
According to an exemplary implementation of the present disclosure, since the relative attitude angle in the three-dimensional space includes three aspects of a pitch angle, a yaw angle, and a roll angle, the first angle 718 and the second angle 728 determined using the above-described method may be projected in different directions in determining the relative attitude angle so as to obtain the relative attitude angle.
According to an exemplary implementation of the present disclosure, the first angle 718 and the second angle 728 may be projected along a top view direction of the acquisition entity 130 to obtain a first projection angle and a second projection angle. Next, a yaw angle in the relative attitude angles may be determined based on a difference between the first projection angle and the second projection angle. Fig. 7 and 8 described above are examples of projection of camera related parameters along the direction of the top view of the acquisition entity 130. At this time, the relative attitude angle 330 as determined in fig. 8 is the yaw angle between the first camera 110 and the second camera 120. Returning to fig. 2, the yaw angle 220 is the angle of the first camera 110 from the positive direction of the Y-axis, and thus the relative attitude angle 330 determined in the manner described above is the yaw angle between the first camera 110 and the second camera 120.
According to an exemplary implementation of the present disclosure, the first angle and the second angle may also be projected along a side view direction of the acquisition entity 130 to obtain a first projection angle and a second projection angle. Next, a pitch angle in the relative position may be determined based on a difference between the first projection angle and the second projection angle. Those skilled in the art may refer to the examples described above with reference to fig. 7 and 8 to determine the pitch angle between the first camera 110 and the second camera 120 in a similar manner. Returning to fig. 2, the pitch angle 210 is the angle between the first camera 110 and the positive direction of the X-axis, and thus the relative attitude angle determined according to the method described above is the pitch angle between the first camera 110 and the second camera 120.
With the above exemplary implementation manner, by projecting the first angle and the second angle along two directions, the relative attitude angle in the three-dimensional space can be converted into the solution in the two-dimensional projection space, and the yaw angle and the pitch angle in the relative attitude angle can be determined in a more convenient and quick manner. Having described in detail how the specific course of yaw and pitch angles is determined, one skilled in the art can determine roll angles based on similar principles.
According to an exemplary implementation of the present disclosure, the first image 510 and the second image 520 may be selected based on at least any one of a movement speed of the first camera 110 and a sampling frequency of the first camera 110. With the above-described exemplary implementation, the first image 510 and the second image 520 may be selected from the first set of images 310 based on a variety of factors.
For example, when the moving speed of the camera is high, in order to avoid that there is no overlapping area between the selected first image 510 and the second image 520, images acquired at a small time interval may be selected as the first image 510 and the second image 520, respectively. For another example, when the moving speed of the camera is slow, images acquired at a larger time interval may be selected as the first image 510 and the second image 520, respectively. At this time, although the difference between the acquisition times of the first image 510 and the second image 520 is large, it is possible to ensure that there is a coincidence region between the two images.
For another example, the first image 510 and the second image 520 may also be selected based on a sampling frequency of the camera. If the sampling frequency of the camera is high (e.g. 24 samples per second), two discrete images can be selected in the image sequence. If the sampling frequency of the camera is low (e.g., 1 sample per second), two consecutive images may be selected. For another example, the movement speed and the sampling frequency of the first camera may also be taken into consideration. In this way, it can be ensured that the probability of a pole is successfully determined based on the selected image.
According to an exemplary implementation of the present disclosure, the pose angle of the second camera 120 may also be determined on the basis of knowing the first pose angle of the first camera 110. For example, a technical scheme known in the art or to be developed in the future may be employed to acquire the first attitude angle of the first camera 110. Then, a second attitude angle of the second camera 120 may be determined based on the acquired first attitude angle and the relative attitude angle 330. When a plurality of cameras are included in the acquisition system, the relative attitude angles between a first camera and the other cameras in the plurality of cameras may be determined, respectively, in the manner described above. In the case where the attitude angles of the first cameras are known, the attitude angles of the respective cameras can be obtained. Alternatively, the relative attitude angles between two successive cameras (for example, the relative attitude angles between the camera 1 and the camera 2, the relative attitude angles between the camera 2 and the camera 3, and the like) may also be determined in the manner described above, respectively, to thereby obtain the attitude angles of the respective cameras.
It will be appreciated that the specific processing steps of how the first pair of poles 312 is obtained based on the first set of images 310 have been described in detail hereinabove. Based on the specific implementation described above, one skilled in the art can process in a similar manner for the second set of images 320 to obtain the second pair of poles 322. In summary, two images may be selected from the second set of images 320, respectively, a set of pairs of feature points depicting the same object may be detected from the selected two images, and the second pair of poles 322 may be determined based on the detected set of pairs of feature points. In this process, the process of selecting two images from the second set of images 320, detecting a set of feature point pairs, and determining the second pair of poles 322 is similar to the corresponding steps described above for the first set of images 310 and will not be repeated.
Fig. 9 schematically illustrates a block diagram of an apparatus 900 for determining a relative pose angle between two cameras mounted to a collection entity according to an exemplary implementation of the present disclosure. In particular, the apparatus 900 may include an obtaining module 910, the obtaining module 910 configured to obtain a first set of images from a first camera and a second set of images from a second camera, respectively; a first acquisition module 920, the first acquisition module 920 configured to acquire a first pair of poles associated with a first camera based on two images of a first set of images; a second acquisition module 930, the second acquisition module 930 configured to acquire a second pair of poles associated with a second camera based on two images of the second set of images; and a determination module 940 configured to determine a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles.
Fig. 10 schematically illustrates a block diagram 1000 of the first acquisition module 920 shown in fig. 9, according to an example implementation of the disclosure. According to an exemplary implementation of the present disclosure, the first acquisition module 920 includes: an image acquisition module 1010, a detection module 1020, and a epipolar determination module 1030. Specifically, the image acquisition module 1010 is configured to select a first image and a second image acquired by the first camera at a first time and a second time, respectively, from the first set of images. The detection module 1020 is configured to detect pairs of feature points in the first image and the second image that depict a same object. The pair pole determination module 1030 is configured to determine a first pair of poles based on a pairing of a set of feature points.
According to an exemplary implementation of the present disclosure, the epipolar determination module 1030 includes: the pose determination module 1032, the motion determination module 1034, and the epipolar identification module 1036. In particular, pose determination module 1032 is configured to determine a first motion pose of the first camera between the first time and the second time based on a pairing of a set of feature points. The motion determination module 1034 is configured to determine a motion of the first camera based on the first motion pose. The epipolar identification module 1036 is configured for determining a first epipolar based on a pairing of a set of feature points in response to the motion indicating that the first camera is moving along a straight line between the first time and the second time.
According to an exemplary implementation of the present disclosure, the first acquisition module 920 further includes a selection module 1040. In particular, the selection module 1040 is configured to select, in response to the motion indicating that the first camera is moving in a non-straight line between the first time and the second time, other images from the first set of images as the first image and the second image, respectively, until the motion determined based on the first image and the second image indicates that the first camera is moving in a straight line between the first time and the second time.
According to an exemplary implementation of the present disclosure, the first acquisition module 920 further includes a selection module 1040. In particular, the selection module 1040 is configured for selecting another image from the first set of images as the second image in response to the movement indicating that the first camera is moving in a non-straight line between the first time and the second time until the movement determined based on the first image and the second image indicates that the first camera is moving in a straight line between the first time and the second time.
Fig. 11 schematically illustrates a block diagram 1100 of the determination module 940 shown in fig. 9, according to an exemplary implementation of the present disclosure. According to an exemplary implementation of the present disclosure, the determining module 940 includes: a first angle determination module 1110, a second angle determination module 1120, and a pose angle determination module 1130. Specifically, the first angle determination module 1110 is configured to determine a first angle associated with the first camera based on a location of a first principal point of the first camera in the first set of images, a location of the first pair of poles, and a focal length of the first camera. The second angle determination module 1120 is configured to determine a second angle associated with the second camera based on a location of the second principal point in the second set of images, a location of the second pair of poles, and a focal length of the second camera. The attitude angle determination module 1130 is configured to determine a relative attitude angle based on the first angle and the second angle.
According to an exemplary implementation of the present disclosure, the attitude angle determination module 1130 includes a projected angle determination module 1132 and a yaw angle determination module 1134. In particular, the projection angle determination module 1132 is configured to project the first angle and the second angle along a top view direction of the acquisition entity to obtain a first projection angle and a second projection angle. The yaw angle determination module 1134 is configured to determine a yaw angle of the relative attitude angles based on a difference between the first and second projected angles.
According to an exemplary implementation of the present disclosure, the pose angle determination module 1130 includes: a projection angle determination module 1132 and a pitch angle determination module 1136. The projection angle determination module 1132 is configured to project the first angle and the second angle along a side view direction of the acquisition entity to obtain a first projection angle and a second projection angle. The pitch angle determination module 1136 is configured to determine a pitch angle in the relative position based on a difference between the first projection angle and the second projection angle.
According to an exemplary implementation of the present disclosure, the image acquisition module 1010 includes an image selection module. Specifically, the image selection module is configured to select the first image and the second image based on at least any one of a movement speed of the first camera and a sampling frequency of the first camera.
According to an exemplary implementation of the present disclosure, the apparatus 900 further comprises: the angle acquisition module and the angle determination module. Specifically, the angle acquisition module is configured to acquire a first attitude angle of a first camera. The angle determination module is configured to determine a second pose angle of the second camera based on the first pose angle and the relative pose angle.
According to an exemplary implementation of the present disclosure, there is provided an apparatus comprising: one or more processors; and a storage device for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the methods described in the underlying disclosure.
Fig. 12 illustrates a block diagram of a computing device 1200 capable of implementing various implementations of the disclosure. Device 1200 may be used to implement computing device 122 of fig. 1. As shown, the device 1200 includes a Central Processing Unit (CPU) 1201 that can perform various suitable actions and processes in accordance with computer program instructions stored in a Read Only Memory (ROM) 1202 or loaded from a storage unit 1208 into a Random Access Memory (RAM) 1203. In the RAM 1203, various programs and data required for the operation of the device 1200 may also be stored. The CPU 1201, ROM 1202, and RAM 1203 are connected to each other through a bus 1204. An input/output (I/O) interface 1205 is also connected to the bus 1204.
Various components in device 1200 are connected to I/O interface 1205, including: an input unit 1206 such as a keyboard, mouse, etc.; an output unit 1207 such as various types of displays, speakers, and the like; a storage unit 1208 such as a magnetic disk, an optical disk, or the like; and a communication unit 1209, such as a network card, modem, wireless communication transceiver, etc. The communication unit 1209 allows the device 1200 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunications networks.
The processing unit 1201 performs the various methods and processes described above, such as process 400. For example, in some implementations, the process 400 may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as the storage unit 1208. In some implementations, part or all of the computer program may be loaded and/or installed onto device 1200 via ROM 1202 and/or communication unit 1209. When the computer program is loaded into RAM 1203 and executed by CPU 1201, one or more steps of process 400 described above may be performed. Alternatively, in other implementations, CPU 1201 may be configured to perform process 400 by any other suitable means (e.g., by means of firmware).
According to an exemplary implementation of the present disclosure, a computer-readable storage medium having a computer program stored thereon is provided. The program when executed by a processor implements the methods described in the present disclosure.
According to an exemplary implementation of the present disclosure, a collection entity is provided. The acquisition entity may comprise a device described in accordance with the present disclosure.
According to an exemplary implementation of the present disclosure, the collection entity is a vehicle.
The functions described above herein may be performed, at least in part, by one or more hardware logic components. For example, without limitation, exemplary types of hardware logic components that may be used include: a Field Programmable Gate Array (FPGA), an Application Specific Integrated Circuit (ASIC), an Application Specific Standard Product (ASSP), a system on a chip (SOC), a load programmable logic device (CPLD), etc.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
Moreover, although operations are depicted in a particular order, this should be understood as requiring that such operations be performed in the particular order shown or in sequential order, or that all illustrated operations be performed, to achieve desirable results. In certain circumstances, multitasking and parallel processing may be advantageous. Likewise, while several specific implementation details are included in the above discussion, these should not be construed as limiting the scope of the present disclosure. Certain features that are described in the context of separate implementations can also be implemented in combination in a single implementation. Conversely, various features that are described in the context of a single implementation can also be implemented in multiple implementations separately or in any suitable subcombination.
Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the specific features or acts described above. Rather, the specific features and acts described above are example forms of implementing the claims.

Claims (22)

1. A method for determining a relative pose angle between cameras mounted to an acquisition entity, comprising:
obtaining a first set of images from a first camera and a second set of images from a second camera, respectively;
acquiring a first pair of poles associated with the first camera based on two images of the first set of images;
acquiring a second pair of poles associated with the second camera based on two images of the second set of images; and
determining a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles, comprising:
determining a first angle associated with the first camera based on a location of a first principal point of the first camera in the first set of images, a location of the first pair of poles, and a focal length of the first camera;
Determining a second angle associated with the second camera based on a location of a second principal point of the second camera in the second set of images, a location of the second pair of poles, and a focal length of the second camera; and
the relative attitude angle is determined based on the first angle and the second angle.
2. The method of claim 1, wherein acquiring a first pair of poles associated with the first camera comprises:
selecting a first image and a second image acquired by the first camera at a first moment and a second moment from the first group of images respectively, wherein the directions of the first camera at the first moment and the second moment are the same;
detecting a pairing of a set of feature points depicting the same object in the first image and the second image; and
the first pair of poles is determined based on the pairing of the set of feature points.
3. The method of claim 2, wherein determining the first pair of poles based on the pairing of the set of feature points comprises:
determining a first motion pose of the first camera between the first time and the second time based on the pairing of the set of feature points;
Determining a motion of the first camera based on the first motion gesture;
the first pair of poles is determined based on the pairing of the set of feature points in response to the movement indicating that the first camera is moving in a straight line between the first time and the second time.
4. The method of claim 3, wherein acquiring a first pair of poles associated with the first camera further comprises: in response to the movement indicating that the first camera is moving in a non-straight line between the first time instant and the second time instant,
other images are selected from the first set of images as the first and second images, respectively, until the motion determined based on the first and second images indicates that the first camera is moving in a straight line between the first and second moments in time.
5. The method of claim 3, wherein acquiring a first pair of poles associated with the first camera further comprises: in response to the movement indicating that the first camera is moving in a non-straight line between the first time instant and the second time instant,
another image is selected from the first set of images as a second image until the motion determined based on the first image and the second image indicates that the first camera is moving in a straight line between the first time and the second time.
6. The method of claim 1, wherein determining the relative attitude angle based on the first angle and the second angle comprises:
projecting the first angle and the second angle along a top view direction of the acquisition entity to obtain a first projection angle and a second projection angle; and
a yaw angle in the relative attitude angles is determined based on a difference between the first projection angle and the second projection angle.
7. The method of claim 1, wherein determining the relative attitude angle based on the first angle and the second angle comprises:
projecting the first angle and the second angle along a side view direction of the acquisition entity to obtain a first projection angle and a second projection angle; and
a pitch angle in the relative attitude angle is determined based on a difference between the first projection angle and the second projection angle.
8. The method of claim 2, wherein selecting a first image and a second image from the first set of images, respectively, comprises:
the first image and the second image are selected based on at least any one of a movement speed of the first camera and a sampling frequency of the first camera.
9. The method of claim 1, further comprising:
acquiring a first attitude angle of the first camera; and
a second pose angle of the second camera is determined based on the first pose angle and the relative pose angle.
10. An apparatus for determining a relative pose angle between cameras mounted to an acquisition entity, comprising:
an obtaining module configured to obtain a first set of images from a first camera and a second set of images from a second camera, respectively;
a first acquisition module configured to acquire a first pair of poles associated with the first camera based on two images of the first set of images;
a second acquisition module configured to acquire a second pair of poles associated with the second camera based on two images of the second set of images; and
a determination module configured to determine a relative pose angle between the first camera and the second camera based on the first pair of poles and the second pair of poles, comprising:
a first angle determination module configured to determine a first angle associated with the first camera based on a location of a first principal point of the first camera in the first set of images, a location of the first pair of poles, and a focal length of the first camera;
A second angle determination module configured to determine a second angle associated with the second camera based on a location of a second principal point of the second camera in the second set of images, a location of the second pair of poles, and a focal length of the second camera; and
an attitude angle determination module configured to determine the relative attitude angle based on the first angle and the second angle.
11. The apparatus of claim 10, wherein the first acquisition module comprises:
an image acquisition module configured to select, from the first set of images, a first image and a second image acquired by the first camera at a first time and a second time, respectively, the orientation of the first camera at the first time and the second time being the same;
a detection module configured to detect a pairing of a set of feature points depicting the same object in the first image and the second image; and
a pair pole determination module configured to determine the first pair of poles based on the pairing of the set of feature points.
12. The apparatus of claim 11, wherein the epipolar determination module comprises:
a pose determination module configured to determine a first motion pose of the first camera between the first time instant and the second time instant based on a pairing of the set of feature points;
a motion determination module configured to determine a motion of the first camera based on the first motion pose;
a pair pole identification module configured to determine the first pair of poles based on a pairing of the set of feature points in response to the motion indicating that the first camera is moving in a straight line between the first time and the second time.
13. The apparatus of claim 12, wherein the first acquisition module further comprises:
a selection module configured to select, in response to the motion indicating that the first camera is moving in a non-straight line between the first time and the second time, other images from the first set of images as the first image and the second image, respectively, until the motion determined based on the first image and the second image indicates that the first camera is moving in a straight line between the first time and the second time.
14. The apparatus of claim 12, wherein the first acquisition module further comprises:
a selection module configured to select another image from the first set of images as a second image in response to the movement indicating that the first camera is moving in a non-straight line between the first time and the second time until the movement determined based on the first image and the second image indicates that the first camera is moving in a straight line between the first time and the second time.
15. The apparatus of claim 10, wherein the attitude angle determination module comprises:
a projection angle determination module configured to project the first angle and the second angle along a top view direction of the acquisition entity to obtain a first projection angle and a second projection angle; and
a yaw angle determination module configured to determine a yaw angle of the relative attitude angles based on a difference between the first and second projected angles.
16. The apparatus of claim 10, wherein the attitude angle determination module comprises:
A projection angle determination module configured to project the first angle and the second angle along a side view direction of the acquisition entity to obtain a first projection angle and a second projection angle; and
a pitch angle determination module configured to determine a pitch angle in the relative attitude angle based on a difference between the first projection angle and the second projection angle.
17. The apparatus of claim 11, wherein the image acquisition module comprises:
an image selection module configured to select the first image and the second image based on at least any one of a movement speed of the first camera and a sampling frequency of the first camera.
18. The apparatus of claim 10, further comprising:
an angle acquisition module configured to acquire a first attitude angle of the first camera; and
an angle determination module configured to determine a second pose angle of the second camera based on the first pose angle and the relative pose angle.
19. An apparatus for determining a relative pose angle between cameras mounted to a collection entity, the apparatus comprising:
One or more processors; and
storage means for storing one or more programs which when executed by the one or more processors cause the one or more processors to implement the method of any of claims 1-9.
20. A computer readable storage medium having stored thereon a computer program which when executed by a processor implements the method according to any of claims 1-9.
21. An acquisition entity comprising the apparatus of claim 19.
22. The acquisition entity of claim 21, wherein the acquisition entity is a vehicle.
CN202110926887.7A 2018-03-19 2018-03-19 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity Active CN113487676B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110926887.7A CN113487676B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202110926887.7A CN113487676B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity
CN201810225217.0A CN108564626B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201810225217.0A Division CN108564626B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity

Publications (2)

Publication Number Publication Date
CN113487676A CN113487676A (en) 2021-10-08
CN113487676B true CN113487676B (en) 2023-06-20

Family

ID=63532749

Family Applications (2)

Application Number Title Priority Date Filing Date
CN202110926887.7A Active CN113487676B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity
CN201810225217.0A Active CN108564626B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201810225217.0A Active CN108564626B (en) 2018-03-19 2018-03-19 Method and apparatus for determining relative pose angle between cameras mounted to an acquisition entity

Country Status (1)

Country Link
CN (2) CN113487676B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113487676B (en) * 2018-03-19 2023-06-20 百度在线网络技术(北京)有限公司 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity
US11022972B2 (en) * 2019-07-31 2021-06-01 Bell Textron Inc. Navigation system with camera assist

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104729481A (en) * 2015-03-12 2015-06-24 北京空间飞行器总体设计部 Cooperative target pose precision measurement method based on PNP perspective model
CN107646126A (en) * 2015-07-16 2018-01-30 谷歌有限责任公司 Camera Attitude estimation for mobile device

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101710932B (en) * 2009-12-21 2011-06-22 华为终端有限公司 Image stitching method and device
CN103673995B (en) * 2013-11-29 2016-09-21 航天恒星科技有限公司 A kind of linear array push-broom type camera optical distortion parameter calibration method in-orbit
US11051000B2 (en) * 2014-07-14 2021-06-29 Mitsubishi Electric Research Laboratories, Inc. Method for calibrating cameras with non-overlapping views
WO2016123448A2 (en) * 2015-01-30 2016-08-04 Catanzariti Scott Paul Systems and method for mapping the ocular surface usually obstructed by the eyelids
CN105389819B (en) * 2015-11-13 2019-02-01 武汉工程大学 A kind of lower visible image method for correcting polar line of half calibration and system of robust
EP3182373B1 (en) * 2015-12-17 2019-06-19 STMicroelectronics S.A. Improvements in determination of an ego-motion of a video apparatus in a slam type algorithm
CN107133987B (en) * 2017-05-16 2019-07-19 西北工业大学 The camera array of non-overlapping visual field optimizes scaling method
CN107392951A (en) * 2017-06-06 2017-11-24 上海卫星工程研究所 Remote sensing images high accuracy rapid registering method
CN113487676B (en) * 2018-03-19 2023-06-20 百度在线网络技术(北京)有限公司 Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104729481A (en) * 2015-03-12 2015-06-24 北京空间飞行器总体设计部 Cooperative target pose precision measurement method based on PNP perspective model
CN107646126A (en) * 2015-07-16 2018-01-30 谷歌有限责任公司 Camera Attitude estimation for mobile device

Also Published As

Publication number Publication date
CN113487676A (en) 2021-10-08
CN108564626B (en) 2021-08-31
CN108564626A (en) 2018-09-21

Similar Documents

Publication Publication Date Title
CN109345596B (en) Multi-sensor calibration method, device, computer equipment, medium and vehicle
CN110148185B (en) Method and device for determining coordinate system conversion parameters of imaging equipment and electronic equipment
US10086955B2 (en) Pattern-based camera pose estimation system
JP6107081B2 (en) Image processing apparatus, image processing method, and program
JP6857697B2 (en) Vehicle positioning methods, vehicle positioning devices, electronic devices and computer readable storage media
JP2020042818A (en) Method and apparatus for generating three-dimensional data, computer device, and computer-readable storage medium
US10451403B2 (en) Structure-based camera pose estimation system
CN105043350A (en) Binocular vision measuring method
WO2019047641A1 (en) Method and device for estimating orientation error of onboard camera
US9858669B2 (en) Optimized camera pose estimation system
WO2019127306A1 (en) Template-based image acquisition using a robot
US8467992B1 (en) Vision based location and measurement device and methods
CN113487676B (en) Method and apparatus for determining relative attitude angle between cameras mounted to acquisition entity
CN111288956A (en) Target attitude determination method, device, equipment and storage medium
CN112097742B (en) Pose determination method and device
CN112880675B (en) Pose smoothing method and device for visual positioning, terminal and mobile robot
WO2018119607A1 (en) Method and apparatus for uncertainty modeling of point cloud
JP4546155B2 (en) Image processing method, image processing apparatus, and image processing program
WO2022033139A1 (en) Ego-motion estimation method and related apparatus
JP7214057B1 (en) DATA PROCESSING DEVICE, DATA PROCESSING METHOD AND DATA PROCESSING PROGRAM
JP4876676B2 (en) POSITION MEASURING DEVICE, METHOD, AND PROGRAM, AND MOVEMENT DETECTION DETECTING DEVICE, METHOD, AND PROGRAM
US20240005552A1 (en) Target tracking method and apparatus, device, and medium
CN117422772A (en) Laser radar and camera combined calibration method and device and electronic equipment
Oh et al. A Camera Center Estimation Based on Perspective One Point Method
TW202437041A (en) Method and system for detecting position and posture of robot alignment target

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant