[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN110750094A - Method, device and system for determining pose change information of movable equipment - Google Patents

Method, device and system for determining pose change information of movable equipment Download PDF

Info

Publication number
CN110750094A
CN110750094A CN201810813571.5A CN201810813571A CN110750094A CN 110750094 A CN110750094 A CN 110750094A CN 201810813571 A CN201810813571 A CN 201810813571A CN 110750094 A CN110750094 A CN 110750094A
Authority
CN
China
Prior art keywords
relative
position information
dimensional image
dimensional
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810813571.5A
Other languages
Chinese (zh)
Inventor
程潇
李裕超
宋江新
毛慧
浦世亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201810813571.5A priority Critical patent/CN110750094A/en
Publication of CN110750094A publication Critical patent/CN110750094A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The disclosure relates to a method, a device and a system for determining pose change information of movable equipment, and belongs to the technical field of machine vision. The method comprises the following steps: determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position; determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device; and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components. By adopting the method and the device, the pose change information can be reliably determined under severe environmental conditions.

Description

Method, device and system for determining pose change information of movable equipment
Technical Field
The present disclosure relates to the field of machine vision technologies, and in particular, to a method, an apparatus, and a system for determining pose change information of a mobile device.
Background
In the related art, it is possible to cause a movable device to perform navigation processing by means of an image captured by a three-dimensional image capturing part installed in the movable device in an unfamiliar environment by means of a visual navigation technique. Wherein, the three-dimensional image photographing part may include two cameras. The image captured by the three-dimensional image capturing section may include two-dimensional images, i.e., a two-dimensional image a and a two-dimensional image B. Depth information for pixel points in the two-dimensional image a may be determined based on the two-dimensional image a, the two-dimensional image B, and a baseline distance between the two cameras. The three-dimensional position information of the pixel point may include two-dimensional position information and depth information of the pixel point in the two-dimensional image a.
When the removable device is powered on, the removable device may be considered to be in an initial position. The movable equipment can identify whether the image shot by the three-dimensional image shooting component contains the image of the obstacle or not, if so, the relative position information of the obstacle relative to the movable equipment after moving needs to be continuously detected in the moving process of the movable equipment, and therefore, the movable equipment can be determined to be controlled to move to avoid the obstacle based on the relative position information of the obstacle relative to the movable equipment after moving. The relative position information of the obstacle with respect to the movable device after the movement can be determined from the relative position information of the obstacle with respect to the movable device at the initial position and the pose change information of the movable device with respect to the initial position. Wherein the pose change information records the degree of change in position and attitude of the movable device relative to the initial position. The above-described relative position information is relative position information of the obstacle with respect to the movable device in consideration of the position and posture of the movable device, and for example, the relative position information may be coordinates of the obstacle in a coordinate system established based on the movable device, or the relative position information may be that the obstacle is at a certain distance at a certain angle to the left (or right) in the forward direction of the movable device.
The following describes the manner of determining pose change information: the pose change information can be determined according to the relative position information of the object point corresponding to the feature point relative to the movable equipment at the initial position and the three-dimensional position information of the feature point in the image continuously shot by the three-dimensional image shooting component in the moving process of the movable equipment. The feature points may be pixel points having certain image characteristics in the image, and may be extracted from the image by a preset feature extraction algorithm, such as FAST (feature from estimated Segment Test) corner points, for example, four corners of a display may be identified as FAST corner points.
In carrying out the present disclosure, the inventors found that at least the following problems exist:
only one three-dimensional image shooting component is installed in the movable equipment, if the three-dimensional image shooting component is used for shooting strong light, the shot image is too bright, so that the feature points in the image are difficult to accurately identify, and further the determined pose change information is inaccurate.
Disclosure of Invention
In order to overcome the problems in the related art, the present disclosure provides the following technical solutions:
according to a first aspect of embodiments of the present disclosure, there is provided a method of determining pose change information of a movable device including at least two three-dimensional image capturing sections, the method including:
determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining, according to three-dimensional position information of at least one feature point in an image captured by each three-dimensional image capturing component in an initial position of the movable device in the image, relative position information of a real object point corresponding to each feature point with respect to the movable device includes:
determining the relative position information of a real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
and determining the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component according to the relative position information of the real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in the at least two three-dimensional image shooting components, wherein the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component is used as the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment.
Optionally, the determining, according to the relative position information corresponding to the determined physical point, the three-dimensional position information of the currently determined feature point, and the pre-stored relative pose information between different three-dimensional image capturing components, pose change information of the movable device with respect to the initial position includes:
determining three-dimensional position information of each feature point relative to the image currently shot by the reference three-dimensional image shooting component according to the currently determined three-dimensional position information of the feature point and pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information of the object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component and the three-dimensional position information of each characteristic point relative to the image currently shot by the reference three-dimensional image shooting component.
Optionally, the method further comprises:
determining target feature points matched with the image features of any feature point in the at least one feature point from the feature points contained in the image shot by each three-dimensional image shooting component at present, and determining three-dimensional position information of each target feature point;
the determining pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components comprises the following steps:
and determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining, according to the determined relative position information corresponding to the physical point, the three-dimensional position information of each target feature point, and the pre-stored relative pose information between different three-dimensional image capturing components, pose change information of the movable device with respect to the initial position includes:
and if the number of the target characteristic points is greater than or equal to a preset number threshold, determining pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the method further comprises:
and if the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
According to a second aspect of the embodiments of the present disclosure, there is provided an apparatus that determines pose change information of a movable device including at least two three-dimensional image capturing sections, the apparatus including:
the determining module is used for determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
the determining module is further used for determining three-dimensional position information of a feature point contained in each image in the image currently shot by each three-dimensional image shooting component in the moving process of the movable equipment;
the determining module is further configured to determine pose change information of the movable device relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined feature point, and pre-stored relative pose information between different three-dimensional image capturing components.
Optionally, the determining module is configured to:
determining the relative position information of a real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
and determining the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component according to the relative position information of the real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in the at least two three-dimensional image shooting components, wherein the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component is used as the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment.
Optionally, the determining module is configured to:
determining three-dimensional position information of each feature point relative to the image currently shot by the reference three-dimensional image shooting component according to the currently determined three-dimensional position information of the feature point and pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information of the object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component and the three-dimensional position information of each characteristic point relative to the image currently shot by the reference three-dimensional image shooting component.
Optionally, the determining module is further configured to:
determining target feature points matched with the image features of any feature point in the at least one feature point from the feature points contained in the image shot by each three-dimensional image shooting component at present, and determining three-dimensional position information of each target feature point;
and determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining module is configured to:
and when the number of the target characteristic points is greater than or equal to a preset number threshold, determining pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining module is further configured to:
and when the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
According to a third aspect of the embodiments of the present disclosure, there is provided a system that determines pose change information of a movable device, the system including a server and the movable device, the movable device including at least two three-dimensional image capturing sections, wherein:
the mobile device is used for shooting images through each three-dimensional image shooting component every time a preset period is reached, and sending the images shot by each three-dimensional image shooting component to the server;
the server is used for determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position; determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device; and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
According to a fourth aspect of embodiments of the present disclosure, there is provided a server comprising a processor, a communication interface, a memory, and a communication bus, wherein:
the processor, the communication interface and the memory complete mutual communication through the communication bus;
the memory is used for storing a computer program;
the processor is used for executing the program stored in the memory so as to realize the method for determining the pose change information of the movable equipment.
According to a fifth aspect of the embodiments of the present disclosure, there is provided a computer-readable storage medium having stored therein a computer program which, when executed by a processor, implements the above-described method of determining pose change information of a movable device.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects:
by the method provided by the embodiment of the disclosure, the pose change information of the movable device relative to the initial position can be determined through the images shot by the at least two three-dimensional image shooting components. Even if one of the three-dimensional image taking sections is affected by strong light or other factors, the information on the change in the attitude of the movable device with respect to the initial position can be determined by means of the images taken by the remaining three-dimensional image taking sections. Furthermore, under severe environmental conditions, the pose change information can be reliably determined.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. In the drawings:
FIG. 1 is a schematic flow diagram illustrating a method of determining pose change information of a movable device in accordance with an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a configuration for determining the base of a removable device in accordance with one illustrative embodiment;
FIG. 3 is a schematic diagram illustrating a method of determining pose change information of a movable device in accordance with an exemplary embodiment;
FIG. 4 is a schematic diagram of a transformation relationship illustrating a method of determining pose change information of a movable device in accordance with an exemplary embodiment;
FIG. 5 is a schematic diagram illustrating an arrangement of an apparatus for determining pose change information of a movable device in accordance with an exemplary embodiment;
fig. 6 is a schematic diagram illustrating a configuration of a server according to an example embodiment.
Illustration of the drawings:
1-6 cameras; 7, a hardware trigger board;
8 a metal base plate; 9-11 wiring holes;
12-17 rectangular groove
With the foregoing drawings in mind, certain embodiments of the disclosure have been shown and described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
The embodiment of the disclosure provides a method for determining pose change information of a movable device, which can be realized by matching a movable terminal device and a server. Wherein, the movable terminal equipment can be a sweeper, a robot, an unmanned vehicle and the like.
The server may include a processor, memory, etc. The processor, which may be a CPU (Central processing unit), may be configured to determine, based on three-dimensional position information of at least one feature point in an image captured by each three-dimensional image capturing unit in the image when the movable apparatus is at the initial position, relative position information of a real object point corresponding to each feature point with respect to the movable apparatus, and the like. The Memory may be a RAM (Random Access Memory), a Flash (Flash Memory), or the like, and may be configured to store received data, data required by the processing procedure, data generated in the processing procedure, or the like, such as relative position information corresponding to the real point.
The server may also include a transceiver or the like. A transceiver, which may be used for data transmission with the removable device, may include a bluetooth component, a WiFi (Wireless-Fidelity) component, an antenna, a matching circuit, a modem, and the like.
An exemplary embodiment of the present disclosure provides a method for determining pose change information of a movable device, where the movable device includes at least two three-dimensional image capturing components, as shown in fig. 1, and a processing flow of the method may include the following steps:
step S110, determining the relative position information of the object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position.
In the embodiment of the present disclosure, at least two three-dimensional image capturing sections, preferably three-dimensional image capturing sections (including six cameras in total) may be provided in the movable apparatus. Fig. 2 is a schematic structural diagram of a base of the mobile device. In the base of the mobile device, the parts numbered 1-6 correspond to six different cameras, respectively. The part corresponding to the reference numeral 7 is a hardware trigger board which is used for controlling the six cameras to carry out synchronous exposure. The part corresponding to the reference numeral 8 is a metal bottom plate, and a camera and a hardware trigger plate can be arranged above the metal bottom plate. Three wiring holes corresponding to 9-11 can be arranged on the metal bottom plate, and the wiring holes can be used for outputting related wires of the camera and the hardware trigger plate to the lower side of the base of the movable equipment. Six rectangular grooves corresponding to the reference numbers 12-17 can be further arranged on the metal bottom plate and are used for fixing the camera on the metal bottom plate.
The six cameras can be independent industrial cameras, and the types of the cameras can comprise common pinhole cameras, fisheye cameras and the like. Two adjacent cameras with parallel arrangement positions are a pair of binocular cameras (three-dimensional image shooting components). The viewing angles between different pairs of binocular cameras may or may not coincide.
The metal bottom plate can be set to be triangular, three sides are total, three pairs of binocular cameras are respectively set on one side of the three sides, and the shooting directions of the two cameras in the pair of binocular cameras are consistent. The baseline distance for each pair of binocular cameras may be set at 12 centimeters. The movable equipment can comprise three-dimensional image shooting components, and the included angle between the connecting lines of any two three-dimensional image shooting components and the preset reference center of the movable equipment is equal to 120 degrees, namely the included angle between two adjacent cameras in different pairs of binocular cameras can be set to be 120 degrees.
After the hardware trigger board controls the six cameras to perform synchronous exposure, the shot images can be sent to the server, the server performs subsequent processing, and a processing result is returned to the mobile equipment. The method provided by the embodiment is mainly executed by the server and cooperatively executed by the mobile device.
In the disclosed embodiment, one three-dimensional image capturing section may include two cameras. Because the pixel points in the two-dimensional image A shot by a single camera lack depth information, the depth information of the pixel points in the two-dimensional image A is determined by means of the two-dimensional image B shot by another camera. Therefore, a three-dimensional image can be established based on the two-dimensional image A and the depth information of the pixel points in the two-dimensional image A, and each pixel point in the three-dimensional image has corresponding three-dimensional position information.
Then, feature points in the images taken by all the cameras can be extracted. The feature points in the images captured by all cameras can be extracted by ORB (organized fast Oriented feature point tree) algorithm, DAISY (feature point extraction algorithm) algorithm, SIFT (Scale-invariant feature transform) algorithm, and other algorithms. In the embodiment of the present disclosure, a FAST (feature from accessed Segment Test) corner extracted by the ORB algorithm is mainly used as an example for description, and other algorithms are similar and will not be described herein again.
The ORB algorithm mainly uses any pixel point in a shot image as a circle center, and analyzes the characteristics of gray value presentation of 16 pixel points around the pixel point. If N (generally, N is 9) continuous pixel points exist among the 16 pixel points, where the gray value of the gray value greater than the center of the circle is added to the preset gray value threshold, and the gray values smaller than the center of the circle are subtracted from the preset gray value threshold, the center of the circle is the FAST corner.
With a P point in a shot image as a circle center, three states exist between the gray values of 16 pixel points around the shot image and the P point, namely the gray value of the pixel point in the 16 pixel points is lighter than the P point, the gray value of the pixel point in the 16 pixel points is similar to the P point, and the gray value of the pixel point in the 16 pixel points is darker than the P point. The determination can be made by equation 1:
Figure BDA0001739779870000091
wherein S isp→xThree states, namely d (darker), s (similar) and b (brighter), are obtained when the gray scale of the pixel points in the 16 pixel points is compared with the P point. I marks the gray value, T is a preset gray value threshold value, and can be adjusted according to the environment.
The attributes of the FAST corner can be described by a BRIEF descriptor. A circular area with a diameter of 31 may be selected around the FAST corner, N of the circular area is selected according to a predetermined rule to compare the gray values of the pixel points, and a 256-bit binary descriptor is generated according to the comparison result, and the descriptor may be used as a BRIEF descriptor of the FAST corner.
The captured image may be corrected before determining the relative position information of the object point corresponding to each feature point with respect to the movable device. The shot image has certain distortion, so that the shot image can be corrected to obtain an image in an ideal condition. Ideally, as shown in fig. 3, the camera coordinate systems of the pair of binocular cameras translate the origin from point O2 to point O1, and the coordinate systems do not rotate.
After the images of the ideal situation are obtained, for each pair of images shot by the binocular cameras, the FAST corner matched with the FAST corner in the image shot by the left camera can be determined in the image shot by the right camera. And the depth information of the FAST corner can be determined according to the baseline distance of the camera.
In the embodiment of the present disclosure, it may be considered that the right camera assists the left camera to determine depth information of a FAST corner in an image captured by the left camera, and the determination processing of pose change information is performed mainly based on the image captured by the left camera.
The camera coordinate systems can be respectively established by taking the left camera in each pair of binocular cameras as a standard. The camera coordinate system is a coordinate system established by taking the optical center of the camera as an origin, taking the optical axis as a Z axis, taking the line where the long edge of the shot image is located as an X axis and taking the line where the wide edge of the shot image is located as a Y axis. Thus, for the apparatus in fig. 2, three camera coordinate systems can be obtained. Among the three camera coordinate systems, the camera coordinate system established by the camera corresponding to the reference numeral 1 may be a body coordinate system.
Before step S110 is executed, the relative pose information between the camera coordinate systems other than the body coordinate system and the body coordinate system in the three camera coordinate systems may be calibrated in advance, which may also be referred to as external references, respectively denoted as T, of the camera coordinate systems other than the body coordinate system and the body coordinate system in the three camera coordinate systemsbl(camera corresponding to reference numeral 4) and Tbr(camera corresponding to reference numeral 5).
Then, the relative position information of the object point corresponding to each feature point relative to the corresponding three-dimensional image capturing component can be determined according to the three-dimensional position information of at least one feature point in the image captured by each three-dimensional image capturing component when the movable device is at the initial position. Finally, the relative position information of the real object point corresponding to each feature point relative to the reference three-dimensional image shooting component can be determined according to the relative position information of the real object point corresponding to each feature point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in at least two pre-stored three-dimensional image shooting components, and the relative position information of the real object point corresponding to each feature point relative to the reference three-dimensional image shooting component can be used as the relative position information of the real object point corresponding to each feature point.
The relative position information is relative position information of the obstacle with respect to the movable device in a case where the position and the posture of the movable device are present. The relative position information is, if the movable device is currently at the initial position, the relative position information of the obstacle with respect to the movable device at the initial position. The relative position information of the object point with respect to the movable device may be coordinate information of the object point in a coordinate system established with a preset point in the movable device as an origin, and may be coordinate information of the object point in a coordinate system established with an optical center of a reference three-dimensional image capturing unit of the movable device as an origin, a line on which an optical axis is located as a Z-axis, a lead straight line as a Y-axis, and a line perpendicular to both the Z-axis and the Y-axis as an X-axis.
In an application, the relative position information of the object point with respect to the movable device at the initial position a may indicate that the object point is 30 degrees and 20 meters in front of the left of the movable device. After the movable device moves to position B, the relative position information of the physical point with respect to the movable device at position B may indicate that the physical point is located 60 degrees 15 meters in front of the movable device on the right.
The at least two three-dimensional image capturing means may include a reference three-dimensional image capturing means (a camera corresponding to reference numeral 1) and a non-reference three-dimensional image capturing means (a camera corresponding to reference numeral 4 and reference numeral 5, respectively) other than the reference three-dimensional image capturing means.
First, the relative position information of the real object point corresponding to at least one feature point in the image captured by each non-reference three-dimensional image capturing component with respect to the non-reference three-dimensional image capturing component can be respectively determined according to the three-dimensional position information of at least one feature point in the image captured by each non-reference three-dimensional image capturing component in the initial position of the movable device, and the conversion relationship (also referred to as camera reference) between the three-dimensional position information of the pre-stored feature point corresponding to each non-reference three-dimensional image capturing component and the relative position information of the real object point with respect to the non-reference three-dimensional image capturing component.
Next, the relative position information (also referred to as map points) of the real object point corresponding to the at least one feature point in the image captured by each non-reference three-dimensional image capturing means with respect to the reference three-dimensional image capturing means may be determined based on the relative position information of the real object point corresponding to the at least one feature point in the image captured by each non-reference three-dimensional image capturing means with respect to the non-reference three-dimensional image capturing means and the relative pose information of each non-reference three-dimensional image capturing means with respect to the reference three-dimensional image capturing means (also referred to as external parameters of the camera coordinate system other than the body coordinate system with respect to the body coordinate system in the three camera coordinate systems). Wherein, the initial position of the reference three-dimensional image shooting component is the initial position of the movable equipment.
Finally, the relative position information (also referred to as map point) of the real object point corresponding to the at least one feature point in the image captured by the reference three-dimensional image capturing means with respect to the reference three-dimensional image capturing means may be determined based on the three-dimensional position information of the at least one feature point in the image captured by the reference three-dimensional image capturing means in the image at the initial position of the movable device, and the conversion relationship between the three-dimensional position information of the feature point corresponding to the reference three-dimensional image capturing means and the relative position information of the real object point with respect to the reference three-dimensional image capturing means, which are stored in advance.
Based on the following formula, the relative position information (also referred to as map points) of the object point corresponding to each feature point with respect to the movable device can be determined.
Wherein,coordinates of map points in the world coordinate system. K is in the camera corresponding to different three-dimensional image shooting componentsThe contents of ginseng (Panax ginseng C.A.),
Figure BDA0001739779870000122
is a FAST corner, and Z is depth information of the FAST corner.
The world coordinate system is a body coordinate system of the movable equipment when the movable equipment is started, the world coordinate system is fixed, and the body coordinate system can translate and rotate along with the movement of the movable equipment.
Through camera internal reference, the position information of the pixel points in the shot image can be converted into the position information of the real object points under the camera coordinate system. The camera internal parameters comprise intrinsic attribute parameters of the camera such as the shooting focal length of the camera.
And after obtaining a plurality of map points, finishing map building. The position information of different map points in the world coordinate system and the corresponding BRIEF descriptors are recorded in the map. It should be noted that the number of map points is determined according to the distance moved by the mobile device. And continuously shooting a new image during the moving process of the movable equipment, and continuously matching map points with the feature points in the shot new image. And when the number of the map points matched with the feature points in the shot new image is less than a preset number threshold, reestablishing the map, and generating new map points. The map points that have been currently matched no longer participate in the process of generating new map points.
Since the generation of the map points is related to the frame number of the photographed image, the frame number of the image may be recorded corresponding to the map points. In practical applications, for example, when the frame number of the currently captured image is N, and the matching process is performed, the inserted map points are generated by using the image with the frame number N-T, and the projection process of the map points in the currently captured image is performed. Wherein, T is a fixed value and can be adjusted according to specific conditions.
Step S120, during the movement of the movable apparatus, determines three-dimensional position information of the feature point included in each image in the image currently captured by each three-dimensional image capturing section.
The three-dimensional position information of the feature point included in each image may be determined in the image currently captured by each three-dimensional image capturing section at a preset cycle during the movement of the movable apparatus.
The three-dimensional position information of the pixel points in the image can be converted into the position information of the physical points in the camera coordinate system through the camera internal reference. For the apparatus in fig. 2, the position information of the physical point corresponding to the pixel point in the image captured by the camera corresponding to the reference numeral 4 and the camera corresponding to the reference numeral 5 can be converted into the camera coordinate system of the camera corresponding to the reference numeral 4 and the camera coordinate system of the camera corresponding to the reference numeral 5 through the camera internal reference of the camera corresponding to the reference numeral 4 and the camera internal reference of the camera corresponding to the reference numeral 5. As shown in fig. 4, the relative position information between different three-dimensional image capturing components may be external parameters of the camera coordinate systems other than the body coordinate system with respect to the body coordinate system, which are respectively denoted as Tbl(camera corresponding to reference numeral 4) and Tbr(camera corresponding to reference numeral 5). Can be obtained by external reference TblAnd TbrAnd converting the position information of the physical point under the camera coordinate system of the camera corresponding to the reference numeral 4 and the position information of the physical point under the camera coordinate system of the camera corresponding to the reference numeral 5 into the position information of the physical point under the body coordinate system. In this way, the position information of the object point corresponding to the images captured by the three different three-dimensional image capturing means can be converted into the position information of the object point in the body coordinate system. Further, the position information of the object point in the body coordinate system can be converted into the position information of the object point in the world coordinate system by the pose change information.
The method provided by the embodiment of the disclosure further comprises the following steps: the method includes the steps of determining target feature points matched with image features of any feature point in at least one feature point from feature points included in an image currently captured by each three-dimensional image capturing component, and determining three-dimensional position information of each target feature point.
Since the current pose change information is not determined at present, a map point (position information of an object point in the world coordinate system) can be projected into the currently shot image by means of the last determined pose change information, and a projection point is obtained in the currently shot image. Selecting a target feature point m of which the feature vector distance with the BRIEF descriptor of the map point is smaller than a preset distance threshold value from BRIEF descriptors corresponding to FAST corners contained in the field of the projection points, wherein the target feature point m is a projection point matched with the map point. And determining three-dimensional position information of each target characteristic point.
As shown in FIG. 4, if there are 3 map points P1、P2、PnThe matched projection point is m1、m2、mnThen, according to the map point P, the following formula can be used1、P2、PnAnd other relevant parameters, to determine m1、m2、mnThree-dimensional position information of (2).
m1=K1T-1P1;m2=K2Tbl -1T-1P2;m3=K3Tbr -1T-1P3(formula 3)
Wherein K is a camera internal parameter, T, corresponding to different camerasblAnd TbrAnd T is the external reference, and the pose change information is determined last time. The last time of determining the pose change information is only the estimated pose change information and cannot be directly used as the current pose change information.
And step S130, determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
First, three-dimensional position information of each feature point with respect to an image currently captured by the reference three-dimensional image capturing section may be determined based on the three-dimensional position information of the currently determined feature point and prestored relative pose information between the reference three-dimensional image capturing section and other three-dimensional image capturing sections. Then, the pose change information of the movable device relative to the initial position can be determined according to the relative position information of the object point corresponding to each feature point relative to the reference three-dimensional image capturing part and the three-dimensional position information of each feature point relative to the image currently captured by the reference three-dimensional image capturing part.
The at least two three-dimensional image capturing means include a reference three-dimensional image capturing means and a non-reference three-dimensional image capturing means other than the reference three-dimensional image capturing means.
First, it is possible to determine three-dimensional position information of a feature point included in an image currently captured by each non-reference three-dimensional image capturing section in the image and three-dimensional position information of a feature point included in an image currently captured by the reference three-dimensional image capturing section in the image during movement of the movable apparatus.
Then, three-dimensional position information of the feature point included in the image currently captured by each non-reference three-dimensional image capturing means with respect to the image currently captured by the reference three-dimensional image capturing means may be determined based on three-dimensional position information of the feature point included in the image currently captured by each non-reference three-dimensional image capturing means and relative pose information of each non-reference three-dimensional image capturing means with respect to the reference three-dimensional image capturing means, respectively.
Finally, the pose change information of the movable apparatus relative to the initial position can be determined based on the relative position information corresponding to the real-object point, the three-dimensional position information of the feature point included in the image currently captured by each non-reference three-dimensional image capturing means relative to the image currently captured by the reference three-dimensional image capturing means, and the three-dimensional position information of the feature point included in the image currently captured by the reference three-dimensional image capturing means in the image.
Alternatively, the three-dimensional position information of each target feature point may be determined by determining a target feature point that matches an image feature of any feature point of the at least one feature point, among feature points included in the image currently captured by each three-dimensional image capturing means. And then determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, if the number of the target feature points is greater than or equal to a preset number threshold, determining pose change information of the movable device relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target feature point and pre-stored relative pose information between different three-dimensional image shooting components. And if the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
And continuously shooting a new image during the moving process of the movable equipment, and continuously matching map points with the feature points in the shot new image. And when the number of the map points matched with the feature points in the shot new image is less than a preset number threshold, reestablishing the map, and generating new map points. If there are currently map points that have already been matched, they are no longer involved in the process of generating new map points.
The current pose change information can be determined according to equation 4.
min[||(m1-K1T-1P1)||2+||(m2-K2Tbl -1T-1P2)||2+||(m3-K3Tbr -1T-1P3)||2](formula 4)
In the above formula, T is unknown, and is the current pose change information, and the remaining parameters are known. The value of T can be optimized in an iterative manner, so that the final result of formula 4 is the minimum under a certain value of T, namely, T which can enable the error between the three-dimensional position information of the target feature point matched with the map point and the three-dimensional position information of the projection point obtained by theoretical calculation through formula 3 to be the minimum is determined, and the T is the current pose change information.
By the method provided by the embodiment of the disclosure, the pose change information of the movable device relative to the initial position can be determined through the images shot by the at least two three-dimensional image shooting components. Even if one of the three-dimensional image taking sections is affected by strong light or other factors, the information on the change in the attitude of the movable device with respect to the initial position can be determined by means of the images taken by the remaining three-dimensional image taking sections. Furthermore, under severe environmental conditions, the pose change information can be reliably determined.
Yet another exemplary embodiment of the present disclosure provides an apparatus for determining pose change information of a movable device including at least two three-dimensional image capturing sections, as shown in fig. 5, the apparatus including:
a determining module 510, configured to determine, according to three-dimensional position information of at least one feature point in an image captured by each three-dimensional image capturing component in the initial position of the mobile device in the image, relative position information of a real object point corresponding to each feature point with respect to the mobile device;
the determining module 510 is further configured to determine three-dimensional position information of a feature point included in each image in the image currently captured by each three-dimensional image capturing component during the movement of the mobile device;
the determining module 510 is further configured to determine pose change information of the mobile device relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of the currently determined feature point, and pre-stored relative pose information between different three-dimensional image capturing components.
Optionally, the determining module 510 is configured to:
determining the relative position information of a real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
and determining the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component according to the relative position information of the real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in the at least two three-dimensional image shooting components, wherein the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component is used as the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment.
Optionally, the determining module 510 is configured to:
determining three-dimensional position information of each feature point relative to the image currently shot by the reference three-dimensional image shooting component according to the currently determined three-dimensional position information of the feature point and pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information of the object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component and the three-dimensional position information of each characteristic point relative to the image currently shot by the reference three-dimensional image shooting component.
Optionally, the determining module 510 is further configured to:
determining target feature points matched with the image features of any feature point in the at least one feature point from the feature points contained in the image shot by each three-dimensional image shooting component at present, and determining three-dimensional position information of each target feature point;
and determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining module 510 is configured to:
and when the number of the target characteristic points is greater than or equal to a preset number threshold, determining pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and pre-stored relative pose information among different three-dimensional image shooting components.
Optionally, the determining module 510 is further configured to:
and when the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
By the aid of the device, the pose change information of the movable equipment relative to the initial position can be determined through images shot by the at least two three-dimensional image shooting components. Even if one of the three-dimensional image taking sections is affected by strong light or other factors, the information on the change in the attitude of the movable device with respect to the initial position can be determined by means of the images taken by the remaining three-dimensional image taking sections. Furthermore, under severe environmental conditions, the pose change information can be reliably determined.
It should be noted that: in the apparatus for determining pose change information of a mobile device according to the foregoing embodiment, when determining pose change information of a mobile device, only the above-mentioned division of the function modules is illustrated, and in practical applications, the above-mentioned function distribution may be completed by different function modules according to needs, that is, the internal structure of the server is divided into different function modules, so as to complete all or part of the above-mentioned functions. In addition, the apparatus for determining pose change information of a mobile device and the method embodiment for determining pose change information of a mobile device provided in the above embodiments belong to the same concept, and specific implementation processes thereof are described in the method embodiment and are not described herein again.
Yet another exemplary embodiment of the present disclosure provides a system to determine pose change information of a movable device, the system including a server and the movable device, the movable device including at least two three-dimensional image capturing sections, wherein:
the mobile device is used for shooting images through each three-dimensional image shooting component every time a preset period is reached, and sending the images shot by each three-dimensional image shooting component to the server;
the server is used for determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position; determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device; and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
With regard to the system in the above embodiment, the specific manner in which the mobile device and the server perform operations has been described in detail in the embodiment related to the method, and will not be elaborated here.
Fig. 6 shows a schematic structural diagram of a server 1900 provided in an exemplary embodiment of the present disclosure. The server 1900 may have a large difference due to different configurations or performances, and may include one or more processors (CPUs) 1910 and one or more memories 1920. Stored in the memory 1920 is at least one instruction that is loaded and executed by the processor 1910 to implement the method for determining pose change information of a movable device described in the above embodiments.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (13)

1. A method of determining pose change information of a movable device, the movable device including at least two three-dimensional image capturing components, the method comprising:
determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
2. The method according to claim 1, wherein the determining the relative position information of the object point corresponding to each feature point with respect to the movable device according to the three-dimensional position information of at least one feature point in the image captured by each three-dimensional image capturing means when the movable device is at the initial position in the image comprises:
determining the relative position information of a real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
and determining the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component according to the relative position information of the real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in the at least two three-dimensional image shooting components, wherein the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component is used as the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment.
3. The method according to claim 2, wherein the determining pose change information of the movable device relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined feature point and the pre-stored relative pose information between different three-dimensional image capturing means comprises:
determining three-dimensional position information of each feature point relative to the image currently shot by the reference three-dimensional image shooting component according to the currently determined three-dimensional position information of the feature point and pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information of the object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component and the three-dimensional position information of each characteristic point relative to the image currently shot by the reference three-dimensional image shooting component.
4. The method of claim 1, further comprising:
determining target feature points matched with the image features of any feature point in the at least one feature point from the feature points contained in the image shot by each three-dimensional image shooting component at present, and determining three-dimensional position information of each target feature point;
the determining pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components comprises the following steps:
and determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
5. The method according to claim 4, wherein the determining pose change information of the movable device relative to the initial position according to the relative position information corresponding to the determined real object points, the three-dimensional position information of each target feature point and the pre-stored relative pose information between different three-dimensional image capturing means comprises:
and if the number of the target characteristic points is greater than or equal to a preset number threshold, determining pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and pre-stored relative pose information among different three-dimensional image shooting components.
6. The method of claim 5, further comprising:
and if the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
7. An apparatus that determines pose change information of a movable device, the movable device including at least two three-dimensional image capturing means, the apparatus comprising:
the determining module is used for determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
the determining module is further used for determining three-dimensional position information of a feature point contained in each image in the image currently shot by each three-dimensional image shooting component in the moving process of the movable equipment;
the determining module is further configured to determine pose change information of the movable device relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined feature point, and pre-stored relative pose information between different three-dimensional image capturing components.
8. The apparatus of claim 7, wherein the determining module is configured to:
determining the relative position information of a real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position;
and determining the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component according to the relative position information of the real object point corresponding to each characteristic point relative to the corresponding three-dimensional image shooting component and the pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components in the at least two three-dimensional image shooting components, wherein the relative position information of the real object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component is used as the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment.
9. The apparatus of claim 8, wherein the determining module is configured to:
determining three-dimensional position information of each feature point relative to the image currently shot by the reference three-dimensional image shooting component according to the currently determined three-dimensional position information of the feature point and pre-stored relative pose information between the reference three-dimensional image shooting component and other three-dimensional image shooting components;
and determining the pose change information of the movable equipment relative to the initial position according to the relative position information of the object point corresponding to each characteristic point relative to the reference three-dimensional image shooting component and the three-dimensional position information of each characteristic point relative to the image currently shot by the reference three-dimensional image shooting component.
10. The apparatus of claim 7, wherein the determining module is further configured to:
determining target feature points matched with the image features of any feature point in the at least one feature point from the feature points contained in the image shot by each three-dimensional image shooting component at present, and determining three-dimensional position information of each target feature point;
and determining the pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
11. The apparatus of claim 10, wherein the determining module is configured to:
and when the number of the target characteristic points is greater than or equal to a preset number threshold, determining pose change information of the movable equipment relative to the initial position according to the determined relative position information corresponding to the real object point, the three-dimensional position information of each target characteristic point and pre-stored relative pose information among different three-dimensional image shooting components.
12. The apparatus of claim 11, wherein the determining module is further configured to:
and when the number of the target characteristic points is smaller than a preset number threshold, setting the current position as an initial position, and determining the relative position information of the real object point corresponding to each characteristic point relative to the movable equipment according to the three-dimensional position information of at least one characteristic point in the image shot by each three-dimensional image shooting component in the image when the movable equipment is at the reset initial position.
13. A system that determines pose change information of a movable device, the system comprising a server and the movable device, the movable device comprising at least two three-dimensional image capture components, wherein:
the mobile device is used for shooting images through each three-dimensional image shooting component every time a preset period is reached, and sending the images shot by each three-dimensional image shooting component to the server;
the server is used for determining the relative position information of a real object point corresponding to each feature point relative to the movable equipment according to the three-dimensional position information of at least one feature point in the image shot by each three-dimensional image shooting component when the movable equipment is at the initial position; determining three-dimensional position information of a feature point contained in each image in the image currently captured by each three-dimensional image capturing element during the movement of the movable device; and determining the pose change information of the movable equipment relative to the initial position according to the relative position information corresponding to the determined real object point, the three-dimensional position information of the currently determined characteristic point and the pre-stored relative pose information among different three-dimensional image shooting components.
CN201810813571.5A 2018-07-23 2018-07-23 Method, device and system for determining pose change information of movable equipment Pending CN110750094A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810813571.5A CN110750094A (en) 2018-07-23 2018-07-23 Method, device and system for determining pose change information of movable equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810813571.5A CN110750094A (en) 2018-07-23 2018-07-23 Method, device and system for determining pose change information of movable equipment

Publications (1)

Publication Number Publication Date
CN110750094A true CN110750094A (en) 2020-02-04

Family

ID=69275177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810813571.5A Pending CN110750094A (en) 2018-07-23 2018-07-23 Method, device and system for determining pose change information of movable equipment

Country Status (1)

Country Link
CN (1) CN110750094A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111780764A (en) * 2020-06-30 2020-10-16 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map
CN113298879A (en) * 2021-05-26 2021-08-24 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment
CN112013850B (en) * 2020-10-16 2021-11-19 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103988226A (en) * 2011-08-31 2014-08-13 Metaio有限公司 Method for estimating camera motion and for determining three-dimensional model of real environment
CN105825518A (en) * 2016-03-31 2016-08-03 西安电子科技大学 Sequence image rapid three-dimensional reconstruction method based on mobile platform shooting
WO2017007254A1 (en) * 2015-07-08 2017-01-12 고려대학교 산학협력단 Device and method for generating and displaying 3d map
CN106713773A (en) * 2017-03-31 2017-05-24 联想(北京)有限公司 Shooting control method and electronic device
CN106813672A (en) * 2017-01-22 2017-06-09 深圳悉罗机器人有限公司 The air navigation aid and mobile robot of mobile robot
JP2017134617A (en) * 2016-01-27 2017-08-03 株式会社リコー Position estimation device, program and position estimation method
CN107330917A (en) * 2017-06-23 2017-11-07 歌尔股份有限公司 The track up method and tracking equipment of mobile target
CN208752459U (en) * 2018-07-23 2019-04-16 杭州海康威视数字技术股份有限公司 Movable equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103988226A (en) * 2011-08-31 2014-08-13 Metaio有限公司 Method for estimating camera motion and for determining three-dimensional model of real environment
WO2017007254A1 (en) * 2015-07-08 2017-01-12 고려대학교 산학협력단 Device and method for generating and displaying 3d map
JP2017134617A (en) * 2016-01-27 2017-08-03 株式会社リコー Position estimation device, program and position estimation method
CN105825518A (en) * 2016-03-31 2016-08-03 西安电子科技大学 Sequence image rapid three-dimensional reconstruction method based on mobile platform shooting
CN106813672A (en) * 2017-01-22 2017-06-09 深圳悉罗机器人有限公司 The air navigation aid and mobile robot of mobile robot
CN106713773A (en) * 2017-03-31 2017-05-24 联想(北京)有限公司 Shooting control method and electronic device
CN107330917A (en) * 2017-06-23 2017-11-07 歌尔股份有限公司 The track up method and tracking equipment of mobile target
CN208752459U (en) * 2018-07-23 2019-04-16 杭州海康威视数字技术股份有限公司 Movable equipment

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111780764A (en) * 2020-06-30 2020-10-16 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map
CN111780764B (en) * 2020-06-30 2022-09-02 杭州海康机器人技术有限公司 Visual positioning method and device based on visual map
CN112013850B (en) * 2020-10-16 2021-11-19 北京猎户星空科技有限公司 Positioning method, positioning device, self-moving equipment and storage medium
CN113298879A (en) * 2021-05-26 2021-08-24 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment
CN113298879B (en) * 2021-05-26 2024-04-16 北京京东乾石科技有限公司 Visual positioning method and device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
CN108765498B (en) Monocular vision tracking, device and storage medium
US10085011B2 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
Carrera et al. SLAM-based automatic extrinsic calibration of a multi-camera rig
EP3886053A1 (en) Slam mapping method and system for vehicle
CN112444242B (en) Pose optimization method and device
US20170127045A1 (en) Image calibrating, stitching and depth rebuilding method of a panoramic fish-eye camera and a system thereof
JP2019536170A (en) Virtually extended visual simultaneous localization and mapping system and method
CN109816730A (en) Workpiece grabbing method, apparatus, computer equipment and storage medium
JP2017108387A (en) Image calibrating, stitching and depth rebuilding method of panoramic fish-eye camera and system thereof
CN103839227B (en) Fisheye image correcting method and device
KR20160116075A (en) Image processing apparatus having a function for automatically correcting image acquired from the camera and method therefor
CN207766424U (en) A kind of filming apparatus and imaging device
CN111737518A (en) Image display method and device based on three-dimensional scene model and electronic equipment
CN109635639B (en) Method, device, equipment and storage medium for detecting position of traffic sign
CN110750094A (en) Method, device and system for determining pose change information of movable equipment
CN110490943B (en) Rapid and accurate calibration method and system of 4D holographic capture system and storage medium
CN109902675B (en) Object pose acquisition method and scene reconstruction method and device
WO2021195939A1 (en) Calibrating method for external parameters of binocular photographing device, movable platform and system
CN113329179B (en) Shooting alignment method, device, equipment and storage medium
CN114943773A (en) Camera calibration method, device, equipment and storage medium
CN110544278B (en) Rigid body motion capture method and device and AGV pose capture system
CN110602376B (en) Snapshot method and device and camera
CN110119189A (en) The initialization of SLAM system, AR control method, device and system
CN114004935A (en) Method and device for three-dimensional modeling through three-dimensional modeling system
CN111353945B (en) Fisheye image correction method, device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination