[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

CN116182807B - Gesture information determining method, device, electronic equipment, system and medium - Google Patents

Gesture information determining method, device, electronic equipment, system and medium Download PDF

Info

Publication number
CN116182807B
CN116182807B CN202310444030.0A CN202310444030A CN116182807B CN 116182807 B CN116182807 B CN 116182807B CN 202310444030 A CN202310444030 A CN 202310444030A CN 116182807 B CN116182807 B CN 116182807B
Authority
CN
China
Prior art keywords
optical
target
determining
air bearing
plane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310444030.0A
Other languages
Chinese (zh)
Other versions
CN116182807A (en
Inventor
朱有菊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Huilang Times Technology Co Ltd
Original Assignee
Beijing Huilang Times Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Huilang Times Technology Co Ltd filed Critical Beijing Huilang Times Technology Co Ltd
Priority to CN202310444030.0A priority Critical patent/CN116182807B/en
Publication of CN116182807A publication Critical patent/CN116182807A/en
Application granted granted Critical
Publication of CN116182807B publication Critical patent/CN116182807B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/757Matching configurations of points or features
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C1/00Measuring angles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Biomedical Technology (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Molecular Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Biophysics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • User Interface Of Digital Computer (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a method, a device, electronic equipment, a system and a medium for determining gesture information. The method for determining the attitude information comprises the steps of acquiring an optical image which is acquired by an optical camera on a triaxial air bearing table and comprises an optical target, wherein the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera; determining at least a first number of target optical target areas within the optical image; determining at least a second number of matching point pairs corresponding to each target optical target region in the target optical target regions, the first number being greater than or equal to the second number; and determining the posture information of the triaxial air bearing table based on the coordinate information of at least the second number of matching point pairs. The problem of the triaxial air bearing table angle measurement method is solved, and the complexity of the triaxial air bearing table angle measurement is reduced.

Description

Gesture information determining method, device, electronic equipment, system and medium
Technical Field
The present invention relates to the field of computer technologies, and in particular, to a method, an apparatus, an electronic device, a system, and a medium for determining pose information.
Background
The air bearing table is an indispensable simulation device for the space vehicle to perform full physical simulation, the core problem of the full physical simulation of the satellite control system is to simulate the attitude motion of a satellite on the ground under the condition of on-orbit weightlessness, the attitude information of the satellite relative to a certain coordinate system needs to be dynamically measured through an attitude sensor in the satellite attitude control process, the attitude information is fed back to an attitude control computer, a control instruction is formed after analysis and calculation, and an executing mechanism is driven to realize satellite attitude control. Therefore, in order to simulate the attitude control process of the satellite, accurate measurement of the rotation angle of the air bearing table is required.
At present, the three-degree-of-freedom air bearing table becomes main ground simulation equipment, and the existing three-axis air bearing table angle measurement method needs to be added with a complex mechanical system and a sensor system, and is complex in structure. Therefore, how to reduce the complexity of the triaxial air bearing table angle measurement is a technical problem to be solved.
Disclosure of Invention
The invention provides a method, a device, electronic equipment, a system and a medium for determining attitude information, which are used for reducing the complexity of angle measurement of a triaxial air bearing table.
According to an aspect of the present invention, there is provided a posture information determining method including:
Acquiring an optical image which is acquired by an optical camera on a triaxial air bearing table and comprises an optical target, wherein the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera;
determining at least a first number of target optical target areas within the optical image;
determining at least a second number of matching point pairs corresponding to each target optical target region in the target optical target regions, the first number being greater than or equal to the second number;
determining attitude information of the triaxial air bearing table based on coordinate information of at least a second number of matching point pairs;
the matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment.
According to another aspect of the present invention, there is provided an attitude information determination apparatus including: the acquisition module is used for acquiring an optical image which is acquired by an optical camera on the triaxial air bearing table and comprises an optical target, wherein the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera;
A first determining module for determining at least a first number of target optical target areas within the optical image;
a second determining module, configured to determine at least a second number of matching point pairs corresponding to each target optical target area in the target optical target areas, where the first number is greater than or equal to the second number;
the third determining module is used for determining the posture information of the triaxial air bearing table based on the coordinate information of at least a second number of matching point pairs;
the matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment
According to another aspect of the present invention, there is provided an electronic apparatus including:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the pose information determination method according to any of the embodiments of the present invention.
According to another aspect of the present invention, there is provided a posture information determination system including:
the device comprises an optical target, an optical camera, a triaxial air bearing table and electronic equipment;
the optical camera is arranged on a suspension table top of the triaxial air bearing table, and the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera;
the optical camera is used for acquiring an optical image comprising the optical target;
the marker points included in the optical targets are located in at least two planes;
the electronic equipment is used for executing the gesture information determining method according to any embodiment of the invention.
According to another aspect of the present invention, there is provided a computer readable storage medium storing computer instructions for causing a processor to implement the pose information determination method according to any embodiment of the present invention when executed.
According to the technical scheme, an optical image which is acquired through an optical camera on a triaxial air bearing table and comprises an optical target is acquired, and the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera; determining at least a first number of target optical target areas within the optical image; determining at least a second number of matching point pairs corresponding to each target optical target region in the target optical target regions, the first number being greater than or equal to the second number; based on the coordinate information of at least the second number of matching point pairs, the posture information of the triaxial air bearing table is determined only through the optical camera and the optical target, the problem that an angle measuring method of the triaxial air bearing table is complex is solved, and the angle measuring complexity of the triaxial air bearing table is reduced.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the invention or to delineate the scope of the invention. Other features of the present invention will become apparent from the description that follows.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present invention, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of a method for determining pose information according to a first embodiment of the present invention;
fig. 2 is a flowchart of a gesture information determining method according to a second embodiment of the present invention;
fig. 3 is a schematic view of an arrangement scenario of an optical target according to a second embodiment of the present invention;
fig. 4 is a schematic structural view of an attitude information determining apparatus according to a third embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device implementing a method for determining pose information according to an embodiment of the present invention.
Detailed Description
In order that those skilled in the art will better understand the present invention, a technical solution in the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings in which it is apparent that the described embodiments are only some embodiments of the present invention, not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the present invention without making any inventive effort, shall fall within the scope of the present invention.
It should be noted that the terms "first," "second," "target," and "candidate" in the description and claims of the present invention and the above figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the invention described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
The existing three-axis air bearing table angle measuring method mainly comprises an angle measuring method based on a Charge-coupled Device (CCD) detector and an induction synchronizer, a three-axis air bearing table single-frame servo angle measuring method, a non-contact three-axis air bearing table attitude measuring scheme and the like, wherein the methods all need to be added with a complex mechanical system and a sensor system, have a complex structure, and simultaneously reduce the available load of the air bearing table.
The existing three-axis air bearing table indoor attitude determination method has the following defects: (1) The calculated gesture information is limited, and three degrees of freedom gesture calculation cannot be realized; (2) The attitude measurement needs to be added with complex mechanical loading, has a complex structure and low calculation accuracy; (3) The complex experimental environment, namely the measuring environment, needs to be constructed, the construction period is long, and the cost is high.
Therefore, in order to realize the determination of the posture of the triaxial air bearing table, the invention provides the following technical scheme.
Example 1
Fig. 1 is a flowchart of a method for determining pose information of a three-axis air bearing table according to a first embodiment of the present invention, where the method may be performed by a pose information determining device, which may be implemented in hardware and/or software, and the pose information determining device may be configured in an electronic device. The electronic device may be a mobile phone, a computer, a server or other devices capable of performing data processing. As shown in fig. 1, the method includes:
S110, acquiring an optical image which is acquired by an optical camera on the triaxial air bearing table and comprises an optical target.
The three-axis air bearing table is also called as a three-degree-of-freedom air bearing table, and the air bearing table comprises a table top which is suspended by an air bearing device. The table top is a three-degree-of-freedom table top and has a pitch angle theta, a roll angle gamma and a yaw angle phi.
An optical camera is a remote sensing device that is limited by the optical convergence unit lens group and the spectral directional stress of the photosensitive film. The bandpass filter in front of the lens group selects the wavelength band that can expose the film through the lens group. The optical camera is arranged on the triaxial air bearing table, so that the determination of the self-posture information can be realized through the optical camera arranged on the triaxial air bearing table.
The optical target is arranged outside the triaxial air bearing table in the field angle range of the optical camera, so that the optical camera can acquire an optical image comprising the optical target. The optical target is a calibration device for detecting the tracking performance and the measurement precision of the optical measurement equipment indoors, and the three-axis air bearing table can be determined by the optical target and the optical camera.
The number of optical targets may be at least a first number. The first number is not limited, and may be 5 or 9. The optical targets may be distributed in at least two planes. The number of planes in which the optical targets are distributed is not limited herein.
The distribution of the optical targets is not limited as long as matching of the feature points in the image and the optical targets in the measurement environment is facilitated. The feature points in the image may be considered as what is presented by the optical target in the image, and the feature points may be used to reflect the optical target in the measurement environment.
An optical image may be considered an image captured by an optical camera that includes an optical target.
The method can acquire the optical image acquired by the optical camera, and can realize the determination of the three-axis air bearing table posture information by analyzing the optical image.
S120, determining at least a first number of target optical target areas within the optical image.
The target optical target region may be considered as the region of the optical target in the image. Each optical target in the measuring environment of the triaxial air bearing table can be provided with a target optical target area in the optical image. At least a first number of optical targets may correspond to at least a first number of optical target regions of interest.
The step can perform feature analysis on the optical image to extract target optical target areas included in the optical image, and if the optical image includes 9 optical targets, the 9 target optical target areas can be extracted from the optical image.
This step does not define how the optical image is analyzed. If the optical image can be identified directly by a machine learning technology, a target optical target area is obtained; candidate optical target regions may also be initially extracted from the optical image using a maximum stable extremum region algorithm. The target optical target region is then filtered out of the candidate optical target regions by a convolutional neural network to achieve noise removal.
Wherein the candidate optical target region may be considered a candidate target optical target region of interest. The candidate optical target regions may include target optical target regions and may also include misrecognized regions. And removing the misrecognized region in the candidate optical target region to obtain the target optical target region.
S130, determining matching point pairs corresponding to each target optical target area in at least a second number of target optical target areas.
The first number is greater than or equal to the second number. The second number may be 5. The second number is not limited herein as long as the pose information can be determined based on the second number of target optical target areas. The attitude information can be used for reflecting information of the three-axis air bearing table, such as the pitch angle theta, the roll angle gamma and the yaw angle psi of the three-axis air bearing table.
The matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment.
The matching point pair may be regarded as a point pair formed by a feature point in the image corresponding to each target optical target region and a feature point in the measurement environment. The first feature point may be considered as a feature point of the optical image corresponding to the optical target region of interest, i.e. the optical target at the optical target region of interest in the optical image. The second feature point may be considered as a feature point of the measurement environment corresponding to the target optical target region, i.e. the optical target corresponding to the target optical target region in the measurement environment.
The measurement environment can be regarded as an environment in which three-axis air bearing table posture information measurement is performed.
The three-axis air bearing table can be placed in the measuring environment, and the optical camera can be arranged on the three-axis air bearing table. An optical target may be disposed within the field angle of the optical camera. Such as an optical camera, is placed on the upper surface of a suspended table surface of a triaxial air bearing table. The optical target is disposed on the indoor rooftop.
In this embodiment, after determining at least a first number of target optical target areas, at least a second number of target optical targets may be selected from the at least first number of target optical target areas, to determine corresponding matching point pairs; at least a first number of target optical target regions may also be determined, with each target optical target region corresponding pairs of matching points. At least a second number of pairs of matching points is then selected from the determined pairs of matching points.
And S140, determining the posture information of the triaxial air bearing table based on the coordinate information of at least the second number of matching point pairs.
After determining at least the second number of matching point pairs, the coordinate information of each matching point pair may be determined, where the coordinate information includes first coordinate information corresponding to a first feature point in the matching point pair and second coordinate information corresponding to a second feature point. The first coordinate information may be regarded as coordinates of the first feature point in the image coordinate system to which the optical image corresponds. The second coordinate information may be regarded as coordinates of a second feature point in a world coordinate system of the measurement environment in which the three-axis air bearing table is located.
After determining the coordinate information of at least the second number of matching point pairs, the step may determine pose information of the three-axis air bearing table based on the determined coordinate information.
The specific means of determination is not limited herein, as the pose matrix may be determined based on the coordinate information, and then the pose information may be determined based on the pose matrix. The gesture matrix may be considered a matrix for characterizing a gesture. The pose matrix may be determined based on camera parameters of the optical video camera and the determined coordinate information.
According to the technical scheme provided by the embodiment, an optical image which is acquired by an optical camera on a triaxial air bearing table and comprises an optical target is acquired, wherein the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera; determining at least a first number of target optical target areas within the optical image; determining at least a second number of matching point pairs corresponding to each target optical target region in the target optical target regions, the first number being greater than or equal to the second number; based on the coordinate information of at least the second number of matching point pairs, the posture information of the triaxial air bearing table is determined only through the optical camera and the optical target, the problem that an angle measuring method of the triaxial air bearing table is complex is solved, and the angle measuring complexity of the triaxial air bearing table is reduced.
In one embodiment, determining pose information for the tri-axis air bearing table based on coordinate information for at least a second number of matching point pairs comprises:
determining first coordinate information under an image coordinate system corresponding to the first feature point and second coordinate information under a world coordinate system corresponding to the second feature point;
and determining the posture information of the triaxial air bearing table based on the first coordinate information, the second coordinate information and the camera parameters of the optical video camera.
Each first feature point corresponds to a target optical target area, and the target optical target area can be used as first coordinate information and also can be used for determining more accurate first coordinate information.
The origin and coordinate axes of the world coordinate system may be defined based on the three-axis air bearing table, and after the world coordinate system is defined, second coordinate information of the second feature point in the world coordinate system may be determined.
After the first coordinate information and the second coordinate information are determined, the gesture matrix can be determined by combining the camera parameters, so that the gesture information is determined.
Camera parameters may be used to reflect parameters of the optical camera such as focal length, pixel size and coordinates of the principal point of the image in the image coordinate system.
In determining the pose matrix, the first matrix and the second matrix may be determined based on the first coordinate information and the second coordinate information, and the first matrix may be regarded as a matrix reflected by the camera parameters, the first coordinate information, and the second coordinate information. The second matrix may be considered as a matrix characterized by camera parameters, first coordinate information and the relative distance between the planes in which the optical targets lie.
In one embodiment, the determining pose information of the tri-axis air bearing table based on the first coordinate information, the second coordinate information, and camera parameters of the optical video camera includes:
determining a first matrix and a second matrix based on the first coordinate information, the second coordinate information, and camera parameters of the optical video camera;
determining a pose matrix based on the first matrix and the second matrix;
determining the posture information of the triaxial air bearing table based on the posture matrix;
wherein the first matrix is:
dX and dY are the dimensions of the picture elements, f is the focal length, < >>,(u 0 , v 0 ) Is the coordinates of the principal point of the image in the image coordinate system, (u) i ,v i ) For the first coordinate information, (x) wi, y wi , z wi ) For a second coordinate information, the second number is 5;
the second matrix is:
H is the distance between a first plane and a second plane, and each optical target in the optical targets is distributed in the first plane and the second plane.
E.g., a first number of 9, wherein 8 optical targets lie in a first plane. The 1 optical target is located in the second plane.
When determining the gesture matrix, the first matrix and the second matrix can be subjected to mathematical operation to obtain the gesture matrix. The calculation means are not limited here.
After determining the pose matrix, the pose information may be determined by the following formula:
in one embodiment, the number of the optical targets is at least a first number, at least one optical target is located in a second plane, the optical targets except the optical targets located in the second plane are located in the first plane, the optical targets in the first plane are arranged in a circular shape, the optical targets in the first plane are not completely symmetrical, and the optical targets in the second plane are located on the circular central vertical line in the second plane.
The optical targets in the first plane are not perfectly symmetrical, so that the determination of matching point pairs is more convenient.
Illustratively, one optical target is located in the second plane and 8 optical targets are located in the first plane. The number of optical targets was 9.
The optical targets in this embodiment may be distributed in a first plane and a second plane.
In one embodiment, the determining the pose matrix based on the first matrix and the second matrix includes:
and carrying out the following operation on the first matrix and the second matrix to obtain a posture matrix:
c is the gesture matrix.
After determining the first matrix and the second matrix, the first matrix and the second matrix may be processedAnd calculating to obtain an attitude matrix.
In one embodiment, the determining at least a first number of target optical target areas within the optical image comprises:
determining candidate optical target regions within the optical image;
filtering from within the candidate optical target region to obtain a target optical target region.
In recognizing the optical image, a region obtained as a result of the first recognition may be determined as a candidate optical target region. And then filtering the candidate optical target area to obtain a target optical target area.
The means for obtaining the candidate optical target region is not limited and can be determined by a maximum stable extremum region algorithm. The means for obtaining the target optical target region is not specified and can be determined based on convolutional neural networks. And outputting the target optical target region if the candidate optical target region is input into a model trained based on the convolutional neural network.
Example two
Fig. 2 is a flowchart of a gesture information determining method according to a second embodiment of the present invention, and the present embodiment provides an exemplary description of the gesture information determining method.
The method for determining the attitude information provided by the embodiment of the invention can be regarded as a computer vision method for internal data in the triaxial air bearing table chamber. The position information determining method provided by the invention solves the problem of internal attitude in the triaxial air bearing table chamber by utilizing a computer vision method, provides a basic support for ground simulation verification of a spacecraft, and realizes the basic technical scheme of the invention as shown in figure 2, and the method comprises the following steps:
s210, 9 LED lamps are installed on an indoor roof.
At least 9 characteristic points, such as LED lamps or black points which do not emit light, are arranged on an indoor roof placed on the triaxial air bearing table and serve as optical targets.
Fig. 3 is a schematic view of an arrangement scenario of an optical target according to a second embodiment of the present invention, in which 9 LED light targets (P 1 ~P 9 ) The specific configuration is shown in fig. 3:
8 targets (P) 1 ~P 8 ) Evenly distributed in a circular arrangement and not perfectly symmetrical so that the optical images are better matched when matched, 8 targets are all placed on the ceiling 1 in the same plane.
9 th optical target P 9 Placement ofDirectly below the center of the circle, in a different plane from the first 8 points, a distance h from the ceiling 1.
S220, placing an optical camera on the surface of the triaxial air bearing table.
The optical camera parameters were as follows: the focal length is f, the dimensions of the pixel in the x direction and the y direction are dX and dY respectively, and the coordinates of the principal point of the image in the image coordinate system are (u 0 , v 0 )。
S230, imaging the indoor roof.
And the optical camera images the indoor roof to obtain an optical image.
S240, extracting candidate optical target areas.
And analyzing the optical image by using a maximum stable extremum region algorithm (MSER algorithm) to obtain a candidate optical target region, wherein a specific analysis means is not limited.
S250, extracting an image optical target area.
Classifying the extracted candidate optical target areas by using a convolutional neural network to obtain 9 image optical target areas; machine learning can also be used to classify candidate optical target regions, resulting in 9 image optical target regions, i.e., target optical target regions.
S260, obtaining the optical image matching point pair.
Calculating the coordinates of the 9 acquired image optical target areas under an optical camera image coordinate system; matching the 9 obtained image optical target areas with 9 LED lamps by using an exhaustive traversal method to obtain 9 optical image matching point pairs (namely matching point pairs); the matching point pair comprises a first characteristic point and a second characteristic point, wherein the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the measuring environment.
S270, calculating to obtain three-axis air bearing table posture information.
Random matching from 9 acquired optical imagesThe 5 optical image matching point pairs are extracted from the point pairs, and the coordinates of the point pairs in the world coordinate system are recorded as (x) wi , y wi , z wi ) (i=1, 2,3,4, 5), the coordinates in the image coordinate system are (u) i ,v i )(i=1,2,3,4,5);
And calculating by using the 5 optical image matching point pairs obtained in the step to obtain the posture information of the triaxial air bearing table.
The world coordinate system is defined as follows: the origin is in the marble mesa center of air supporting platform workstation, and z axle perpendicular workstation mesa is upwards, and the x-axis is parallel to the long limit of working face, and the y-axis is parallel to the minor face of working face, and three coordinate axes satisfy the right hand rule.
The posture information of the triaxial air bearing table is calculated as follows:
calculating intermediate parameter a based on optical camera parameters x And a y
Calculating intermediate parameters b of the obtained 5 optical image matching point pairs xi And b yi (i=1,2,3,4,5):
Calculating a three-axis air bearing table posture calculation key matrix, namely a first matrix M and a second matrix B:
calculating a posture matrix C of the triaxial air bearing table, and recording elements C (i, j) of an ith row and a jth column of the matrix:
calculating a pitch angle theta, a roll angle gamma and a yaw angle psi of the triaxial air bearing table:
through the scheme, the three-axis air bearing table indoor positioning problem is converted into the non-contact computer vision positioning problem, so that the complexity of the air bearing table indoor positioning is effectively reduced, and the air bearing table indoor positioning economy is improved. According to the invention, the candidate optical target region is extracted by utilizing the extremely stable extremum region algorithm, and the optical target region is confirmed by utilizing the convolutional neural network, so that the accuracy of the positioning in the air bearing table chamber and the richness of the positioning information are effectively improved.
Example III
Fig. 4 is a schematic structural diagram of an attitude information determining apparatus according to a third embodiment of the present invention. As shown in fig. 4, the apparatus includes:
an acquisition module 410, configured to acquire an optical image acquired by an optical camera on a triaxial air bearing table, where the optical target is disposed outside the triaxial air bearing table within a field angle range of the optical camera;
a first determination module 420 for determining at least a first number of target optical target areas within the optical image;
a second determining module 430, configured to determine at least a second number of matching point pairs corresponding to each of the target optical target areas, where the first number is greater than or equal to the second number;
a third determining module 440, configured to determine pose information of the tri-axial air bearing table based on coordinate information of at least a second number of matching point pairs;
the matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment.
Optionally, the third determining module 440 includes:
a first determining unit, configured to determine first coordinate information under an image coordinate system corresponding to the first feature point and second coordinate information under a world coordinate system corresponding to the second feature point;
and the second determining unit is used for determining the posture information of the triaxial air bearing table based on the first coordinate information, the second coordinate information and the camera parameters of the optical video camera.
Optionally, the second determining unit is specifically configured to:
determining a first matrix and a second matrix based on the first coordinate information, the second coordinate information, and camera parameters of the optical video camera;
determining a pose matrix based on the first matrix and the second matrix;
determining the posture information of the triaxial air bearing table based on the posture matrix;
wherein the first matrix is:
dX and dY are the dimensions of the picture elements, f is the focal length, < >>,(u 0 , v 0 ) Is the coordinates of the principal point of the image in the image coordinate system, (u) i ,v i ) For the first coordinate information, (x) wi, y wi , z wi ) For a second coordinate information, the second number is 5;
the second matrix is:
h is the distance between the first plane and the second planeEach of the optical targets is distributed in the first plane and the second plane. / >
Optionally, the number of the optical targets is at least a first number, at least one optical target is located in a second plane, the optical targets except the optical targets located in the second plane are located in the first plane, the optical targets in the first plane are circularly arranged, the optical targets in the first plane are not completely symmetrical, and the optical targets in the second plane are located on the circular central vertical line in the second plane.
Optionally, the second determining unit determines a pose matrix based on the first matrix and the second matrix, including:
and carrying out the following operation on the first matrix and the second matrix to obtain a posture matrix:
c is the gesture matrix.
Optionally, the first determining module 420 is specifically configured to:
determining candidate optical target regions within the optical image;
filtering from within the candidate optical target region to obtain a target optical target region.
The gesture information determining device provided by the embodiment of the invention can execute the gesture information determining method provided by any embodiment of the invention, and has the corresponding functional modules and beneficial effects of the executing method.
Example IV
Fig. 5 is a schematic structural diagram of an electronic device implementing a method for determining pose information according to an embodiment of the present invention. The electronic device 10 is intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device 10 may also represent various forms of mobile equipment, such as personal digital assistants, cellular telephones, smartphones, wearable devices (e.g., helmets, eyeglasses, watches, etc.), and other similar computing equipment. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the inventions described and/or claimed herein.
As shown in fig. 5, the electronic device 10 includes at least one processor 11, and a memory, such as a Read Only Memory (ROM) 12, a Random Access Memory (RAM) 13, etc., communicatively connected to the at least one processor 11, in which the memory stores a computer program executable by the at least one processor, and the processor 11 may perform various appropriate actions and processes according to the computer program stored in the Read Only Memory (ROM) 12 or the computer program loaded from the storage unit 18 into the Random Access Memory (RAM) 13. In the RAM 13, various programs and data required for the operation of the electronic device 10 may also be stored. The processor 11, the ROM 12 and the RAM 13 are connected to each other via a bus 14. An input/output (I/O) interface 15 is also connected to bus 14.
Various components in the electronic device 10 are connected to the I/O interface 15, including: an input unit 16 such as a keyboard, a mouse, etc.; an output unit 17 such as various types of displays, speakers, and the like; a storage unit 18 such as a magnetic disk, an optical disk, or the like; and a communication unit 19 such as a network card, modem, wireless communication transceiver, etc. The communication unit 19 allows the electronic device 10 to exchange information/data with other devices via a computer network, such as the internet, and/or various telecommunication networks.
The processor 11 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of processor 11 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various processors running machine learning model algorithms, digital Signal Processors (DSPs), and any suitable processor, controller, microcontroller, etc. The processor 11 performs the respective methods and processes described above, such as the posture information determination method.
In some embodiments, the pose information determination method may be implemented as a computer program tangibly embodied on a computer-readable storage medium, such as storage unit 18. In some embodiments, part or all of the computer program may be loaded and/or installed onto the electronic device 10 via the ROM 12 and/or the communication unit 19. When a computer program is loaded into RAM 13 and executed by processor 11, one or more steps of the pose information determination method described above may be performed. Alternatively, in other embodiments, the processor 11 may be configured to perform the pose information determination method by any other suitable means (e.g., by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
A computer program for carrying out methods of the present invention may be written in any combination of one or more programming languages. These computer programs may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus, such that the computer programs, when executed by the processor, cause the functions/acts specified in the flowchart and/or block diagram block or blocks to be implemented. The computer program may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of the present invention, a computer-readable storage medium stores computer instructions for causing a processor to implement the pose information determination method provided by the present invention when executed. A computer readable storage medium may be a tangible medium that can contain, or store a computer program for use by or in connection with an instruction execution system, apparatus, or device. The computer readable storage medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. Alternatively, the computer readable storage medium may be a machine readable signal medium. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on an electronic device 10, the electronic device 10 having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the electronic device 10. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), blockchain networks, and the internet.
The computing system may include clients and servers. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server can be a cloud server, also called a cloud computing server or a cloud host, and is a host product in a cloud computing service system, so that the defects of high management difficulty and weak service expansibility in the traditional physical hosts and VPS service are overcome.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps described in the present invention may be performed in parallel, sequentially, or in a different order, so long as the desired results of the technical solution of the present invention are achieved, and the present invention is not limited herein.
The above embodiments do not limit the scope of the present invention. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. A posture information determination method, characterized by comprising:
acquiring an optical image which is acquired by an optical camera on a triaxial air bearing table and comprises optical targets, wherein the optical targets are arranged outside the triaxial air bearing table within the range of the view angle of the optical camera, the number of the optical targets is at least a first number, at least one optical target is positioned in a second plane, the optical targets except the optical targets positioned in the second plane are positioned in a first plane, the optical targets in the first plane are circularly arranged, the optical targets in the first plane are not completely symmetrical, and the optical targets in the second plane are positioned on the circular central vertical line in the second plane;
determining at least a first number of target optical target areas within the optical image;
determining at least a second number of matching point pairs corresponding to each target optical target region in the target optical target regions, the first number being greater than or equal to the second number;
determining attitude information of the triaxial air bearing table based on coordinate information of at least a second number of matching point pairs;
the matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment.
2. The method of claim 1, wherein determining pose information for the tri-axis air bearing table based on coordinate information for at least a second number of matched point pairs comprises:
determining first coordinate information under an image coordinate system corresponding to the first feature point and second coordinate information under a world coordinate system corresponding to the second feature point;
and determining the posture information of the triaxial air bearing table based on the first coordinate information, the second coordinate information and the camera parameters of the optical video camera.
3. The method of claim 2, wherein the determining pose information of the tri-axis air bearing table based on the first coordinate information, the second coordinate information, and camera parameters of the optical video camera comprises:
determining a first matrix and a second matrix based on the first coordinate information, the second coordinate information, and camera parameters of the optical video camera;
determining a pose matrix based on the first matrix and the second matrix;
determining the posture information of the triaxial air bearing table based on the posture matrix;
wherein the first matrix is:
,dXanddYfor the size of the picture element,fas the focal length of the lens is, ,(u 0 , v 0 ) Is the coordinates of the principal point of the image in the image coordinate system, (u) i ,v i ) Is first coordinate informationx wi, y wi , z wi ) For a second coordinate information, the second number is 5;
the second matrix is:
hand the distance between the first plane and the second plane is the distance between the first plane and the second plane, and each optical target in the optical targets is distributed in the first plane and the second plane.
4. A method according to claim 3, wherein said determining a pose matrix based on said first matrix and said second matrix comprises:
and carrying out the following operation on the first matrix and the second matrix to obtain a posture matrix:
c is the gesture matrix.
5. The method of claim 1, wherein the determining at least a first number of target optical target areas within the optical image comprises:
determining candidate optical target regions within the optical image;
filtering from within the candidate optical target region to obtain a target optical target region.
6. An attitude information determination apparatus, comprising:
the acquisition module is used for acquiring an optical image which is acquired by an optical camera on the triaxial air bearing table and comprises optical targets, wherein the optical targets are arranged outside the triaxial air bearing table within the range of the field angle of the optical camera, the number of the optical targets is at least a first number, at least one optical target is positioned in a second plane, the optical targets except the optical targets positioned in the second plane are positioned in a first plane, the optical targets in the first plane are circularly arranged, the optical targets in the first plane are not completely symmetrical, and the optical targets in the second plane are positioned on the circular central vertical line in the second plane;
A first determining module for determining at least a first number of target optical target areas within the optical image;
a second determining module, configured to determine at least a second number of matching point pairs corresponding to each target optical target area in the target optical target areas, where the first number is greater than or equal to the second number;
the third determining module is used for determining the posture information of the triaxial air bearing table based on the coordinate information of at least a second number of matching point pairs;
the matching point pair comprises a first characteristic point and a second characteristic point, the first characteristic point is a characteristic point in the optical image indicated by the matching point pair corresponding to the target optical target area, and the second characteristic point is a characteristic point corresponding to the matching point pair corresponding to the target optical target area in the optical target in the measuring environment.
7. An electronic device, the electronic device comprising:
at least one processor; and
a memory communicatively coupled to the at least one processor; wherein,,
the memory stores a computer program executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1-5.
8. A posture information determination system, characterized by comprising:
the device comprises an optical target, an optical camera, a triaxial air bearing table and electronic equipment;
the optical camera is arranged on a suspension table top of the triaxial air bearing table, and the optical target is arranged outside the triaxial air bearing table within the range of the angle of view of the optical camera;
the optical camera is used for acquiring an optical image comprising the optical target;
the marker points included in the optical targets are located in at least two planes;
the electronic device is configured to perform the pose information determination method according to any of claims 1-5.
9. A computer readable storage medium storing computer instructions for causing a processor to implement the pose information determination method according to any of claims 1-5 when executed.
CN202310444030.0A 2023-04-24 2023-04-24 Gesture information determining method, device, electronic equipment, system and medium Active CN116182807B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310444030.0A CN116182807B (en) 2023-04-24 2023-04-24 Gesture information determining method, device, electronic equipment, system and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310444030.0A CN116182807B (en) 2023-04-24 2023-04-24 Gesture information determining method, device, electronic equipment, system and medium

Publications (2)

Publication Number Publication Date
CN116182807A CN116182807A (en) 2023-05-30
CN116182807B true CN116182807B (en) 2023-07-28

Family

ID=86452394

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310444030.0A Active CN116182807B (en) 2023-04-24 2023-04-24 Gesture information determining method, device, electronic equipment, system and medium

Country Status (1)

Country Link
CN (1) CN116182807B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1877247A (en) * 2006-07-07 2006-12-13 哈尔滨工业大学 Apparatus and method for measuring attitude angle of three-axis air-bearing table
CN101660966B (en) * 2009-09-18 2011-04-20 中国科学院长春光学精密机械与物理研究所 Device for simulating dynamic imaging of TDI CCD camera
CN102426007B (en) * 2011-08-29 2013-12-25 哈尔滨工业大学 High-precision method for measuring attitude angle of triaxial air bearing table and measurement device thereof
CN102865883B (en) * 2012-06-26 2015-05-20 北京航空航天大学 Test system for impact analysis of imaging quality of TDICCD (Time Delayed Integration Charge Coupled Device) by multi-source interference
CN104006787B (en) * 2014-05-01 2016-07-06 哈尔滨工业大学 Spacecraft Attitude motion simulation platform high-precision attitude defining method
CN104867160B (en) * 2015-06-17 2017-11-07 合肥工业大学 A kind of directionality demarcation target demarcated for camera interior and exterior parameter

Also Published As

Publication number Publication date
CN116182807A (en) 2023-05-30

Similar Documents

Publication Publication Date Title
CN111080693A (en) Robot autonomous classification grabbing method based on YOLOv3
WO2021004416A1 (en) Method and apparatus for establishing beacon map on basis of visual beacons
CN108764048A (en) Face critical point detection method and device
CN107705293A (en) A kind of hardware dimension measurement method based on CCD area array cameras vision-based detections
CN110634137A (en) Bridge deformation monitoring method, device and equipment based on visual perception
CN115797359B (en) Detection method, equipment and storage medium based on solder paste on circuit board
CN116661477A (en) Substation unmanned aerial vehicle inspection method, device, equipment and storage medium
CN116844124A (en) Three-dimensional object detection frame labeling method, three-dimensional object detection frame labeling device, electronic equipment and storage medium
CN116342585A (en) Product defect detection method, device, equipment and storage medium
CN116053549A (en) Battery cell positioning method, device and system
CN113705564B (en) Pointer type instrument identification reading method
CN112197708B (en) Measuring method and device, electronic device and storage medium
CN116182807B (en) Gesture information determining method, device, electronic equipment, system and medium
CN113642425A (en) Multi-mode-based image detection method and device, electronic equipment and storage medium
CN117911729A (en) Image matching positioning method and device
CN115311624B (en) Slope displacement monitoring method and device, electronic equipment and storage medium
CN114734444B (en) Target positioning method and device, electronic equipment and storage medium
CN115049810A (en) Coloring method, device and equipment for solid-state laser radar point cloud and storage medium
CN116129069A (en) Method and device for calculating area of planar area, electronic equipment and storage medium
CN115100296A (en) Photovoltaic module fault positioning method, device, equipment and storage medium
CN115376026A (en) Key area positioning method, device, equipment and storage medium
CN115077494B (en) Three-dimensional model correction method, device, equipment, medium and system
JP2005346348A (en) Image processing method, device, and program
CN118882485A (en) Dimension measurement method and device based on machine vision, electronic equipment and medium
CN115170914A (en) Pose estimation method and device, electronic equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant