CN113295171B - Monocular vision-based attitude estimation method for rotating rigid body spacecraft - Google Patents
Monocular vision-based attitude estimation method for rotating rigid body spacecraft Download PDFInfo
- Publication number
- CN113295171B CN113295171B CN202110545278.7A CN202110545278A CN113295171B CN 113295171 B CN113295171 B CN 113295171B CN 202110545278 A CN202110545278 A CN 202110545278A CN 113295171 B CN113295171 B CN 113295171B
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- spacecraft
- image
- ellipse
- plane
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 239000013598 vector Substances 0.000 claims abstract description 83
- 238000001914 filtration Methods 0.000 claims abstract description 19
- 230000008571 general function Effects 0.000 claims abstract description 19
- 238000000926 separation method Methods 0.000 claims abstract description 19
- 238000001514 detection method Methods 0.000 claims abstract description 10
- 239000011159 matrix material Substances 0.000 claims description 44
- 238000006243 chemical reaction Methods 0.000 claims description 15
- 238000012545 processing Methods 0.000 claims description 13
- 230000008030 elimination Effects 0.000 claims description 6
- 238000003379 elimination reaction Methods 0.000 claims description 6
- 230000003287 optical effect Effects 0.000 claims description 6
- 238000006073 displacement reaction Methods 0.000 claims description 5
- 230000011218 segmentation Effects 0.000 claims description 4
- 230000009466 transformation Effects 0.000 claims description 4
- 238000004458 analytical method Methods 0.000 claims description 3
- 230000001174 ascending effect Effects 0.000 claims description 3
- 238000005315 distribution function Methods 0.000 claims description 3
- 230000006870 function Effects 0.000 claims description 3
- 238000012163 sequencing technique Methods 0.000 claims description 3
- 230000001131 transforming effect Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 9
- 239000003550 marker Substances 0.000 description 5
- 238000004364 calculation method Methods 0.000 description 4
- 238000003032 molecular docking Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 239000002699 waste material Substances 0.000 description 2
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000002372 labelling Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/24—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for cosmonautical navigation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/136—Segmentation; Edge detection involving thresholding
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/194—Segmentation; Edge detection involving foreground-background segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10004—Still image; Photographic image
- G06T2207/10012—Stereo images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20024—Filtering details
- G06T2207/20032—Median filtering
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Quality & Reliability (AREA)
- Astronomy & Astrophysics (AREA)
- Geometry (AREA)
- Image Analysis (AREA)
- Navigation (AREA)
Abstract
The invention discloses a monocular vision-based attitude estimation method for a rotary rigid spacecraft, which comprises the steps of establishing a known target rotary rigid spacecraft model and a coordinate system for describing the attitude of the spacecraft, carrying out median filtering and background separation on an image obtained by a monocular camera, detecting and marking all ellipses by using a rapid ellipse detection method, determining general functions of all the ellipses, carrying out radius matching on the marked ellipses and the known target model, solving by using a geometric method to obtain the normal direction and the central coordinate of a plane where the ellipses are located under a camera coordinate system, fitting all the ellipse central coordinates into a straight line by using a least square method, calculating the unit direction vector of the straight line, and obtaining the complete relative attitude of the target spacecraft relative to a tracked spacecraft according to the normal line and the central coordinate of the uniquely determined plane where the reference ellipses are located. The method has the advantages of small calculated amount, short consumed time and high accuracy, and can effectively estimate the relative attitude of the non-cooperative target spacecraft relative to the tracking spacecraft.
Description
Technical Field
The invention belongs to the technical field of vision measurement of non-cooperative target spacecrafts in the space task technology, and relates to a method for estimating attitude of a rotating rigid body spacecraft based on monocular vision.
Background
Pose (relative position and attitude) estimation of the target spacecraft is an important issue for a wide range of spatial mission scenarios, such as Formation Flight (FF), on-orbit service (OOS), and Active Debris Removal (ADR). Relative pose estimation and navigation are essential in safe close range operations (from tens of meters to several meters), close range capture such as rendezvous and docking, and the like. In the most common non-cooperative target spacecraft (namely, no artificial marker or any communication link is arranged), the realization of the pose determination of the non-cooperative target spacecraft is an important basis for executing subsequent complex space tasks, so that the pose estimation problem research of the non-cooperative target spacecraft has important theoretical value and engineering significance.
For the pose estimation problem of the non-cooperative target, the patent CN108562274A adopts a binary square marker as a marker, casts a plurality of markers to the target spacecraft in the rendezvous and docking approach stage, and solves the relative pose between the target coordinate system and the camera coordinate system by identifying the position information of the satellite and rocket docking rings and the markers on the surface of the non-cooperative target. However, this method requires throwing a plurality of markers, which results in waste of resources, and is only suitable for the close proximity approach stage, and the identification of the markers increases the amount of calculation and brings estimation errors. In the patent CN111536981A, feature information such as frame corner points and docking rings of a non-cooperative target main body is extracted by a binocular camera, left and right image features are matched, three-dimensional coordinates of the matched features are restored, and finally, the position and posture of the non-cooperative target relative to the binocular camera are obtained by calculation, so as to obtain the relative pose between a coordinate system of the non-cooperative target and a coordinate system of the binocular camera. The method is restricted by the base line of the binocular camera, the observable distance is limited, the power consumption and the load are large, and the requirement on the load is improved.
Disclosure of Invention
The invention solves the problems: the method is used for obtaining the position and attitude information of the non-cooperative target spacecraft relative to the tracking spacecraft and establishing a foundation for the subsequent complex space task. The invention realizes a monocular vision system based on low power and low quality requirements, and under the condition of not depending on additional measurement information, the invention fits the straight line where the centers of a plurality of circles of the rotating rigid spacecraft are positioned by the least square method to determine the normal direction and the central position of the support plane of the spacecraft, thereby realizing the accurate measurement of the relative pose of the non-cooperative target spacecraft, having small calculated amount and high accuracy, and providing higher value for practical engineering application.
The invention provides a monocular vision-based attitude estimation method for a rotating rigid body spacecraft, which comprises the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x) i ,y i ,z i ) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
In step S1, a known target rotating rigid body spacecraft 3D model is established, and a coordinate system describing the posture of the spacecraft is established, where the coordinate system includes a pixel coordinate system, an image coordinate system, a camera coordinate system, and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix, which specifically comprises the following steps:
pixel coordinate system O uv -uv: to track the vertex O of the image obtained by a monocular camera on a spacecraft uv The point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located;
image coordinate system O xy -xy: at the center O of the image xy As an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system;
camera coordinate system O c -X c Y c Z c : with camera optical center O c Is an origin, Z c Axial in the direction of the optical axis, X c The axis being parallel to the x-axis, Y, of the image coordinate system c The axis is parallel to the y-axis of the image coordinate system;
with the mass center O of the target spacecraft body w : establishing a target spacecraft body coordinate system O for an origin w -X w Y w Z w The direction of the outward normal of the plane of the ellipse is taken as Z w Axis X in the direction perpendicular to the normal of the plane of the ellipse w Axis, Y w The axes are respectively connected with X w Axis and Z w The axes are vertical and form a right-handed screw system;
establishing a coordinate system O on the plane of the ellipse D -X D Y D Z D At the center O of the plane D Is an origin, X D Axis parallel to X of body coordinate system w Axis, Y D Y with axis parallel to body coordinate system of target spacecraft w Axis, Z D Z with axis parallel to body coordinate system of target spacecraft w A shaft.
In the step S2, calculating a pixel coordinate of an image obtained by tracking a monocular camera on a spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image, specifically including:
s21: converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
s22: selecting a sliding template (generally 3 multiplied by 3), aligning the first three rows and the first three columns of pixel points of the gray image, sequencing all the pixel points in the sliding template according to the pixel values to generate a two-dimensional data sequence which monotonically rises or monotonically falls, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
s23: the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
s24: and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
In step S3, the image after the median filtering is processed, and the spacecraft is separated from the background by using the weak gradient elimination method to obtain an image after the background separation, which specifically includes:
s31: calculating image gradient by using a Priwitt operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
in the formula, G x ,G y Respectively representing the horizontal and vertical gradients of the image, and representing a two-dimensional convolution operator;
s32: taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
wherein G (u, v) represents an image gradient value at an image coordinate of (u, v);
s33: arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into a plurality of groups (100 groups are selected in the invention), and approximating the histogram form to an exponential probability density function according to the frequency statistics of each group in a histogram form
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
s34: substituting the percentage of the weak gradient value pixel quantity in the total pixel quantity and lambda obtained by the formula (3) into the exponential distribution function formula (4), calculating the corresponding gradient segmentation threshold value, and setting the pixel values smaller than the gradient segmentation threshold value to be 0
The step S4, processing the image after background separation, detecting all ellipses in the image after background separation by using a fast ellipse detection method, labeling the ellipses in the sequence 1,2, … n, and determining general functions of all the ellipses, specifically including:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses 0i ,y 0i ,a i ,b i ,θ i ) I ∈ {1,2.. n }, where (x) 0i ,y 0i ) Represents the center position of the ellipse, a i Semi-major axis of the ellipse, b i Semi-minor axis, theta, representing an ellipse i Representing the angle by which the long axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au 2 +Bv 2 +Cuv+Du+Ev+F=0 (5)
wherein A, B, C, D, E and F are parameters of an elliptic general function;
step S5, performing radius matching on the ellipses of all the marks and the known target rotational rigid body spacecraft 3D model, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and center coordinates of a plane where each ellipse is located in the camera coordinate system, specifically including:
s51: matching the marked ellipse with the known target spacecraft 3D model from Z w Matching in the positive direction of the axis to obtain the radius r of each matched circle i The formula (5) is rewritten into an algebraic form:
[u v 1]Q[u v 1] T =0 (7)
wherein,
the point is represented as P in the camera coordinate system c =(X c ,Y c ,Z c ) T Expressed as (u, v) in the pixel coordinate system T Corresponding homogeneous coordinate is expressed as (u, v,1) T The following relationship is satisfied:
X c ,Y c ,Z c respectively representing points along X in the camera coordinate system c Axis, Y c Axis and Z c Distance of the shaft; m ins The internal parameter matrix of the monocular camera is obtained; will be given by the formula (7) Turning to a camera coordinate system, and obtaining an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[X c Y c Z c ]C Q [X c Y c Z c ] T =0 (10)
wherein,
establishing a coordinate system O at the origin of the camera coordinate system c -X 'Y' Z ', the Z' axis being parallel to the normal of the plane of the ellipse, the oblique elliptic cone Γ becoming a positive elliptic cone; coordinate system O c -X ' Y ' Z ' and camera coordinate system O c -X c Y c Z c There is only a rotational transformation; transforming the matrix P by rotation, C Q Conversion to diagonal matrix:
P T C Q P=diag(λ 1 ,λ 2 ,λ 3 ) (12)
wherein λ is 1 、λ 2 And λ 3 Is C Q A characteristic value of (a), and 1 ≥λ 2 >0>λ 3 (ii) a Reckoning the coordinate System O c -center coordinate O 'of the plane in which the ith ellipse is located under X' Y 'Z' i And normal vector n' i Comprises the following steps:
wherein i belongs to {1,2.. n }, r i Representing the radius of each circle after matching with the known 3D model of the target spacecraft;
the center coordinate O 'of the plane of the ellipse' i And normal vector n' i Turning to the camera coordinate system:
wherein,representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;a normal vector representing the plane of the ith ellipse in the camera coordinate system;
s52: comparing the ellipse to be marked with the known 3D model of the target spacecraft from Z w Matching in the opposite direction, and repeating the step S51 to obtain the radius r of each matched circle i Calculating Z w The axes are matched with the center coordinates and normal vectors of the plane where the series of ellipses are located in the lower camera coordinate system in the opposite direction.
The step S6, converting all the center coordinates (x) i ,y i ,z i ) I belongs to {1,2.. n } and is fitted into a straight line, a unit direction vector where the straight line is located is obtained through calculation, i represents the ith mark ellipse, and the method specifically comprises the following steps:
the simplified form of the line l is:
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
thus:
the direction vector of the straight line l is obtained as N ═ a, c,1, and N is converted into a unit direction vector
Step S7, according to the feature that the centers of all ellipses are on a straight line, and the straight line is the normal of the plane where the ellipses are located, uniquely determining the normal vector and the central coordinate of the plane where the ellipses are located, taking the plane where the first ellipse is located as the reference, calculating the rotation matrix of the plane where the reference ellipse is located with respect to the tracked spacecraft, where the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft, specifically including:
s71: calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained by S5Angle therebetween
S72: uniquely determining a normal vector of a plane where a reference ellipse is located and a corresponding central coordinate thereof according to the following principle;
wherein ε < 5 ° represents an angle threshold;
s73: calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(O x ,O y ,O z ) T (21)
wherein (O) x ,O y ,O z ) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipseAs Z D Axial direction vectorCalculating a vector perpendicular thereto as X D Axial direction vectorThereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
wherein, representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
The normal vector of the plane of the uniquely determined reference ellipseAs Z D Axial direction vectorCalculating a vector perpendicular thereto as X D Axial direction vectorThereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
wherein, representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
In order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around X w The angle of rotation of the shaft being pitch phi, about Y w The angle of rotation of the shaft being yaw angleAround Z w The rotation angle of the shaft is a roll angle gamma, and the rotation matrix R is in the form of Euler angle according to X w 、Y w 、Z w The rotation sequence of (a) is expressed as:
from equation (23), the euler angle can be calculated, expressed as:
wherein R is ij (i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
Compared with the prior art, the invention has the advantages that:
(1) compared with the existing spacecraft pose estimation method based on binocular vision, the spacecraft pose estimation method based on monocular vision designed by the invention is not influenced by baseline constraint, has lower hardware complexity and cost, lower power consumption and load, reduces the requirement on load, can ensure that the pose is determined quickly under the low-power and quality requirements, and better meets the actual engineering requirement.
(2) Compared with the existing monocular vision spacecraft pose estimation method based on the projection marker, the monocular vision-based spacecraft pose estimation method designed by the invention does not need to additionally project and identify the marker, does not cause resource waste, is suitable for spacecrafts with longer distance and without markers, has small calculated amount, short consumed time and high accuracy, can effectively estimate the relative posture of the non-cooperative target spacecraft relative to the tracking spacecraft, and has good engineering value.
Drawings
FIG. 1 is a flow chart of a method for estimating attitude of a rotating rigid body spacecraft based on monocular vision according to the present invention;
fig. 2 is a schematic diagram of a coordinate system for describing the attitude of a spacecraft, which is established in embodiment 1 of the invention;
FIG. 3 is a normal vector diagram estimated from the centers of all detected ellipses in the target spacecraft in embodiment 1 of the present invention;
FIG. 4 is a diagram of an absolute error of an ellipsometric vector in a target spacecraft in embodiment 1 of the present invention;
fig. 5 is a diagram of absolute error of a displacement vector of a target spacecraft with respect to a tracking spacecraft in embodiment 1 of the present invention;
FIG. 6 is a diagram of relative errors of a target spacecraft with respect to a displacement vector of a tracking spacecraft in embodiment 1 of the present invention;
fig. 7 is a diagram of an estimation error of the euler angle of the target spacecraft with respect to the tracking spacecraft in embodiment 1 of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only illustrative and are not intended to limit the present invention.
As shown in fig. 1, the method for estimating the attitude of a rotating rigid body spacecraft based on monocular vision of the present invention comprises the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x) i ,y i ,z i ) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
The following describes in detail a specific implementation of the aforementioned monocular vision-based rigid body spacecraft attitude estimation method according to the present invention with a specific embodiment.
Example 1:
firstly, establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix. The method is realized by the following specific steps:
as shown in fig. 2, each coordinate system is established as follows: pixel coordinate system O uv -uv: to track the vertex O of the image obtained by a monocular camera on a spacecraft uv The point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located; image coordinate system O xy -xy: at the center O of the image xy As an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system; camera coordinate system O c -X c Y c Z c : with camera optical center O c Is an origin, Z c Axial in the direction of the optical axis, X c The axis being parallel to the x-axis, Y, of the image coordinate system c The axis is parallel to the y-axis of the image coordinate system; with the mass center O of the target spacecraft body w : establishing a target spacecraft body coordinate system O for an origin w -X w Y w Z w The direction of the outward normal of the plane of the ellipse is taken as Z w Axis X in the direction perpendicular to the normal of the plane of the ellipse w Axis, Y w The axes are respectively connected with X w Axis and Z w The axes are vertical and form a right-handed screw system; establishing a coordinate system O on the plane of the ellipse D -X D Y D Z D At the center O of the plane D Is an origin, X D Axis parallel to X of body coordinate system w Axis, Y D Y with axis parallel to body coordinate system of target spacecraft w Axis, Z D Z with axis parallel to body coordinate system of target spacecraft w A shaft.
And secondly, calculating the pixel coordinates of an image obtained by a monocular camera on the tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image. The method is realized by the following specific steps:
(1) converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
(2) selecting a sliding template, aligning the first three rows and the first three columns of pixel points of the gray level image, sequencing all the pixel points in the sliding template according to the pixel values to generate a two-dimensional data sequence which monotonically rises or monotonically falls, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
(3) the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
(4) and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
And thirdly, processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain the image after background separation. The method is realized by the following specific steps:
(1) calculating image gradient by using a Priwit operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
in the formula, G x ,G y Representing the image horizontal and vertical gradients, respectively, representing the two-dimensional convolution operator.
(2) Taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
in the formula, G (u, v) represents an image gradient at image coordinates (u, v).
(3) Arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into 100 groups, counting the frequency of each group as a histogram, and approximating the histogram as an exponential probability density function
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
(4) the percentage of the weak gradient value pixel amount to the total pixel amount and λ obtained in (3) are substituted into the exponential distribution function (4), so that the corresponding gradient division threshold value can be calculated, and all the pixel values smaller than the gradient division threshold value are set to 0.
And fourthly, processing the image after the background separation, detecting all ellipses in the image after the background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses. The method is realized by the following specific steps:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses 0i ,y 0i ,a i ,b i ,θ i ) I ∈ {1,2.. n }, where (x) 0i ,y 0i ) Represents the center position of the ellipse, a i Semi-major axis of the ellipse, b i Semi-minor axis, theta, representing an ellipse i Representing the angle through which the major axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au 2 +Bv 2 +Cuv+Du+Ev+F=0 (5)
wherein, A, B, C, D, E and F are parameters of an elliptic general function.
And fifthly, matching the marked ellipse with the known target spacecraft 3D model from two directions respectively, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under the camera coordinate system.
(1) Matching the marked ellipse with the known target spacecraft 3D model from Z w Matching in the positive direction of the axis to obtain the radius r of each matched circle i . Rewrite equation (5) to algebraic form:
[u v 1]Q[u v 1] T =0 (7)
wherein,
due to the fact that
Wherein M is ins The internal parameter matrix of the monocular camera is obtained; and (3) converting the formula (7) into a camera coordinate system to obtain an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[X c Y c Z c ]C Q [X c Y c Z c ] T =0 (10)
wherein,
establishing a coordinate system O at the origin of the camera coordinate system c -X 'Y' Z ', the Z' axis being parallel to the normal of the plane of the ellipse, the oblique elliptic cone Γ becoming a positive elliptic cone; coordinate system O c -X ' Y ' Z ' and camera coordinate system O c -X c Y c Z c There is only a rotational transformation; by passingRotating transformation matrix P, C Q Conversion to diagonal matrix:
P T C Q P=diag(λ 1 ,λ 2 ,λ 3 ) (12)
wherein λ is 1 、λ 2 And λ 3 Is C Q Characteristic value ofAnd λ 1 ≥λ 2 >0>λ 3 (ii) a Reckoning the coordinate system O c -center coordinate O 'of the plane of the ith ellipse under X' Y 'Z' i And normal vector n' i Comprises the following steps:
wherein i belongs to {1,2.. n }, r i Representing the radius of each circle after matching with the known 3D model of the target spacecraft;
the center coordinate O 'of the plane of the ellipse' i And normal vector n' i Turning to the camera coordinate system:
wherein,representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;and the normal vector of the plane where the ith ellipse is located in the camera coordinate system is represented.
(2) Matching the marked ellipse with the known target spacecraft 3D model from Z w Matching in the opposite direction of the axis, and repeating the step (1) to obtain the radius r of each matched circle i Calculating Z w The axes are matched with the center of the plane of a series of ellipses in the lower camera coordinate system in the opposite directionThe normal vectors are summed.
Sixthly, all the center coordinates (x) i ,y i ,z i ) And i belongs to {1,2.. n } and is fitted into a straight line, a unit direction vector where the straight line is located is obtained through calculation, and i represents the ith mark ellipse. The method is realized by the following steps:
the simplified form of the line l is:
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
thus:
this gives (a, c,1) the direction vector of the straight line l, and converts N into a unit direction vector
And seventhly, according to the characteristic that the centers of all ellipses are in one straight line which is the normal of the plane in which the ellipses are located, uniquely determining a normal vector and a central coordinate of the plane in which the ellipses are located, taking the plane in which the first ellipse is located as a reference, calculating a rotation matrix of the plane in which the reference ellipse is located relative to the tracking spacecraft, and taking the rotation matrix as estimation of the attitude of the rotating rigid body spacecraft. The method is realized by the following specific steps:
(1) calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained in the fifth stepAngle therebetween
(2) And uniquely determining a normal vector of a plane where the reference ellipse is located and a corresponding central coordinate thereof according to the following principle.
Wherein ε < 5 ° represents an angle threshold;
(3) calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(O x ,O y ,O z ) T (21)
wherein (O) x ,O y ,O z ) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipseAs Z D Axial direction vectorCalculating a vector perpendicular thereto as X D Axial direction vectorThereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
wherein, representing the normal vector of the plane in which the uniquely determined reference ellipse lies.
In order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around X w The angle of rotation of the shaft being pitch phi, about Y w The angle of rotation of the shaft being yaw angleAround Z w The angle of the axis rotation is a roll angle gamma, and the rotation matrix R is in the form of Euler angle according to X w 、Y w 、Z w The rotation sequence of (a) is expressed as:
from equation (23), the euler angle can be calculated, expressed as:
wherein R is ij (i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
The precision of the method for estimating the attitude of the rotating rigid body spacecraft based on monocular vision provided by the embodiment 1 of the invention is detected as follows. FIG. 1 is a flow chart of the steps of the present invention. FIG. 2 depicts a target rotating rigid body spacecraftThe model comprises a pixel coordinate system, an image coordinate system, a camera coordinate system, a plane coordinate system where a datum ellipse is located and a target spacecraft body coordinate system. Fig. 3 depicts the direction vectors, estimated normal vectors and true normal vectors of all the detected ellipse centers in the target spacecraft from the straight lines where the centers of the ellipses are located by using the least square method when the target spacecraft is 4 meters away from the tracking spacecraft, and fig. 4 shows the absolute error curve between the estimated normal vectors and the true normal vectors, so that the errors are all within 0.4 degrees. Fig. 5 and 6 respectively show the absolute error and relative error curve of the target spacecraft relative to the tracking spacecraft, the absolute error is within 10cm, and the relative error is less than 1.2%, so that the relative position of the target spacecraft relative to the tracking spacecraft is recovered. FIG. 7 shows the Euler angle error of the target spacecraft relative to the tracking spacecraft, with the upper diagram representing the pitch angle psi error, the middle diagram representing the roll angle gamma error, and the lower diagram representing the yaw angleThe error, all euler angle errors in the graph are very small. All angle and position errors in fig. 3-7 decrease as the monocular camera approaches the target spacecraft because the resolution of the objects in the image increases and the error in feature extraction decreases as the distance between the monocular camera and the target spacecraft decreases. The simulation results fully show that the method for estimating the attitude of the rotating rigid body spacecraft based on monocular vision, provided by the embodiment of the invention, is feasible and effective, has small calculated amount and high accuracy, and can realize effective estimation of the attitude of the non-cooperative target spacecraft.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.
Claims (8)
1. A method for estimating the attitude of a rotating rigid body spacecraft based on monocular vision is characterized by comprising the following steps:
s1: establishing a known target rotating rigid body spacecraft 3D model, and simultaneously establishing a coordinate system for describing the posture of a spacecraft, wherein the coordinate system comprises a pixel coordinate system, an image coordinate system, a camera coordinate system and a target rotating rigid body spacecraft body coordinate system; calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix;
s2: calculating pixel coordinates of an image obtained by a monocular camera on a tracking spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image;
s3: processing the image after median filtering, and separating the spacecraft from the background by using a weak gradient elimination method to obtain an image after background separation;
s4: processing the image after background separation, detecting all ellipses in the image after background separation by using a rapid ellipse detection method, marking the ellipses according to the sequence 1,2 and … n, and determining general functions of all the ellipses;
s5: matching the radii of the ellipses of all the marks with the known 3D model of the target rotating rigid body spacecraft, and solving general functions of all the ellipses by using a geometric method to obtain normal vectors and central coordinates of the plane where each ellipse is located under a camera coordinate system;
s6: all the center coordinates (x) i ,y i ,z i ) Fitting i belongs to {1,2.. n } to form a straight line, calculating to obtain a unit direction vector of the straight line, wherein i represents the ith mark ellipse;
s7: according to the characteristic that the centers of all ellipses are on one straight line, and the straight line is the normal line of the plane where the ellipses are located, the normal vector and the central coordinate of the plane where the ellipses are located are uniquely determined, the plane where the first ellipse is located is used as a reference, the rotation matrix of the plane where the reference ellipse is located relative to the tracking spacecraft is calculated, and the rotation matrix is used as the estimation of the attitude of the rotating rigid-body spacecraft.
2. A monocular vision based rotating rigid body spacecraft attitude estimation method according to claim 1, wherein in step S1, a known target rotating rigid body spacecraft 3D model is established, along with a coordinate system describing spacecraft attitude, the coordinate system comprising a pixel coordinate system, an image coordinate system, a camera coordinate system, and a target rotating rigid body spacecraft body coordinate system; and calculating a conversion relation among the pixel coordinate system, the image coordinate system, the camera coordinate system and the target rotating rigid body spacecraft body coordinate system to obtain a coordinate conversion matrix, which specifically comprises the following steps:
pixel coordinate system O uv -uv: to track the vertex O of the image obtained by a monocular camera on a spacecraft uv The point is an original point, the horizontal coordinate u axis is a row where the image is located, and the vertical coordinate v axis is a column where the image is located;
image coordinate system O xy -xy: at the center O of the image xy As an origin, the x-axis is parallel to the u-axis and the y-axis is parallel to the v-axis of the pixel coordinate system;
camera coordinate system O c -X c Y c Z c : with camera optical center O c Is an origin, Z c Axial in the direction of the optical axis, X c The axis being parallel to the x-axis, Y, of the image coordinate system c The axis is parallel to the y-axis of the image coordinate system;
with the mass center O of the target spacecraft body w Establishing a target spacecraft body coordinate system O for an origin w -X w Y w Z w The direction of the outward normal of the plane of the ellipse is taken as Z w Axis X in the direction perpendicular to the normal of the plane of the ellipse w Axis, Y w The axes are respectively connected with X w Axis and Z w The axes are vertical and form a right-handed screw system;
establishing a coordinate system O on the plane of the ellipse D -X D Y D Z D At the center O of the plane D Is an origin, X D Axis parallel to X of body coordinate system w Axis, Y D Y with axis parallel to body coordinate system of target spacecraft w Axis, Z D Z with axis parallel to body coordinate system of target spacecraft w A shaft.
3. The method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 1, wherein in step S2, calculating pixel coordinates of an image obtained by tracking a monocular camera on the spacecraft in a pixel coordinate system, and performing median filtering to obtain a filtered image, specifically comprising:
s21: converting an image obtained by a monocular camera on a tracked spacecraft into a gray image;
s22: selecting a sliding template, aligning the first three rows and the first three columns of pixel points of the gray level image, sequencing all the pixel points in the sliding template according to the size of the pixel values to generate a monotonously-rising or monotonously-falling two-dimensional data sequence, and selecting a median value in the two-dimensional data sequence to replace the pixel value of a central pixel point in the sliding template;
s23: the whole sliding template translates a row of pixel points along the direction of the u axis of the pixel coordinate system, and the step S22 is repeated until the scanning of all the pixels of the row is completed;
s24: and moving the pixels of the whole sliding template downwards by one line, repeating the step S22 and the step S23, scanning the next line, and finally obtaining the image after median filtering.
4. The method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 1, wherein in step S3, the processing is performed on the image after median filtering, and the spacecraft is separated from the background by weak gradient elimination, so as to obtain the image after background separation, specifically comprising:
s31: calculating image gradient by using a Priwit operator, and calculating the gradient of the image I in the horizontal direction and the vertical direction by using 2 convolution kernels with the window size of 3 multiplied by 3 respectively;
in the formula, G x ,G y Respectively representing the horizontal and vertical gradients of the image, and representing a two-dimensional convolution operator;
s32: taking the root mean square of the gradient values in the horizontal direction and the vertical direction as the integral gradient of the image;
wherein G (u, v) represents an image gradient value at an image coordinate of (u, v);
s33: arranging gradient values G (u, v) of all pixels in the image in an ascending order, uniformly dividing the gradient values into a plurality of groups, counting the frequency of each group as a histogram form, and approximating the histogram form as an exponential probability density function
Wherein x represents an image gradient value, and lambda is obtained by fitting histogram data by using a formula (3);
s34: substituting the percentage of the weak gradient value pixel quantity in the total pixel quantity and lambda obtained by the formula (3) into an exponential distribution function formula (4), calculating a corresponding gradient segmentation threshold value, and setting the pixel values which are less than or equal to the gradient segmentation threshold value as 0;
5. the method for estimating pose of rotating rigid body spacecraft based on monocular vision as claimed in claim 1, wherein step S4, the image after background separation is processed, all ellipses in the image after background separation are detected by using fast ellipse detection method, labeled by 1,2, … n in sequence, and general functions of all ellipses are determined, specifically comprising:
processing the image after median filtering by using a rapid ellipse detection method to obtain five parameters (x) of a series of ellipses 0i ,y 0i ,a i ,b i ,θ i ) I ∈ {1,2.. n }, where (x) 0i ,y 0i ) Represents the center position of the ellipse, a i Semi-major axis of the ellipse, b i Semi-minor axis, theta, representing an ellipse i Representing the angle through which the major axis is rotated from the x-axis of the image coordinate system; from the five parameters of the ellipse, a general function for determining the ellipse is as follows:
Au 2 +Bv 2 +Cuv+Du+Ev+F=0 (5)
wherein A, B, C, D, E and F are parameters of an elliptic general function;
6. the method for estimating pose of rotating rigid body spacecraft based on monocular vision according to claim 5, wherein the step S5 is to match radii of all marked ellipses with known 3D model of target rotating rigid body spacecraft, and solve general functions of all ellipses by using a geometric method to obtain normal vector and center coordinate of a plane where each ellipse is located under a camera coordinate system, and specifically comprises:
s51: matching the marked ellipse with the known target spacecraft 3D model from Z w Matching in the positive direction of the axis to obtain the radius r of each matched circle i The formula (5) is rewritten into an algebraic form:
[u v 1]Q[u v 1] T =0 (7)
wherein,
the point is represented as P in the camera coordinate system c =(X c ,Y c ,Z c ) T Expressed as (u, v) in the pixel coordinate system T Corresponding homogeneous coordinate is expressed as (u, v,1) T The following relationship is satisfied:
X c ,Y c ,Z c respectively representing points along X in the camera coordinate system c Axis, Y c Axis and Z c Distance of the shaft; m ins The internal parameter matrix of the monocular camera is obtained; converting the formula (7) into a camera coordinate system to obtain an oblique elliptic cone gamma equation under the camera coordinate system as follows:
[X c Y c Z c ]C Q [X c Y c Z c ] T =0 (10)
wherein,
establishing a coordinate system O at the origin of the camera coordinate system c -X 'Y' Z ', the Z' axis being parallel to the normal of the plane of the ellipse, the oblique elliptic cone Γ becoming a positive elliptic cone; coordinate system O c -X ' Y ' Z ' and camera coordinate System O c -X c Y c Z c There is only a rotational transformation; transforming the matrix P by rotation, C Q Conversion to diagonal matrix:
P T C Q P=diag(λ 1 ,λ 2 ,λ 3 ) (12)
wherein λ is 1 、λ 2 And λ 3 Is C Q Characteristic value of (a), and 1 ≥λ 2 >0>λ 3 (ii) a Coordinate system O c -center coordinate O 'of the plane of the ith ellipse under X' Y 'Z' i And normal vector n' i Comprises the following steps:
wherein i belongs to {1,2.. n }, r i Showing and knowing the purposeMarking the radius of each circle after the spacecraft 3D model is matched;
the center coordinate O 'of the plane of the ellipse' i And normal vector n' i Turning to the camera coordinate system:
wherein,representing the center coordinate of the plane where the ith ellipse is located under the camera coordinate system;a normal vector representing the plane of the ith ellipse in the camera coordinate system;
s52: matching the marked ellipse with the known target spacecraft 3D model from Z w Matching in the opposite direction, and repeating the step S51 to obtain the radius r of each matched circle i Calculating Z w The axes are matched with the center coordinates and normal vectors of the plane where the series of ellipses are located in the lower camera coordinate system in the opposite direction.
7. A method for estimating pose of rotary rigid body spacecraft based on monocular vision as recited in claim 1, wherein said step S6 is to convert all central coordinates (x) into i ,y i ,z i ) Fitting i ∈ {1,2.. n } into a straight line, and calculating a unit direction vector where the straight line is located specifically includes:
the simplified form of the line l is:
wherein a, b, c and d are parameters of a straight line l, and the matrix form converted from (15) is as follows:
thus:
8. The method for estimating pose of a rotating rigid body spacecraft based on monocular vision as claimed in claim 1, wherein said step S7, according to the characteristic that all ellipses have their centers in a straight line, and the straight line is the normal of the plane where the ellipses are located, uniquely determines the normal vector and the center coordinates of the plane where the ellipses are located, and calculates the rotation matrix of the plane where the reference ellipse is located with respect to the tracked spacecraft by using the plane where the first ellipse is located as the reference, and the rotation matrix is used as the estimate of pose of the rotating rigid body spacecraft, specifically comprising:
s71: calculating the vector n in the direction of the normal line l of the ellipse and the two normal vectors of the plane of the reference ellipse obtained by S5Angle therebetween
S72: uniquely determining a normal vector of a plane where a reference ellipse is located and a corresponding central coordinate thereof according to the following principle;
wherein ε < 5 ° represents an angle threshold;
s73: calculating a displacement vector t of the target spacecraft relative to the tracking spacecraft according to the center coordinate of the plane where the uniquely determined reference ellipse is located:
t=(O x ,O y ,O z ) T (21)
wherein (O) x ,O y ,O z ) The center coordinates of the plane where the uniquely determined reference ellipse is located are represented;
the normal vector of the plane of the uniquely determined reference ellipseAs Z D Axial direction vectorCalculating a vector perpendicular thereto as X D Axial direction vectorThereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
wherein, a normal vector representing a plane in which the uniquely determined reference ellipse is located;
the normal vector of the plane of the uniquely determined reference ellipseAs Z D Axial direction vectorCalculating a vector perpendicular thereto as X D Axial direction vectorThereby calculating a rotation matrix R of the target spacecraft relative to the tracking spacecraft:
wherein, a normal vector representing a plane in which the uniquely determined reference ellipse is located;
in order to facilitate the analysis of the error determined by the relative attitude, the rotation matrix R is written into the form of Euler angle, and the target spacecraft is made to rotate around X w The angle of rotation of the shaft being pitch phi, about Y w The angle of rotation of the shaft being yaw angleAround Z w The angle of the axis rotation is a roll angle gamma, and the rotation matrix R is expressed in the form of Euler angleBy taking X-rays w 、Y w 、Z w The rotation sequence of (a) is expressed as:
from equation (23), the euler angle can be calculated, expressed as:
wherein R is ij (i, j e (1,2,3)) represents the i-th row and j-th column element of the rotation matrix R.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110545278.7A CN113295171B (en) | 2021-05-19 | 2021-05-19 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110545278.7A CN113295171B (en) | 2021-05-19 | 2021-05-19 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
Publications (2)
Publication Number | Publication Date |
---|---|
CN113295171A CN113295171A (en) | 2021-08-24 |
CN113295171B true CN113295171B (en) | 2022-08-16 |
Family
ID=77322796
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110545278.7A Active CN113295171B (en) | 2021-05-19 | 2021-05-19 | Monocular vision-based attitude estimation method for rotating rigid body spacecraft |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113295171B (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114693786B (en) * | 2022-03-16 | 2024-09-17 | 北京理工大学 | Target one-dimensional position parameter measurement method based on DSP |
CN114963981B (en) * | 2022-05-16 | 2023-08-15 | 南京航空航天大学 | Cylindrical part butt joint non-contact measurement method based on monocular vision |
CN116310126B (en) * | 2023-03-23 | 2023-11-03 | 南京航空航天大学 | Aircraft air inlet three-dimensional reconstruction method and system based on cooperative targets |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108492333A (en) * | 2018-03-30 | 2018-09-04 | 哈尔滨工业大学 | Spacecraft attitude method of estimation based on satellite-rocket docking ring image information |
CN109405835A (en) * | 2017-08-31 | 2019-03-01 | 北京航空航天大学 | Relative pose measurement method based on noncooperative target straight line and circle monocular image |
CN110186465A (en) * | 2019-07-03 | 2019-08-30 | 西北工业大学 | A kind of space non-cooperative target relative status estimation method based on monocular vision |
-
2021
- 2021-05-19 CN CN202110545278.7A patent/CN113295171B/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109405835A (en) * | 2017-08-31 | 2019-03-01 | 北京航空航天大学 | Relative pose measurement method based on noncooperative target straight line and circle monocular image |
CN108492333A (en) * | 2018-03-30 | 2018-09-04 | 哈尔滨工业大学 | Spacecraft attitude method of estimation based on satellite-rocket docking ring image information |
CN110186465A (en) * | 2019-07-03 | 2019-08-30 | 西北工业大学 | A kind of space non-cooperative target relative status estimation method based on monocular vision |
Also Published As
Publication number | Publication date |
---|---|
CN113295171A (en) | 2021-08-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108562274B (en) | Marker-based non-cooperative target pose measurement method | |
CN113295171B (en) | Monocular vision-based attitude estimation method for rotating rigid body spacecraft | |
Zhang et al. | Vision-based pose estimation for textureless space objects by contour points matching | |
CN101598556B (en) | Unmanned aerial vehicle vision/inertia integrated navigation method in unknown environment | |
CN103697855B (en) | A kind of hull horizontal attitude measuring method detected based on sea horizon | |
CN107862719A (en) | Scaling method, device, computer equipment and the storage medium of Camera extrinsic | |
CN107677274B (en) | Unmanned plane independent landing navigation information real-time resolving method based on binocular vision | |
CN105856230A (en) | ORB key frame closed-loop detection SLAM method capable of improving consistency of position and pose of robot | |
CN109631911B (en) | Satellite attitude rotation information determination method based on deep learning target recognition algorithm | |
CN115187798A (en) | Multi-unmanned aerial vehicle high-precision matching positioning method | |
CN106529587A (en) | Visual course identification method based on target point identification | |
CN111273312A (en) | Intelligent vehicle positioning and loop-back detection method | |
CN109214254B (en) | Method and device for determining displacement of robot | |
CN114549629A (en) | Method for estimating three-dimensional pose of target by underwater monocular vision | |
CN116563377A (en) | Mars rock measurement method based on hemispherical projection model | |
Kaufmann et al. | Shadow-based matching for precise and robust absolute self-localization during lunar landings | |
CN109871024A (en) | A kind of UAV position and orientation estimation method based on lightweight visual odometry | |
CN110211148B (en) | Underwater image pre-segmentation method based on target state estimation | |
CN111812978A (en) | Cooperative SLAM method and system for multiple unmanned aerial vehicles | |
CN108921896B (en) | Downward vision compass integrating dotted line characteristics | |
CN104484647B (en) | A kind of high-resolution remote sensing image cloud height detection method | |
CN112906573A (en) | Planet surface navigation road sign matching method based on contour point set | |
CN115131433B (en) | Non-cooperative target pose processing method and device and electronic equipment | |
CN115760984A (en) | Non-cooperative target pose measurement method based on monocular vision by cubic star | |
CN112734843B (en) | Monocular 6D pose estimation method based on regular dodecahedron |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |