CN101762277B - Six-degree of freedom position and attitude determination method based on landmark navigation - Google Patents
Six-degree of freedom position and attitude determination method based on landmark navigation Download PDFInfo
- Publication number
- CN101762277B CN101762277B CN2010101035226A CN201010103522A CN101762277B CN 101762277 B CN101762277 B CN 101762277B CN 2010101035226 A CN2010101035226 A CN 2010101035226A CN 201010103522 A CN201010103522 A CN 201010103522A CN 101762277 B CN101762277 B CN 101762277B
- Authority
- CN
- China
- Prior art keywords
- mover
- mrow
- msub
- rightarrow
- mtd
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000009466 transformation Effects 0.000 claims abstract description 11
- 239000011159 matrix material Substances 0.000 claims description 50
- 239000013598 vector Substances 0.000 claims description 28
- 230000003287 optical effect Effects 0.000 claims description 26
- 238000012546 transfer Methods 0.000 claims description 5
- 230000008878 coupling Effects 0.000 abstract description 4
- 238000010168 coupling process Methods 0.000 abstract description 4
- 238000005859 coupling reaction Methods 0.000 abstract description 4
- 230000010365 information processing Effects 0.000 abstract description 2
- 230000000007 visual effect Effects 0.000 abstract description 2
- 238000005259 measurement Methods 0.000 description 3
- 238000004364 calculation method Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 238000013461 design Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000014509 gene expression Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Landscapes
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
The invention relates to a six-degree of freedom position and attitude determination method based on landmark navigation, belonging to the information processing field of the automatic control system. The invention utilizes the coupling of position elements and attitude elements in pixel observation equation, considers the invariability of angle under European style transformation and uses included angles formed between navigation landmark observation visual line as observed quantity to decouple the position and attitude states in pixel observation equation and separately solve the position and attitude states in pixel observation equation, thus reducing the complexity of algorithm, fast determining the result and increasing the solution accuracy. The applicability of the method can be increased from three navigation landmarks to a plurality of landmarks.
Description
Technical Field
The invention relates to a six-degree-of-freedom position and posture determining method based on road sign navigation, and belongs to the field of information processing in automatic control systems.
Background
The autonomous positioning navigation is one of the most basic and important functions of an autonomous mobile carrier, an optical sensor is generally concerned about because the optical sensor can sense rich environmental information, and a plurality of existing visual positioning methods adopt a navigation mode based on landmark optical information and have certain success in the fields of mobile robots, aerospace, vehicle navigation and the like. The navigation method based on the road sign optical information mainly depends on an optical camera to shoot an image of a scene where the mobile carrier is located, and pixel information of relevant road signs in the image is extracted, wherein the positions of the road signs in the scene are known in advance, so that the spatial six-degree-of-freedom states of the mobile carrier, such as the position, the posture and the like relative to the scene, can be calculated through processing the road sign pixel information and the known position information. The method is a key link in autonomous landmark navigation, the state estimation with six degrees of freedom is a problem of strong nonlinearity and ambiguity, the correctness of the calculation result and the calculation precision directly influence the autonomous positioning capability of the mobile carrier, and the success rate of completing the related tasks of the mobile carrier is determined. Therefore, the method for obtaining the autonomous state of the road sign navigation is one of the key problems concerned by current science and technology personnel.
In the developed method for solving the six-degree-of-freedom state by using the landmark pixel information, in the prior art [1] (Sharp C S, Shakernia O, science S.A vision system for growing and estimating an estimated temporal [ C ]// IEEE International Conference on robotics and Automation, 2001, 2: 1720-. However, due to strong nonlinearity and ambiguity of the pixel observation equation, a truncation error in a linear process is large, so that the solution accuracy of a method for directly linearizing the pixel observation equation is low, and even divergence of a solution algorithm may be caused by improper initial value selection.
In the prior art [2] (refer to Chenying, digital photogrammetry of remote sensing images, Shanghai: Tongji university Press, 2003.), the position state and the attitude state of the mobile carrier are respectively solved based on a pyramid principle. The method effectively overcomes the correlation between the position state and the attitude state, and the algorithm is simple and convenient to solve. However, the method is only suitable for the spatial state solution based on 3 navigation landmark information, and when the number of the shot landmark pixels is more than 3, the method can only use 3 landmarks for solution, which obviously cannot fully use the information of other landmarks and cannot improve the navigation precision.
Disclosure of Invention
The invention aims to solve the problems of low precision, poor applicability and the like in the existing method for solving the state of six degrees of freedom by using road sign pixel information, and provides a method for determining the position and the attitude of six degrees of freedom.
The design idea of the invention is as follows: aiming at the coupling of position elements and attitude elements in a pixel observation equation, the invariance of angles under Euclidean transformation is considered, an included angle formed between the observation sight lines of each navigation road sign is used as an observed quantity, the position state and the attitude state in the pixel observation equation are decoupled, the position state and the attitude state in the pixel observation equation are respectively solved, the complexity of an algorithm is reduced, the result is quickly obtained, and the solving precision is improved. The applicability of the algorithm can also be extended from 3 navigation landmarks to multiple landmarks.
A six-degree-of-freedom position and attitude determination method comprises the following specific steps:
The mobile carrier can image the navigation road sign by using an optical camera carried by the mobile carrier in a scene, and the pointing direction of the navigation road sign under a mobile carrier camera coordinate system can be obtained by extracting pixel and image line coordinates of the navigation road sign in the image. Let the position vector of the ith navigation landmark beThe position vector and the transformation matrix of the coordinate system of the mobile carrier camera relative to the coordinate system of the scene are respectivelyAnd CbaThen, under the moving carrier camera coordinate system, the position vector of the ith navigation landmarkIs composed of
Wherein, the scene coordinate system is a three-dimensional coordinate system, and the matrix C is convertedbaA three row three column matrix.
Then the pixel p of the ith navigation road sign is extractediImage line liThe coordinate value and the position and the posture of the mobile carrier satisfy the following relation
Wherein x, y and z are three-axis position coordinates of the mobile carrier in a scene coordinate system, and xi,yi,ziFor the i-th navigation road sign in the three-axis position coordinate of the scene coordinate system, cba(a 1, 2, 3; b 1, 2, 3) is the transformation matrix CbaF is the focal length of the optical camera.
Step 2, constructing an observation included angle matrix of the navigation road sign by the optical camera by using the pixel and the image line coordinate of the navigation road sign obtained in the step 1 and the focal length f of the optical camera
Setting the observation included angle formed by the optical camera of the mobile carrier to the observation sight line of the ith and jth navigation signposts as AijThen, then
In the above formula Is the position vector of the ith and jth navigation signposts relative to the moving carrier under a scene coordinate system, ri,rjThe distance between the ith and jth navigation signposts and the moving carrier.
Using pixel p of any ith and jth road sign combination in n navigation road signsi、pjAnd image line li、ljCoordinates and focal length f of the optical camera, a measured angle of observation can be constructed
For n navigation road signs, any two road signs are combined pairwise to construct n (n-1)/2 observation included angles, and the n (n-1)/2 observation included angles form a measured observation included angle matrix
Step 3, utilizing the prior estimated value of the current position of the mobile carrierDetermining the predicted value A of the observation angleij *And an observation matrix H
Mobile carrier current position prior estimation value given by mobile carrier self position forecastThe prediction value of the observation angle matrix can be determinedAnd an observation matrix H, wherein
Aij *The forecast observation included angle corresponding to any ith and jth road sign combination is expressed as
The observation matrix is constructed by the following formula
Wherein
Wherein The predicted position vectors of the ith road sign and the jth road sign relative to the moving carrier under the scene coordinate system are respectively
ri *,rj *The forecasted distances between the ith and jth navigation landmarks and the moving carrier.
Step 4, determining the position of the mobile carrier relative to the scene by using the observation included angle and the observation matrix obtained in the steps 2 and 3
Using measured observation angle matricesAnd forecast observation angle matrixCalculating the matrix deviation of the observed included angle by differenceConstructing an observation angle deviation matrix
Further using iterative means
Iterative computation of position deviation amountUsing the deviation to estimate the position priorAnd improving to solve the current mobile carrier position:
step 5, determining the posture of the mobile carrier relative to the scene by using the position result of the mobile carrier relative to the scene obtained in the step 4
Using n pixels p of navigation signpostiImage line liCoordinates and focal length f of the optical camera, and unit pointing vector of each navigation road sign in the moving carrier coordinate systemAnd matrix form composed of the same
Moving carrier position calculated based on step 4Determining unit pointing vector of each navigation road sign under scene coordinate systemAnd matrix form composed of the same
By using NaAnd NbSolving the optimal solution C of the attitude transfer matrix of the mobile carrier relative to the scene coordinate systembaComprises the following steps:
wherein And obtaining the posture of the mobile carrier relative to the scene coordinate system according to the posture transfer matrix.
Advantageous effects
The method provided by the invention considers the invariance of angles under Euclidean transformation, utilizes the included angle formed between the observation sight lines of each navigation road sign as the observed quantity, decouples the state of the position and the attitude in the pixel observation equation, weakens the nonlinear strength and the complexity of an observation system, and ensures that the decoupled linear equation has good solving linearity, high iteration efficiency and simple operation; the degree of nonlinearity is weakened, so that the truncation error is reduced, and the resolving precision is improved; the iteration efficiency is improved, so that the resolving time is shortened, and the real-time performance of the system is improved.
The method is suitable for the condition that 3 or more than 3 navigation signposts exist in an autonomous positioning navigation system, can solve the six-free-space position and state of the mobile carrier in real time, and has great application value in the engineering fields of space detection, satellite positioning, unmanned aerial vehicles and the like.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
FIG. 2 is a schematic diagram of the navigation landmark imaging relationship in the present invention.
Fig. 3 is a schematic view of an angle measurement position surface according to an embodiment of the present invention.
Detailed Description
For a better understanding of the objects and advantages of the present invention, reference should be made to the following detailed description taken in conjunction with the accompanying drawings.
The specific steps of this example are as follows:
The mobile carrier can image the navigation road sign by using an optical camera carried by the mobile carrier in a scene, the pointing direction of the navigation road sign under a mobile carrier coordinate system can be obtained by extracting pixel and image line coordinates of the navigation road sign in an image, and the imaging relation of the navigation road sign is shown in fig. 2.
Let the position vector of the ith navigation landmark beThe position vector prior estimated value and the transformation matrix of the coordinate system of the mobile carrier camera relative to the scene coordinate system are respectivelyAnd CbaThen, under the moving carrier camera coordinate system, the position vector of the ith navigation landmarkIs composed of
Wherein, the scene coordinate system is a three-dimensional coordinate system, and the matrix is convertedCbaA three row three column matrix.
Then the pixel p of the ith navigation road sign is extractediImage line liThe coordinate value and the position and the posture of the mobile carrier satisfy the following relation
Wherein x, y and z are three-axis position coordinates of the mobile carrier in a scene coordinate system, and xi,yi,ziFor the i-th navigation road sign in the three-axis position coordinate of the scene coordinate system, cba(a 1, 2, 3; b 1, 2, 3) is the transformation matrix CbaF is the focal length of the optical camera. If the number of the tracked and observed navigation landmarks is n, the corresponding observation equation is
Step 2, constructing an observation included angle matrix measured by the optical camera on the navigation road sign by using the pixel and the image line coordinate of the navigation road sign obtained in the step 1 and the focal length f of the optical camera
As can be seen from the expressions (1) and (2) in step 1, x, y, z representing position information and c representing attitude information in the observation equationbaThe method has serious coupling, the observation equation has strong nonlinear characteristics, and the direct use of the formula (2) for estimating the six-degree-of-freedom state of the mobile carrier can lead to complex and fussy algorithm and even divergence. (2) The complexity of formula solving is mainly caused by position and attitude coupling, the invariance of an angle coordinate under Euclidean transformation is noticed, namely the angle coordinate in a geometric space is irrelevant to the transformation of an orthogonal attitude, and an optical camera is introduced to carry out decoupling solving on the position and the attitude of the moving carrier by taking an included angle between the observation lines of the navigation road sign as an observed quantity by utilizing the characteristic.
Setting the observation angle formed by the optical camera of the moving carrier to the observation sight line of the ith and jth signposts as AijThen, then
In the above formulaIs the position vector of the ith and jth navigation signposts relative to the moving carrier under a scene coordinate system, ri,rjThe distance between the ith and jth navigation signposts and the moving carrier.
The observation angle can be expressed by pixel, pixel line coordinates in the optical image, i.e.
The geometric relation shows that the position surface corresponding to the measured value of the included angle is a toroid obtained by rotating a section of circular arc passing through two navigation road signs by taking the connecting line of the two navigation road signs as an axis, namely, the observation angles on the toroid both meet AijValues, as shown in fig. 3. The center O of the arc is marked at two paths PiPjOn the perpendicular bisector of the sideline, the distance between the arc radius R and the two marks and AijIn a relationship of
From the center of the circle to PiPjA distance L of
L=RcosAij
The geometric description can also be expressed by a vector formula, usingAndinner product of (A) has
As can be seen, the above formula is the position of the mobile carrierAnd measure the included angle AijRegardless of the attitude state of the carrier, the positional state of the carrier can be solved independently by the above equation.
For n navigation road signs, any two road signs are combined pairwise to construct n (n-1)/2 observation included angles, and the n (n-1)/2 observation included angles form a measured observation included angle matrix
Step 3, utilizing the prior estimated value of the current position of the mobile carrierDetermining a predicted value A of the measured angleij *And an observation matrix H
Considering that equation (3) is a non-linear equation, it is difficult to directly solve the equation, and the prior estimated value of the position of the moving carrier is assumed below under the condition of small deviation linearizationAnd deducing a linearization measurement equation of the position sensor to obtain the position deviationDeviation from the observed angle delta AijApproximately linear relationship therebetween
Wherein
WhereinThe predicted position vectors of the ith road sign and the jth road sign relative to the moving carrier under the scene coordinate system are respectively
ri *,rj *The forecasted distances between the ith and jth navigation landmarks and the moving carrier. In the actual operation process, the prior estimation value of the current position of the mobile carrier is given by using position forecastThe prediction value A of the measuring angle can be determinedij *And linear vectorBy measuring the deviation deltaA in this wayijBy using <math>
<mrow>
<mi>δ</mi>
<msub>
<mi>A</mi>
<mi>ij</mi>
</msub>
<mo>=</mo>
<msub>
<mover>
<mi>h</mi>
<mo>→</mo>
</mover>
<mi>ij</mi>
</msub>
<mo>·</mo>
<mi>δ</mi>
<mover>
<mi>r</mi>
<mo>→</mo>
</mover>
</mrow>
</math> The current position of the mobile carrier can be improved. If n navigation road signs are extracted in the image, n (n-1)/2 measurement angle combinations are in total, namely n (n-1)/2 such as <math>
<mrow>
<mi>δ</mi>
<msub>
<mi>A</mi>
<mi>ij</mi>
</msub>
<mo>=</mo>
<msub>
<mover>
<mi>h</mi>
<mo>→</mo>
</mover>
<mi>ij</mi>
</msub>
<mo>·</mo>
<mi>δ</mi>
<mover>
<mi>r</mi>
<mo>→</mo>
</mover>
</mrow>
</math> Equations of form, forming pairs of equations for the position of the moving carrierAnd (6) solving.
Mobile carrier current position prior estimation value given by mobile carrier self position forecastThe prediction value of the observation angle matrix can be determinedAnd an observation matrix H, wherein
Aij *The forecast observation included angle corresponding to any ith and jth road sign combination is expressed as
The observation matrix is constructed by the following formula
Step 4, determining the position of the mobile carrier relative to the scene by using the observation included angle and the observation matrix obtained in the steps 2 and 3
Using measured observation angle matricesAnd forecast observation angle matrixCalculating the matrix deviation of the observed included angle by differenceConstructing an observation angle deviation matrix
Further using iterative means
Iterative computation of position deviation amountUsing the deviation to estimate the position priorAnd improving to solve the current mobile carrier position:
step 5, determining the posture of the mobile carrier relative to the scene by using the position result of the mobile carrier relative to the scene obtained in the step 4
Since the position of the navigation landmark can be expressed as in the moving carrier camera coordinate system
The unit normalization of the formula (4) can be obtained
Wherein,the coordinates of the pixels and the image lines of the navigation road sign can be expressed as
Determining the optimal solution of the attitude transfer matrix of the mobile carrier relative to the scene coordinate system by using the position of the mobile carrier obtained in the step 4 and a multi-vector attitude determination principle
Wherein
Thus, the position solution of step 4 is used
And the attitude matrix of step 5
Six free-space states, such as the position and attitude of the mobile carrier relative to the scene, can be determined.
Claims (1)
1. The six-degree-of-freedom position and posture determining method based on road sign navigation is characterized by comprising the following steps of: comprises the following steps:
step 1, reading pixel coordinates of navigation road signs in images shot by an optical camera
The mobile carrier can image the navigation road sign by using an optical camera carried by the mobile carrier in a scene, the pointing direction of the navigation road sign under a coordinate system of the mobile carrier camera can be obtained by extracting the pixel and image line coordinates of the navigation road sign in the image, and the position vector of the ith navigation road sign under the coordinate system of the scene isN, the position vector and the transformation matrix of the coordinate system of the mobile carrier camera relative to the coordinate system of the scene are respectively 1, 2And CbaThen, under the moving carrier camera coordinate system, the position vector of the ith navigation landmarkIs composed of
Wherein, the scene coordinate system is a three-dimensional coordinate system, and the matrix C is convertedbaThree rows and three columns of matrixes are formed;
then the pixel p of the ith navigation road sign is extractediImage line liThe coordinate value and the position and the posture of the mobile carrier satisfy the following relation
Wherein x, y and z are three-axis position coordinates of the mobile carrier in a scene coordinate system, and xi,yi,ziFor the i-th navigation road sign in the three-axis position coordinate of the scene coordinate system, cbaA is 1, 2, 3; b is 1, 2, 3, and is the transformation matrix CbaF is the focal length of the optical camera;
step 2, constructing a measured observation included angle matrix by using the pixel and the image line coordinate of the navigation road sign obtained in the step 1 and the focal length f of the optical camera
Setting the observation included angle formed by the optical camera of the mobile carrier to the observation sight line of the ith and jth navigation signposts as Aij:
For n navigation road signs, any two road signs are combined pairwise to construct n (n-1)/2 observation included angles, and the n (n-1)/2 observation included angles form a measured observation included angle matrix
Step 3, utilizing the prior estimated value of the current position of the mobile carrierDetermining the predicted value of the observation angleAnd an observation matrix H
The forecast observation included angle corresponding to any ith and jth road sign combination is expressed as
The observation matrix is constructed by the following formula
Wherein
WhereinThe predicted position vectors of the ith road sign and the jth road sign relative to the moving carrier under the scene coordinate system are respectively
step 4, determining the position of the mobile carrier relative to the scene by using the observation included angle and the observation matrix obtained in the steps 2 and 3
Using measured observation angle matricesAnd forecast observation angle matrixCalculating the matrix deviation of the observed included angle by differenceConstructing an observation angle deviation matrix
Further using iterative means
Iterative computation of position deviation amountUsing the deviation to estimate the position priorAnd improving to solve the current mobile carrier position:
step 5, determining the posture of the mobile carrier relative to the scene by using the position result of the mobile carrier relative to the scene obtained in the step 4
Constructing unit pointing vector of each navigation road sign under moving carrier coordinate systemAnd matrix form composed of the same
Moving carrier position calculated based on step 4Determining unit pointing vector of each navigation road sign under scene coordinate systemAnd matrix form composed of the same
By using NaAnd NbSolving the optimal solution C of the attitude transfer matrix of the mobile carrier relative to the scene coordinate systembaComprises the following steps:
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010101035226A CN101762277B (en) | 2010-02-01 | 2010-02-01 | Six-degree of freedom position and attitude determination method based on landmark navigation |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2010101035226A CN101762277B (en) | 2010-02-01 | 2010-02-01 | Six-degree of freedom position and attitude determination method based on landmark navigation |
Publications (2)
Publication Number | Publication Date |
---|---|
CN101762277A CN101762277A (en) | 2010-06-30 |
CN101762277B true CN101762277B (en) | 2012-02-15 |
Family
ID=42493579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN2010101035226A Expired - Fee Related CN101762277B (en) | 2010-02-01 | 2010-02-01 | Six-degree of freedom position and attitude determination method based on landmark navigation |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN101762277B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9183638B2 (en) | 2011-08-09 | 2015-11-10 | The Boeing Company | Image based position determination |
US9214021B2 (en) | 2012-10-09 | 2015-12-15 | The Boeing Company | Distributed position identification |
CN102980555B (en) * | 2012-12-06 | 2015-04-08 | 紫光股份有限公司 | Method and device for detecting direction of optical imaging type wheeled mobile robot |
US10375289B2 (en) * | 2017-03-31 | 2019-08-06 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for providing autonomous photography and videography |
CN110065062A (en) * | 2018-01-24 | 2019-07-30 | 南京机器人研究院有限公司 | A kind of motion control method of articulated robot |
CN108896053B (en) * | 2018-07-12 | 2021-11-23 | 北京理工大学 | Planet landing optical navigation optimal road sign selection method |
CN113791377B (en) * | 2021-09-09 | 2024-04-12 | 中国科学院微小卫星创新研究院 | Positioning method based on angle measurement |
CN116091546B (en) * | 2023-01-12 | 2024-04-19 | 北京航天飞行控制中心 | Observation construction method under push-broom mode of optical camera |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6489922B1 (en) * | 2000-04-22 | 2002-12-03 | American Gnc Corporation | Passive/ranging/tracking processing method for collision avoidance guidance and control |
KR20060011517A (en) * | 2004-07-30 | 2006-02-03 | 에스케이 텔레콤주식회사 | Robust gps data format in gps error, and gps poison determination method by using the gps data format |
CN1851406A (en) * | 2006-05-26 | 2006-10-25 | 南京航空航天大学 | Gasture estimation and interfusion method based on strapdown inertial nevigation system |
CN101419070A (en) * | 2008-12-03 | 2009-04-29 | 南京航空航天大学 | Relative position and pose determining method based on laser ranging formatter |
-
2010
- 2010-02-01 CN CN2010101035226A patent/CN101762277B/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6489922B1 (en) * | 2000-04-22 | 2002-12-03 | American Gnc Corporation | Passive/ranging/tracking processing method for collision avoidance guidance and control |
KR20060011517A (en) * | 2004-07-30 | 2006-02-03 | 에스케이 텔레콤주식회사 | Robust gps data format in gps error, and gps poison determination method by using the gps data format |
CN1851406A (en) * | 2006-05-26 | 2006-10-25 | 南京航空航天大学 | Gasture estimation and interfusion method based on strapdown inertial nevigation system |
CN101419070A (en) * | 2008-12-03 | 2009-04-29 | 南京航空航天大学 | Relative position and pose determining method based on laser ranging formatter |
Non-Patent Citations (1)
Title |
---|
汤国建,赵汉元.载人飞船返回舱六自由度再入弹道仿真研究.《飞行力学》.2000,第18卷(第3期),80-84. * |
Also Published As
Publication number | Publication date |
---|---|
CN101762277A (en) | 2010-06-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101762277B (en) | Six-degree of freedom position and attitude determination method based on landmark navigation | |
Scaramuzza et al. | Absolute scale in structure from motion from a single vehicle mounted camera by exploiting nonholonomic constraints | |
CN102435188B (en) | Monocular vision/inertia autonomous navigation method for indoor environment | |
CN102682448B (en) | Stereo vision rapid navigation and positioning method based on double trifocal tensors | |
CN112129281B (en) | High-precision image navigation positioning method based on local neighborhood map | |
CN103791902B (en) | It is applicable to the star sensor autonomous navigation method of high motor-driven carrier | |
CN103759670A (en) | Object three-dimensional information acquisition method based on digital close range photography | |
Cai et al. | Mobile robot localization using gps, imu and visual odometry | |
CN104281148A (en) | Mobile robot autonomous navigation method based on binocular stereoscopic vision | |
Lee et al. | Structureless pose-graph loop-closure with a multi-camera system on a self-driving car | |
CN101762274B (en) | Observation condition number-based method for selecting autonomously located road sign of deep space probe | |
Dawood et al. | Harris, SIFT and SURF features comparison for vehicle localization based on virtual 3D model and camera | |
CN105956074A (en) | Single image scene six-degree-of-freedom positioning method of adjacent pose fusion guidance | |
CN101782392B (en) | Method for selecting autonomous navigation signposts of deep space probe based on observing matrix | |
Liu et al. | A high-accuracy pose measurement system for robotic automated assembly in large-scale space | |
Jo et al. | New Monte Carlo localization using deep initialization: A three-dimensional LiDAR and a camera fusion approach | |
CN105866777A (en) | Bistatic PS-InSAR 3D deformation inversion method based on multi-angle and multi-period navigation satellite | |
Hoang et al. | 3D motion estimation based on pitch and azimuth from respective camera and laser rangefinder sensing | |
Fang et al. | A motion tracking method by combining the IMU and camera in mobile devices | |
CN103791901B (en) | A kind of star sensor data processes system | |
CN117434570B (en) | Visual measurement method, measurement device and storage medium for coordinates | |
CN111145267B (en) | 360-degree panoramic view multi-camera calibration method based on IMU assistance | |
Andolfo et al. | Precise pose estimation of the NASA Mars 2020 Perseverance rover through a stereo‐vision‐based approach | |
Van Hamme et al. | Robust visual odometry using uncertainty models | |
Pedersen | Science target assessment for Mars rover instrument deployment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
C10 | Entry into substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
C14 | Grant of patent or utility model | ||
GR01 | Patent grant | ||
C17 | Cessation of patent right | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20120215 Termination date: 20130201 |