CN108444478B - Moving target visual pose estimation method for underwater vehicle - Google Patents
Moving target visual pose estimation method for underwater vehicle Download PDFInfo
- Publication number
- CN108444478B CN108444478B CN201810206444.9A CN201810206444A CN108444478B CN 108444478 B CN108444478 B CN 108444478B CN 201810206444 A CN201810206444 A CN 201810206444A CN 108444478 B CN108444478 B CN 108444478B
- Authority
- CN
- China
- Prior art keywords
- underwater vehicle
- state
- target
- coordinate system
- sigma
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims abstract description 30
- 230000000007 visual effect Effects 0.000 title claims abstract description 8
- 238000005259 measurement Methods 0.000 claims abstract description 21
- 239000011159 matrix material Substances 0.000 claims description 35
- 238000006243 chemical reaction Methods 0.000 claims description 6
- 230000008569 process Effects 0.000 claims description 4
- 238000001914 filtration Methods 0.000 abstract description 5
- 230000009466 transformation Effects 0.000 abstract description 4
- 238000013178 mathematical model Methods 0.000 abstract description 3
- 238000010586 diagram Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 241000282414 Homo sapiens Species 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
- Navigation (AREA)
Abstract
The invention provides a moving target visual pose estimation method for an underwater vehicle, which is characterized in that according to a mathematical model of the underwater vehicle, a linear velocity, an angular velocity and an angle of the underwater vehicle under a carrier system are measured by combining sensors (such as a Doppler velocimeter and an azimuth attitude measurement system) to obtain state information; and (3) taking a plurality of feature points known on the moving target, and on the basis of the underwater vehicle kinematic model, obtaining the position of the feature points under the global system under the image system through coordinate system transformation, thereby obtaining the measurement information. And estimating the relative position difference and the motion attitude of the underwater vehicle and the center of the target object by using an unscented Kalman filtering algorithm. Compared with the method of the geometric method, the method breaks through the limitation that the arrangement of the characteristic points in the geometric method must meet specific conditions, and can accurately estimate the relative position difference and the motion attitude of the centers of the underwater vehicle and the target object.
Description
Technical Field
The invention relates to the technical field of underwater vehicle vision, in particular to a moving target vision pose estimation method for an underwater vehicle, which is used for estimating moving target pose parameters below the underwater vehicle by utilizing vision measurement and a pose estimation method based on nonlinear Kalman filtering when the underwater vehicle tracks a characteristic target.
Background
The ocean is a huge wealth for human beings, which reserves huge resources, but the exploration of the ocean is still a huge challenge, especially the exploration of deep sea and open sea. The development of underwater sensor technology has also greatly driven the development of underwater vehicle technology.
For a long time, people are devoted to research on an underwater acoustic positioning technology, and good research results are obtained in the aspects of long-distance target positioning, navigation and the like in an underwater vehicle, but the stability and the precision in short-distance measurement are still to be further improved due to the low data updating frequency of an underwater acoustic positioning system. To meet the needs of underwater vehicle operations, people need to achieve close range target estimation. And the vision sensor is suitable for the close-range and high-precision target detection and tracking.
The current underwater vehicle vision positioning mainly adopts a geometric method. The geometric method obtains position coordinates of n characteristic points on a visual target under an image system through a camera arranged on an underwater vehicle, and the position and the posture of the underwater vehicle relative to the target are solved through a PnP (Passive-n-Piont) algorithm. However, the algorithm has the problems of multiple solutions, poor robustness and the like in the solving process, in order to obtain a unique solution, the arrangement of the feature points must meet specific conditions, so that the application range of the algorithm is limited, and information such as the speed of a target cannot be obtained, so that the method cannot be widely used for controlling the underwater vehicle in practical use.
Disclosure of Invention
Aiming at the problems in the prior art, the invention provides a moving target visual pose estimation method for an underwater vehicle, which is characterized in that the underwater vehicle with six degrees of freedom tracks a moving characteristic target above the characteristic target, and the method can accurately estimate the relative position difference and the motion attitude between the underwater vehicle and the center of a target object.
The main principle and thought of the invention are as follows: according to the mathematical model of the underwater vehicle, measuring the linear velocity, the angular velocity and the angle of the underwater vehicle under a carrier system by combining a sensor (such as a Doppler velocimeter and an azimuth attitude measurement system) to obtain state information; and (3) taking a plurality of feature points known on the moving target, and on the basis of the underwater vehicle kinematic model, obtaining the position of the feature points under the global system under the image system through coordinate system transformation, thereby obtaining the measurement information. And estimating the relative position difference and the motion attitude of the underwater vehicle and the center of the target object by using an unscented Kalman filtering algorithm.
The technical scheme of the invention is as follows:
the moving target visual pose estimation method for the underwater vehicle is characterized by comprising the following steps of: the method comprises the following steps:
step 1: acquiring initial state information of the underwater vehicle by using an azimuth attitude measurement system and a speed measuring instrument which are installed on the underwater vehicle with six degrees of freedom, estimating the initial state information of a target, and setting an initial value x (0) of a system state vector of the underwater vehicle when k is 0 according to the system state vector:
x(k)=[Δx,Δy,Δz,φB,θB,ψB,uB,vB,wB,pB,qB,rB,φA,θA,ψA,uA,vA,wA,pA,qA,rA]k T
wherein Δ x ═ xB-xA,Δy=yB-yA,Δz=zB-zA,(xB,yB,zB) For the position coordinates of the center of the underwater vehicle in the global coordinate system, (x)A,yA,zA) The position coordinates of the target center in the global system, and delta x, delta y and delta z are respectively the position difference of the underwater vehicle relative to the tracking target in the global coordinate system, phiB,θB,ψBRespectively the attitude angle u of the underwater vehicle in the global coordinate systemB,vB,wBRespectively the linear velocity p of the underwater vehicle in the vehicle system of the underwater vehicleB,qB,rBAngular velocities, phi, of three attitude angles of the underwater vehicle in a global coordinate system, respectivelyA,θA,ψARespectively the attitude angle, u, of the target in the global coordinate systemA,vA,wALinear velocities, p, of the target in three directions in the target carrier system, respectivelyA,qA,rAThe angular velocities of three attitude angles of the target in the global coordinate system are respectively; and calculating a covariance matrix of state quantities of the underwater vehicle at the moment when k is 0
WhereinIs the expected value of x (0), E (.) represents the expected operation, (.)TRepresenting a transpose operation;
step 2: obtaining the estimated value of the system state vector according to the k-1 momentAnd a state estimation covariance matrix P (k-1) is combined with the linear velocity of the underwater vehicle in the vehicle system of the underwater vehicle and the angular velocity and angle of the underwater vehicle in a global coordinate system measured at the moment k, and the estimated value of the system state vector of the underwater vehicle at the moment k is estimatedAnd a state estimation covariance matrix p (k):
step 2.1: predicting the state one step ahead, and the steps are as follows:
step 2.1.1: using estimated states at time k-1And estimating the covariance matrix P (k-1| k-1), yielding the Sigma points
In the formulaRepresents the ith point of Sigma and represents the point of Sigma,after the expression n is multiplied by the estimation variance P (k-1| k-1), matrix evolution operation is carried out, an ith row is selected and is parallel to obtain an n-dimensional vector, and n is the dimension of the system state vector;
WhereinThe state estimate calculated for the ith Sigma point,for the system state equation, the system state equation in one step length T is
In which is a T control period of time,for converting the underwater vehicle carrier coordinate system into a coordinate conversion matrix under a global coordinate system,converting the target carrier system into a coordinate conversion matrix under a global coordinate system;
Step 2.1.4: and (3) taking process noise into consideration, obtaining an estimated covariance matrix predicted in the previous step: :
wherein Q (k-1) is the covariance matrix of the system noise;
step 2.2: updating the prediction state of the previous step by using the measurement, and the steps are as follows:
Wherein the measurement equation is:
h[x(k)]=[μj(k),υj(k),φB(k),θB(k),ψB(k),uB,νB,wB,pB,qB,rB]T
(μj(k),υj(k) the coordinates of the jth characteristic point on the target in an image coordinate system are acquired by a downward-looking camera of the underwater vehicle, and the coordinates of the characteristic point on the target in the image coordinate system are obtained according to the image data;
Step 2.2.5: the covariance of the metrology prediction considering metrology noise is:
wherein R (k) a covariance matrix of the measured noise;
Step 2.3: calculating a gain matrix K (k) PXZPZ -1Then the updated state estimate and variance are
P(k|k)=P(k|k-1)-K(k)PZKT(k)
Wherein z (k) is a measurement at time k;
step 2.4: and (5) repeating the step 2.1 to the step 2.3 until the navigation is finished.
Advantageous effects
Compared with the method of the geometric method, the method breaks through the limitation that the arrangement of the characteristic points in the geometric method must meet specific conditions, and can accurately estimate the relative position difference and the motion attitude of the centers of the underwater vehicle and the target object.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1: the invention discloses a visual system model schematic diagram of an underwater vehicle;
FIG. 2: in the first example, a schematic diagram of the motion tracks of an underwater vehicle and characteristic points is shown;
FIG. 3: in the first example, the schematic diagram of the relative position of the underwater vehicle and the center of the characteristic target is shown;
FIG. 4: the attitude angle estimation value of the characteristic target in the first example is shown schematically.
Detailed Description
The following detailed description of embodiments of the invention is intended to be illustrative, and not to be construed as limiting the invention.
The embodiment is based on nonlinear Unscented kalman filtering (Unscented kalman Filter (kf)), and the hardware basis of the method is that an industrial camera is installed in the middle below the underwater vehicle with six degrees of freedom and is downward overlooked, the camera is sealed and waterproof by using a sealed cabin, the lower end of the sealed cabin is made of organic glass, and the frame rate of the camera at least reaches more than 20 frames. An industrial personal computer is installed in a main sealed cabin of the underwater vehicle, the performance of a processor of the industrial personal computer needs to reach the performance of an i5 processor or above, and the internal memory of a hard disk needs to reach above 32 GB. The underwater vehicle is provided with an azimuth attitude measurement system, so that the real-time angular speed and angle information of the underwater vehicle under a global coordinate system can be obtained. The Doppler velocimeter is arranged at the head part below the underwater vehicle, so that real-time speed information of the underwater vehicle under an underwater vehicle carrier system can be obtained.
The principle of the method is that according to a mathematical model of the underwater vehicle, a Doppler velocimeter and an azimuth attitude measurement system are combined to measure the linear velocity, the angular velocity and the angle of the underwater vehicle under a carrier system, and state information is obtained; 4 known feature points on a moving target are extracted by using an opencv image extraction algorithm in an industrial personal computer, and the positions of the feature points under a known global system under an image system are obtained through coordinate system transformation on the basis of an underwater vehicle kinematic model, so that measurement information is obtained. And estimating the relative position difference and the motion attitude of the underwater vehicle and the center of the target object by using an unscented Kalman filtering algorithm.
The method comprises the following specific steps:
step 1: acquiring initial state information of the underwater vehicle by using an azimuth attitude measurement system and a speed measuring instrument which are installed on the underwater vehicle with six degrees of freedom, estimating the initial state information of a target, and setting an initial value x (0) of a system state vector of the underwater vehicle when k is 0 according to the system state vector:
x(k)=[Δx,Δy,Δz,φB,θB,ψB,uB,vB,wB,pB,qB,rB,φA,θA,ψA,uA,vA,wA,pA,qA,rA]k T
wherein Δ x ═ xB-xA,Δy=yB-yA,Δz=zB-zA,(xB,yB,zB) For the position coordinates of the center of the underwater vehicle in the global coordinate system, (x)A,yA,zA) Is the position coordinate of the target center in the global system, Δ x, Δ y, ΔzRespectively the position difference phi of the underwater vehicle relative to the tracking target under the global coordinate systemB,θB,ψBRespectively the attitude angle u of the underwater vehicle in the global coordinate systemB,vB,wBRespectively the linear velocity p of the underwater vehicle in the vehicle system of the underwater vehicleB,qB,rBAngular velocities, phi, of three attitude angles of the underwater vehicle in a global coordinate system, respectivelyA,θA,ψARespectively the attitude angle, u, of the target in the global coordinate systemA,vA,wALinear velocities, p, of the target in three directions in the target carrier system, respectivelyA,qA,rAThe angular velocities of three attitude angles of the target in the global coordinate system are respectively; and calculating a covariance matrix of state quantities of the underwater vehicle at the moment when k is 0
WhereinIs the expected value of x (0), E (.) represents the expected operation, (.)TRepresenting a transpose operation.
In this embodiment, the initial value x (0) of the system state vector is obtained
P(0)=diag{10,10,10,π/180,π/180,π/180,1,1,1,π/180,π/180,π/180,π/180,π/180,π/180,1,1,1,π/180,π/180,π/180}。
Step 2: obtaining the estimated value of the system state vector according to the k-1 momentAnd a state estimation covariance matrix P (k-1) is combined with the linear velocity of the underwater vehicle in the vehicle system of the underwater vehicle and the angular velocity and angle of the underwater vehicle in a global coordinate system measured at the moment k, and the estimated value of the system state vector of the underwater vehicle at the moment k is estimatedAnd a state estimation covariance matrix p (k):
step 2.1: predicting the state one step ahead, and the steps are as follows:
step 2.1.1: using estimated states at time k-1And estimating the covariance matrix P (k-1| k-1), yielding the Sigma points
In the formulaRepresents the ith point of Sigma and represents the point of Sigma,after the expression n is multiplied by the estimation variance P (k-1| k-1), matrix evolution operation is carried out, an ith row is selected and is parallel to obtain an n-dimensional vector, and n is the dimension of the system state vector;
WhereinThe state estimate calculated for the ith Sigma point,for the system state equation, the system state equation in one step length T is
Wherein, T is the control period, T is 0.5,for converting the underwater vehicle carrier coordinate system into a coordinate conversion matrix under a global coordinate system,converting the target carrier system into a coordinate conversion matrix under a global coordinate system;
Step 2.1.4: and (3) taking process noise into consideration, obtaining an estimated covariance matrix predicted in the previous step: :
where Q (k-1) is the covariance matrix of the system noise, obtained through estimation and historical experience, taken in this example as:
Q=diag{0.1,0.1,0.1,0.1π/180,0.1π/180,0.1π/180,1,1,1,0.1π/180,0.1π/180,0.1π/180,0.1π/180,0.1π/180,0.1π/180,1,1,1,0.1π/180,0.1π/180,0.1π/180,}2;
step 2.2: updating the prediction state of the previous step by using the measurement, and the steps are as follows:
Wherein the measurement equation is:
h[x(k)]=[μj(k),υj(k),φB(k),θB(k),ψB(k),uB,νB,wB,pB,qB,rB]T
(μj(k),υj(k) the coordinates of the jth characteristic point on the target under an image coordinate system are acquired by an overhead camera of the underwater vehicle, and the coordinates of the characteristic point on the target in the image coordinate system are obtained by an industrial personal computer according to the image data by utilizing an image extraction algorithm in opencv; in the present embodiment, 4 feature points j are 1,2,3, 4;
the feature points are converted into coordinates under the camera system
Wherein,is a transformation matrix from a global coordinate system to an underwater vehicle carrier system, (x, y, z) are position coordinates of the underwater vehicle under the global coordinate system,for the position coordinate of the jth characteristic point at the moment t in the global coordinate system,j is the coordinate of the jth feature point at the time t in the camera coordinate system, and is 1,2,3, and 4, as shown in fig. 1. Combining an aperture camera model and an underwater vehicle kinematic model, the coordinates of 4 characteristic points in a global coordinate system in an image system are
Wherein R isijIs a rotation matrixRow i and column j of (d), (u)j,νj) Is the two-dimensional coordinate of the j-th point in the camera system, kx,kyThe resulting parameters are calibrated for the camera. Considering the coincidence of the camera systems of the underwater vehicle carrier and the downward-looking camera,
in this embodiment, the initial coordinates of 4 feature points known to be on the target in the global coordinate system are taken as O1(0.25,0.25,0),O2(0.25,-0.25,0),O3(-0.25,-0.25,0),O4(-0.25,0.25,0), focal length f of the camerax=f y40 pixels.
Step 2.2.5: the covariance of the metrology prediction considering metrology noise is:
where R (k) is a covariance matrix of the measured noise, obtained from the recorded data and the characteristics of the instrument, in this example R (k) biag {4,4,4,4,4,4,4, pi/180, 0.01,0.01,0.01, pi/180, }2
Step 2.3: calculating a gain matrix K (k) PXZPZ -1Then the updated state estimate and variance are
P(k|k)=P(k|k-1)-K(k)PZKT(k)
Wherein z (k) is a measurement at time k;
step 2.4: and (5) repeating the step 2.1 to the step 2.3 until the navigation is finished.
And finishing the estimation of the position of the underwater vehicle relative to the characteristic target and the attitude of the characteristic target, wherein the Matlab simulation result of the estimation value of the relative position of the underwater vehicle and the center of the characteristic target is shown in FIG. 3, and the Matlab simulation result of the angle estimation value of the characteristic target is shown in FIG. 4. According to the embodiment, Matlab is used for respectively simulating the motion states of the characteristic targets, the distance difference between an underwater vehicle and the targets and the estimation of the angles of the characteristic targets are simulated, and the result shows that the method can accurately finish the estimation of the parameters of the motion targets.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present invention, and that variations, modifications, substitutions and alterations can be made in the above embodiments by those of ordinary skill in the art without departing from the principle and spirit of the present invention.
Claims (1)
1. A moving target visual pose estimation method for an underwater vehicle is characterized by comprising the following steps: the method comprises the following steps:
step 1: acquiring initial state information of the underwater vehicle by using an azimuth attitude measurement system and a speed measuring instrument which are installed on the underwater vehicle with six degrees of freedom, estimating the initial state information of a target, and setting an initial value x (0) of a system state vector of the underwater vehicle when k is 0 according to the system state vector:
x(k)=[Δx,Δy,Δz,φB,θB,ψB,uB,vB,wB,pB,qB,rB,φA,θA,ψA,uA,vA,wA,pA,qA,rA]k T
wherein Δ x ═ xB-xA,Δy=yB-yA,Δz=zB-zA,(xB,yB,zB) For the position coordinates of the center of the underwater vehicle in the global coordinate system, (x)A,yA,zA) The position coordinates of the target center in the global system, and delta x, delta y and delta z are respectively the position difference of the underwater vehicle relative to the tracking target in the global coordinate system, phiB,θB,ψBRespectively the attitude angle u of the underwater vehicle in the global coordinate systemB,vB,wBRespectively the linear velocity p of the underwater vehicle in the vehicle system of the underwater vehicleB,qB,rBAngular velocities, phi, of three attitude angles of the underwater vehicle in a global coordinate system, respectivelyA,θA,ψARespectively the attitude angle, u, of the target in the global coordinate systemA,vA,wALinear velocities, p, of the target in three directions in the target carrier system, respectivelyA,qA,rAThe angular velocities of three attitude angles of the target in the global coordinate system are respectively; and calculating a covariance matrix of state quantities of the underwater vehicle at the moment when k is 0
WhereinIs the expected value of x (0), E (.) represents the expected operation, (.)TRepresenting a transpose operation;
step 2: obtaining the estimated value of the system state vector according to the k-1 momentAnd a state estimation covariance matrix P (k-1) is combined with the linear velocity of the underwater vehicle in the vehicle system of the underwater vehicle and the angular velocity and angle of the underwater vehicle in a global coordinate system measured at the moment k, and the estimated value of the system state vector of the underwater vehicle at the moment k is estimatedAnd a state estimation covariance matrix p (k):
step 2.1: predicting the state one step ahead, and the steps are as follows:
step 2.1.1: using estimated states at time k-1And estimating the covariance matrix P (k-1)I k-1), generating Sigma dots
In the formulaRepresents the ith point of Sigma and represents the point of Sigma,after the expression n is multiplied by the estimation variance P (k-1| k-1), matrix evolution operation is carried out, an ith row is selected and is parallel to obtain an n-dimensional vector, and n is the dimension of the system state vector;
WhereinThe state estimate calculated for the ith Sigma point,for the system state equation, the system state equation in one step length T is
In which is a T control period of time,for converting the underwater vehicle carrier coordinate system into a coordinate conversion matrix under a global coordinate system,converting the target carrier system into a coordinate conversion matrix under a global coordinate system;
Step 2.1.4: and (3) taking process noise into consideration, obtaining an estimated covariance matrix predicted in the previous step:
wherein Q (k-1) is the covariance matrix of the system noise;
step 2.2: updating the prediction state of the previous step by using the measurement, and the steps are as follows:
Wherein the measurement equation is:
h[x(k)]=[μj(k),υj(k),φB(k),θB(k),ψB(k),uB,νB,wB,pB,qB,rB]T
(μj(k),υj(k) the coordinates of the jth characteristic point on the target in an image coordinate system are acquired by a downward-looking camera of the underwater vehicle, and the coordinates of the characteristic point on the target in the image coordinate system are obtained according to the image data;
Step 2.2.5: the covariance of the metrology prediction considering metrology noise is:
wherein R (k) a covariance matrix of the measured noise;
Step 2.3: calculating a gain matrix K (k) PXZPZ -1Then the updated state estimate and variance are
P(k|k)=P(k|k-1)-K(k)PZKT(k)
Wherein z (k) is a measurement at time k;
step 2.4: and (5) repeating the step 2.1 to the step 2.3 until the navigation is finished.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810206444.9A CN108444478B (en) | 2018-03-13 | 2018-03-13 | Moving target visual pose estimation method for underwater vehicle |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201810206444.9A CN108444478B (en) | 2018-03-13 | 2018-03-13 | Moving target visual pose estimation method for underwater vehicle |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108444478A CN108444478A (en) | 2018-08-24 |
CN108444478B true CN108444478B (en) | 2021-08-10 |
Family
ID=63194113
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201810206444.9A Active CN108444478B (en) | 2018-03-13 | 2018-03-13 | Moving target visual pose estimation method for underwater vehicle |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN108444478B (en) |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109697734B (en) * | 2018-12-25 | 2021-03-09 | 浙江商汤科技开发有限公司 | Pose estimation method and device, electronic equipment and storage medium |
CN110209180B (en) * | 2019-05-20 | 2022-03-01 | 武汉理工大学 | Unmanned underwater vehicle target tracking method based on HuberM-Cubasic Kalman filtering |
CN110160524B (en) * | 2019-05-23 | 2020-12-01 | 深圳市道通智能航空技术有限公司 | Sensor data acquisition method and device of inertial navigation system |
CN111649743B (en) * | 2020-05-08 | 2022-03-22 | 武汉高德红外股份有限公司 | Target angular velocity resolving method and device based on photoelectric turret |
CN114200966B (en) * | 2020-09-17 | 2023-10-13 | 中国科学院沈阳自动化研究所 | Unmanned aircraft target orientation equidistant tracking method based on perception information |
CN112184765B (en) * | 2020-09-18 | 2022-08-23 | 西北工业大学 | Autonomous tracking method for underwater vehicle |
CN112417948B (en) * | 2020-09-21 | 2024-01-12 | 西北工业大学 | Method for accurately guiding lead-in ring of underwater vehicle based on monocular vision |
CN112924708B (en) * | 2021-01-29 | 2022-06-03 | 中国航天空气动力技术研究院 | Speed estimation method suitable for underwater near-bottom operation vehicle |
CN112836889B (en) * | 2021-02-19 | 2024-07-19 | 鹏城实验室 | Path optimization method, underwater vehicle, and computer-readable storage medium |
CN113074725B (en) * | 2021-05-11 | 2022-07-22 | 哈尔滨工程大学 | Small underwater multi-robot cooperative positioning method and system based on multi-source information fusion |
CN113945892B (en) * | 2021-10-11 | 2022-05-03 | 哈尔滨工程大学 | Method for measuring three-dimensional motion trail of body target |
CN114323552B (en) * | 2021-11-18 | 2022-10-21 | 厦门大学 | Method for judging stability of water entering and exiting from cross-medium navigation body |
CN115060238B (en) * | 2022-05-18 | 2023-11-10 | 深圳荔石创新科技有限公司 | Method and device for measuring relative pose of underwater component |
CN115479507B (en) * | 2022-09-14 | 2023-08-15 | 中国科学院声学研究所 | Guidance control method and system for underwater vehicle |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101598556A (en) * | 2009-07-15 | 2009-12-09 | 北京航空航天大学 | Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known |
CN103149939A (en) * | 2013-02-26 | 2013-06-12 | 北京航空航天大学 | Dynamic target tracking and positioning method of unmanned plane based on vision |
CN103645487A (en) * | 2013-12-06 | 2014-03-19 | 江苏科技大学 | Underwater multi-target tracking method |
CN105676181A (en) * | 2016-01-15 | 2016-06-15 | 浙江大学 | Underwater moving target extended Kalman filtering tracking method based on distributed sensor energy ratios |
CN105890589A (en) * | 2016-04-05 | 2016-08-24 | 西北工业大学 | Underwater robot monocular vision positioning method |
CN106780560A (en) * | 2016-12-29 | 2017-05-31 | 北京理工大学 | A kind of feature based merges the bionic machine fish visual tracking method of particle filter |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2814833C (en) * | 2010-10-25 | 2018-11-06 | Sekhar C. Tangirala | Estimating position and orientation of an underwater vehicle based on correlated sensor data |
CN106950974B (en) * | 2017-04-19 | 2020-07-28 | 哈尔滨工程大学 | Three-dimensional path understanding and tracking control method for under-actuated autonomous underwater vehicle |
-
2018
- 2018-03-13 CN CN201810206444.9A patent/CN108444478B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101598556A (en) * | 2009-07-15 | 2009-12-09 | 北京航空航天大学 | Unmanned plane vision/inertia integrated navigation method under a kind of circumstances not known |
CN103149939A (en) * | 2013-02-26 | 2013-06-12 | 北京航空航天大学 | Dynamic target tracking and positioning method of unmanned plane based on vision |
CN103645487A (en) * | 2013-12-06 | 2014-03-19 | 江苏科技大学 | Underwater multi-target tracking method |
CN105676181A (en) * | 2016-01-15 | 2016-06-15 | 浙江大学 | Underwater moving target extended Kalman filtering tracking method based on distributed sensor energy ratios |
CN105890589A (en) * | 2016-04-05 | 2016-08-24 | 西北工业大学 | Underwater robot monocular vision positioning method |
CN106780560A (en) * | 2016-12-29 | 2017-05-31 | 北京理工大学 | A kind of feature based merges the bionic machine fish visual tracking method of particle filter |
Non-Patent Citations (2)
Title |
---|
AUV同时定位与跟踪研究;卢健等;《计算机工程与应用》;20111231;第47卷(第16期);第4-8页 * |
The AUV Location and Location Error Analysis Based on Binocular Stereo Vision;Jun-Chai GAO, et al;《Sensors & Transducers》;20130930;第156卷(第9期);第291-297页 * |
Also Published As
Publication number | Publication date |
---|---|
CN108444478A (en) | 2018-08-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN108444478B (en) | Moving target visual pose estimation method for underwater vehicle | |
CN112347840B (en) | Vision sensor laser radar integrated unmanned aerial vehicle positioning and image building device and method | |
CN112639502A (en) | Robot pose estimation | |
CN109901205B (en) | Underwater robot multi-sensor fusion and motion trajectory prediction method | |
CN111596333B (en) | Underwater positioning navigation method and system | |
CN110726406A (en) | Improved nonlinear optimization monocular inertial navigation SLAM method | |
CN108120438B (en) | Indoor target rapid tracking method based on IMU and RFID information fusion | |
CN106679662B (en) | A kind of underwater robot list beacon Combinated navigation method based on TMA technology | |
CN113739795B (en) | Underwater synchronous positioning and mapping method based on polarized light/inertia/vision integrated navigation | |
CN108387236B (en) | Polarized light SLAM method based on extended Kalman filtering | |
CN114001733B (en) | Map-based consistent efficient visual inertial positioning algorithm | |
Yap et al. | A particle filter for monocular vision-aided odometry | |
Zhou et al. | A lidar odometry for outdoor mobile robots using ndt based scan matching in gps-denied environments | |
Demim et al. | Simultaneous localisation and mapping for autonomous underwater vehicle using a combined smooth variable structure filter and extended kalman filter | |
CN112710304B (en) | Underwater autonomous vehicle navigation method based on adaptive filtering | |
Zhang et al. | An integrated navigation method for small-sized AUV in shallow-sea applications | |
CN113108774A (en) | Underwater robot and navigation positioning method thereof | |
CN112581610B (en) | Robust optimization method and system for building map from multi-beam sonar data | |
CN112611376B (en) | RGI-Lidar/SINS tightly-coupled AUV underwater navigation positioning method and system | |
CN112802195B (en) | Underwater robot continuous occupying and mapping method based on sonar | |
CN110849349B (en) | Fusion positioning method based on magnetic sensor and wheel type odometer | |
Ma et al. | A Vision-Integrated Navigation Method in AUV Terminal Mobile Docking Based on Factor Graph Optimization | |
CN113029173A (en) | Vehicle navigation method and device | |
Wang et al. | Initial positioning of terrain relative navigation under pseudo-peaks interference | |
CN117451054A (en) | Unmanned aerial vehicle high-precision indoor positioning method based on monocular camera, IMU and UWB multi-sensor fusion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |