Kim et al., 2009 - Google Patents
Targeted driving using visual tracking on Mars: From research to flightKim et al., 2009
View PDF- Document ID
- 10797423600090763601
- Author
- Kim W
- Nesnas I
- Bajracharya M
- Madison R
- Ansar A
- Steele R
- Biesiadecki J
- Ali K
- Publication year
- Publication venue
- Journal of Field Robotics
External Links
Snippet
This paper presents the development, validation, and deployment of the visual target tracking capability onto the Mars Exploration Rover (MER) mission. Visual target tracking enables targeted driving, in which the rover approaches a designated target in a closed …
- 230000000007 visual effect 0 title abstract description 72
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
- G01S3/7864—T.V. type tracking systems
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10515458B1 (en) | Image-matching navigation method and apparatus for aerial vehicles | |
Forster et al. | SVO: Fast semi-direct monocular visual odometry | |
US8744665B2 (en) | Control method for localization and navigation of mobile robot and mobile robot using the same | |
Amidi et al. | A visual odometer for autonomous helicopter flight | |
US8134479B2 (en) | Monocular motion stereo-based free parking space detection apparatus and method | |
US8098893B2 (en) | Moving object image tracking apparatus and method | |
US5170352A (en) | Multi-purpose autonomous vehicle with path plotting | |
Chai et al. | Three-dimensional motion and structure estimation using inertial sensors and computer vision for augmented reality | |
JP4798450B2 (en) | Navigation device and control method thereof | |
Zhang et al. | Robust appearance based visual route following for navigation in large-scale outdoor environments | |
CN104204721A (en) | Single-camera distance estimation | |
Dall’Osto et al. | Fast and robust bio-inspired teach and repeat navigation | |
Caballero et al. | Unmanned aerial vehicle localization based on monocular vision and online mosaicking: a new mapping framework | |
Schauwecker et al. | On-board dual-stereo-vision for autonomous quadrotor navigation | |
Caballero et al. | Improving vision-based planar motion estimation for unmanned aerial vehicles through online mosaicing | |
CN109782810A (en) | Video satellite motion target tracking imaging method and its device based on image guidance | |
Marchand et al. | RemoveDebris vision-based navigation preliminary results | |
Bazin et al. | UAV attitude estimation by vanishing points in catadioptric images | |
Kim et al. | Targeted driving using visual tracking on Mars: From research to flight | |
Kim et al. | Rover-based visual target tracking validation and mission infusion | |
Fang et al. | A motion tracking method by combining the IMU and camera in mobile devices | |
Tsai et al. | Autonomous vision-based tethered-assisted rover docking | |
Eudes et al. | Visuo-inertial fusion for homography-based filtering and estimation | |
Nesnas et al. | Visual target tracking for rover-based planetary exploration | |
Huntsberger et al. | Closed loop control for autonomous approach and placement of science instruments by planetary rovers |