Meyer et al., 1992 - Google Patents
Estimation of time-to-collision maps from first order motion models and normal flowsMeyer et al., 1992
- Document ID
- 14598841176850126738
- Author
- Meyer F
- Bouthemy P
- Publication year
- Publication venue
- 1992 11th IAPR International Conference on Pattern Recognition
External Links
Snippet
In this paper, we propose a centralized system which detects potholes on roads and assists the driver to avoid them. The system consists of 3 components namely sensors for detection, an inter-vehicle communication protocol to warn other vehicles about the new potholes …
- 230000000007 visual effect 0 abstract description 6
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00624—Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Meyer et al. | Estimation of time-to-collision maps from first order motion models and normal flows | |
Geyer et al. | A2d2: Audi autonomous driving dataset | |
Smith et al. | Visual tracking for intelligent vehicle-highway systems | |
Liang et al. | Video stabilization for a camcorder mounted on a moving vehicle | |
Dooley et al. | A blind-zone detection method using a rear-mounted fisheye camera with combination of vehicle detection methods | |
Franke et al. | 6d-vision: Fusion of stereo and motion for robust environment perception | |
CN114474061B (en) | Cloud service-based multi-sensor fusion positioning navigation system and method for robot | |
Shim et al. | An autonomous driving system for unknown environments using a unified map | |
Pfeiffer et al. | Modeling dynamic 3D environments by means of the stixel world | |
Broggi et al. | Terramax vision at the urban challenge 2007 | |
CN109300143B (en) | Method, device and equipment for determining motion vector field, storage medium and vehicle | |
CN111771207A (en) | Enhanced vehicle tracking | |
CN113012197B (en) | Binocular vision odometer positioning method suitable for dynamic traffic scene | |
Parra et al. | Robust visual odometry for vehicle localization in urban environments | |
Baig et al. | A robust motion detection technique for dynamic environment monitoring: A framework for grid-based monitoring of the dynamic environment | |
Laflamme et al. | Driving datasets literature review | |
Xiang et al. | Hybrid bird’s-eye edge based semantic visual SLAM for automated valet parking | |
Tistarelli et al. | Direct estimation of time-to-impact from optical flow | |
Rabie et al. | Mobile active‐vision traffic surveillance system for urban networks | |
Mankowitz et al. | CFORB: Circular FREAK-ORB Visual Odometry | |
Zhou et al. | Research on vehicle adaptive real-time positioning based on binocular vision | |
Rabie et al. | Active-vision-based traffic surveillance and control | |
Agand et al. | DMODE: Differential Monocular Object Distance Estimation Module without Class Specific Information | |
Gandhi et al. | Dynamic panoramic surround map: motivation and omni video based approach | |
Nie et al. | Model-based optical flow for large displacements and homogeneous regions |