[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Löffler et al., 2018 - Google Patents

Evaluation criteria for inside-out indoor positioning systems based on machine learning

Löffler et al., 2018

View PDF
Document ID
13111952383855670914
Author
Löffler C
Riechel S
Fischer J
Mutschler C
Publication year
Publication venue
2018 International Conference on Indoor Positioning and Indoor Navigation (IPIN)

External Links

Snippet

Real-time tracking allows to trace goods and enables the optimization of logistics processes in many application areas. Camera-based inside-out tracking that uses an infrastructure of fixed and known markers is costly as the markers need to be installed and maintained in the …
Continue reading at www.researchgate.net (PDF) (other versions)

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/36Image preprocessing, i.e. processing the image information without deciding about the identity of the image
    • G06K9/46Extraction of features or characteristics of the image
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S3/00Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
    • G01S3/78Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
    • G01S3/782Systems for determining direction or deviation from predetermined direction
    • G01S3/785Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
    • G01S3/786Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically, i.e. tracking systems
    • G01S3/7864T.V. type tracking systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06KRECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K9/00Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
    • G06K9/00624Recognising scenes, i.e. recognition of a whole field of perception; recognising scene-specific objects
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING

Similar Documents

Publication Publication Date Title
Acharya et al. BIM-PoseNet: Indoor camera localisation using a 3D indoor model and deep learning from synthetic images
Cheng et al. Improving monocular visual SLAM in dynamic environments: an optical-flow-based approach
Löffler et al. Evaluation criteria for inside-out indoor positioning systems based on machine learning
Chen et al. Separated sonar localization system for indoor robot navigation
Skrzypczyński Mobile robot localization: Where we are and what are the challenges?
Teng et al. CloudNavi: Toward ubiquitous indoor navigation service with 3D point clouds
Bai et al. A survey of image-based indoor localization using deep learning
Saleem et al. Neural network-based recent research developments in SLAM for autonomous ground vehicles: A review
Safin et al. Evaluation of visual slam methods in usar applications using ros/gazebo simulation
Ishihara et al. Deep radio-visual localization
Jiao et al. Smart fusion of multi-sensor ubiquitous signals of mobile device for localization in GNSS-denied scenarios
Tseng et al. A new architecture for simultaneous localization and mapping: an application of a planetary rover
Jiang et al. Automatic elevator button localization using a combined detecting and tracking framework for multi-story navigation
Shu et al. 3D point cloud-based indoor mobile robot in 6-DoF pose localization using a Wi-Fi-aided localization system
Yang et al. Enhanced visual SLAM for construction robots by efficient integration of dynamic object segmentation and scene semantics
Shewail et al. Survey of indoor tracking systems using augmented reality
Dai et al. RGB‐D SLAM with moving object tracking in dynamic environments
Acharya et al. Modelling uncertainty of single image indoor localisation using a 3D model and deep learning
Bejuri et al. Ubiquitous WLAN/camera positioning using inverse intensity chromaticity space-based Feature detection and matching: a preliminary result
Karpov et al. Multi-robot exploration and mapping based on the subdefinite models
Virgolino Soares et al. Visual localization and mapping in dynamic and changing environments
Jung et al. U-VIO: tightly coupled UWB visual inertial odometry for robust localization
Alliez et al. Indoor localization and mapping: Towards tracking resilience through a multi-slam approach
Zou et al. Static map reconstruction and dynamic object tracking for a camera and laser scanner system
Kuang et al. An improved Robot’s localization and mapping method based on ORB-SLAM