[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

Smart Sensor Systems for Positioning and Navigation

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Navigation and Positioning".

Deadline for manuscript submissions: 30 June 2025 | Viewed by 3476

Special Issue Editors


E-Mail Website
Guest Editor
Department of Computer Science and Centre for Reliable Machine Learning, Royal Holloway, University of London, London, UK
Interests: machine learning; data analysis; networked systems; indoor positioning

E-Mail Website
Guest Editor
Department of Computer Science, Royal Holloway University of London, Surrey TW20 0EX, UK
Interests: indoor positioning; contact tracing; railway navigation
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The last ten years have seen enormous technical progress in the field of indoor positioning and indoor navigation. The potential applications of indoor localization are all-encompassing, from home to wide public areas, from IoT and personal devices to surveillance and crowd behavior applications, and from casual use to mission-critical systems.

This Special Issue encourages authors, from academia and industry, to submit new research results about innovations for indoor positioning and navigation, especially the application of smart sensors. The Special Issue topics include but are not limited to the following:

  • Location-based services and applications;
  • Benchmarking, assessment, evaluation and standards;
  • User requirements;
  • UI, indoor maps, and 3D building models;
  • Human motion monitoring and modeling;
  • Robotics and UAV;
  • Indoor navigation and tracking methods;
  • Self-contained sensors;
  • Wearable and multisensor systems.

Prof. Dr. Zhiyuan Luo
Dr. Khuong An Nguyen
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • indoor positioning
  • indoor mapping
  • indoor navigation
  • smart sensors

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

20 pages, 4186 KiB  
Article
Deep Learning-Emerged Grid Cells-Based Bio-Inspired Navigation in Robotics
by Arturs Simkuns, Rodions Saltanovs, Maksims Ivanovs and Roberts Kadikis
Sensors 2025, 25(5), 1576; https://doi.org/10.3390/s25051576 - 4 Mar 2025
Viewed by 232
Abstract
Grid cells in the brain’s entorhinal cortex are essential for spatial navigation and have inspired advancements in robotic navigation systems. This paper first provides an overview of recent research on grid cell-based navigation in robotics, focusing on deep learning models and algorithms capable [...] Read more.
Grid cells in the brain’s entorhinal cortex are essential for spatial navigation and have inspired advancements in robotic navigation systems. This paper first provides an overview of recent research on grid cell-based navigation in robotics, focusing on deep learning models and algorithms capable of handling uncertainty and dynamic environments. We then present experimental results where a grid cell network was trained using trajectories from a mobile unmanned ground vehicle (UGV) robot. After training, the network’s units exhibited spatially periodic and hexagonal activation patterns characteristic of biological grid cells, as well as responses resembling border cells and head-direction cells. These findings demonstrate that grid cell networks can effectively learn spatial representations from robot trajectories, providing a foundation for developing advanced navigation algorithms for mobile robots. We conclude by discussing current challenges and future research directions in this field. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>An overview of the high-level system architecture for grid cell-based robotic navigation, detailing the flow from input data generation through trajectory simulation in Gazebo, data preprocessing and storage, model training with Long Short-Term Memory (LSTM) and linear layers.</p>
Full article ">Figure 2
<p>The diagram illustrates the relationships between various brain regions and cell types involved in spatial navigation.</p>
Full article ">Figure 3
<p>The diagram illustrates the flow of data from trajectory inputs of a UGV robot in a simulation environment through the Supervised Learning Grid Cell Module (SLGCM), using an RNN and LSTM layer to predict and simulate grid cell activity that supports vector-based navigation.</p>
Full article ">Figure 4
<p>Architecture of a grid cell network for robotic navigation, detailing the flow from the input layer (taking velocity and angular components) through the LSTM recurrent layer, a linear layer for feature transformation, and separate linear decoders for place and head-direction cells, which output activations for spatial and directional mapping.</p>
Full article ">Figure 5
<p>Tensorflow 2.9.1 implementation architecture of supervised learning grid cell module.</p>
Full article ">Figure 6
<p>Husky UGV robot in Gazebo environment.</p>
Full article ">Figure 7
<p>The flow of dataset generation and environment interaction for the Husky UGV in a Gazebo simulation, showing how commands are processed through the EIM module, robot movements are controlled, sensor data are collected and transformed, and the data is finally stored in TFRecord format for training purposes.</p>
Full article ">Figure 8
<p>Husky robot’s traveled trajectories during dataset generation, with each colored line representing a different path within a 6.4-meter-square area (±3.2 m on both the X and Y axes).</p>
Full article ">Figure 9
<p>Dataset trajectory generation from Clearpath Husky in Gazebo to dataset in TFRecords format.</p>
Full article ">Figure 10
<p>Spatial activity plots from the linear layer after 124 training epochs show similar to grid cell activations. These activations exhibit periodic and varied firing patterns, reflecting learned spatial encoding across different units.</p>
Full article ">Figure 11
<p>Spatial autocorrelograms of ratemaps after 124 training epochs, displaying distinct circular patterns that reflect spatial regularity.</p>
Full article ">
22 pages, 1654 KiB  
Article
A New Scene Sensing Model Based on Multi-Source Data from Smartphones
by Zhenke Ding, Zhongliang Deng, Enwen Hu, Bingxun Liu, Zhichao Zhang and Mingyang Ma
Sensors 2024, 24(20), 6669; https://doi.org/10.3390/s24206669 - 16 Oct 2024
Viewed by 788
Abstract
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect [...] Read more.
Smartphones with integrated sensors play an important role in people’s lives, and in advanced multi-sensor fusion navigation systems, the use of individual sensor information is crucial. Because of the different environments, the weights of the sensors will be different, which will also affect the method and results of multi-source fusion positioning. Based on the multi-source data from smartphone sensors, this study explores five types of information—Global Navigation Satellite System (GNSS), Inertial Measurement Units (IMUs), cellular networks, optical sensors, and Wi-Fi sensors—characterizing the temporal, spatial, and mathematical statistical features of the data, and it constructs a multi-scale, multi-window, and context-connected scene sensing model to accurately detect the environmental scene in indoor, semi-indoor, outdoor, and semi-outdoor spaces, thus providing a good basis for multi-sensor positioning in a multi-sensor navigation system. Detecting environmental scenes provides an environmental positioning basis for multi-sensor fusion localization. This model is divided into four main parts: multi-sensor-based data mining, a multi-scale convolutional neural network (CNN), a bidirectional long short-term memory (BiLSTM) network combined with contextual information, and a meta-heuristic optimization algorithm. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>Four scene classifications: (<b>a</b>) outdoor, (<b>b</b>) semi-outdoor, (<b>c</b>) semi-indoor, and (<b>d</b>) indoor.</p>
Full article ">Figure 2
<p>Satellite zenith view: (<b>a</b>) west indoor neighboring window, (<b>b</b>) south indoor neighbouring window, (<b>c</b>) indoor, and (<b>d</b>) open outdoor neighboring window.</p>
Full article ">Figure 3
<p>DOP change graph: (<b>a</b>) outdoor DOP change graph, and (<b>b</b>) indoor DOP change graph.</p>
Full article ">Figure 4
<p>Visible satellite map: (<b>a</b>) variation in the number of visible satellites. (<b>b</b>) Variation in the rate of change of visible satellites in different windows.</p>
Full article ">Figure 5
<p>Satellite signal quality map: (<b>a</b>) CNR variation and (<b>b</b>) DCNR variation.</p>
Full article ">Figure 6
<p>State of motion versus acceleration.</p>
Full article ">Figure 7
<p>Wi-Fi channel spectrum scan: (<b>a</b>) indoor, (<b>b</b>) outdoor.</p>
Full article ">Figure 8
<p>Visible AP distribution of Wi-Fi: (<b>a</b>) number distribution, (<b>b</b>) signal strength distribution.</p>
Full article ">Figure 9
<p>Variation of light sensors and cellular network sensors: (<b>a</b>) variation of indoor and outdoor light intensity over 24 h, (<b>b</b>) variation of the number of base stations receiving signals.</p>
Full article ">Figure 10
<p>An algorithmic model for the classification of complex indoor and outdoor scenes based on spatio-temporal features.</p>
Full article ">Figure 11
<p>Pearson correlation feature map.</p>
Full article ">Figure 12
<p>Schematic of a two-scale convolutional neural network.</p>
Full article ">Figure 13
<p>BiLSTM network structure diagram.</p>
Full article ">Figure 14
<p>Structure of the ablation experiment.</p>
Full article ">Figure 15
<p>Confusion matrix: (<b>a</b>) confusion matrix before WOA optimization. (<b>b</b>) confusion matrix after WOA optimisation.</p>
Full article ">Figure 16
<p>Comparison of the accuracy of different models.</p>
Full article ">Figure 17
<p>Comparison of accuracy in different scenarios.</p>
Full article ">
13 pages, 4239 KiB  
Communication
Deep Learning-Based Transmitter Localization in Sparse Wireless Sensor Networks
by Runjie Liu, Qionggui Zhang, Yuankang Zhang, Rui Zhang and Tao Meng
Sensors 2024, 24(16), 5335; https://doi.org/10.3390/s24165335 - 18 Aug 2024
Viewed by 1271
Abstract
In the field of wireless communication, transmitter localization technology is crucial for achieving accurate source tracking. However, the extant methodologies for localization face numerous challenges in wireless sensor networks (WSNs), particularly due to the constraints posed by the sparse distribution of sensors across [...] Read more.
In the field of wireless communication, transmitter localization technology is crucial for achieving accurate source tracking. However, the extant methodologies for localization face numerous challenges in wireless sensor networks (WSNs), particularly due to the constraints posed by the sparse distribution of sensors across large areas. We present DSLoc, a deep learning-based approach for transmitter localization in sparse WSNs. Our method is based on an improved high-resolution network model in neural networks. To address localization in sparse wireless sensor networks, we design efficient feature enhancement modules, and propose to locate transmitter locations in the heatmap using an image centroid-based method. Experiments conducted on WSNs with a 0.01% deployment density demonstrate that, compared to existing deep learning models, our method significantly reduces the transmitter miss rate and improves the localization accuracy by more than double. The results indicate that the proposed method offers more accurate and robust performance in sparse WSN environments. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>Transmitter localization flowchart (number of sensors and Gaussian size used for demonstration purposes only).</p>
Full article ">Figure 2
<p>Pixel attention module for bias correction.</p>
Full article ">Figure 3
<p>Gaussian convolution process (number of sensors used for demonstration purposes only).</p>
Full article ">Figure 4
<p>HRNet architecture (the squares of different colors represent channel maps of different sizes. The whole system consists of four stages, and each stage performs a feature extraction transformation).</p>
Full article ">Figure 5
<p>Multi-Scale Output Fusion Module (the components have been defined in <a href="#sensors-24-05335-f004" class="html-fig">Figure 4</a>, and the feature maps of different sizes are fused by first upsampling separately and then feature concatenation and fusion).</p>
Full article ">Figure 6
<p>Centroid vs. Argmax approach.</p>
Full article ">Figure 7
<p>Performance comparison of different algorithms.</p>
Full article ">Figure 8
<p>Visualization of localization cases.</p>
Full article ">Figure 9
<p>Performance comparison of ablation experiment.</p>
Full article ">Figure 10
<p>Performance comparison of Centroid and Argmax.</p>
Full article ">

Review

Jump to: Research

21 pages, 1744 KiB  
Review
A Review of Miniature Radio Transmitters for Wildlife Tracking
by Sivan Toledo
Sensors 2025, 25(2), 517; https://doi.org/10.3390/s25020517 - 17 Jan 2025
Viewed by 756
Abstract
This article surveys the literature on miniature radio transmitters designed to track free-ranging wild animals using emitter-localization techniques. The articles covers the topics of power sources used in such transmitters, including miniature batteries and energy harvesting, techniques for generating the transmitted radio-frequency carrier, [...] Read more.
This article surveys the literature on miniature radio transmitters designed to track free-ranging wild animals using emitter-localization techniques. The articles covers the topics of power sources used in such transmitters, including miniature batteries and energy harvesting, techniques for generating the transmitted radio-frequency carrier, techniques for creating short radio pulses and more general on–off schedules, modulation in modern wildlife-tracking transmitters, construction, manufacturing, and tuning techniques, and recent trends in this area. The article also describes the recreation of the first successful wildlife-tracking transmitter, a nontrivial invention that had a profound impact on wildlife ecology, and explores its behavior. Full article
(This article belongs to the Special Issue Smart Sensor Systems for Positioning and Navigation)
Show Figures

Figure 1

Figure 1
<p>The components of a radio-transmitting tracking tag. All of the components except for the RF generator, the matching circuit, and the antenna are optional, but almost all tags also include a low-frequency (LF) oscillator, a controller, and a modulator. The function of the different components and typical designs for them are described throughout the article.</p>
Full article ">Figure 2
<p>Single-stage power-oscillator tags. The circuit on the left is of the Cochran–Lord CW tag [<a href="#B13-sensors-25-00517" class="html-bibr">13</a>,<a href="#B47-sensors-25-00517" class="html-bibr">47</a>] and the one on the right is of the Tester–Warner–Cochran pinging tag [<a href="#B48-sensors-25-00517" class="html-bibr">48</a>]. The original circuits used PNP transistors, not NPN ones.</p>
Full article ">Figure 3
<p>A two-stage tag, from [<a href="#B20-sensors-25-00517" class="html-bibr">20</a>].</p>
Full article ">Figure 4
<p><b>Left</b>: A prototype single-stage transmitter. <b>Right</b>: Emitter voltage when the tank is tuned and the supply voltage is 3 V.</p>
Full article ">Figure 5
<p><b>Left</b>: Collector voltage in the prototype tag, showing a ping rate of a little less than 10 Hz. <b>Right</b>: Collector voltage during one ping.</p>
Full article ">
Back to TopTop