[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
sensors-logo

Journal Browser

Journal Browser

New Advances in Robotically Enabled Sensing

A special issue of Sensors (ISSN 1424-8220). This special issue belongs to the section "Sensors and Robotics".

Deadline for manuscript submissions: 31 May 2025 | Viewed by 4736

Special Issue Editors


E-Mail Website
Guest Editor
Researcher, Institute for High-Performance Computing and Networking (ICAR), National Research Council (CNR), Via Ugo La Malfa, 153, 90146 Palermo, Italy
Interests: advanced adaptive robotic systems; programming and real-time control; collision avoidance; instrument interfacing and data collection
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
Centre for Ultrasonic Engineering, University of Strathclyde, Glasgow G1 1XW, UK
Interests: welding technology; NDT; residual stress; additive manufacturing; robotics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Humans have an immediate perception of the geometry of parts and surroundings, through their senses and their cognitive capabilities. This innate human ability enables the manual inspection of objects in everyday life and in manufacturing environments. Trained inspectors combine their senses and handling skills with bespoke instrumentation. However, manual inspection can be slow for large and complex geometries, and is prone to human error (e.g., tiredness, boredom, and distraction). Robotic sensing has emerged in many sectors to improve the inspection of parts and materials, enhancing data acquisition speed, part coverage, and inspection reliability. Several automated or semi-automated solutions have been proposed to enable the automated deployment of specific types of sensors. Additionally, robots are able to reach inspection positions not easily accessible to human operators, removing humans from potentially dangerous environments.

However, the perceived complexity and high costs of robot sensing have limited the adoption of automation. As a result, the full potential derived from the seamless integration of robotic platforms with sensors, actuators, and software has not been fully explored; sufficient research could revolutionise how automated sensing is performed and conceived. Recent advancements in electronics, robotics, sensor technology and software pave the way for new developments in automated and data-driven robotic inspections in several sectors. These developments can help face the current societal challenges in this field. Robotic sensing must develop in parallel with new arising tools, e.g., autonomous robotics, artificial intelligence, the Internet of Things, cloud computing, cybersecurity, virtual-twin simulations, augmented reality, and big data.

We invite the research community to submit contributions to this Special Issue. Manuscripts introducing novel developments in one or more of the following aspects are welcome:

  • Robotic sensing;
  • Robotic non-destructive testing;
  • Novel integrations of robotic systems for hybrid manufacturing and inspection tasks;
  • Transition from automated to autonomous robotics;
  • Modelling of robotic approaches, remote inspections, and data interpretation;
  • Real-time data monitoring and robot control;
  • Processing, management, compression and archiving of robotically collected data;
  • Machine learning, artificial intelligence, image recognition and data mining;
  • Novel data visualization and analysis approaches;
  • Human–robot interaction/communication in the operation of robotic inspection systems.

The goal of this Special Issue is to identify how robotic sensing is evolving to address the issues raised by challenging new frontiers in civil and medical fields and by Industry 4.0, which is the ongoing automation of traditional manufacturing and industrial practices using modern smart technology.

Dr. Carmelo Mineo
Dr. Yashar Javadi
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Sensors is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • robotic non-destructive testing
  • robotic sensing
  • remote inspection
  • adaptive inspection
  • data interpretation
  • real-time monitoring
  • data-driven autonomous inspection
  • data management, processing, compression, and storage
  • machine learning, artificial intelligence, image recognition
  • human–robot interaction

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

21 pages, 11583 KiB  
Article
Real-Time Kinematically Synchronous Planning for Cooperative Manipulation of Multi-Arms Robot Using the Self-Organizing Competitive Neural Network
by Hui Zhang, Hongzhe Jin, Mingda Ge and Jie Zhao
Sensors 2023, 23(11), 5120; https://doi.org/10.3390/s23115120 - 27 May 2023
Cited by 1 | Viewed by 1510
Abstract
This paper presents a real-time kinematically synchronous planning method for the collaborative manipulation of a multi-arms robot with physical coupling based on the self-organizing competitive neural network. This method defines the sub-bases for the configuration of multi-arms to obtain the Jacobian matrix of [...] Read more.
This paper presents a real-time kinematically synchronous planning method for the collaborative manipulation of a multi-arms robot with physical coupling based on the self-organizing competitive neural network. This method defines the sub-bases for the configuration of multi-arms to obtain the Jacobian matrix of common degrees of freedom so that the sub-base motion converges along the direction for the total pose error of the end-effectors (EEs). Such a consideration ensures the uniformity of the EE motion before the error converges completely and contributes to the collaborative manipulation of multi-arms. An unsupervised competitive neural network model is raised to adaptively increase the convergence ratio of multi-arms via the online learning of the rules of the inner star. Then, combining with the defined sub-bases, the synchronous planning method is established to achieve the synchronous movement of multi-arms robot rapidly for collaborative manipulation. Theory analysis proves the stability of the multi-arms system via the Lyapunov theory. Various simulations and experiments demonstrate that the proposed kinematically synchronous planning method is feasible and applicable to different symmetric and asymmetric cooperative manipulation tasks for a multi-arms system. Full article
(This article belongs to the Special Issue New Advances in Robotically Enabled Sensing)
Show Figures

Figure 1

Figure 1
<p>A type of cooperative manipulation. (<b>a</b>) Carrying. (<b>b</b>) Operating rudder. (<b>c</b>) Operating a wrench. (<b>d</b>) Using pliers. (<b>e</b>) Multi-station operation.</p>
Full article ">Figure 2
<p>The diagram for the common features in the cooperative manipulation of multi-arms.</p>
Full article ">Figure 3
<p>Simple configuration of multi-arm robot.</p>
Full article ">Figure 4
<p>Kinematically synchronous planning for multi-arm robot. <span class="html-italic">U<sub>in</sub></span> = <b>t</b> = (<b>t</b><sub>1</sub>, <b>t</b><sub>2</sub>, …, <b>t</b><sub>N</sub>)<sup>T</sup>. <span class="html-italic">U<sub>out</sub></span> = <b>s</b> = (<b>s</b><sub>1</sub>, <b>s</b><sub>2</sub>, …, <b>s</b><sub>N</sub>)<sup>T</sup>.</p>
Full article ">Figure 5
<p>Motion planning and EE motion for the EE with the minimum pose error, <math display="inline"><semantics> <mrow> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>e</mi> </mstyle> <mrow> <mi>min</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="normal">T</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>. <math display="inline"><semantics> <mrow> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mover accent="true"> <mi>v</mi> <mo>^</mo> </mover> </mstyle> <mi>i</mi> </msub> <mrow> <mo>(</mo> <mi mathvariant="normal">T</mi> <mo>)</mo> </mrow> <mo>=</mo> <mi>μ</mi> <msub> <mstyle mathvariant="bold" mathsize="normal"> <mi>e</mi> </mstyle> <mrow> <mi>min</mi> </mrow> </msub> <mrow> <mo>(</mo> <mi mathvariant="normal">T</mi> <mo>)</mo> </mrow> </mrow> </semantics></math>.</p>
Full article ">Figure 6
<p>The configuration of three-arm robot with 15-DoFs.</p>
Full article ">Figure 7
<p>Inverse kinematics based on the traditional method in real time. (<b>a</b>) Motion of multi-arms. (<b>b</b>) Joint angles. (<b>c</b>) EE position velocity. (<b>d</b>) EE attitude velocity. (<b>e</b>) EE position error. (<b>f</b>) EE attitude error.</p>
Full article ">Figure 8
<p>Inverse kinematics based on the sub-base method in real time. (<b>a</b>) Motion of multi-arms. (<b>b</b>) Joint angles. (<b>c</b>) EE position velocity. (<b>d</b>) EE attitude velocity. (<b>e</b>) EE position error. (<b>f</b>) EE attitude error.</p>
Full article ">Figure 9
<p>The configuration of two-arm robot with 13-DoFs.</p>
Full article ">Figure 10
<p>The principle of two-arm robot with 13-DoFs.</p>
Full article ">Figure 11
<p>Carrying task. (<b>a</b>) Initial configuration. (<b>b</b>) Manipulating process.</p>
Full article ">Figure 12
<p>Trajectories for dual arms in carrying task. (<b>a</b>) EE movement. (<b>b</b>) Joint trajectory. (<b>c</b>) EE position. (<b>d</b>) EE attitude. (<b>e</b>) Position velocity. (<b>f</b>) Attitude velocity. (<b>g</b>) Pose velocity error. (<b>h</b>) EE pose error.</p>
Full article ">Figure 13
<p>Manipulating pilers. (<b>a</b>) Initial configuration. (<b>b</b>) Manipulating process.</p>
Full article ">Figure 14
<p>Trajectories for dual arms in manipulating pilers. (<b>a</b>) EE movement. (<b>b</b>) Joint trajectory. (<b>c</b>) EE position. (<b>d</b>) EE attitude. (<b>e</b>) Position velocity. (<b>f</b>) Attitude velocity. (<b>g</b>) Pose velocity error. (<b>h</b>) EE pose error.</p>
Full article ">Figure 15
<p>Manipulating rudder. (<b>a</b>) Initial configuration. (<b>b</b>) Manipulating process.</p>
Full article ">Figure 16
<p>Trajectories for dual arms in manipulating rudder. (<b>a</b>) EE movement. (<b>b</b>) Joint trajectory. (<b>c</b>) EE position. (<b>d</b>) EE attitude. (<b>e</b>) Position velocity. (<b>f</b>) Attitude velocity. (<b>g</b>) Pose velocity error. (<b>h</b>) EE pose error.</p>
Full article ">Figure 16 Cont.
<p>Trajectories for dual arms in manipulating rudder. (<b>a</b>) EE movement. (<b>b</b>) Joint trajectory. (<b>c</b>) EE position. (<b>d</b>) EE attitude. (<b>e</b>) Position velocity. (<b>f</b>) Attitude velocity. (<b>g</b>) Pose velocity error. (<b>h</b>) EE pose error.</p>
Full article ">
15 pages, 3171 KiB  
Article
Transforming Industrial Manipulators via Kinesthetic Guidance for Automated Inspection of Complex Geometries
by Charalampos Loukas, Momchil Vasilev, Rastislav Zimmerman, Randika K. W. Vithanage, Ehsan Mohseni, Charles N. MacLeod, David Lines, Stephen Gareth Pierce, Stewart Williams, Jialuo Ding, Kenneth Burnham, Jim Sibson, Tom O’Hare and Michael R. Grosser
Sensors 2023, 23(7), 3757; https://doi.org/10.3390/s23073757 - 5 Apr 2023
Viewed by 2495
Abstract
The increased demand for cost-efficient manufacturing and metrology inspection solutions for complex-shaped components in High-Value Manufacturing (HVM) sectors requires increased production throughput and precision. This drives the integration of automated robotic solutions. However, the current manipulators utilizing traditional programming approaches demand specialized robotic [...] Read more.
The increased demand for cost-efficient manufacturing and metrology inspection solutions for complex-shaped components in High-Value Manufacturing (HVM) sectors requires increased production throughput and precision. This drives the integration of automated robotic solutions. However, the current manipulators utilizing traditional programming approaches demand specialized robotic programming knowledge and make it challenging to generate complex paths and adapt easily to unique specifications per component, resulting in an inflexible and cumbersome teaching process. Therefore, this body of work proposes a novel software system to realize kinesthetic guidance for path planning in real-time intervals at 250 Hz, utilizing an external off-the-shelf force–torque (FT) sensor. The proposed work is demonstrated on a 500 mm2 near-net-shaped Wire–Arc Additive Manufacturing (WAAM) complex component with embedded defects by teaching the inspection path for defect detection with a standard industrial robotic manipulator in a collaborative fashion and adaptively generating the kinematics resulting in the uniform coupling of ultrasound inspection. The utilized method proves superior in performance and speed, accelerating the programming time using online and offline approaches by an estimate of 88% to 98%. The proposed work is a unique development, retrofitting current industrial manipulators into collaborative entities, securing human job resources, and achieving flexible production. Full article
(This article belongs to the Special Issue New Advances in Robotically Enabled Sensing)
Show Figures

Figure 1

Figure 1
<p>Comparison of collaborative robotic arms and industrial manipulators in terms of reachability and load capacity [<a href="#B14-sensors-23-03757" class="html-bibr">14</a>,<a href="#B15-sensors-23-03757" class="html-bibr">15</a>,<a href="#B16-sensors-23-03757" class="html-bibr">16</a>]. These industrial arms can support the kinesthetics concept for path planning, transforming these robots into collaborative entities.</p>
Full article ">Figure 2
<p>Communication interface for kinesthetic guidance and real-time kinematics generation based on the RSI protocol between external PC, KRC, and FT sensor. The external target updates, cyclically, the control process algorithm in the KRC4 controller.</p>
Full article ">Figure 3
<p>Real-time control process algorithm for kinesthetic guidance path planning describing the cyclical flow of process information between FT sensor current measurements, setpoint forces and torques, LabVIEW external control program, and generated robot positional corrections.</p>
Full article ">Figure 4
<p>Expansion of the control process algorithm of kinesthetic path planning (<a href="#sensors-23-03757-f003" class="html-fig">Figure 3</a>) to support the direct feedback of the FT corrections to the LabVIEW environment for adaptive motion control.</p>
Full article ">Figure 5
<p>LabVIEW External Real-Time Control GUI, which handles the kinesthetic teaching and the generation of kinematics for the taught path.</p>
Full article ">Figure 6
<p>Experimental setup: (<b>a</b>) robotic setup with a 6 DoF FT sensor and a WAAM roller probe for NDE inspection mounted as an end effector; (<b>b</b>) WAAM component consisting of three sections with three embedded defects.</p>
Full article ">Figure 7
<p>Kinesthetic teaching: (<b>a</b>) Four points recorded 100 mm above the workpiece, which are the starting, between, and endpoints during the three areas of inspection. Thirteen points were recorded by manipulation of the end effector 5 mm above the specimen, and at these points, the adaptive FT control was enabled to perform the UT inspection for defects; (<b>b</b>) top view of the complex-shaped WAAM component showcasing the taught positions generated from the kinesthetic path planning.</p>
Full article ">Figure 8
<p>UT NDE Inspection took place following the kinesthetic guidance. The kinematics generation based on the feedback of the FT PI controller to the endtarget position adapted the motion to the overbuild features of the WAAM component.</p>
Full article ">
Back to TopTop