[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 

AI-Integrated Advanced Robotics Towards Industry 5.0

A special issue of Machines (ISSN 2075-1702). This special issue belongs to the section "Advanced Manufacturing".

Deadline for manuscript submissions: 31 May 2025 | Viewed by 2692

Special Issue Editors


E-Mail Website
Guest Editor
School of Engineering, London South Bank University, 103 Borough Rd., London SE1 0AA, UK
Interests: smart manufacturing; IoT; robotics; manufacturing systems; industry 4.0; operations research; optimization; assembly systems
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor Assistant
School of Computer Science and Engineering, University of Westminster, 115 New Cavendish St, London W1W 6UW, UK
Interests: industrial automation and robotics; zero defect manufacturing; applied artificial intelligence; simulation-based optimisation; industry 4.0; digital twin
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The utilisation of robotic systems is prevalent in various industries such as construction, automotive, aerospace and semiconductors. The advancements in artificial intelligence, such as machine learning, deep learning and optimisation techniques have enhanced the operational capabilities of robotic-based manufacturing and assembly systems. The integration of novel concepts including big data analytics, process mining, collaborative robotics, cyber-physical systems, swarm intelligence, reinforcement learning and digital transformation holds the potential to enhance and optimise current production systems. Moreover, they can play a crucial role in proactive policy formulation, evaluation of health and safety, mitigation of risks and examination of the ethical ramifications associated with smart manufacturing systems. Hence, the primary objective of this Special Issue is to provide a comprehensive compilation of cutting-edge strategies and emerging trends/technologies within the domain of robotic-integrated smart manufacturing systems. It further aims to examine the various challenges and opportunities that arise from the integration of artificial intelligence into the domain of industrial robotics systems towards realising the goals of the concept of Industry 5.0, where robots and smart machines work alongside people to add resilience and sustainability to the manufacturing systems.

Topics of interest include, but are not limited to, the following:

  • Collaborative robots in smart manufacturing and assembly;
  • Simulation and modeling of robotic systems;
  • Energy monitoring and optimisation of robotic systems;
  • Data-driven cyber-physical robotic manufacturing and assembly;
  • Decision-support systems for robotic applications;
  • Intelligent robot-enabled material handling solutions;
  • Path planning for automated guided vehicles and mobile robots for shop-floor logistics;
  • Intelligent and adaptive gripper systems.

Dr. Bugra Alkan
Guest Editor

Dr. Malarvizhi Kaniappan Chinnathai
Guest Editor Assistant

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Machines is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue policies can be found here.

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

15 pages, 3274 KiB  
Article
Gesture-Controlled Robotic Arm for Small Assembly Lines
by Georgios Angelidis and Loukas Bampis
Machines 2025, 13(3), 182; https://doi.org/10.3390/machines13030182 - 25 Feb 2025
Viewed by 223
Abstract
In this study, we present a gesture-controlled robotic arm system for small assembly lines. Robotic arms are extensively used in industrial applications; however, they typically require special treatment and qualified personnel to set up and operate them. Towards this end, hand gestures can [...] Read more.
In this study, we present a gesture-controlled robotic arm system for small assembly lines. Robotic arms are extensively used in industrial applications; however, they typically require special treatment and qualified personnel to set up and operate them. Towards this end, hand gestures can provide a natural way for human–robot interaction, providing a straightforward means for control without the need for significant training of the operators. Our goal is to develop a safe, low-cost, and user-friendly system for environments that often involve non-repetitive and custom automation processes, such as in small factory setups. Our system estimates the 3D position of the user’s joints in real time with the help of AI and real-world data provided by an RGB-D camera. Then, joint coordinates are translated into the robotic arm’s desired poses in a simulated environment (ROS), thus achieving gesture control. Through the experiments we conducted, we show that the system provides the performance required to control a robotic arm effectively and efficiently. Full article
(This article belongs to the Special Issue AI-Integrated Advanced Robotics Towards Industry 5.0)
Show Figures

Figure 1

Figure 1
<p>Schematic representation of our proposed gesture-controlled robotic arm system.</p>
Full article ">Figure 2
<p>Proposed setup for controlling a robotic arm through human gestures. The frames considered for reference are depicted in red for the <span class="html-italic">x</span>-axis, green for the <span class="html-italic">y</span>-axis, and blue for the <span class="html-italic">z</span>-axis.</p>
Full article ">Figure 3
<p>The 21 estimated hand joints from the work presented in [<a href="#B42-machines-13-00182" class="html-bibr">42</a>,<a href="#B50-machines-13-00182" class="html-bibr">50</a>].</p>
Full article ">Figure 4
<p>Direct vectors computed between the index PIP joint and the wrist (blue), pinky MCP, and index MCP joints (red). These vectors are used as references for computing the orientation of the operator’s hand (frame <math display="inline"><semantics> <msub> <mi>h</mi> <mi>r</mi> </msub> </semantics></math>).</p>
Full article ">Figure 5
<p>Rotations of the user’s chest frame of reference (<span class="html-italic">c</span>) in order to align its axes with those of the camera’s coordinate frame (<span class="html-italic">s</span>).</p>
Full article ">Figure 6
<p>The two categories of studied hand poses for controlling the robotic arm. (<b>a</b>) Pose 1, the palm’s surface is perpendicular to the camera; (<b>b</b>) Pose 2, where palm appears parallel relatively to the camera.</p>
Full article ">Figure 7
<p>Snapshots of the developed system’s operation.</p>
Full article ">Figure 8
<p>The two boundary conditions for the gripper’s opening <span class="html-italic">w</span> values.</p>
Full article ">Figure 9
<p>The trajectory of the user’s hand, with respect to the frame of reference <span class="html-italic">c</span>, and the end effector, with respect to <span class="html-italic">b</span>. 33 points are depicted with blue for the human hand and orange for the Panda arm. The left graph is a side-view comparison of the movements, while the right one illustrates the same sequence from a top view.</p>
Full article ">
23 pages, 7079 KiB  
Article
An Integrated Architecture for Robotic Assembly and Inspection of a Composite Fuselage Panel with an Industry 5.0 Perspective
by Gaetano Lettera and Ciro Natale
Machines 2024, 12(2), 103; https://doi.org/10.3390/machines12020103 - 1 Feb 2024
Cited by 1 | Viewed by 1620
Abstract
Aeronautical robotic applications use quite large, heavy robots with huge end effectors that are frequently multifunctional. An assembly jig to hold a fuselage panel and two medium-sized six-axis robots fixed on linear axes, referred to as the internal and the external robot with [...] Read more.
Aeronautical robotic applications use quite large, heavy robots with huge end effectors that are frequently multifunctional. An assembly jig to hold a fuselage panel and two medium-sized six-axis robots fixed on linear axes, referred to as the internal and the external robot with respect to the curvature of the panel, make up the Lean robotized AssemBly and cOntrol of composite aeRostructures (LABOR) work cell. A distributed software architecture is proposed in which individual modules are developed to execute specific subprocesses, each implementing innovative algorithms that solve the main drawbacks of state-of-the-art solutions. Real-time referencing adopts a point-cloud-based strategy to reconstruct and process the part before drilling, avoiding hole positioning errors. Accurate concentric countersink diameters are made possible through the automatic adjustment of the drilling tool with respect to the skin panel, which guarantees its orthogonality, as well as the implementation of process parameter optimization algorithms based on historical results that compensate for the wear of the drilling bits. Automatic sealing and fastening strategies that involve the measurement of the main fastener quality parameters allow for the complete verification of the entire assembly process of each part. Additionally, an advanced multimodal perception system continuously monitors the collaborative workspace to ensure safe human–robot collaboration (HRC) tasks. Through this integrated architecture, LABOR substantially reduces expenses and facilitates maintenance and programming. Full article
(This article belongs to the Special Issue AI-Integrated Advanced Robotics Towards Industry 5.0)
Show Figures

Figure 1

Figure 1
<p>Distributed software architecture.</p>
Full article ">Figure 2
<p>Final fuselage details: AFT and FWD panels and side, keel, and crown sections (<b>left</b>); side forward panel structural arrangement (<b>right</b>).</p>
Full article ">Figure 3
<p>Identification of drilling and fastener measurement requirements.</p>
Full article ">Figure 4
<p>Example of a HRC assembly operation: the operator removes the blue element from the panel for the execution of deburring operations, while the robot executes inspection of other parts.</p>
Full article ">Figure 5
<p>Work cell calibration of the system (<b>left</b>) and referencing of the part with respect to the internal vision system (<b>right</b>).</p>
Full article ">Figure 6
<p>Referencing algorithms: edge detection for shear-tie (<b>left</b>) and pop-rivet identification for stringer and frames (<b>right</b>).</p>
Full article ">Figure 7
<p>Drilling alignment requirement: align the <span class="html-italic">z</span>-axis of the drilling tool frame, <math display="inline"><semantics> <msub> <mo>Σ</mo> <mrow> <mi>e</mi> <mi>x</mi> <mi>t</mi> <mo>_</mo> <mrow> <mi>T</mi> <mi>C</mi> <mi>P</mi> </mrow> </mrow> </msub> </semantics></math>, with the unit vector normal to the panel skin, <span class="html-italic">N</span> (<b>left</b>), of the drilling nose (<b>right</b>).</p>
Full article ">Figure 8
<p>Clamping force strategy: once the external robot end effector has been aligned with the panel surface (<b>a</b>), it executes a pure translation towards the surface (<b>b</b>) until contact occurs (<b>c</b>); then, it waits for the counter-thrust application from the internal robot (<b>d</b>) before starting the second clamping step until the target force has been reached (<b>e</b>).</p>
Full article ">Figure 9
<p>Hole and countersink diameter measurement: the two arcs of the circumference have been identified from five acquisitions, the yellow one representing the external countersink edge and the green one representing the internal hole edge. The final reconstruction that merges the segments identified in each picture is shown on the right.</p>
Full article ">Figure 10
<p>Hole and countersink diameter measurement.</p>
Full article ">Figure 11
<p>Fastener quality check: from the internal side (left) the software computes the sleeve diameter and the sleeve height by identifying three representative points, while from the external side (center and right) it computes the stem protrusion and the flushness by identifying some representative ROIs (colored areas).</p>
Full article ">Figure 12
<p>The HRC test bench: the laboratory collaborative workspace setup (<b>A</b>); the robust human detection of the CNN through the novel sensor fusion approach (<b>B</b>); several human workers identified in the scene at the same time (<b>C</b>); two identified closest points for separation distance computation (<b>D</b>).</p>
Full article ">Figure 13
<p>The HRC algorithm integrated into the LABOR work cell: a frame of the collaborative task, on the left; the corresponding digital twin, which shows the point cloud of the human operator identified by the multimodal CNN, on the top right; the robot speed modulation approach, on the bottom right.</p>
Full article ">Figure 14
<p>Hole positioning analysis: distance measurements between holes and between the hole edges and the part edges.</p>
Full article ">
Back to TopTop