[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Next Article in Journal
Data Reduction in Proportional Hazards Models Applied to Reliability Prediction of Centrifugal Pumps
Previous Article in Journal
Speed–Pressure Compound Control of Thrust System Based on the Adaptive Sliding Mode Control Strategy
Previous Article in Special Issue
Soft Grippers in Robotics: Progress of Last 10 Years
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Framework for Real-Time Autonomous Robotic Sorting and Segregation of Nuclear Waste: Modelling, Identification and Control of DexterTM Robot

by
Mithun Poozhiyil
1,2,*,
Omer F. Argin
1,3,
Mini Rai
1,*,
Amir G. Esfahani
1,4,
Marc Hanheide
1,
Ryan King
5,
Phil Saunderson
5,
Mike Moulin-Ramsden
5,
Wen Yang
6,
Laura Palacio García
6,
Iain Mackay
6,
Abhishek Mishra
6,
Sho Okamoto
6 and
Kelvin Yeung
6
1
Lincoln Centre for Autonomous Systems, University of Lincoln, Lincoln LN6 7TS, UK
2
Manufacturing Technology Centre, Coventry CV7 9JU, UK
3
Department of Electrical and Electronic Engineering, University of Manchester, Manchester M13 9PL, UK
4
Computer Science Research Centre, University of Surrey, Guildford GU2 7XH, UK
5
Veolia Nuclear Solutions, Abingdon OX14 4SR, UK
6
Faculty, Level 5, 160 Old Street, London EC1V 9BW, UK
*
Authors to whom correspondence should be addressed.
Machines 2025, 13(3), 214; https://doi.org/10.3390/machines13030214
Submission received: 20 January 2025 / Revised: 23 February 2025 / Accepted: 27 February 2025 / Published: 6 March 2025
(This article belongs to the Special Issue New Trends in Industrial Robots)
Figure 1
<p>A schematic of Dexter<sup>TM</sup> teleoperation system architecture comprising local and remote manipulators.</p> ">
Figure 2
<p>The experimental setup comprising a mock-up of nuclear waste sorting test-bed. (<b>a</b>) Dexter<sup>TM</sup> local and remote arms. (<b>b</b>) Remote arm with associated sensors and sorting table.</p> ">
Figure 3
<p>Top-level system process flow for Dexter<sup>TM</sup> system-based nuclear sort and segregation application.</p> ">
Figure 4
<p>Nuclear sort and segregation system architecture.</p> ">
Figure 5
<p>Frame definition of the Dexter<sup>TM</sup> manipulator.</p> ">
Figure 6
<p>Dexter dynamic model parameter identification process.</p> ">
Figure 7
<p>Joint-space feed-forward nonlinear control scheme.</p> ">
Figure 8
<p>Octomap of the environment and ROS-Rviz simulation model.</p> ">
Figure 9
<p>Curve fitting to mass estimation.</p> ">
Figure 10
<p>Fourier series-based excitation trajectories generated for dynamical parameter identification of Dexter<sup>TM</sup> manipulator. Joint trajectories for (<b>a</b>) training and (<b>b</b>) testing.</p> ">
Figure 11
<p>Predicted and measured torques for the test trajectory using estimated dynamical parameters of Dexter<sup>TM</sup> manipulator.</p> ">
Figure 12
<p>Result of object detection and classification from two RGB-D images of a scene. Images from left to right shows the test objects in the environment from two cameras, their depth images, Multiview Stereo (MVS) reconstruction and filtered point cloud, and SoftGroup model-based classification and object detection outputs.</p> ">
Figure 13
<p>Single-object point cloud reconstruction from three different object pose performed by Dexter<sup>TM</sup> after grasping and the category classification result.</p> ">
Figure 14
<p>Radiological surveying objects, radiation scan trajectories and radiation levels.</p> ">
Figure 15
<p>Grasp pose generation results from two object piles.</p> ">
Figure 16
<p>Geometry characterizations of wellington boot (<b>Row 1</b>) and plastic hose (<b>Row 2</b>). (<b>Column 1</b>): 3D point cloud of the environment. (<b>Column 2</b>): Watertight mesh generated from detected object point cloud. (<b>Column 3</b>): Geometrical characterization of detected object.</p> ">
Figure 17
<p>Experimental result of the bin packing.</p> ">
Figure 18
<p>Full system demonstrator.</p> ">
Figure 19
<p>Execution of the integrated system from picking to dropping for four example objects.</p> ">
Versions Notes

Abstract

:
Robots are essential for carrying out tasks, for example, in a nuclear industry, where direct human involvement is limited. However, present-day nuclear robots are not versatile due to limited autonomy and higher costs. This research presents a merely teleoperated DexterTM nuclear robot’s transformation into an autonomous manipulator for nuclear sort and segregation tasks. The DexterTM system comprises a arm client manipulator designed to operate in extreme radiation environments and a similar single/dual-arm local manipulator. In this paper, initially, a kinematic model and convex optimization-based dynamic model identification of a single-arm DexterTM manipulator is presented. This model is used for autonomous DexterTM control through Robot Operating System (ROS). A new integration framework incorporating vision, AI-based grasp generation and an intelligent radiological surveying method for enhancing the performance of autonomous DexterTM is presented. The efficacy of the framework is demonstrated on a mock-up nuclear waste test-bed using similar waste materials found in the nuclear industry. The experiments performed show potency, generality and applicability of the proposed framework in overcoming the entry barriers for autonomous systems in regulated domains like the nuclear industry.

1. Introduction

Nuclear reactors are often preferred over fossil fuels for efficient and sustainable large-scale power generation without refueling for a long time. Developing nations worldwide consider nuclear energy as one of the most important energy sources for meeting growing energy demands [1]. Even though nuclear energy production contributes towards clean energy, the disposal of nuclear wastes efficiently and safely is challenging [2]. In addition to that, efficient methods for decommissioning and clean-up of aging nuclear reactors, for example, Sellafield (UK) [3], and technology developments for handling nuclear disasters, for example, Fukushima (Japan) [4], are in need. The Nuclear Decommissioning Authority (NDA) oversees many legacy facilities in the UK that are in the process of being decommissioned and with rising nuclear wastes the scale of the problem is enormous [5]. Safe, effective and optimized waste management is a significant part of these long-term nuclear maintenance or decommissioning tasks. The radioactive wastes in nuclear power plants are commonly classified as High-Level Waste (HLW), Intermediate-Level Waste (ILW), Low-Level Waste (LLW) and Very Low-Level Waste (VLLW), according to the nature of material, level of radioactivity and the heat produced [6]. In order to facilitate a waste management strategy, these wastes needed to be sorted and segregated for optimal packing. Currently, these materials are handled by trained operators wearing personal protective equipment (PPE) [7] or by remotely controlled robots [8]. The human-led nuclear waste handling tasks are challenging due to proximity to radioactive wastes, limited efficiency, and high time consumption. In order to address these challenges, methodologies using robotic technologies have emerged to find ways to perform safe and efficient handling of nuclear materials.
Several attempts have been made earlier using robots [9,10,11,12], but robotic sorting and segregation of nuclear wastes are seldom reported in the literature. The Joint European Torus (JET) machine employs a human-in-loop approach for teleoperating the robotic boom and a dual-arm robotic system inside the JET Torus [13]. A robotic arm mounted on a mobile robot, Hydro-Lek, for nuclear decommissioning tasks is presented in [14,15]. Hydro-Lek is a seven degrees-of-freedom (DoF) hydraulic manipulator with a continuous jaw rotation mechanism designed to grasp objects and cut ropes. The Oak Ridge National Laboratory’s Advanced Servomanipulator System and Dual-Arm Remote Sawing Manipulator System are task-specific robotic systems designed for specific nuclear decommissioning applications [16]. A modular robotic system for Reactor Core Detector Removal was presented in [17] for positioning, extracting, transporting, cutting and coiling tasks. As robotics- and artificial-intelligence-based computer vision technologies [18,19,20] have advanced through industrial automation, it is now clear that autonomous robotics can also play a key role in nuclear waste sort and segregation applications. Refs. [21,22] presented industrial non-nuclear waste sorting robotic methodologies using KUKA arms and a 4-DoF fast Parallel robot. Conventional robotic mechanisms and motion planning approaches work well for structured or semi-structured environments [23]. However, in harsh environments, the precarious nature of the physical tasks involved and regulatory constraints impose several other challenges in terms of deploying commercial robots for nuclear clean-up. A recent survey on challenges for future robotic sorters suggested the requirement for a highly dexterous dual-arm robotic systems for efficient sorting of unstructured and heap of objects [24]. Even though there is advancement in image-based detection of waste objects and environmental monitoring using computer vision and AI, developing a new autonomous dexterous robotic system specifically for the sorting and segregation of nuclear waste is challenging.
Robotic remote operation technologies under human supervision are needed to perform hazardous decommissioning and sorting activities. For faster development and to overcome the entry barriers in a highly regulated nuclear domain, the authors solve the problem by building a novel solution to further automate and induce intelligence to the existing commercial DexterTM teleoperation system by Veolia Nuclear Solutions (VNSs), Oxford, UK. A new methodology for switching from teleoperation to autonomous operation is presented based on rigorous system identification and dynamic modelling. This paradigm shift allows a quicker and more efficient introduction of shared autonomy into the previously manual tasks. Industrial manipulators are generally designed for structured and predictable factory environments. In contrast, DexterTM is specifically developed for extreme and hazardous environments, including high-radiation nuclear decommissioning sites, where human intervention is either unsafe or impractical. While industrial robots primarily operate in fully autonomous or pre-programmed modes, DexterTM is a teleoperated system with a human-in-the-loop architecture. It employs a kinematically identical local and remote arm setup, allowing an operator to manipulate the remote arm in real-time. This feature is critical for responding to unexpected conditions in dynamic and unstructured nuclear environments. Dexter incorporates cable-driven joints and gear-driven mechanisms. This is a significant departure from conventional industrial robots, which predominantly rely on direct-drive, gear-drive or harmonic-drive actuators. The cable-driven system enhances flexibility, but introduces complexities in dynamic modeling, requiring precise system identification and compensation strategies.
The DexterTM manipulator comprises cable-driven joints, complex gear assembly and parallel linkages. Identifying the dynamics of such a system is not straightforward and requires precise modelling and system identifications. Dynamical model parameter identification of the DexterTM manipulator through excitation trajectories is one of the major contributions of this paper. The complex friction model and joint elasticity of cable-driven joints are also considered to obtain a comprehensive dynamic model through constrained optimization. This enables the DexterTM to perform sort and segregation tasks autonomously. The new framework presented realised object identification, classification, tracking, robotic grasping, sorting and packaging of nuclear wastes according to the radiation levels. For more complex grabs and tangled wastes, the integrated system would also have the ability to call on the expertise of a skilled operator to take over the task for manual operation; the remote DexterTM robot will replicate the operator’s movements on the local DexterTM. By sharing the tasks between human and computer intelligence, this approach offers a step change in safety and efficiency through limited intervention by operators for complex tasks. This methodology will enable accelerated adoption of automation and autonomy in extreme environments without rigorous changes in highly regulated sectors.
This paper first introduces the DexterTM robotic teleoperation system, and the sort and segregation experimental setup is proposed in Section 2. In Section 3, an outline of the proposed sort and segregation framework is presented. Section 4 elaborates on the modeling, system identification, and control of the DexterTM for autonomous operation. The sensor integration and autonomous procedures for the radiation scanning, sorting, segregating, and packing of the wastes are described in Section 5. Section 6 covers experimental verification and validation, and finally, the concluding inferences are stated in Section 7.

2. Preliminaries and System Overview

This section presents relevant preliminary details that define the baseline architecture and building blocks of the new framework covered in Section 3. Initially, the DexterTM, a commercial haptic-enabled and radiation-hardened teleoperated robotic platform, is introduced. Later, an overview of the experimental setup is provided, which accomplishes autonomous nuclear sort and segregation activities.

2.1. DexterTM Teleoperation System

DexterTM, in Figure 1, is an exemplar of a collection of telemanipulation capabilities purposed for extremely hazardous environments, notably extreme radiation [13]. The DexterTM system is primarily developed for Human-in-Loop operations using kinematically identical local and remote arms, the former in a safe operations centre and the later in a hazardous operational environment. The Mascot (the predecessor to DexterTM) started life as an indispensable part of the Joint European Torus (JET) nuclear fusion program’s remote handling operation and it supports the nuclear fuel debris retrieval from a stricken reactor at the Fukushima Daiichi nuclear power plant in Japan. VNS’s DexterTM is a system that is able to repair, maintain and/or re-tool a larger complex mechanical system, including the ability to respond to unforeseeable upset conditions. In essence, this high-value system provides an insurance policy guaranteeing that the system can be kept in service despite not being accessible to humans. In addition to that, a real-time control system and RTServiceLibrary provides features like system redundancy and fault tolerances through various services like high priority process execution, non-blocking mechanisms for network communication and a tree structure for different levels of fault abstraction [25]. The human operator performs the tasks with the local manipulator, which the remote manipulator replicates in real time in the remote location. The remote manipulator can be positioned up to 8 km from the local manipulator using only data cables and there is no mechanical connection between the two systems. One of the key features of the DexterTM arm is that the arm can be considered as a straightforward replication of the operator’s arms. The human operator can prepare for tasks and carry them out almost exactly as if they were intervening in the environment directly. Along with the extreme smoothness, responsiveness, and sensitive force, the feedback of the arms coupled with camera feedback creates a sense of “Telexistence” for the operator.
A single arm of DexterTM system comprises all revolute joints to exhibit 6-DoF motion. The first three joints are driven by a gear system and last three joints are cable-driven by a two-finger end-effector. The electrical actuators responsible for each joint motions are placed together at the base of the DexterTM in a specialised configuration with in a protective casing. This is beneficial for reliable operations in a nuclear environment where radiations and high temperatures will effect the electrical systems. In order to achieve heavy load carrying capacity, a specialised parallelogram mechanism is also provided between joints one and three. For transforming this torque-controlled DexterTM teleoperation system to the autonomous system, where currently the remote arm joint positions follow local arm joint positions, an accurate kinematic and dynamical model has to be obtained. The dynamical model identification of DexterTM robot is challenging considering the above mentioned configurations and specialised actuation mechanisms. Further, the frictions in joints and the elasticity of cables bring additional challenges in model identification.

2.2. Sort and Segregation Experimental Setup Overview

The experimental setup comprising DexterTM local and remote arms is shown in Figure 2a. The remote system is isolated to perform complete sort and segregation operations safely similar to a nuclear environment. As shown in Figure 2b, sensors and sorting tables are placed within the worspace of remote arm for performing autonomous operations. In order to obtain the 3D information of the object, two industrial grade SICK sensors are placed facing towards the robot workspace in an eye-to-hand configuration. Initially, the sorting table is loaded with all the waste objects that need sorting. Once a waste object is picked up and its characteristics identified, it will be deposited in ILW, LLW or recycling trays, as shown in Figure 2b. Since the robot base is stationary, a trolley mechanism manually moves the corresponding trays into the robot’s workspace after characterizing the waste item.

3. Outline of Proposed Framework

The new framework presented in this paper aims to advance the autonomy and automation of DexterTM system and integrate it with advanced characterization and tracking technologies to identify, classify, transfer and package waste items commonly found in the nuclear industry. Figure 3 shows a top-level system process flow for a nuclear sort-and-segregation task using the DexterTM system, which comprises various control modules, as shown in Figure 4. The low-level control module consists of DexterTM kinematic and derived dynamic models, and position controllers. Using the ROS framework, various submodules and Task Groups (TGs) are interconnected for task transitions, sensor interfaces, safety, teleoperation and Digital Twin in the middle-level control. The GUI enables an interface to control various control tasks in middle-level control and also act as an information display system for the operator to make decisions. The system is designed such that high priority is given for system and operation safety of robot through process control. The process control is also responsible for identification of various failure modes, such as object slip and drop, and failure to generate feasible trajectory to desired pose. The failure modes will generate corresponding flags in the GUI for the operator to intervene and take necessary actions.

4. Robot Modelling, Identification and Control

This section initially describes DexterTM’s kinematic model and an analytical formulation of dynamic equations, including friction and joint elasticity components. Next, a system identification procedure is presented for obtaining reliable dynamic parameters. Finally, a joint space position controller is implemented using a pre-computed feed-forward torque controller.
An open kinematic chain of DexterTM joint frames, shown in of Figure 5, is considered for obtaining the minimal kinematic model of the robot. The calculated Denavit–Hartenberg (DH) notation-based kinematic parameterization of the DexterTM is provided in Table 1, where a i , d i and q i are corresponding link lengths, offsets and joint angles of DexterTM, as shown in Figure 5.

4.1. DexterTM Kinematic and Dynamic Model

The motion of the 6-DoF DexterTM manipulator can be described by the general rigid body dynamics as in (1)
τ = M ( q ) q ¨ + C ( q , q ˙ ) q ˙ + G ( q ) + τ f + τ e ,
where q , q ˙ , q ¨ , R 6 represent the joint position, velocity and acceleration vectors, respectively. M ( q ) R 6 × 6 represents the mass/inertia matrix containing respective mass m i and inertia tensors I i . C ( q , q ˙ ) R 6 × 6 contains the corresponding Coriolis and Centrifugal forces/torques. The terms G ( q ) and τ R 6 are the gravitational force and input torque vectors, respectively.
To accurately model DexterTM’s behavior, two additional terms, τ f and τ e , are introduced to capture frictional effects and joint elasticity, respectively. Given that DexterTM operates in a constrained nuclear environment where motions are generally slow, the Striebeck effect becomes more pronounced, making a complex friction model necessary [26]. The joint friction term τ f R 6 is therefore modeled as
τ f = F c + ( F s F c ) e | q ˙ | / q ˙ s + F v q ˙ ,
where F s , F v , F c and q ˙ s represent static, viscous, Coulomb and Striebeck constants, respectively [27].
Additionally, joint elasticity is considered due to the cable-driven actuation mechanism used in DexterTM. The flexibility in the cables introduces compliance in the joints, which is modeled as a stiffness term:
τ e = K q ,
where K R 6 × 6 is a diagonal stiffness matrix.
The identification of dynamic parameters for DexterTM includes these additional friction and elasticity terms, which are essential for achieving accurate torque control. The presence of complex gear systems and cable-driven actuation necessitates a precise estimation of these parameters. Since the parameter identification process involves a constrained optimization problem, we employ a well-established convex optimization algorithm to ensure robust and physically consistent parameter estimation [26]. This optimization approach allows for an efficient solution while maintaining feasibility constraints imposed by the robot’s physical limitations.

4.2. Dynamic Model Parameter Identification

The Dynamical parameter identification [28] is a crucial part in the process of transforming purely teleoperated DexterTM robot to an autonomous system. The direct identification of model parameters are not straightforward considering the advanced DexterTM robot mechanisms. Therefore, data collected by actuating the DexterTM manipulator using excitation trajectories are used to identify the dynamical model parameters through convex optimization. An outline of the parameter identification procedure is shown in Figure 6.
From the generic dynamic model of the DexterTM robot (1), the unknown parameters that needs to be identified for i t h link are mass m i , Center-of-Mass (CoM) l i = [ l x , l y , l z ] , inertia parameters ( I i x x , I i x y , I i x z , I i y y , I i y z , I i z z ), friction constants ( F i s , F i c , F i v ) and stiffness constant K i . In total 84 parameters need to be identified for a 6-DoF DexterTM manipulator.
For simplicity, (1) is represented in the linear form [29] by
τ = H ( q , q ˙ , q ¨ ) β ,
where H R 6 × 84 represents a regression matrix and β R 84 represents the unknown dynamic parameters vector. From the DexterTM robot’s physical configuration its understandable that not all dynamical parameters will contribute to the dynamic model and becomes zero. Hence, a minimum set of identifiable parameters is obtained from (4) using numerical QR decomposition method [30]. The dynamic equation with reduced number of parameters ( r = 57 ) can be represented as
τ = H r ( q , q ˙ , q ¨ ) β r ,
where H r and β r are the reduced regression matrix and vector of the identifiable dynamic parameter set. From an excitation trajectory q i ( t ) , where t = t 1 , t 2 , , t k time step, and using the collected joint position, velocity, acceleration and torque values, the linear equation in (5) can be represented as
τ = τ ( t 1 ) τ ( t 2 ) τ ( t k ) = H e β r ,
where H e is an extended regression matrix
H e = H r ( q ( t 1 ) , q ˙ ( t 1 ) , q ¨ ( t 1 ) ) H r ( q ( t 2 ) , q ˙ ( t 2 ) , q ¨ ( t 2 ) ) H r ( q ( t k ) , q ˙ ( t k ) , q ¨ ( t k ) ) .

4.2.1. Identification by Minimization

The dynamical parameter estimation is posed as a minimization problem between measured torque vector τ m and predicted torque vector τ in (6) using an excitation trajectory. The parameter identification is performed by solving the following constrained optimization problem
min β r | | τ m H e β r | | 2 β r D subject to h ( β r ) ,
where h ( β r ) represents the constraints and D denotes search space. In order to obtain a physically feasible parameter set following physical consistency constraints, h ( β r ) is defined.
  • The mass of the links must be positive definite: m > 0 ;
  • The inertia tensors must be positive definite I > 0 ,;
  • Eigenvalues of the inertia tensors ( σ x , σ y , σ z ) must satisfy triangle inequality conditions σ x + σ y > σ z , σ x + σ z > σ y , σ y + σ z > σ x , as in [31];
  • The mass center should remain in its convex hull that is m l l b < l i and m l u b > l i [32], where l l b and l u b represents lower and upper bounds of the l i , respectively;
  • The stiffness of the spring is positive definite: K > 0 .

4.2.2. Optimal Excitation Trajectory Generation

It is important to ensure that the excitation trajectories provided to the DexterTM must induce all the modeled dynamical effects for accurate parameter identification. Therefore, finite Fourier series-based excitation trajectories are used to obtain variable amplitudes and frequencies. Using Fourier series functions the angular position trajectories of i t h joint can be written as
q i ( t ) = n = 1 N a n , i w f n sin ( w f n t ) b n , i w f n cos ( w f n t ) + q 0 , i ,
where w f , N and q 0 , i are the frequency and the number of harmonics and initial joint angles, respectively. a n , i and b n , i are trigonometric term constants and 2 N + 1 parameters have to be identified per joint. Since the joint angle’s range and workspace of DexterTM manipulator is limited, another constrained nonlinear optimization is performed in order to generate sufficiently rich trajectories. Minimizing the condition number of the regression matrix H e [33] is defined as an objective function towards this with joint and pose constraints:
arg m i n c o n d ( H e ) subject to q m i n q q m a x , q ˙ m i n q ˙ q ˙ m a x , q ¨ m i n q ¨ q ¨ m a x , s m i n s s m a x ,
where s represent task space end-effector positions.

4.3. Position Control System Design of DexterTM

Advanced control methods are required for highly nonlinear robots like DexterTM for accurate and efficient trajectory execution. The computed torque control method with linearization for compensating the robot dynamics is one of the common nonlinear control methods [34]. Instead of altering the DexterTM’s internal low-level torque control architectures, an offline calculated feed-forward computed torque controller is utilized in this research.
Based on the desired joint trajectories ( q d , q ˙ d , q ¨ d ), predicted joints torques are obtained offline using (1). The controller then applies these predicted joints torques to DexterTM joints together with a Proportional–Integral–Derivative (PID) control, as shown in Figure 7. The joint control command u for the DexterTM system can be written as the following:
u = M ( q d ) q ¨ d + C ( q d , q ˙ d ) q ˙ d + G ( q d ) + τ f ( q ˙ d ) + τ e ( q d ) + K p e + K d e ˙ + K i e ,
where joint error e is defined as q d q and K p , K d , K i are PID gains.

4.4. Motion Planning and Trajectory Generation

For generating optimal motion trajectories on DexterTM robot in order to reach desired positions, Open Motion Planning Library’s (OMPL’s) Rapidly Exploring Random Tree (RRT*) algorithm is utilised through ROS [35]. The implemented motion planner considers joint limits, singularities and self-collisions of DexterTM manipulator. In order to avoid collisions with objects in the DexterTM workspace, a 3D occupancy grid map (Octomap) of the environment is also generated and included in the planner. The real-time point cloud input of the environment is obtained from depth cameras. Figure 8 presents the point cloud of the environment generated on the test setup described in Section II(B) for optimal trajectory generation with collision avoidance.

5. Autonomous Object Grasping and Packing

Autonomous detection of objects from a pile, its characterisation and packing using a robotic system requires efficient algorithms and their integration into a single framework. The autonomous framework developed in this research (Figure 3) around DexterTM system utilizes various AI-based methodologies to perform nuclear sort and segregation tasks in a single pipeline. The methodologies used for identifying the objects, how and where to grasp the objects, material characterisation, radiation scanning of the objects and optimal packing are presented in this section.

5.1. Object Detection, Material Characterization and Grasp Point Generation

The vision system of the autonomous DexterTM consists of two stationary SICK Visionary-S RGBD sensors attached to the sensor frames, as shown in Figure 2b. These industrial sensors are traditionally used for verifying box packing and are ideal for identifying a large number of items simultaneously with high-frequency operations. Two SICK sensors are carefully placed in the DexterTM workspace such that a single point cloud of the whole scene can be generated from two RGBD images, which almost completely describes the scene in 3D. The next step in object detection and classification involves using efficient AI-based image processing algorithms.
In order to improve performance and to avoid integration complexities in terms of system requirements and dependencies, segmentation, classification and material characterization are implemented together using the SoftGroup [36] AI model. The 3D instance segmentation model, SoftGroup, also provided the flexibility for isolating point cloud of the target object from the robot gripper to perform efficient single-object characterization rather than only from piles. Around 130 different objects commonly seen in nuclear waste sites were considered, and over 400 scans of these objects were carried out to formulate the training dataset for the SoftGroup model. Each scan was created by both SICK cameras taking an RGBD image simultaneously, which was then processed into a single point cloud of the whole scene. Along with the scans from a pile of 20–40 objects on the sorting table, objects held by DexterTM above the sorting table were also collected at different orientations. In the preprocessing stage, ground truth segmentation masks and classification tags consisting of object category, material, mass, volume and density were manually annotated. For material categorisation of the objects, the object category and material classifications are merged to encode both geometric/semantic and material information. For example, instead of having “steel” and “aluminium” as material classes and “rod” and “metal scrap” as category classes, “steel rod”, “steel scrap”, “aluminium rod” and “aluminium scrap” are considered as separate independent classes. In this way, a single SoftGroup model performs all segmentation and classification tasks for the whole system, significantly reducing the system’s complexity and running time, without sacrificing the model’s prediction accuracy.
Once the objects in the pile are detected and identified, the next step is to identify feasible grasp poses [37] on the objects in order to perform grasping by the DexterTM. An object grasp pose generator pipeline is developed using Contact-GraspNet [38], which takes the segmented and classified point cloud as the input.

5.2. Geometry Characterisation and Mass Estimation

Objects’ geometric information, such as length, width, height, surface area, and volume, is another important characteristic required for optimal packing. This information is crucial, along with the object type and name in nuclear applications for keeping a catalog of objects in a packed container, especially if radioactive. The geometric characterisation is performed on objects in a pile and also after the objects held by the gripper for revalidation. A sequence of operations is performed using the point cloud generated by two cameras, which outputs a watertight mesh of the object under consideration for geometric characterisation. Generally, with two RGBD cameras, the complete capture of all the sides of the object is not possible, and the corresponding mesh is unreliable for accurate characterisation. In the first step, the combined 3D point cloud obtained from two cameras is down sampled and an interpolation is performed to fill the vacant space on the object point cloud. Next, a simple mesh (dense 3D geometry) is created from the interpolated point cloud using a surface reconstruction method called Ball pivoting using Open3D [39]. In order to calculate the volume of objects, a watertight mesh consisting of a continuous and closed surface is needed. Using the manifold method in Open3D, the watertight meshes are generated, and finally, the geometric information of the object is obtained.
On top of identifying the object mass from camera input, a direct mass estimation model using DexterTMrobot is also developed for better accuracy. For this, 25 different objects with known masses and the torque value of DexterTM joint-1 to lift the object are considered. After the object is grasped, the DexterTM is moved to a specific predefined configuration to estimate the mass. Using the ground truth mass and corresponding torque values, a fourth-order function is fitted using the linear least square curve fitting method Figure 9. Using this mapping, the object’s mass m is obtained using the following function:
m = 0.0002 τ 1 4 0.0013 τ 1 3 0.0154 τ 1 2 + 0.3462 τ 1 0.6673 ,
where τ 1 is the corresponding measured torque values to lift the object.

5.3. Radiological Surveying and Decision

In a nuclear sort and segregation application, one of the crucial tasks is the efficient detection of object’s radioactivity levels. In this research, real radioactive objects were not available for testing and therefore a radiological surveying pipeline is designed where random activity levels are assigned to segmented items in the point cloud and radiation measurements are simulated on a mock up radiation detector held by the DexterTM robot. For performing radiation scanning, a 2D end-effector trajectory is generated at a fixed height from the tray where objects that need to be picked are placed. Although the trajectory planner aims to scan the whole tray, it is focused more on the object location inputs from the point cloud and segmentation outputs to reduce scan time requirements. The DexterTM performs slow and detailed scanning movements near the objects and finally, if calculated confidence for waste sentencing is low, random scanning is performed over the tray in order to obtain a more accurate object activity result.
The radiation detector demonstrated here is Createc’s N-Visage Recon. As this is a gamma detector, the alpha and beta content of waste items is determined though application of radionuclide fingerprints in this example. Use of an alpha/beta monitor was explored and the positional accuracy of the DexterTM demonstrated surface monitoring with this type of monitor to be possible for simple flat surfaces though challenging for more complex surfaces. The output activity for each waste item is obtained through calculation in Createc’s N-Visage algorithm [40], where measurement uncertainty and Minimum Detectable Activity (MDA) is determined through a Monte Carlo-based random sampling method in the algorithm and application of the Currie Equation [41]. To sort the object into one of the ILW, LLW, or recycling trays by the DexterTM, this activity model is used.

5.4. Bin Packing

Efficient bin packing is another challenge that is faced in the nuclear industry. The strategy of dropping objects randomly into the bin often results in instability of heterogeneous objects inside the bin, uneven weight distribution and inefficient space utilization. An efficient container packing algorithm, Tetris, is utilized for autonomous bin packing using Dexter TM robot [42]. The algorithm calculates the quasi-optimal position and orientation to place the picked objects in the destination container. Initially, using the 2D Tetris algorithm the bottom layer of the bin is filled using a target function such that it minimizes the total unoccupied bottom surface area of the container. The target function places the objects as close as possible to this bottom wall and also minimize the holes in the bottom surface area of the container. Next, based on the shape of the object in hand and the landscape of the existing objects in the container, another target function executes Tetris in the z-direction and tries to maximize the number of objects packed and simultaneously minimize the gap volume of the container. With this target function, it would try to put each object onto the lowest region as possible in the container, but in the meantime, also try to minimise the void volume underneath the object that would be created after putting it, as this volume would not be recovered.

6. Experimental Evaluation

This section presents the experimental evaluation of the proposed sort-and-segregation framework, outlined in Section 3, using DexterTM manipulator. The experimental setup with cameras and sorting tables is shown in Figure 2. The initial part of this section presents the results of DexterTM dynamical model parameter identification. For verifying its effectiveness, the tracking performance is studied by implementing a position controller. Next, the results of sort and segregation subsystems, presented in Section 5, are provided. Finally, the integrated system performance is evaluated through successful nuclear sort-and-segregation experiments on similar objects from the nuclear industry.

6.1. Parameter Identification and Model Validation

The input excitation trajectories for dynamical model parameter identification of the DexterTM manipulator are generated by solving (10) using convex optimization, as explained in Section 4.2 The generated reference trajectories induces the modeled dynamical effects, friction and elasticity, of the DexterTM manipulator. The corresponding joint positions are shown in Figure 10.
Using these reference trajectories, the DexterTM joints are actuated using the position controller shown in Figure 7 where an approximate of dynamical parameters were used from DexterTM CAD model. In order to find the exact dynamical model parameters, initially, the executed joint torques and angular motion information ( q , q ˙ , q ¨ ) are collected and stacked across each time step as the training data set. Using this data, a suitable set of dynamic parameters is estimated through convex optimization using (8). The designed position controller (11) is tested along with the updated dynamical parameters to accomplish DexterTM autonomous control, as shown in Figure 11. Although there are minimal deviations during the motion due to the nonlinearity caused by its cable-driven nature and the unmodeled dynamic effects, the controller showed good performance that enabled the desired target to be achieved.
Table 2 compares the torque tracking error and their deviation for all joints for the generated training and test trajectory. Even though joint-2 and joint-3 errors are slightly higher, overall percentage error and standard deviations are minimal, resulting in less than 0.7 cm end-effector mean positional error through the feed-forward nonlinear control. These results confirm the accuracy of the dynamical model parameters identified.

6.2. Sort and Segregation Subsystems

The results of sort and segregation operation performed using experimental setup, explained in Section 2.2 in a real environment are provided below. Integrating individual subsystems led to expected full system performance, which successfully sorted and segregated various objects from a heap.

6.2.1. 3D Instant Segmentation and Classification

The pipeline for the sort and segregation process starts by generating two point clouds of the environment using two SICK sensors. Initially, the two point clouds were processed into a single point cloud of the whole scene using conventional Multiview Stereo (MVS) reconstruction method [43] and performed a denoising. The denoising is required to overcome the intrinsic device limitation of the SICK sensors, especially when measuring highly reflective surfaces. The developed SoftGroup model takes the combined point cloud input and performed segmentation and classification of each object agnostically from the pile and object in Dexter’s hand. An example of such a pair of acquired RGB-D images for a scene, merged single point cloud and the result of the SoftGroup model-based pipeline are shown in Figure 12. It can be seen that most objects were cleanly segmented, and the safety hat in the gripper was correctly classified (other objects were also classified by the model but not illustrated in the figure for brevity). A second level of object classification, while DexterTM holds the object and rotates it to three predefined orientations, is also performed by extracting its isolated point cloud. The SoftGroup model predicts the probabilistic vector of the target object belonging to each candidate category. The pipeline then reports the most likely category of the target object as the category with the highest probability averaged across the predictions. The target object’s point cloud was extracted from each scan orientation’s using the segmentation mask predicted by the SoftGroup model, and then merged together to obtain a complete point cloud of the object. Figure 13 shows the complete extracted point cloud of a safety hat, and also the overall categorical probability histogram of it as averaged across the detailed scans’ model predictions. A sample of the objects used and their category are provided in Table 3.

6.2.2. Radialogical Surveying

After scanning the pile and detecting objects, radiological surveying is conducted. Initially, DexterTM grasps the radiation detector and performs the scanning operation according to the scanning trajectory calculated offline. The radiation scanning trajectory is generated based on the segmented object positions in the point cloud of the pile. Figure 14 demonstrates the generated end-effector trajectories, which perform slow scanning near objects and a final random scanning if confidence in radiation measurement and subsequent waste sentencing route is determined to be low.
Due to the inherent safety risks and strict regulatory requirements associated with handling real radioactive waste, conducting experiments in an actual radiological environment is not feasible. Direct testing with nuclear waste would require specialized containment facilities, extensive safety protocols and regulatory approvals, which significantly limit accessibility and experimental flexibility. Additionally, the re-usability of test items is impractical, as exposure to real radiation would render them hazardous for further testing or modification. Given these constraints, experiments relied on simulated radiation data, where random radioactivity levels are assigned to objects to mimic real-world conditions. The implemented model generates synthetic detector measurements to create a radioactivity map superimposed on the point cloud while the trajectory is being executed. Item radiation activity is then matched with associated detected objects. The video (https://youtu.be/v2hlDxKDGcM, (accessed on 10 February 2025)) demonstrates a radiation-scanning experiment post object detection. Future work may explore testing in specialized nuclear facilities to further validate system performance.

6.2.3. Grasp Pose Generation

The core grasp-pose candidate generation method used in this research is Graspnet (Section 5.1). On top of Graspnet, a heuristic system is implemented on the generated grasp candidates to create higher quality and deterministic/reproducible grasp poses in cluttered piles and extreme cases such as the small lego block. This grasp generation system prioritizes grasp poses of the targeting objects having larger top surface areas and higher up in a pile. This strategy helped to pick the objects without much disturbance to the pile and identify objects that were hidden in the previous scan. It also prioritizes grasp poses closer to the geometric centre of each object, as such grasp poses would lead to a lower chance for the object to rotate, tip over, or slip out from the gripper during grasping and after being picked up. Figure 15 shows the top three highest priority grasp poses generated by the algorithm for two different clutter piles.

6.2.4. Object Characterisation

The pipeline for object geometry and physical characterization was based on conversion of point cloud of the object into a watertight mesh body, as explained in Section 5.2. Figure 16 shows two examples of the extracted point clouds for a wellington boot on the table and a Plastic hose held by DexterTM. By using Open3D’s inbuilt functionality, approximate geometric information of the object from the watertight mesh is obtained. On comparing the accuracy against the ground truth manual measurements of over 50 objects, closer results were observed except for objects with complex shapes, in particular, those with internal hollowness or folded structures. The estimated masses of objects were calculated using the estimated volume of object from Open3D, the estimated object material from category classification and the object category density values. In addition, an estimated mass of the object held by the DexterTM gripper is also calculated based on the torque feedback of the motors. The torque-based measurements were more closer to the mass of ground truth objects. Hence, the vision-based mass information is used as a redundant information for second validation.

6.3. Bin Packing Evaluation

Four different types of “human benchmarks” packing algorithms are analysed and tested in simulation to validate the efficacy of the developed Tetris algorithm. The benchmarks are (1) the Center Algorithm, a randomly oriented object dropped at the center of the container; (2) the Random Algorithm, a randomly oriented object dropped at a random position of the container; (3) the Lower-Bound Algorithm [44]; and (4) the Orthogonal Low Algorithm [45]. Each algorithm runs ten times for the same pile of objects. Table 4 presents the average values of the performance of the algorithms for comparison.
The unoccupied volume (voidage) of the container is used as a scoring method for comparison and is calculated as the volume of voidage = volume of container − the total volume of objects. The packed container volume in the experiments is 43.261 L. Although the Tetris algorithm is slower than the other algorithms, the optimal packing with the minimum voidage rate is achieved. The result of bin packing using the developed Tetris algorithm in a real experiment is shown in Figure 17. In this experiment, a total of 40 items are placed into the container with a total volume of 43.261 L and voidage 19.221 L. A 44 % container voidage is obtained in this experiment with an average pose generation run time of 0.23959 s.

6.4. Full System Demonstration

The designed sort and segregation subsystems and the DexterTM control systems are integrated into a full system demonstrator, as shown in Figure 18. It consists of a DexterTM manipulator, two SICK Visionary-S cameras, one sorting tray, LLW, ILW and recyclable trays. The full system provides a continuous sequential operation of sub-tasks, as shown in Figure 19. The examples provided in Figure 19 show the complete execution of the integrated system, from picking to dropping for various objects.
Multiple experiments are performed and the efficacy of the developed system and framework was observed through successful sorting and segregation of the objects used. The result showcases the successful and efficient segregation of objects into LLW, ILW and recyclable trays. The database created contains the objects utilized, their properties and parameters (Table 5) and a summary of each container (Table 6). The video (https://youtu.be/iIb8am8WkWs, (accessed on 10 February 2025)) presents a complete sort and segregation system demonstration for a pile of objects. The video shows that the designed system can successfully sort and segregate nuclear waste. It is also shown in the video that the process is completed by switching to teleoperation mode as a result of the manipulation error due to a huge object in the pile that cannot get into the scanning position.

7. Conclusions

Efficient and scalable robotic methodologies are required to overcome the entry barriers for autonomous systems in challenging environments like the nuclear industry. An autonomous robotic sort and segregation framework for nuclear waste handling was presented by introducing autonomy into DexterTM, a purely teleoperated nuclear robot. The framework relies on dynamical model parameter identification of DexterTM through Fourier series-based excitation trajectories for autonomous operations and integration of methodologies for object identification, classification, characterisation and radiological surveying through ROS. For more complex grabs and tangled waste, the integrated system can call on the expertise of a skilled operator to take over the task through teleoperation. The effectiveness of this framework was verified by performing several sort and segregation operations safely on a mock-up nuclear waste test-bed using similar waste materials found in the nuclear industry. A new database of the objects packed into different bins and their properties was also generated and catalogued for future referencing. The results presented in this paper show a paradigm shift in enabling the adoption of autonomy in addition to automation in the regulated sectors.
The proposed framework is designed to be scalable and adaptable to larger and more complex waste-handling scenarios. The system identification and dynamic modelling methodology developed for DexterTM can be generalized to other robotic manipulators with similar actuation principles. In highly unstructured and unpredictable waste-handling scenarios, fully autonomous sorting may not always be feasible. The framework facilitates seamless switching between autonomy and teleoperation, ensuring that complex or ambiguous tasks can still benefit from human expertise while leveraging automation where possible. This shared autonomy paradigm enhances safety and efficiency, reducing operator workload while ensuring precise task execution. The future work will investigate automatic impedance learning methodologies, for example [46], to reduce the effort required to guide the robot in dynamic environments and performance comparison with similar systems for extreme environments.

Author Contributions

Project Technical Lead: P.S Conceptualization: M.R., A.G.E., M.H. and M.M.-R.; control methodology and software development: M.P., O.F.A. and R.K.; image processing software and integration: M.P., W.Y. and L.P.G.; GUI: I.M., S.O. and K.Y.; writing—original draft preparation: M.P. and O.F.A.; review and Editing: M.R.; funding acquisition and project administration and supervision: M.R., A.G.E., M.H., P.S., M.M.-R. and A.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Innovate UK SBRI Competition—Sort and Segregate Nuclear Waste (Project Id. 10014065).

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors would like to thank Sarah Banfield, the lead project manager at Veolia Nuclear Services, for her outstanding support.

Conflicts of Interest

The authors Ryan King, Phil Saunderson and Mike Moulin-Ramsden were employed by the company Veolia Nuclear Solutions. The authors Wen Yang, Laura Palacio García, Iain Mackay, Abhishek Mishra, Sho Okamoto and Kelvin Yeung were employed by the company Faculty. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest. The funders had no role in the design of the study; in the collection, analysis, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

Abbreviations

The following abbreviations are used in this manuscript:
ROSRobot Operating System
AIArtificial Intelligence
NDANuclear Decommissioning Authority
UKUnited Kingdom
NDANuclear Decommissioning Authority
HLWHigh-Level Waste
ILWIntermediate-Level Waste
LLWLow-Level Waste
VLLWVery Low-Level Waste
JETJoint European Torus
DoFDegrees of Freedom
VNSVeolia Nuclear Solution
TGTask Group
GUIGraphical User Interface
DHDenavit–Hartenberg
PIDProportional–Integral–Derivative
RRTRapidly Exploring Random Tree
MDAMinimum Detectable Activity
CADComputer Aided Design
MVSl Multi-View Stereo
OMPLOpen Motion Planning Library
PPEPersonal Protective Equipment

References

  1. Yuldashev, N.; Saidov, M. The Economy of the Countries of the World is Experiencing the Need for Nuclear Power Plants. Am. J. Econ. Bus. Manag. 2023, 6, 86–99. [Google Scholar]
  2. Deng, D.; Zhang, L.; Dong, M.; Samuel, R.E.; Ofori-Boadu, A.; Lamssali, M. Radioactive waste: A review. Water Environ. Res. 2020, 92, 1818–1825. [Google Scholar] [CrossRef]
  3. Baldwin, N. Remediating Sellafield: A New Focus for the Site. In Proceedings of the International Conference on Radioactive Waste Management and Environmental Remediation, Tucson, AZ, USA, 23–27 February 2003; Volume 37327, pp. 35–40. [Google Scholar]
  4. Nagatani, K.; Kiribayashi, S.; Okada, Y.; Otake, K.; Yoshida, K.; Tadokoro, S.; Nishimura, T.; Yoshida, T.; Koyanagi, E.; Fukushima, M.; et al. Emergency response to the nuclear accident at the Fukushima Daiichi Nuclear Power Plants using mobile rescue robots. J. Field Robot. 2013, 30, 44–63. [Google Scholar] [CrossRef]
  5. NDA. UK Radioactive Waste Inventory. Available online: https://ukinventory.nda.gov.uk/ (accessed on 31 January 2024).
  6. NDA. Integrated Waste Strategy. Available online: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/838828/Radioactive_Waste_Strategy_September_2019.pdf/ (accessed on 31 January 2024).
  7. Teunckens, L.; Walthéry, R.; Lewandowski, P.; Millen, D.; Baumann, S. Individual Protection Equipment and Ergonomics Associated with Dismantling Operations in a Hostile Environment. In Proceedings of the International Conference on Radioactive Waste Management and Environmental Remediation; American Society of Mechanical Engineers, Bruges, Belgium, 30 September–4 October 2001; Volume 80166, pp. 583–587. [Google Scholar]
  8. Tokatli, O.; Das, P.; Nath, R.; Pangione, L.; Altobelli, A.; Burroughes, G.; Jonasson, E.T.; Turner, M.F.; Skilton, R. Robot-assisted glovebox teleoperation for nuclear industry. Robotics 2021, 10, 85. [Google Scholar] [CrossRef]
  9. Buckingham, R.; Graham, A. Dexterous manipulators for nuclear inspection and maintenance—Case study. In Proceedings of the 2010 1st International Conference on Applied Robotics for the Power Industry, Montreal, QC, Canada, 5–7 October 2010; pp. 1–6. [Google Scholar]
  10. Iqbal, J.; Tahir, A.M.; ul Islam, R.; Riaz-un-Nabi. Robotics for nuclear power plants—Challenges and future perspectives. In Proceedings of the 2012 2nd International Conference on Applied Robotics for the Power Industry (CARPI), Zurich, Switzerland, 11–13 September 2012; pp. 151–156. [Google Scholar]
  11. Kim, C.H.; Seo, Y.C.; Lee, S.U.; Choi, B.S.; Moon, J.K. Design of a heavy-duty manipulator for dismantling of a nuclear power plant. In Proceedings of the 2015 15th International Conference on Control, Automation and Systems (ICCAS), Busan, Republic of Korea, 13–16 October 2015; pp. 1154–1158. [Google Scholar]
  12. Tsitsimpelis, I.; Taylor, C.J.; Lennox, B.; Joyce, M.J. A review of ground-based robotic systems for the characterization of nuclear environments. Prog. Nucl. Energy 2019, 111, 109–124. [Google Scholar] [CrossRef]
  13. Sanders, S. Remote operations for fusion using teleoperation. Ind. Robot. Int. J. Robot. Res. Appl. 2006, 33, 174–177. [Google Scholar] [CrossRef]
  14. Bakari, M.J.; Zied, K.M.; Seward, D.W. Development of a multi-arm mobile robot for nuclear decommissioning tasks. Int. J. Adv. Robot. Syst. 2007, 4, 387–406. [Google Scholar] [CrossRef]
  15. Montazeri, A.; Ekotuyo, J. Development of dynamic model of a 7DOF hydraulically actuated tele-operated robot for decommissioning applications. In Proceedings of the 2016 American control conference (ACC), Boston, MA, USA, 6–8 July 2016; pp. 1209–1214. [Google Scholar]
  16. Trevelyan, J.; Hamel, W.R.; Kang, S.C. Robotics in hazardous applications. In Springer Handbook of Robotics; Springer: Berlin/Heidelberg, Germany, 2016; pp. 1521–1548. [Google Scholar]
  17. Han, Z.; Tian, H.; Meng, F.; Wen, H.; Ma, R.; Duan, X.; Zhang, Y.; Liu, C. Design and experimental validation of a robotic system for reactor core detector removal. In Proceedings of the 2021 IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 2473–2479. [Google Scholar]
  18. Lu, W.; Chen, J. Computer vision for solid waste sorting: A critical review of academic research. Waste Manag. 2022, 142, 1023–1040. [Google Scholar] [CrossRef]
  19. Bohg, J.; Morales, A.; Asfour, T.; Kragic, D. Data-driven grasp synthesis—A survey. IEEE Trans. Robot. 2013, 30, 289–309. [Google Scholar] [CrossRef]
  20. Xie, C.; Xiang, Y.; Mousavian, A.; Fox, D. Unseen object instance segmentation for robotic environments. IEEE Trans. Robot. 2021, 37, 1343–1359. [Google Scholar] [CrossRef]
  21. Kiyokawa, T.; Katayama, H.; Tatsuta, Y.; Takamatsu, J.; Ogasawara, T. Robotic waste sorter with agile manipulation and quickly trainable detector. IEEE Access 2021, 9, 124616–124631. [Google Scholar] [CrossRef]
  22. Leveziel, M.; Laurent, G.J.; Haouas, W.; Gauthier, M.; Dahmouche, R. A 4-DoF parallel robot with a built-in gripper for waste sorting. IEEE Robot. Autom. Lett. 2022, 7, 9834–9841. [Google Scholar] [CrossRef]
  23. Pan, Z.; Zeng, A.; Li, Y.; Yu, J.; Hauser, K. Algorithms and systems for manipulating multiple objects. IEEE Trans. Robot. 2022, 39, 2–20. [Google Scholar] [CrossRef]
  24. Kiyokawa, T.; Takamatsu, J.; Koyanaka, S. Challenges for Future Robotic Sorters of Mixed Industrial Waste: A Survey. IEEE Trans. Autom. Sci. Eng. 2022, 21, 29–43. [Google Scholar] [CrossRef]
  25. Hamilton, D.; Preece, G. Development of the MASCOT Telemanipulator Control System; European Fusion Development Agreement: Cultham, UK, 2001. [Google Scholar]
  26. Argin, O.F.; Moccia, R.; Iacono, C.; Ficuciello, F. daVinci Research Kit Patient Side Manipulator Dynamic Model Using Augmented Lagrangian Particle Swarm Optimization. IEEE Trans. Med. Robot. Bionics 2024, 6, 589–599. [Google Scholar] [CrossRef]
  27. Armstrong-Hélouvry, B.; Dupont, P.; De Wit, C.C. A survey of models, analysis tools and compensation methods for the control of machines with friction. Automatica 1994, 30, 1083–1138. [Google Scholar] [CrossRef]
  28. Han, Y.; Wu, J.; Liu, C.; Xiong, Z. An iterative approach for accurate dynamic model identification of industrial robots. IEEE Trans. Robot. 2020, 36, 1577–1594. [Google Scholar] [CrossRef]
  29. Siciliano, B.; Sciavicco, L.; Villani, L.; Oriolo, G. Robotics: Modelling, Planning and Control; Springer-Verlag: New York, NY, USA, 2009. [Google Scholar]
  30. Gautier, M. Numerical calculation of the base inertial parameters of robots. J. Robot. Syst. 1991, 8, 485–506. [Google Scholar] [CrossRef]
  31. Gaz, C.; Cognetti, M.; Oliva, A.; Giordano, P.R.; De Luca, A. Dynamic identification of the franka emika panda robot with retrieval of feasible parameters using penalty-based optimization. IEEE Robot. Autom. Lett. 2019, 4, 4147–4154. [Google Scholar] [CrossRef]
  32. Sousa, C.D.; Cortesao, R. Physical feasibility of robot base inertial parameter identification: A linear matrix inequality approach. Int. J. Robot. Res. 2014, 33, 931–944. [Google Scholar] [CrossRef]
  33. Argin, O.F.; Bayraktaroglu, Z.Y. Consistent dynamic model identification of the Stäubli RX-160 industrial robot using convex optimization method. J. Mech. Sci. Technol. 2021, 35, 2185–2195. [Google Scholar] [CrossRef]
  34. Khalil, W.; Dombre, E. Modeling Identification and Control of Robots; CRC Press: Boca Raton, FL, USA, 2002. [Google Scholar]
  35. Sucan, I.A.; Moll, M.; Kavraki, L.E. The open motion planning library. IEEE Robot. Autom. Mag. 2012, 19, 72–82. [Google Scholar] [CrossRef]
  36. Vu, T.; Kim, K.; Luu, T.M.; Nguyen, T.; Yoo, C.D. Softgroup for 3d instance segmentation on point clouds. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 2708–2717. [Google Scholar]
  37. Newbury, R.; Gu, M.; Chumbley, L.; Mousavian, A.; Eppner, C.; Leitner, J.; Bohg, J.; Morales, A.; Asfour, T.; Kragic, D.; et al. Deep learning approaches to grasp synthesis: A review. IEEE Trans. Robot. 2023, 39, 3994–4015. [Google Scholar] [CrossRef]
  38. Sundermeyer, M.; Mousavian, A.; Triebel, R.; Fox, D. Contact-graspnet: Efficient 6-DoF grasp generation in cluttered scenes. In Proceedings of the IEEE International Conference on Robotics and Automation (ICRA), Xi’an, China, 30 May–5 June 2021; pp. 13438–13444. [Google Scholar]
  39. Zhou, Q.Y.; Park, J.; Koltun, V. Open3D: A modern library for 3D data processing. arXiv 2018, arXiv:1801.09847. [Google Scholar]
  40. Shippen, B.A.; Adams, J.; Joyce, M.J.; Mellor, M.P. Inverse radiation modelling for plant characterisation. In Proceedings of the IEEE Nuclear Science Symposium and Medical Imaging Conference Record (NSS/MIC), Anaheim, CA, USA, 27 October–3 November 2012; pp. 284–294. [Google Scholar]
  41. Hilsabeck, J. 3D Gamma Source Mapping and Intervention Analysis-19243; WM Symposia, Inc.: Tempe, AZ, USA, 2019. [Google Scholar]
  42. Shome, R.; Tang, W.N.; Song, C.; Mitash, C.; Kourtev, H.; Yu, J.; Boularias, A.; Bekris, K.E. Towards Robust Product Packing with a Minimalistic End-Effector. In Proceedings of the International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 9007–9013. [Google Scholar] [CrossRef]
  43. Scharstein, D.; Szeliski, R. A taxonomy and evaluation of dense two-frame stereo correspondence algorithms. Int. J. Comput. Vis. 2002, 47, 7–42. [Google Scholar] [CrossRef]
  44. Korf, R.E. A new algorithm for optimal bin packing. In Proceedings of the Eighteenth National Conference on Artificial Intelligence, Edmonton, AB, Canada, 28 July– 1 August 2002; pp. 731–736. [Google Scholar]
  45. Martello, S.; Pisinger, D.; Vigo, D. The three-dimensional bin packing problem. Oper. Res. 2000, 48, 256–267. [Google Scholar] [CrossRef]
  46. Xing, X.; Burdet, E.; Si, W.; Yang, C.; Li, Y. Impedance learning for human-guided robots in contact with unknown environments. IEEE Trans. Robot. 2023, 39, 3705–3721. [Google Scholar] [CrossRef]
Figure 1. A schematic of DexterTM teleoperation system architecture comprising local and remote manipulators.
Figure 1. A schematic of DexterTM teleoperation system architecture comprising local and remote manipulators.
Machines 13 00214 g001
Figure 2. The experimental setup comprising a mock-up of nuclear waste sorting test-bed. (a) DexterTM local and remote arms. (b) Remote arm with associated sensors and sorting table.
Figure 2. The experimental setup comprising a mock-up of nuclear waste sorting test-bed. (a) DexterTM local and remote arms. (b) Remote arm with associated sensors and sorting table.
Machines 13 00214 g002
Figure 3. Top-level system process flow for DexterTM system-based nuclear sort and segregation application.
Figure 3. Top-level system process flow for DexterTM system-based nuclear sort and segregation application.
Machines 13 00214 g003
Figure 4. Nuclear sort and segregation system architecture.
Figure 4. Nuclear sort and segregation system architecture.
Machines 13 00214 g004
Figure 5. Frame definition of the DexterTM manipulator.
Figure 5. Frame definition of the DexterTM manipulator.
Machines 13 00214 g005
Figure 6. Dexter dynamic model parameter identification process.
Figure 6. Dexter dynamic model parameter identification process.
Machines 13 00214 g006
Figure 7. Joint-space feed-forward nonlinear control scheme.
Figure 7. Joint-space feed-forward nonlinear control scheme.
Machines 13 00214 g007
Figure 8. Octomap of the environment and ROS-Rviz simulation model.
Figure 8. Octomap of the environment and ROS-Rviz simulation model.
Machines 13 00214 g008
Figure 9. Curve fitting to mass estimation.
Figure 9. Curve fitting to mass estimation.
Machines 13 00214 g009
Figure 10. Fourier series-based excitation trajectories generated for dynamical parameter identification of DexterTM manipulator. Joint trajectories for (a) training and (b) testing.
Figure 10. Fourier series-based excitation trajectories generated for dynamical parameter identification of DexterTM manipulator. Joint trajectories for (a) training and (b) testing.
Machines 13 00214 g010
Figure 11. Predicted and measured torques for the test trajectory using estimated dynamical parameters of DexterTM manipulator.
Figure 11. Predicted and measured torques for the test trajectory using estimated dynamical parameters of DexterTM manipulator.
Machines 13 00214 g011
Figure 12. Result of object detection and classification from two RGB-D images of a scene. Images from left to right shows the test objects in the environment from two cameras, their depth images, Multiview Stereo (MVS) reconstruction and filtered point cloud, and SoftGroup model-based classification and object detection outputs.
Figure 12. Result of object detection and classification from two RGB-D images of a scene. Images from left to right shows the test objects in the environment from two cameras, their depth images, Multiview Stereo (MVS) reconstruction and filtered point cloud, and SoftGroup model-based classification and object detection outputs.
Machines 13 00214 g012
Figure 13. Single-object point cloud reconstruction from three different object pose performed by DexterTM after grasping and the category classification result.
Figure 13. Single-object point cloud reconstruction from three different object pose performed by DexterTM after grasping and the category classification result.
Machines 13 00214 g013
Figure 14. Radiological surveying objects, radiation scan trajectories and radiation levels.
Figure 14. Radiological surveying objects, radiation scan trajectories and radiation levels.
Machines 13 00214 g014
Figure 15. Grasp pose generation results from two object piles.
Figure 15. Grasp pose generation results from two object piles.
Machines 13 00214 g015
Figure 16. Geometry characterizations of wellington boot (Row 1) and plastic hose (Row 2). (Column 1): 3D point cloud of the environment. (Column 2): Watertight mesh generated from detected object point cloud. (Column 3): Geometrical characterization of detected object.
Figure 16. Geometry characterizations of wellington boot (Row 1) and plastic hose (Row 2). (Column 1): 3D point cloud of the environment. (Column 2): Watertight mesh generated from detected object point cloud. (Column 3): Geometrical characterization of detected object.
Machines 13 00214 g016
Figure 17. Experimental result of the bin packing.
Figure 17. Experimental result of the bin packing.
Machines 13 00214 g017
Figure 18. Full system demonstrator.
Figure 18. Full system demonstrator.
Machines 13 00214 g018
Figure 19. Execution of the integrated system from picking to dropping for four example objects.
Figure 19. Execution of the integrated system from picking to dropping for four example objects.
Machines 13 00214 g019
Table 1. DH parameters of DexterTM robot.
Table 1. DH parameters of DexterTM robot.
Joint a i α i d i θ i
10 π / 2 0 q 1 π / 2
20 π / 2 d 2 q 2
3 a 3 π / 2 0 q 3 + π / 2
40 π / 2 d 4 q 4
5 a 5 π / 2 0 q 5 π / 2
60 π / 2 d 6 q 6
Table 2. Joint torque and position tracking mean percentage error and standard deviation of the training and test trajectories.
Table 2. Joint torque and position tracking mean percentage error and standard deviation of the training and test trajectories.
JointTorque Tracking (Nm)Position Tracking (rad)
TrainTestTrainTest
%Std.%Std.%Std.%Std.
q 1 6.97 0.23 7.32 0.15 1.24 0.02 1.70 0.02
q 2 11.71 0.15 7.98 0.29 1.89 0.08 2.10 0.07
q 3 14.58 0.32 18.03 0.32 2.35 0.08 2.43 0.08
q 4 8.70 0.01 11.64 0.02 1.34 0.03 1.83 0.03
q 5 7.22 0.02 6.62 0.01 1.72 0.04 1.73 0.04
q 6 7.39 0.01 8.47 0.01 2.34 0.06 1.97 0.04
Table 3. Sample of objects and category.
Table 3. Sample of objects and category.
Object NameObject Category
Safety Hard Hat, Plastic Scrap, Glove, Electronic ScrapPlastic
Man-Made FibresMan-Made Fibres
Wellington BootRubber
Steel Rod/BarSteel
CanMetal
Steel ScrapPlastic + Metal
Table 4. Performance comparison of the bin packing algorithms.
Table 4. Performance comparison of the bin packing algorithms.
AlgorithmCenterRandomLowerOrth. LowTetris
Total Packed Object22.227.631.034.838.4
Total Objects Volume12.56714.46515.70818.45320.408
Container Voidage30.68628.79627.55324.80822.853
% of Container Voidage70.93366.56463.6957.34552.826
Average pose gen run time0.000520.004280.00470.009090.74237
Table 5. The database of sorted objects and their characteristics. A record of packed objects is generated as part of the developed framework of autonomous sort and segregation system.
Table 5. The database of sorted objects and their characteristics. A record of packed objects is generated as part of the developed framework of autonomous sort and segregation system.
CategoryMaterialCategory Probability (%)Material Probability (%)Estimated Volume (cm3)Surface Area (cm2)Length (cm)Width (cm)Height (cm)Alpha Activity (MBq/kg)Beta Activity (MBq/kg)Estimated Mass (g)Density Avg. (g/cm3)Destination
Safety Hard HatPlastic96.798.63050565031.628.618.21.4410.1484159ILW
Man-Made FibresMan-Made Fibres95.295.31130259026.223.818.24.934.274.1656ILW
Plastic ScrapPlastic52.853.1178024203630.49.47.0248.9436244ILW
Safety Hard HatPlastic97.299.1241046203127.419.24.934.2469195ILW
Man-Made FibresMan-Made Fibres95.896.0237021602821.4152.5117.518.1765ILW
Wellington BootRubber58.660.03190881046.43615.84.934.2222694ILW
Steel Rod/BarSteel87.499.531.615022.44.22.62.5117.52548.03ILW
Coke canMetal79.980.1209429137.26.61.127.8330143Recycle
Steel ScrapPlastic + Metal9948.41135362410.66.44.934.23342.95Recycle
Wood ScrapStone9938.466.23361615.24.62.5117.595.81.45Recycle
Medical GlovePlastic9949.5401127027.612.4101.127.8337.1927Recyc.
Electronic ScrapPlastic9924.256.627811.27.268585.9842.4749Recycle
Plastic ScrapPlastic51.553.333013302622.87.61.127.83121366LLW
Table 6. A record generated as part of developed sort and segregation framework for obtaining each packed container information.
Table 6. A record generated as part of developed sort and segregation framework for obtaining each packed container information.
ContainerILWLLWRecyclable
Total number items715
Total mass (kg)2.410.1210.539
Total net volume (m3)0.01480.000330.000846
Total surface area (m2)2.850.1330.285
Total alpha activity (MBq/kg)381.1210.5
Total beta/gamma activity (MBq/kg)2647.8373.3
Container fill level (%)23.42.27.9
Container voidage (%)13.577.083.6
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Poozhiyil, M.; Argin, O.F.; Rai, M.; Esfahani, A.G.; Hanheide, M.; King, R.; Saunderson, P.; Moulin-Ramsden, M.; Yang, W.; García, L.P.; et al. A Framework for Real-Time Autonomous Robotic Sorting and Segregation of Nuclear Waste: Modelling, Identification and Control of DexterTM Robot. Machines 2025, 13, 214. https://doi.org/10.3390/machines13030214

AMA Style

Poozhiyil M, Argin OF, Rai M, Esfahani AG, Hanheide M, King R, Saunderson P, Moulin-Ramsden M, Yang W, García LP, et al. A Framework for Real-Time Autonomous Robotic Sorting and Segregation of Nuclear Waste: Modelling, Identification and Control of DexterTM Robot. Machines. 2025; 13(3):214. https://doi.org/10.3390/machines13030214

Chicago/Turabian Style

Poozhiyil, Mithun, Omer F. Argin, Mini Rai, Amir G. Esfahani, Marc Hanheide, Ryan King, Phil Saunderson, Mike Moulin-Ramsden, Wen Yang, Laura Palacio García, and et al. 2025. "A Framework for Real-Time Autonomous Robotic Sorting and Segregation of Nuclear Waste: Modelling, Identification and Control of DexterTM Robot" Machines 13, no. 3: 214. https://doi.org/10.3390/machines13030214

APA Style

Poozhiyil, M., Argin, O. F., Rai, M., Esfahani, A. G., Hanheide, M., King, R., Saunderson, P., Moulin-Ramsden, M., Yang, W., García, L. P., Mackay, I., Mishra, A., Okamoto, S., & Yeung, K. (2025). A Framework for Real-Time Autonomous Robotic Sorting and Segregation of Nuclear Waste: Modelling, Identification and Control of DexterTM Robot. Machines, 13(3), 214. https://doi.org/10.3390/machines13030214

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop