Soft-Sensor System for Grasp Type Recognition in Underactuated Hand Prostheses
<p>Working principle: side-polished angle sensor.</p> "> Figure 2
<p>Working principle: jacket remotion with cladding and core axial polish angle sensor.</p> "> Figure 3
<p>Devices used in the sensor’s construction. (<b>a</b>) CNC. (<b>b</b>) 3D-printed structure and fiber cladding separated from its jacket. (<b>c</b>) 3D-printed structure to obtain the polish around the fiber.</p> "> Figure 4
<p>Test bench for the angle sensor and first position per fiber configuration. (<b>a</b>) Test bench CAD. (<b>b</b>) Fiber sensor with side polish. (<b>c</b>) Fiber sensor jacket remotion. (<b>d</b>) Fiber sensor jacket remotion with cladding and core axial polish.</p> "> Figure 5
<p>Working principle of contact force sensor.</p> "> Figure 6
<p>Construction of the contact force sensor. (<b>a</b>) 3D-printed mold. (<b>b</b>) Contact force sensor.</p> "> Figure 7
<p>Test bench for the contact force sensor. (<b>a</b>) CAD. (<b>b</b>) Real setup.</p> "> Figure 8
<p>Prosthesis with the soft-sensor integrated, holding one object for the protocol.</p> "> Figure 9
<p>Results of the three sensor configurations per rotation: the dotted blue line corresponds to 0°, the dotted and dashed orange line corresponds to 90°, the squared gray line corresponds to 180°, and the dashed yellow line corresponds to 270°. (<b>a</b>) Side polish fiber. (<b>b</b>) Jacket remotion. (<b>c</b>) Jacket remotion with cladding and core axial polish.</p> "> Figure 10
<p>Angle sensor characterization per finger: the dotted blue line represents the closing and the dashed orange line the opening. (<b>a</b>) Little. (<b>b</b>) Ring. (<b>c</b>) Middle. (<b>d</b>) Index. (<b>e</b>) Thumb.</p> "> Figure 11
<p>Contact force sensor characterization per finger: the dotted blue line represents compression, and the dashed orange line reflects decompression. (<b>a</b>) Little. (<b>b</b>) Ring. (<b>c</b>) Middle. (<b>d</b>) Index. (<b>e</b>) Thumb.</p> "> Figure 12
<p>Accuracy per type of grasp for the linear regression algorithm.</p> "> Figure 13
<p>Accuracy per type of grasp for the k-NN algorithm.</p> "> Figure 14
<p>Accuracy per grasp type for the support vector machine algorithm.</p> "> Figure 15
<p>Accuracy per type of grasp for the decision tree algorithm.</p> "> Figure 16
<p>Accuracy per type of grasp for the KMC algorithm.</p> "> Figure 17
<p>Accuracy per type of grasp for the hierarchical clustering algorithm.</p> "> Figure 18
<p>Principal Components Analysis (PCA).</p> ">
Abstract
:1. Introduction
2. Materials and Methods
2.1. Angle Sensor
2.2. Contact Force Sensor
2.3. Grasp Type Recognition Based on Machine Learning Algorithms
2.3.1. Algorithms
- k-Nearest Neighbor (k-NN): The k-NN algorithm classifies the test sample based on its similarity with k samples in the training database. For this, the distance between the new sample and each training sample is calculated, and between the categories of the k training samples closest to the testing, the sample is found in the mode. To calculate the k value, the training data were divided into two equal parts and proven with k = 1, 2, 3, 5, …, m. To calculate the distances between the testing sample and the training sample, we used the Euclidean distance. See Equation (2), where was the training sample, was the testing sample, and n was the number of attributes.
- Support Vector Machine (SVM): This algorithm is highly used because it creates a line or space to separate the database into its categories, allowing for the correct categorization of the new data. The line or space is called a hyperplane, and, for its creation, we use the extreme samples of each category as a limit.
- Decision Tree (DT): The decision tree is a supervised algorithm that creates rules to predict the value of the target variable. It has a hierarchical tree structure consisting of a root node, branches, internal nodes, and leaf nodes. DT learning employs a “divide and conquer” strategy by performing a greedy search to identify optimal split points within a tree. This splitting process is repeated top-down and recursively until all, or most, records have been classified under specific class labels.
- k-Means Clustering (KMC): This algorithm classifies the dataset into k clusters. Initially, the algorithm selects k random centroids after each sample is associated with the closest centroid, calculating the Euclidean distance between them. Once all the samples are classified into one centroid, a new centroid is calculated with the mean of all samples of each cluster. This process is repeated until the centroids have no changes.
- Hierarchical Clustering (HC): In the hierarchical clustering algorithm, the similar data of the dataset are clustering into a tree-like structure called a dendrogram. The algorithm works by adding the data samples in one cluster or dividing larger clusters into small ones. In the divisive hierarchical cluster, chosen for this work, all samples are in the same cluster, and the algorithm divides the group into subgroups until having one data point.
2.3.2. Protocol
2.3.3. Algorithm Evaluation
3. Results and Discussion
3.1. Angle Sensor
Finger | Closing | R2 (%) | Opening | R2 (%) | H (%) |
---|---|---|---|---|---|
Little | V = 0.0001A0.0189A + 1.5424 | 98.81 | V5A + 0.0154A + 1.5289 | 99.96 | 16.62 |
Ring | V = 0.0002A0.0225A + 1.6044 | 93.98 | V = 0.0003A0.0247A + 1.6698 | 99.33 | 13.60 |
Middle | V5A0.0005A + 2.6976 | 99.79 | VA + 0.0007A + 2.6801 | 99.78 | 2.81 |
Index | V = 0.0002A0.0347A + 2.4522 | 99.24 | V5A0.0048A + 1.6853 | 92.03 | 13.18 |
Thumb | V5A + 0.0064A + 2.9612 | 99.84 | VA + 0.0356A + 1.3477 | 98.61 | 9.19 |
3.2. Contact Force Sensor
3.3. Grasp Type Recognition Based on Machine Learning Algorithms
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
ADLs | Activities of Daily Life |
AHAP | Anthropomorphic Hand Assessment Protocol |
CAD | Computer-Aided Design |
CG | Cylindrical Grip |
DT | Decision Tree |
DoFs | Degrees of Freedom |
DVG | Diagonal Volar Grip |
EG | Extension Grip |
FBG | Fiber Bragg Grating |
H | Hook |
HC | Hierarchical Clustering |
KMC | k-Means Clustering |
k-NN | k-Nearest Neighbor |
ML | Machine Learning |
LED | Light-Emitting Diode |
LP | Lateral Pinch |
LR | Logistic regression |
ML | Machine Learning |
PCA | Principal Components Analysis |
DIP | Proximal Interphalangeal |
POF | Polymer Optical Fiber |
PP | Pulp Pinch |
sEMG | Surface Electromyography |
SG | Spherical Grip |
SVM | Support Vector Machine |
TP | Tripod Pinch |
References
- DATASUS. Amputação Desarticulação de Membros Superiores. Available online: http://tabnet.datasus.gov.br/cgi/tabcgi.exe?sih/cnv/qiuf.def (accessed on 21 January 2023).
- SISPRO. ASIS Disability Indicators. Available online: http://rssvr2.sispro.gov.co/reportesAsis2 (accessed on 21 January 2023).
- Cordella, F.; Ciancio, A.L.; Sacchetti, R.; Davalli, A.; Cutti, A.G.; Guglielmelli, E.; Zollo, L. Literature review on needs of upper limb prosthesis users. Front. Neurosci. 2016, 10, 209. [Google Scholar] [CrossRef] [PubMed]
- Pomares, G.; Coudane, H.; Dap, F.; Dautel, G. Psychological effects of traumatic upper-limb amputations. Orthop. Traumatol. Surg. Res. 2020, 106, 297–300. [Google Scholar] [CrossRef] [PubMed]
- Jelacic, Z.; Dedic, R.; Dindo, H. Chapter 1: The Challenges of Prosthetic Design and Control. In Active Above-Knee Prosthesis, A Guide to a Smart Prosthetic Leg; Academic Press: Cambridge, MA, USA, 2020. [Google Scholar]
- Resnik, L.; Ekerholm, S.; Borgia, M.; Clark, M.A. A national study of Veterans with major upper limb amputation: Survey methods, participants, and summary findings. PLoS ONE 2019, 14, e0213578. [Google Scholar] [CrossRef] [Green Version]
- Piazza, C.; Grioli, G.; Catalano, M.; Bicchi, A. A century of robotic hands. Annu. Rev. Control. Robot. Auton. Syst. 2019, 2, 1–32. [Google Scholar] [CrossRef]
- Madusanka, D.G.K.; Wijayasingha, L.N.S.; Gopura, R.A.R.C.; Amarasinghe, Y.W.R.; Mann, G.K.I. A review on hybrid myoelectric control systems for upper limb prosthesis. In Proceedings of the 2015 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, 7–8 April 2015; pp. 136–141. [Google Scholar]
- Tian, L.; Magnenat Thalmann, N.; Thalmann, D.; Zheng, J. The making of a 3D-printed, cable-driven, single-model, lightweight humanoid robotic hand. Front. Robot. AI 2017, 4, 65. [Google Scholar] [CrossRef] [Green Version]
- Soler, M.A. Diseño de una Prótesis de Mano Adaptable Para Niños. Ph.D. Thesis, ETSEIB, Barcelona, Spain, 2017. [Google Scholar]
- Trent, L.; Intintoli, M.; Prigge, P.; Bollinger, C.; Walters, L.S.; Conyers, D.; Miguelez, J.; Ryan, T. A narrative review: Current upper limb prosthetic options and design. Disabil. Rehabil. Assist. Technol. 2020, 15, 604–613. [Google Scholar] [CrossRef]
- Mohammadi, A.; Lavranos, J.; Zhou, H.; Mutlu, R.; Alici, G.; Tan, Y.; Choong, P.; Oetomo, D. A practical 3D-printed soft robotic prosthetic hand with multi-articulating capabilities. PLoS ONE 2020, 15, e0232766. [Google Scholar] [CrossRef]
- Gul, J.Z.; Sajid, M.; Rehman, M.M.; Siddiqui, G.U.; Shah, I.; Kim, K.H.; Lee, J.W.; Choi, K.H. 3D printing for soft robotics–A review. Sci. Technol. Adv. Mater. 2018, 19, 243–262. [Google Scholar] [CrossRef] [Green Version]
- Weiner, P.; Neef, C.; Shibata, Y.; Nakamura, Y.; Asfour, T. An embedded, multi-modal sensor system for scalable robotic and prosthetic hand fingers. Sensors 2019, 20, 101. [Google Scholar] [CrossRef] [Green Version]
- Li, G.; Cheng, L.; Gao, Z.; Xia, X.; Jiang, J. Development of an Untethered Adaptive Thumb Exoskeleton for Delicate Rehabilitation Assistance. IEEE Trans. Robot. 2022, 38, 3514–3529. [Google Scholar] [CrossRef]
- Li, L.; Li, Y.; Yang, L.; Fang, F.; Yan, Z.; Sun, Q. Continuous and Accurate Blood Pressure Monitoring Based on Wearable Optical Fiber Wristband. IEEE Sensors J. 2021, 21, 3049–3057. [Google Scholar] [CrossRef]
- Li, J.; Liu, J.; Li, C.; Zhang, H.; Li, Y. Wearable wrist movement monitoring using dual surface-treated plastic optical fibers. Materials 2020, 13, 3291. [Google Scholar] [CrossRef]
- Konstantinova, J.; Stilli, A.; Althoefer, K. Force and proximity fingertip sensor to enhance grasping perception. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 2118–2123. [Google Scholar]
- Yang, M.; Liu, Q.; Naqawe, H.S.; Fok, M.P. Movement detection in soft robotic gripper using sinusoidally embedded fiber optic sensor. Sensors 2020, 20, 1312. [Google Scholar] [CrossRef] [Green Version]
- Teeple, C.B.; Becker, K.P.; Wood, R.J. Soft curvature and contact force sensors for deep-sea grasping via soft optical waveguides. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 1621–1627. [Google Scholar]
- Mori, Y.; Zhu, M.; Kim, H.J.; Wada, A.; Mitsuzuka, M.; Tajitsu, Y.; Kawamura, S. Development of a pneumatically driven flexible finger with feedback control of a polyurethane bend sensor. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018; pp. 5952–5957. [Google Scholar]
- Huang, H.; Lin, J.; Wu, L.; Wen, Z.; Dong, M. Trigger-Based Dexterous Operation with Multimodal Sensors for Soft Robotic Hand. Appl. Sci. 2021, 11, 8978. [Google Scholar] [CrossRef]
- Feng, J.; Jiang, Q. Slip and roughness detection of robotic fingertip based on FBG. Sensors Actuators A Phys. 2019, 287, 143–149. [Google Scholar] [CrossRef]
- Jiang, L.; Low, K.; Costa, J.; Black, R.J.; Park, Y.L. Fiber optically sensorized multi-fingered robotic hand. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September–2 October 2015; pp. 1763–1768. [Google Scholar]
- Durini, F.; Terruso, G.; D’Abbraccio, J.; Filosa, M.; Fransvea, G.; Camboni, D.; Aliperta, A.; Palermo, E.; Massari, L.; Oddo, C.M. Soft large area FBG tactile sensors for exteroception and proprioception in a collaborative robotic manipulator. In Proceedings of the 2021 Smart Systems Integration (SSI), Grenoble, France, 27–29 April 2021; pp. 1–4. [Google Scholar]
- Walker, S.; Loewke, K.; Fischer, M.; Liu, C.; Salisbury, J.K. An optical fiber proximity sensor for haptic exploration. In Proceedings of the 2007 IEEE International Conference on Robotics and Automation, Rome, Italy, 10–14 April 2007; pp. 473–478. [Google Scholar]
- Ataollahi, A.; Polygerinos, P.; Puangmali, P.; Seneviratne, L.D.; Althoefer, K. Tactile sensor array using prismatic-tip optical fibers for dexterous robotic hands. In Proceedings of the 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 18–22 October 2010; pp. 910–915. [Google Scholar]
- Chaudhuri, S.; Bhardwaj, A. Kinesthetic Perception. In A Machine Learning Approach; Springer: Berlin/Heidelberg, Germany, 2018. [Google Scholar]
- Lederman, S.J.; Klatzky, R.L. Extracting object properties through haptic exploration. Acta Psychol. 1993, 84, 29–40. [Google Scholar] [CrossRef]
- Kim, D.; Kim, S.H.; Kim, T.; Kang, B.B.; Lee, M.; Park, W.; Ku, S.; Kim, D.; Kwon, J.; Lee, H.; et al. Review of machine learning methods in soft robotics. PLoS ONE 2021, 16, e0246102. [Google Scholar] [CrossRef] [PubMed]
- Papakostas, C.; Troussas, C.; Krouska, A.; Sgouropoulou, C. Modeling the Knowledge of Users in an Augmented Reality-Based Learning Environment Using Fuzzy Logic. In Proceedings of the Novel & Intelligent Digital Systems: Proceedings of the 2nd International Conference (NiDS 2022), Athens, Greece, 29–30 September 2022; pp. 113–123. [Google Scholar]
- Li, G.; Zhu, R. A multisensory tactile system for robotic hands to recognize objects. Adv. Mater. Technol. 2019, 4, 1900602. [Google Scholar] [CrossRef]
- da Fonseca, V.P.; Jiang, X.; Petriu, E.M.; de Oliveira, T.E.A. Tactile object recognition in early phases of grasping using underactuated robotic hands. Intell. Serv. Robot. 2022, 15, 513–525. [Google Scholar] [CrossRef]
- Konstantinova, J.; Cotugno, G.; Stilli, A.; Noh, Y.; Althoefer, K. Object classification using hybrid fiber optical force/proximity sensor. In Proceedings of the 2017 IEEE SENSORS, Glasgow, UK, 29 October–1 November 2017; pp. 1–3. [Google Scholar]
- Kaboli, M.; De La Rosa T, A.; Walker, R.; Cheng, G. In-hand object recognition via texture properties with robotic hands, artificial skin, and novel tactile descriptors. In Proceedings of the 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids), Seoul, Republic of Korea, 3–5 November 2015; pp. 1155–1160. [Google Scholar]
- Huang, H.; Lin, J.; Wu, L.; Fang, B.; Wen, Z.; Sun, F. Machine learning-based multi-modal information perception for soft robotic hands. Tsinghua Sci. Technol. 2019, 25, 255–269. [Google Scholar] [CrossRef]
- R. Diaz, C.A.; Leal-Junior, A.G.; M. Avellar, L.; C. Antunes, P.F.; Pontes, M.J.; Marques, C.A.; Frizera, A.; N. Ribeiro, M.R. Perrogator: A Portable Energy-Efficient Interrogator for Dynamic Monitoring of Wavelength-Based Sensors in Wearable Applications. Sensors 2019, 19, 2962. [Google Scholar] [CrossRef] [Green Version]
- Díaz, C.A.; Marques, C.A.; Domingues, M.F.F.; Ribeiro, M.R.; Frizera-Neto, A.; Pontes, M.J.; André, P.S.; Antunes, P.F. A cost-effective edge-filter based FBG interrogator using catastrophic fuse effect micro-cavity interferometers. Measurement 2018, 124, 486–493. [Google Scholar] [CrossRef]
- Díaz, C.A.R.; Leitão, C.; Marques, C.A.; Domingues, M.F.; Alberto, N.; Pontes, M.J.; Frizera, A.; Ribeiro, M.R.N.; André, P.S.B.; Antunes, P.F.C. Low-Cost Interrogation Technique for Dynamic Measurements with FBG-Based Devices. Sensors 2017, 17, 2414. [Google Scholar] [CrossRef] [Green Version]
- Silveira, M.; Frizera, A.; Leal-Junior, A.; Ribeiro, D.; Marques, C.; Blanc, W.; R. Díaz, C.A. Transmission–Reflection Analysis in high scattering optical fibers: A comparison with single-mode optical fiber. Opt. Fiber Technol. 2020, 58, 102303. [Google Scholar] [CrossRef]
- Leal-Junior, A.G.; Diaz, C.A.; Avellar, L.M.; Pontes, M.J.; Marques, C.; Frizera, A. Polymer Optical Fiber Sensors in Healthcare Applications: A Comprehensive Review. Sensors 2019, 19, 3156. [Google Scholar] [CrossRef] [Green Version]
- Leal-Junior, A.G.; Frizera, A.; Marques, C.; Pontes, M.J. Viscoelastic features based compensation technique for polymer optical fiber curvature sensors. Opt. Laser Technol. 2018, 105, 35–40. [Google Scholar] [CrossRef]
- Leal-Junior, A.G.; Díaz, C.R.; Marques, C.; Pontes, M.J.; Frizera, A. 3D-printed POF insole: Development and applications of a low-cost, highly customizable device for plantar pressure and ground reaction forces monitoring. Opt. Laser Technol. 2019, 116, 256–264. [Google Scholar] [CrossRef]
- De Arco, L.; Ramos, O.; Múnera, M.; Moazen, M.; Wurdemann, H.; Cifuentes, C. The prhand: Functional assessment of an underactuated soft-robotic prosthetic hand. In Proceedings of the IEEE RAS/EMBS International Conference on Biomedical Robotics and Biomechatronics, Seoul, Republic of Korea, 21–24 August 2022; pp. 1–6. [Google Scholar]
- Leal-Junior, A.G.; Frizera, A.; Pontes, M.J. Sensitive zone parameters and curvature radius evaluation for polymer optical fiber curvature sensors. Opt. Laser Technol. 2018, 100, 272–281. [Google Scholar] [CrossRef]
- Leal Junior, A.G.; Frizera, A.; Pontes, M.J. Analytical model for a polymer optical fiber under dynamic bending. Opt. Laser Technol. 2017, 93, 92–98. [Google Scholar] [CrossRef]
- De Arco, L.; Pontes, M.J.; Vieira Segatto, M.E.; Cifuentes, C.A.; Díaz, C.A.R. Instrumentation of the Prosthesis PrHand Based on Soft-Robotics: Angle sensor with Optical Fiber. In Proceedings of the Latin America Optics & Photonics Conference, OSA, Recife, Brazil, 7–11 August 2022. [Google Scholar]
- De Arco, L.; Pontes, M.J.; Vieira Segatto, M.E.; Monteiro, M.; Cifuentes, C.A.; Díaz, C.A.R. Optical Fiber Angle Sensors for the PrHand Prosthesis: Development and Application in Grasp Types Recognition with Machine Learning. In Proceedings of the Latin American Electron Devices Conference, Cancun, Mexico, 4–6 July 2022. [Google Scholar]
- Arco, L.D.; Pontes, M.J.; Segatto, M.E.V.; Monteiro, M.E.; Cifuentes, C.A.; Díaz, C.A.R. Pressure and Angle Sensors with Optical Fiber for Instrumentation of the PrHand Hand Prosthesis. J. Phys. Conf. Ser. 2022, 2407, 012010. [Google Scholar] [CrossRef]
- Leal-Junior, A.G.; Diaz, C.R.; Marques, C.; Pontes, M.J.; Frizera, A. Multiplexing technique for quasi-distributed sensors arrays in polymer optical fiber intensity variation-based sensors. Opt. Laser Technol. 2019, 111, 81–88. [Google Scholar] [CrossRef]
- Kavitha, S.; Varuna, S.; Ramya, R. A comparative analysis on linear regression and support vector regression. In Proceedings of the 2016 Online International Conference on Green Engineering and Technologies (IC-GET), Coimbatore, India, 19 November 2016; pp. 1–5. [Google Scholar]
- Llop-Harillo, I.; Pérez-González, A.; Starke, J.; Asfour, T. The anthropomorphic hand assessment protocol (AHAP). Robot. Auton. Syst. 2019, 121, 103259. [Google Scholar] [CrossRef]
- Refaeilzadeh, P.; Tang, L.; Liu, H. Cross-validation. Encycl. Database Syst. 2009, 5, 532–538. [Google Scholar]
- De Souza, R.; El-Khoury, S.; Santos-Victor, J.; Billard, A. Recognizing the grasp intention from human demonstration. Robot. Auton. Syst. 2015, 74, 108–121. [Google Scholar] [CrossRef]
- El-Khoury, S.; Li, M.; Billard, A. On the generation of a variety of grasps. Robot. Auton. Syst. 2013, 61, 1335–1349. [Google Scholar] [CrossRef] [Green Version]
Step | Description |
---|---|
1 | The prosthetic hand is open |
2 | Place the object and close the prosthesis |
3 | The prosthesis holds the object |
4 | Open the prosthesis and remove the object |
5 | The prosthetic hand is completely open again |
Finger | Compression | R (%) | Decompression | R (%) | H (%) |
---|---|---|---|---|---|
Little | V = 0.0001F0.0158F + 2.3138 | 95.53 | V0.0001F + 0.00864F + 2.3178 | 99.96 | 24.45 |
Ring | V = 0.0173F + 2.6786 | 99.16 | V0.017F + 2.6428 | 99.01 | 0.03 |
Middle | V = 0.0019F0.0928F + 1.8887 | 98.47 | V0.0023F + 0.0993F + 1.7779 | 95.8 | 3.40 |
Index | V0.0165F + 2.6126 | 98.59 | V = −0.0173F+2.5671 | 94.75 | 8.04 |
Thumb | V0.0004F0.0022F + 2.9164 | 97.74 | V0.0002F0.0109F + 2.972 | 96.69 | 21.06 |
k | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
---|---|---|---|---|---|---|---|
Accuracy | 97.47732 | 97.47732 | 97.44898 | 97.47732 | 97.44898 | 97.44898 | 97.44898 |
Accuracy | Standard Deviation | |
---|---|---|
LR | 21.1 | 0.2 |
k-NN | 98.5 | 0.01 |
SVM | 12.5 | 0.0 |
DT | 93.3 | 0.2 |
KMC | 10.7 | 5.9 |
HC | 12.5 | 0.0 |
Name | Objective | Number of Sensors | Type of Sensors | Sensing Variables | Sensor Location | Machine Learning Algorithms | Results |
---|---|---|---|---|---|---|---|
This work | Recognition of 8 grasps types with two soft-sensors | 10 (2 per finger) | Contact force | Force that the prosthesis applied over the object | Fingertips | LR | 20.80% |
k-NN | 98.50% | ||||||
SVM | 12.50% | ||||||
Angle | Angle in a joint of the finger | Finger DIP joints | DT | 94.30% | |||
KMC | 10.30% | ||||||
HC | 12.50% | ||||||
Konstantinova et al. [34] | A classification algorithm that distinguishes between hard and soft objects | 8 (4 per finger, being 2 fingers) | Optical fiber | Force sensor Torque sensor 2 Proximity sensor | Fingertips | ZeroR | 63.60% |
Perceptron | 69.00% | ||||||
SVM | 87.30% | ||||||
Kaboli et al. [35] | To identify 20 objects of the ADLs through texture properties | 5 (1 per finger) | BioTac | Contact force | Fingertips | SVM | 96.00% |
PA | 87.00% | ||||||
EM | 80.58% | ||||||
Huang et al. [36] | Recognition of gestures, object shape, size and weight | 10 (2 per finger) | Optical fiber | Curvature | Along the finger | k-NN | Gestures: 97.96% Shapes: 90.81% Size: 90.79% Weight: 100.00% |
SVM | Gestures: 96.55% Shapes: 90.56% Size: 90.90% Weight: 100.00% | ||||||
Intelligent digital display pressure transmitter | Pressure | It is installed in parallel with the soft finger in each gas pressure channel. | LR | Gestures: 95.6% Shapes: 86.94% Size: 79.73% Weight: 100.00% | |||
KMC | Gestures: 96.83% Size: 99.37% |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
De Arco, L.; Pontes, M.J.; Segatto, M.E.V.; Monteiro, M.E.; Cifuentes, C.A.; Díaz, C.A.R. Soft-Sensor System for Grasp Type Recognition in Underactuated Hand Prostheses. Sensors 2023, 23, 3364. https://doi.org/10.3390/s23073364
De Arco L, Pontes MJ, Segatto MEV, Monteiro ME, Cifuentes CA, Díaz CAR. Soft-Sensor System for Grasp Type Recognition in Underactuated Hand Prostheses. Sensors. 2023; 23(7):3364. https://doi.org/10.3390/s23073364
Chicago/Turabian StyleDe Arco, Laura, María José Pontes, Marcelo E. V. Segatto, Maxwell E. Monteiro, Carlos A. Cifuentes, and Camilo A. R. Díaz. 2023. "Soft-Sensor System for Grasp Type Recognition in Underactuated Hand Prostheses" Sensors 23, no. 7: 3364. https://doi.org/10.3390/s23073364
APA StyleDe Arco, L., Pontes, M. J., Segatto, M. E. V., Monteiro, M. E., Cifuentes, C. A., & Díaz, C. A. R. (2023). Soft-Sensor System for Grasp Type Recognition in Underactuated Hand Prostheses. Sensors, 23(7), 3364. https://doi.org/10.3390/s23073364