Yue et al., 2023 - Google Patents
How to achieve human–machine interaction by foot gesture recognition: a reviewYue et al., 2023
View PDF- Document ID
- 4875479158956632288
- Author
- Yue L
- Zongxing L
- Hui D
- Chao J
- Ziqiang L
- Zhoujie L
- Publication year
- Publication venue
- IEEE Sensors Journal
External Links
Snippet
Researchers are investigating how to make machines read our body language to make human–machine interaction (HMI) more intelligent and efficient. The lower limbs contain a variety of gestures, and it is also one of the most effective ways to express body information …
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6813—Specially adapted to be attached to a specific body part
- A61B5/6825—Hand
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6801—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
- A61B5/6802—Sensor mounted on worn items
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1116—Determining posture transitions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/04—Detecting, measuring or recording bioelectric signals of the body of parts thereof
- A61B5/0402—Electrocardiography, i.e. ECG
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/45—For evaluating or diagnosing the musculoskeletal system or teeth
- A61B5/4528—Joints
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/48—Other medical applications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/117—Identification of persons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F19/00—Digital computing or data processing equipment or methods, specially adapted for specific applications
- G06F19/30—Medical informatics, i.e. computer-based analysis or dissemination of patient or disease data
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/033—Indexing scheme relating to G06F3/033
- G06F2203/0331—Finger worn pointing device
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2562/00—Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Jiang et al. | Emerging wearable interfaces and algorithms for hand gesture recognition: A survey | |
Guo et al. | Human-machine interaction sensing technology based on hand gesture recognition: A review | |
Xiong et al. | Deep learning for EMG-based human-machine interaction: A review | |
Jiang et al. | A novel, co-located EMG-FMG-sensing wearable armband for hand gesture recognition | |
Yang et al. | Gesture interaction in virtual reality | |
McIntosh et al. | Echoflex: Hand gesture recognition using ultrasound imaging | |
Jiang et al. | Exploration of force myography and surface electromyography in hand gesture classification | |
Jiang et al. | Feasibility of wrist-worn, real-time hand, and surface gesture recognition via sEMG and IMU sensing | |
EP3659016B1 (en) | Armband for tracking hand motion using electrical impedance measurement | |
Jiang et al. | Stretchable e-skin patch for gesture recognition on the back of the hand | |
Esposito et al. | Biosignal-based human–machine interfaces for assistance and rehabilitation: A survey | |
Tchantchane et al. | A review of hand gesture recognition systems based on noninvasive wearable sensors | |
Yue et al. | How to achieve human–machine interaction by foot gesture recognition: a review | |
Dong et al. | Wearable sensing devices for upper limbs: A systematic review | |
Zongxing et al. | Human-machine interaction technology for simultaneous gesture recognition and force assessment: A Review | |
Tang et al. | From brain to movement: Wearables-based motion intention prediction across the human nervous system | |
Zhu et al. | A contactless method to measure real-time finger motion using depth-based pose estimation | |
Zhang et al. | ViT-LLMR: Vision Transformer-based lower limb motion recognition from fusion signals of MMG and IMU | |
Mao et al. | Simultaneous estimation of grip force and wrist angles by surface electromyography and acceleration signals | |
Li et al. | Human lower limb motion intention recognition for exoskeletons: A review | |
Liu et al. | A practical system for 3-D hand pose tracking using EMG wearables with applications to prosthetics and user interfaces | |
Guo et al. | Human–robot interaction for rehabilitation robotics | |
Xu et al. | Execution and perception of upper limb exoskeleton for stroke patients: a systematic review | |
Zhang et al. | Integrating intention-based systems in human-robot interaction: a scoping review of sensors, algorithms, and trust | |
Carpi et al. | Non invasive brain-machine interfaces |