Wang et al., 2018 - Google Patents
Controlling object hand-over in human–robot collaboration via natural wearable sensingWang et al., 2018
View PDF- Document ID
- 18398619203542972803
- Author
- Wang W
- Li R
- Diekel Z
- Chen Y
- Zhang Z
- Jia Y
- Publication year
- Publication venue
- IEEE Transactions on Human-Machine Systems
External Links
Snippet
With the deployment of collaborative robots in intelligent manufacturing, object hand-over between humans and robots plays a significant role in human-robot collaborations. In most collaboration studies, human hand-over intentions were usually assumed to be known by …
- 230000001276 controlling effect 0 title description 10
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6217—Design or setup of recognition systems and techniques; Extraction of features in feature space; Clustering techniques; Blind source separation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/62—Methods or arrangements for recognition using electronic means
- G06K9/6267—Classification techniques
- G06K9/6268—Classification techniques relating to the classification paradigm, e.g. parametric or non-parametric approaches
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J9/00—Programme-controlled manipulators
- B25J9/16—Programme controls
- B25J9/1674—Programme controls characterised by safety, monitoring, diagnostic
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B25—HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
- B25J—MANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
- B25J13/00—Controls for manipulators
- B25J13/08—Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | Controlling object hand-over in human–robot collaboration via natural wearable sensing | |
Wang et al. | Predicting human intentions in human–robot hand-over tasks through multimodal learning | |
Yang et al. | Haptics electromyography perception and learning enhanced intelligence for teleoperated robot | |
Li et al. | Survey on mapping human hand motion to robotic hands for teleoperation | |
Fang et al. | Survey of imitation learning for robotic manipulation | |
Dillmann et al. | Learning robot behaviour and skills based on human demonstration and advice: the machine learning paradigm | |
Kubota et al. | Activity recognition in manufacturing: The roles of motion capture and sEMG+ inertial wearables in detecting fine vs. gross motion | |
Hossain et al. | Pick-place of dynamic objects by robot manipulator based on deep learning and easy user interface teaching systems | |
Yuan et al. | Robot synesthesia: In-hand manipulation with visuotactile sensing | |
Yanik et al. | Use of kinect depth data and growing neural gas for gesture based robot control | |
Tortora et al. | Fast human motion prediction for human-robot collaboration with wearable interface | |
da Fonseca et al. | Tactile object recognition in early phases of grasping using underactuated robotic hands | |
Chen et al. | A human–robot interface for mobile manipulator | |
Shin et al. | EMG and IMU based real-time HCI using dynamic hand gestures for a multiple-DoF robot arm | |
Nguyen et al. | Merging physical and social interaction for effective human-robot collaboration | |
Palm et al. | Recognition of human grasps by time-clustering and fuzzy modeling | |
Huang et al. | Tradeoffs in neuroevolutionary learning-based real-time robotic task design in the imprecise computation framework | |
Brock et al. | A framework for learning and control in intelligent humanoid robots | |
Cai et al. | FedHIP: Federated learning for privacy-preserving human intention prediction in human-robot collaborative assembly tasks | |
Amatya et al. | Real time kinect based robotic arm manipulation with five degree of freedom | |
Gäbert et al. | Gesture based symbiotic robot programming for agile production | |
Perico et al. | Learning robust manipulation tasks involving contact using trajectory parameterized probabilistic principal component analysis | |
Skoglund et al. | Programming-by-Demonstration of reaching motions—A next-state-planner approach | |
Fresnillo et al. | A method for understanding and digitizing manipulation activities using programming by demonstration in robotic applications | |
Chen et al. | Dynamic gesture design and recognition for human-robot collaboration with convolutional neural networks |