Iqbal et al., 2019 - Google Patents
Human-robot teaming: Approaches from joint action and dynamical systemsIqbal et al., 2019
View PDF- Document ID
- 6856335625688003019
- Author
- Iqbal T
- Riek L
- Publication year
- Publication venue
- Humanoid robotics: A reference
External Links
Snippet
As robots start to work alongside people, they are expected to coordinate fluently with humans in teams. Many researchers have explored the problems involved in building more interactive and cooperative robots. In this chapter, we discuss recent work and the main …
- 238000005183 dynamical system 0 title description 13
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T13/00—Animation
- G06T13/20—3D [Three Dimensional] animation
- G06T13/40—3D [Three Dimensional] animation of characters, e.g. humans, animals or virtual beings
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computer systems based on biological models
- G06N3/02—Computer systems based on biological models using neural network models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N99/00—Subject matter not provided for in other groups of this subclass
- G06N99/005—Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06Q—DATA PROCESSING SYSTEMS OR METHODS, SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL, SUPERVISORY OR FORECASTING PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q10/00—Administration; Management
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06N—COMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N5/00—Computer systems utilising knowledge based models
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09B—EDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
- G09B19/00—Teaching not covered by other main groups of this subclass
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Iqbal et al. | Human-robot teaming: Approaches from joint action and dynamical systems | |
McDuff et al. | Designing emotionally sentient agents | |
Moulin-Frier et al. | DAC-h3: a proactive robot cognitive architecture to acquire and express knowledge about the world and the self | |
Chernova et al. | Robot learning from human teachers | |
Calderita et al. | THERAPIST: towards an autonomous socially interactive robot for motor and neurorehabilitation therapies for children | |
Wheatland et al. | State of the art in hand and finger modeling and animation | |
Gillies | Understanding the role of interactive machine learning in movement interaction design | |
Lee | A survey of robot learning from demonstrations for human-robot collaboration | |
Cabibihan et al. | Human-recognizable robotic gestures | |
Ondras et al. | Audio-driven robot upper-body motion synthesis | |
Mihoub et al. | Graphical models for social behavior modeling in face-to face interaction | |
Krishnaswamy et al. | Communicating and acting: Understanding gesture in simulation semantics | |
Saponaro et al. | Robot anticipation of human intentions through continuous gesture recognition | |
Shukla et al. | Learning semantics of gestural instructions for human-robot collaboration | |
Stoeva et al. | Body language in affective human-robot interaction | |
Wu et al. | Communicative learning with natural gestures for embodied navigation agents with human-in-the-scene | |
Salehzadeh et al. | Purposeful Communication in Human–Robot Collaboration: A Review of Modern Approaches in Manufacturing | |
Williams et al. | Investigating the potential effectiveness of allocentric mixed reality deictic gesture | |
Zabala et al. | Modeling and evaluating beat gestures for social robots | |
Banerjee et al. | AI enabled tutor for accessible training | |
Kopp et al. | The fabric of socially interactive agents: Multimodal interaction architectures | |
Zabala et al. | Learning to gesticulate by observation using a deep generative approach | |
Hartholt et al. | Platforms and tools for SIA research and development | |
Bohus et al. | Situated interaction | |
Chella et al. | Imitation learning and anchoring through conceptual spaces |