Vogel et al., 2012 - Google Patents
Exploring the possibilities of supporting robot-assisted work places using a projection-based sensor systemVogel et al., 2012
- Document ID
- 4663444937528279613
- Author
- Vogel C
- Walter C
- Elkmann N
- Publication year
- Publication venue
- 2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings
External Links
Snippet
We explore different kind of functions which a projection-based optical safety system may provide in the context of physical human-robot interaction (HRI). A scenario will be presented in which a collaborative workbench equipped with a robot arm is augmented by …
- 230000003993 interaction 0 abstract description 20
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterized by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/39—Robotics, robotics to robotics hand
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B2219/00—Program-control systems
- G05B2219/30—Nc systems
- G05B2219/40—Robotics, robotics mapping to robotics vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS], computer integrated manufacturing [CIM]
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Dianatfar et al. | Review on existing VR/AR solutions in human–robot collaboration | |
Vogel et al. | A projection-based sensor system for safe physical human-robot collaboration | |
CN105094005B (en) | Optical security system, the method for controlling motorized industry equipment and computer-readable medium | |
KR20190062171A (en) | Deep learning-based real-time detection and correction of compromised sensors in autonomous machines | |
US10394327B2 (en) | Integration of auxiliary sensors with point cloud-based haptic rendering and virtual fixtures | |
US9535538B2 (en) | System, information processing apparatus, and information processing method | |
JP2019519387A (en) | Visualization of Augmented Reality Robot System | |
Huy et al. | See-through and spatial augmented reality-a novel framework for human-robot interaction | |
Vogel et al. | Exploring the possibilities of supporting robot-assisted work places using a projection-based sensor system | |
US11529737B2 (en) | System and method for using virtual/augmented reality for interaction with collaborative robots in manufacturing or industrial environment | |
CN112828846A (en) | Simulation device and robot system using augmented reality | |
Boschetti et al. | 3D collision avoidance strategy and performance evaluation for human–robot collaborative systems | |
KR20190128988A (en) | Aiding maneuvering of obscured objects | |
Stone | Virtual reality and telepresence | |
Demirtas et al. | Development and implementation of a collaborative workspace for industrial robots utilizing a practical path adaptation algorithm and augmented reality | |
Williams | A framework for robot-generated mixed-reality deixis | |
Liu et al. | A projection-based making-human-feel-safe system for human-robot cooperation | |
Imtiaz et al. | A flexible context-aware assistance system for industrial applications using camera based localization | |
Kumar | Dynamic speed and separation monitoring with on-robot ranging sensor arrays for human and industrial robot collaboration | |
CN109761119B (en) | Elevator control method, device and equipment and elevator system | |
WO2018016192A1 (en) | Virtual sensory evaluation assistance system | |
Vogel et al. | A projection-based sensor system for ensuring safety while grasping and transporting objects by an industrial robot | |
Torkar et al. | RNN-based human pose prediction for human-robot interaction | |
Vogel et al. | Optical workspace monitoring system for safeguarding tools on the mobile manipulator VALERI | |
Schmidt | Real-time collision detection and collision avoidance |