WO2001031424A2 - Anordnung zur interaktion - Google Patents
Anordnung zur interaktion Download PDFInfo
- Publication number
- WO2001031424A2 WO2001031424A2 PCT/DE2000/003716 DE0003716W WO0131424A2 WO 2001031424 A2 WO2001031424 A2 WO 2001031424A2 DE 0003716 W DE0003716 W DE 0003716W WO 0131424 A2 WO0131424 A2 WO 0131424A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- arrangement according
- interface
- surface unit
- wireless interface
- module box
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
Definitions
- the invention relates to an arrangement for interaction
- the object to be recognized is a pointer unit, e.g. a hand or a pointing stick.
- a pointer unit e.g. a hand or a pointing stick.
- a user can use a graphic
- gesture control Interact with the interface by triggering the appropriate actions by gesture control.
- Such an arrangement is also referred to as a gesture computer or a "virtual touch screen”.
- the way in which the virtual touch screen works is that the interface is projected onto a predetermined area, the projection preferably having virtual buttons, the actuation of which triggers predetermined actions.
- a trigger mechanism is the pointing of the user with his hand as a pointer unit to one of the buttons. After a predetermined period of time, the action associated with the button is triggered.
- the position of the hand of the user is recorded by a camera, the picture is digitized and a computer determines the position of the hand in relation to the projection. In this way it can be determined whether a virtual button has been operated or not.
- This release mechanism can be compared in its effect with the operation of a computer mouse as an input medium for a graphic interface.
- the mouse pointer is moved by hand on the virtual touch screen, instead of double-clicking, the hand is held over a selected button for a predetermined minimum period.
- the possibility of additional lighting of the user interface with waves in the invisible spectral range is presented.
- the projection can be appropriately differentiated from the pointer unit by illuminating the user interface with waves in the invisible spectral range.
- the user interface When illuminated with infrared light, the user interface reflects the infrared light more strongly than the pointing device (hand). The hand therefore has a higher absorption than the user interface.
- this effect can be enhanced by the fact that the control surface is additionally provided with a reflective layer, so that the absorption of the infrared light on the control surface is as low as possible. The hand appears dark in front of the user interface.
- Projector can be reached. However, if input components, for example, are also to be designed on the user interface, then these and in particular their lines to the actuating unit are potentially exposed to vandalism.
- the object of the invention is to provide an arrangement for interaction which also has components on a surface unit, the electrical connections of which are protected against vandalism.
- the module box has the following components in particular:
- Surface unit an image can be represented.
- a camera that captures the surface unit.
- a computer that controls the projector and evaluates the recording of the camera in such a way that a recording of a movement or a dwelling of an interaction component on or in front of the surface unit as a functionality of a
- Input pointer is interpretable.
- This wireless interface is in particular a transmission interface for the transmission of data and / or energy.
- the interaction component is a hand or a finger of a user or a pointer unit.
- the user can make an entry on the surface unit, onto which a user surface (graphic user interface (GUI)) is projected expediently based on the projector.
- GUI graphics user interface
- the user moves his hand over the user interface, a mouse pointer being controlled by the movement of the hand.
- a predetermined action can be drawn on the user interface, for example, by holding the hand for a predetermined period of time at a specific point on the projection. This lingering action is triggered which is associated with a projected user interface. This corresponds to controlling a user interface using a computer mouse on a conventional personal computer.
- At least one wireless interface is a sound interface.
- the sound interface can preferably be an ultrasound interface.
- the ultrasound interface expediently has an ultrasound transmitter and an ultrasound receiver, which can be implemented on the surface unit and in the module box.
- One embodiment consists of the at least one wireless interface being an optical interface.
- the optical interface can have an infrared transmitter and an infrared receiver.
- the infrared transmitter is preferably an infrared light-emitting diode or an infrared laser diode.
- the at least one wireless interface is an electromagnetic interface.
- the transmitter of the electromagnetic interface can be designed as one of the following components
- the receiver of the electromagnetic interface can be designed as one of the following components: a) antenna coil, b) slot antenna, c) dipole antenna.
- a further embodiment consists in that the respective transmitter of the at least one wireless interface m of the module box and the respective receiver of the at least one wireless interface m of the surface unit are designed.
- the respective transmitter of the at least one wireless interface m of the surface unit and the respective receiver of the at least one wireless interface m of the module box can be designed accordingly.
- an input device is provided in the surface unit.
- the input device can include the following: a) joystick, b) button, c) rocker arm, d) distance-sensitive sensor, e) touch-sensitive sensor, f) pressure-dependent resistance, g) spring plate, h) switch,
- the surface unit has a sensor for converting light energy into electrical energy, the sensor supplying the surface unit with current.
- the sensor for converting light energy into electrical energy can be a solar cell.
- Fig.l an arrangement for interaction
- 2 shows a processor unit; 3 shows an arrangement for interaction, the one
- FIG. 4 shows a surface unit with possible input devices.
- Fig.l shows how a virtual touch screen works.
- the surface unit in the form of a passive user interface BOF is presented.
- Surface unit is understood in particular to mean the physical expression of the base surface onto which a projection is made.
- the content of the projection is e.g. the BOF user interface.
- the user interface BOF (interaction area, image of a graphical user interface GUI) is mapped to a predeterminable area (surface), here a projection display PD (interaction area).
- the Pro etionsdisplay PD replaces a conventional screen. The input is made by pointing directly with the interaction component, here a hand H on the
- the projection display PD is illuminated with infrared light in FIG.
- the infrared light source IRL can advantageously be designed using infrared light-emitting diodes.
- a camera K preferably with a special infrared filter IRF, which is particularly sensitive in an infrared spectral range is configured, the projection display PD includes the hand H of the user.
- the operating surface BOF is imaged on the projection display PD.
- the user interface BOF can be configured as a graphical user interface (GUI) on a monitor of the computer R.
- GUI graphical user interface
- a mouse pointer MZ is moved by the hand H of the user.
- a pointer unit can also be used as the interaction component.
- the hand H is moved to the field F, the mouse pointer MZ follows the hand H.
- the hand H remains above the for a predeterminable period of time Field F, the function associated with field F is drawn on computer R. In conventional systems, this corresponds to double-clicking the mouse when the mouse pointer is over a virtual switch.
- a processor unit PRZE (computer) is shown in FIG.
- the processor unit PRZE comprises a processor CPU, a memory MEM and an input / output interface IOS, which is used in different ways via an interface IFC: via a graphic interface, an output on a monitor MON, in particular a projector, becomes visible and / or printed on a PRT printer. An entry is made using a mouse MAS or a keyboard TAST.
- the processor unit PRZE also has a data bus BUS, which ensures the connection of a memory MEM, the processor CPU and the input / output interface IOS. Additional components can also be connected to the data bus BUS, for example additional memory, data memory (hard disk), camera, frame grabber, detection unit, input devices, network connections or scanners.
- 3 shows an arrangement for interaction, which has a surface unit, which surface unit comprises additional functional acts.
- a module box 111 comprises a computer (processor unit, evaluation unit) 112, a camera (Detektionsemheit for shafts, preferably emitted from IRL radiation) 113, a pro j ector 127, an infrared light 128 (source of waves for non-visible light, preferably infrared light).
- the surface unit 121 comprises a multiplicity of components which, in addition to the contactless interaction described above, also enable a touch-sensitive interaction.
- a laser diode 114 which is provided in the module box, radiates specifically onto a photodiode 120, which is arranged in the surface unit 121.
- Optical energy is thus transmitted from laser diode 114 to the associated sensor (photodiode 120).
- energy or data / signals are transmitted from an optical transmitter 119, for example a light-emitting diode or a laser diode, both of which preferably emit light in the non-visible spectral range, to a photodiode.
- the energy transmitted, in particular from the module box 111 to the surface unit 121, can be used by converting electrical energy at the receiver to operate an input device or the surface unit itself. Parts of the surface unit can be made interchangeable.
- a joystick 118 shows a signaling input device that controls the optical transmitter 119.
- the signals that are transmitted to the evaluation unit 112 via the interaction of the joystick 118 are emitted via the duplex connection Transmitters 119, 114 and receivers 120, 115.
- the Ü transmission takes place without electrical lines, that is certainly invisible against vandalism and for the user.
- the surface unit 121 is supplied with energy via a solar cell 122.
- energy can be transmitted from the module box 111 to the surface unit 121 in a targeted manner by the laser diode 114 feeding the optical receiver 120, the optical receiver 120 providing the light energy m electrical energy for operation the surface unit converts.
- FIG. 3 additionally shows a full-duplex transmitter-receiver connection based on ultrasound.
- ultrasonic transmitters 123 and 125 or ultrasonic receivers 124 and 126 are provided.
- a wireless interface of an electromagnetic type is shown in the form of the receiving and transmitting coils 130 and 116.
- a keypad 117 transmits signals which are transmitted via the transmission coil 116 from the surface unit 121 to the module box 111, there the reception coil 130.
- the computer 112 takes over the coding of the transmitted signals m predetermined actions.
- FIG. 4 shows a surface unit 216 with possible input devices.
- An area 211 is provided as an information area.
- Two light-sensitive input devices photodiodes are suitable for recording certain ambient lighting (optical button,
- a joystick 214 a joystick 214, pushbutton or push switch 217, a projection surface 218, a three-dimensional design of an interaction surface in the form of an elevation 219 and a depression 220 and a surface 221, which is shown by the camera 113 m in FIG is recorded.
- a joystick 214 a joystick 214, pushbutton or push switch 217, a projection surface 218, a three-dimensional design of an interaction surface in the form of an elevation 219 and a depression 220 and a surface 221, which is shown by the camera 113 m in FIG is recorded.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Projection Apparatus (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE19951322.8 | 1999-10-25 | ||
DE1999151322 DE19951322A1 (de) | 1999-10-25 | 1999-10-25 | Anordnung zur Interaktion |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2001031424A2 true WO2001031424A2 (de) | 2001-05-03 |
WO2001031424A3 WO2001031424A3 (de) | 2001-11-29 |
Family
ID=7926770
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/DE2000/003716 WO2001031424A2 (de) | 1999-10-25 | 2000-10-20 | Anordnung zur interaktion |
Country Status (2)
Country | Link |
---|---|
DE (1) | DE19951322A1 (de) |
WO (1) | WO2001031424A2 (de) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1661538B (zh) * | 2004-02-27 | 2010-05-05 | 三星电子株式会社 | 用于具有触摸屏的终端的指示设备和使用该设备的方法 |
Families Citing this family (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE20216904U1 (de) | 2002-11-02 | 2003-01-02 | MAN Roland Druckmaschinen AG, 63075 Offenbach | Dateneingabe für eine Druckmaschine |
DE10260305A1 (de) | 2002-12-20 | 2004-07-15 | Siemens Ag | HMI Einrichtung mit einem optischem Touch Screen |
DE102005001417B4 (de) | 2004-01-29 | 2009-06-25 | Heidelberger Druckmaschinen Ag | Projektionsflächenabhängige Anzeige-/Bedienvorrichtung |
DE102008046092A1 (de) * | 2007-09-06 | 2009-09-03 | Bernd Hopp | Endloser Navigator |
DE102011119082A1 (de) * | 2011-11-21 | 2013-05-23 | Übi UG (haftungsbeschränkt) | Vorrichtungsanordnung zur Schaffung eines interaktiven Bildschirms aus einem Bildschirm |
DE102016224260A1 (de) * | 2016-12-06 | 2018-06-07 | Bayerische Motoren Werke Aktiengesellschaft | Anwenderschnittstelle, Fortbewegungsmittel und Verfahren zur Eingabe von Informationen in ein Fortbewegungsmittel |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19708240A1 (de) * | 1997-02-28 | 1998-09-10 | Siemens Ag | Anordnung zur Detektion eines Objekts in einem von Wellen im nichtsichtbaren Spektralbereich angestrahlten Bereich |
DE19734511A1 (de) * | 1997-08-08 | 1999-02-11 | Siemens Ag | Kommunikationseinrichtung |
DE19806021A1 (de) * | 1998-02-13 | 1999-08-19 | Siemens Nixdorf Inf Syst | Gerät mit virtueller Eingabeeinrichtung |
WO2000055802A1 (de) * | 1999-03-17 | 2000-09-21 | Siemens Aktiengesellschaft | Anordnung zur interaktion |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5528263A (en) * | 1994-06-15 | 1996-06-18 | Daniel M. Platzker | Interactive projected video image display system |
DE29804165U1 (de) * | 1998-03-09 | 1998-05-07 | Scm Microsystems Gmbh | Vorrichtung zur peripheren Datenkommunikation |
-
1999
- 1999-10-25 DE DE1999151322 patent/DE19951322A1/de not_active Withdrawn
-
2000
- 2000-10-20 WO PCT/DE2000/003716 patent/WO2001031424A2/de active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE19708240A1 (de) * | 1997-02-28 | 1998-09-10 | Siemens Ag | Anordnung zur Detektion eines Objekts in einem von Wellen im nichtsichtbaren Spektralbereich angestrahlten Bereich |
DE19734511A1 (de) * | 1997-08-08 | 1999-02-11 | Siemens Ag | Kommunikationseinrichtung |
DE19806021A1 (de) * | 1998-02-13 | 1999-08-19 | Siemens Nixdorf Inf Syst | Gerät mit virtueller Eingabeeinrichtung |
WO2000055802A1 (de) * | 1999-03-17 | 2000-09-21 | Siemens Aktiengesellschaft | Anordnung zur interaktion |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN1661538B (zh) * | 2004-02-27 | 2010-05-05 | 三星电子株式会社 | 用于具有触摸屏的终端的指示设备和使用该设备的方法 |
Also Published As
Publication number | Publication date |
---|---|
WO2001031424A3 (de) | 2001-11-29 |
DE19951322A1 (de) | 2001-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP0963563B1 (de) | Anordnung und verfahren zur detektion eines objekts in einem von wellen im nichtsichtbaren spektralbereich angestrahlten bereich | |
DE69130282T2 (de) | Positions- und Funktionseingabesystem für Grossoberflächenanzeige | |
DE4423005C1 (de) | Eingabevorrichtung für einen Computer | |
DE69331614T2 (de) | Optischer Leser | |
DE69914659T2 (de) | Eingabevorrichtung für Rechner in Form eines Stiftes | |
DE20221921U1 (de) | Tragbare elektronische Vorrichtung mit mausähnlichen Fähigkeiten | |
EP2325727B1 (de) | Zeichen-, Schreib- und Zeigevorrichtung zur Mensch-Maschine Interaktion | |
DE112008000771B4 (de) | Mobiles Kommunikationsgerät und Eingabeeinrichtung hierfür sowie Verfahren zur Bewerkstelligung eines Eingabevorgangs und Verfahren zur Abwicklung einer Eingabe von Hand-Gesten-Befehlen | |
EP2016480B1 (de) | Optoelektronische vorrichtung zur erfassung der position und/oder bewegung eines objekts sowie zugehöriges verfahren | |
EP1184804B1 (de) | System zur Bildwiedergabe | |
EP1998996A1 (de) | Interaktive bedienvorrichtung und verfahren zum betreiben der interaktiven bedienvorrichtung | |
DE20111879U1 (de) | Universal-Darstellungseinrichtung | |
DE102004044999A1 (de) | Eingabesteuerung für Geräte | |
EP2754016A1 (de) | Bedienvorrichtung für ein kraftfahrzeug und verfahren zum bedienen der bedienvorrichtung für ein kraftfahrzeug | |
WO2000023938A1 (de) | Eingabevorrichtung für einen computer | |
EP2331362A2 (de) | Bedienvorrichtung und verfahren zum betreiben einer bedienvorrichtung mit verbesserter annäherungserfassung | |
WO2001031424A2 (de) | Anordnung zur interaktion | |
EP1161740A1 (de) | Anordnung zur interaktion | |
US6770864B2 (en) | Light beam operated personal interfaces to computers | |
CN105700715B (zh) | 一种可控电脑光标的翻页笔及其控制方法 | |
WO1992005483A1 (de) | Vorrichtung zur dateneingabe | |
DE4321825C2 (de) | Computereingabe- oder Steuervorrichtung für Räume mit hoher EM-Störstrahlung, explosionsgefährdete oder kontaminierte Räume | |
DE10124834C2 (de) | Verfahren zur Informationseingabe für eine Mensch-Maschine-Schnittstelle, Stift zur Durchführung des Verfahrens und Vorrichtung zur Eingabe von Informationen mittels des Verfahrens | |
WO2001084482A2 (de) | Vorrichtung zur eingabe von relativen koordinaten | |
DE19880483C1 (de) | Eingabevorrichtung für einen Computer |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A2 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A2 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DFPE | Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101) | ||
AK | Designated states |
Kind code of ref document: A3 Designated state(s): JP US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A3 Designated state(s): AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2000987005 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: JP |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 2000987005 Country of ref document: EP |
|
122 | Ep: pct application non-entry in european phase |