NL2004333C2 - Method and apparatus for touchlessly inputting information into a computer system. - Google Patents
Method and apparatus for touchlessly inputting information into a computer system. Download PDFInfo
- Publication number
- NL2004333C2 NL2004333C2 NL2004333A NL2004333A NL2004333C2 NL 2004333 C2 NL2004333 C2 NL 2004333C2 NL 2004333 A NL2004333 A NL 2004333A NL 2004333 A NL2004333 A NL 2004333A NL 2004333 C2 NL2004333 C2 NL 2004333C2
- Authority
- NL
- Netherlands
- Prior art keywords
- pointing
- planes
- plane
- faces
- interface object
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
- User Interface Of Digital Computer (AREA)
Description
Title: Method and apparatus for touchlessly inputting information into a computer system
Field of the invention 5 The present invention relates to a method for touchlessly inputting information into a computer system, and to an input apparatus implementing said method.
Background 10 Many of today’s computer systems are equipped with a graphical user interface (GUI) through which a user can interact with them. A user may be enabled to select items or objects presented to him by the GUI, for example on a video display, by a variety of input means.
Common input means include a trackball type pointing device, e.g.
15 a computer mouse. A user may place his hand on the pointing device so as to move it and to correspondingly displace a position indicator, e.g. a cursor, across the display. The position indicator typically indicates the position of the display that will respond to further input from the pointing device or another input means. Once a cursor has been moved over an item presented 20 on the display, the user may press a button on the pointing device to confirm a selection. Other input means finding increased implementation are touch screens or touch panels. Relative to the aforementioned pointing devices, they are considered to be more user friendly and natural. This is primarily because touch screens enable a user to directly interact with any information 25 presented to them, rather than indirectly via a pointing device. In addition, touch screens render separate and/or intermediate pointing devices, such as pointing sticks or pens, superfluous.
As computer systems become more and more ubiquitous, the demand for operating systems through which a computer system may be 30 operated in a natural and intuitive manner, and without special hardware 2 such as pointing devices, increases. A case in point are head-up displays systems used in the automotive industry. Such display systems may run a GUI that displays a variety of information, e.g. speedometer and night vision data, on a windshield of an automobile. This allows a user/driver to take 5 notice of the information without having to take his eyes off of the road.
Providing an operating system that enables natural interaction between the driver and the GUI, however, presents a challenge. One the one hand, trackball type pointing devices are considered neither user-friendly nor promoting natural interaction. Touch screens, on the other hand, aren’t an 10 obvious choice either. This is because a capacitive or resistive touch panel is preferably implemented as an overlay, present immediately in front of a display to be operated. In most cars, windshields tend to extend away from the driver. Though the driver may be able to point at objects presented to him by the GUI, it may typically be cumbersome for him to reach forward to 15 actually touch the windshield.
It is an object of the present invention to overcome or mitigate this problem associated with known operating systems, and more in particular, to provide for an economical input apparatus that enables a user to interact and input coordinates into an associated computer system.
20
Summary of the invention
One aspect of the present invention is directed to an input apparatus for touchlessly inputting information into a computer system by pointing a pointing member at an interface object of said system. The 25 apparatus includes monitoring means configured to monitor, within a region of interest, a plurality of planes for intersections by a pointing member, said plurality of planes comprising at least a first plane and a second plane. The apparatus further includes a controller that is operably connected to the monitoring means, and that is configured to detect, based on reference 30 signals outputted by the monitoring means, intersections of said planes by 3 the pointing member. Upon (simultaneous) detection of an intersection of both the first and the second plane the controller is configured to determine first spatial coordinates at which the pointing member intersects the first plane and second spatial coordinates at which the pointing member intersects 5 the second plane. The controller is also configured to determine from said first and second spatial coordinates a position and a pointing direction of the pointing member, and to determine from the determined position and pointing direction of the pointing member, and from supplementary information relating to the location of the interface object, coordinates of a 10 target point on the interface object at which the pointing member points.
The apparatus according to the present invention provides for a way of touchlessly inputting information into a computer system, e.g. a board computer with a head up display as described above, a portable navigation system or an industrial machine control system, in particular by pointing a 15 pointing member, e.g. a human finger, at an interface object of said system, e.g. a video display. The pointing action of the pointing member is registered with the aid of a plurality of optically or sonically monitored planes, which may typically be located somehow in front of the interface object, for example immediately adjacent thereto or at a distance therefrom. A first and second 20 plane of the plurality of planes are employed in particular for assessing both a position and a direction of the pointing member as it points at the interface object. On the basis of these parameters, supplemented by information about the location (including information about spatial distribution or shape) of the interface object, the coordinates of the point on the interface object at which 25 the pointing member points (i.e. the target point) may be determined. These latter coordinates may be passed on/communicated to a computer running an application program, for example one that provides a GUI on the interface object, in order to enable it to respond to the user’s input, for example by moving a position indicator to the target point.
4
It is understood that the input apparatus according to the present invention interprets a pointing gesture, e.g. an extended index finger, in accordance with its normal meaning as it seeks to identify the location that is pointed at. Accordingly, it provides for a natural way of inputting coordinate 5 information, which is especially suitable for use in situations where it is desirable to control a computer system’s user interface from a distance and/or without touching the associated interface object.
The aforementioned planes may be monitored using electromagnetic radiation (EMR) in a part of the electromagnetic spectrum invisible to 10 the human eye, e.g. infrared (IR). Alternatively, monitoring may take place using (ultra)sonic signals. Hence the planes may be fully transparent and permeable, and in fact imperceptible to humans unless indicated by indicating means present in or at the periphery of the planes. Such indicating means may for example include light emitting devices that emit light in a 15 visible part of the electromagnetic spectrum, and within one or more of the planes, such that a pointing member that extends through said plane(s) is partly illuminated at the location of intersection of said plane(s), thereby providing the user with feedback about the location of the plane(s). It is noted that EMR monitoring means configured to monitor a single plane in space are 20 known in the art, for example from the field of electronic (white)boards. In such boards a monitored plane may typically be present immediately adjacent and parallel to a writing surface of the board. Using a number of such single-plane monitoring means in parallel provides for an economical way of monitoring multiple parallel planes as proposed by the present 25 invention. Technical alternatives, such as camera control systems based on stereoscopic imaging, do exist but are typically difficult and relatively costly to implement.
While a first and a second plane may be used in particular to ascertain the coordinates of a target point on the interface object, any number 30 of desired further planes may be used to provide for additional functionality.
5
The plurality of planes may for example comprises a third plane that is located between, on the one hand, the first and second planes, and on the other hand, the interface object. Intersections of such a third plane may be recorded and communicated to an application program running on a 5 computer connected to the input apparatus. The application program may be capable of executing a range of different actions, and assign to each of these actions, by way of activating command, a specific intersection pattern. A single, brief intersection of the third plane, for example, may be assigned the meaning of a selection-confirmation command, comparable to a single mouse 10 click on conventional home computer systems or a single tap on a touch screen. Alternatively, such a selection-confirmation command may be linked to another intersection pattern, for example including two repetitive intersections within a particular time interval. An application program may in principle assign any intersection pattern of any of the plurality of the 15 planes to a specific action it is capable of executing.
Another aspect of the present invention relates to a method for touchlessly inputting information into a computer system by pointing a pointing member at an interface object of said system. The method includes defining in a region of interest a plurality of planes, said plurality of planes 20 comprising at least a first plane and a second plane. The method also includes monitoring said planes for intersections by the pointing member, and upon detecting an intersection of both the first and the second plane, determining first spatial coordinates at which the pointing member intersects the first plane and second spatial coordinates at which the pointing member 25 intersects the second plane. The method further includes determining from said first and second spatial coordinates a position and a pointing direction of the pointing member, and determining from the determined position and pointing direction of the pointing member, and from information relating to the location of the interface object, coordinates of a target point on the 30 interface object at which the pointing member points.
6
These and other features and advantages of the invention will be more fully understood from the following detailed description of certain embodiments of the invention, taken together with the accompanying drawings, which are meant to illustrate and not to limit the invention.
5
Brief description of the drawings
Fig. 1 schematically illustrates an exemplary embodiment of a computer system including an input apparatus according to the present invention, and shows inter alia the intersection of two parallel planes by an 10 index finger of a user’s hand; and
Figs. 2A-D schematically illustrate the intersection of the planes by the index finger as shown in shown in Fig. 1, in a perspective front view, a perspective diagonal-front view, a perspective top view and a perspective side view, respectively.
15
Detailed description
Fig. 1 schematically illustrates a computer system of a general type. It includes an interface object 10, which is operably connected to and controlled by a computer 8. The computer 8, in turn, is operably connected to 20 an input apparatus 1 according to the present invention comprising monitoring means 2, 4, 6 and a controller (not shown). The different components of the computer system will be briefly elucidated in turn.
The interface object 10 may typically include a video display capable of presenting variable information content to a user. The term video 25 display is to be construed broadly, and intends to include any type of device that is suited for the purpose of visual information transfer. However, although the interface object may generally be described as a part of the computer system, it need not be an active or controllable part thereof. In fact, it may be any passive object, as long as information about its location/spatial 30 distribution is known to the controller/computer, so as to allow it to 7 extrapolate the pointing direction of the pointing member towards the object in order to determine the coordinates of the target point on the object. The interface object may, for example, be a blind wall that is to be painted, while the computer system including the input apparatus 1 according to the 5 present invention is a spray gun control system, configured to control the aiming and firing of a spray gun that will paint the wall.
The information content presented to a user via the interface object 10 may be controlled by the computer 8. The computer 8 may include a processor for running an application program that provides for a GUI, which 10 is presented to the user via the interface object 10. User input for the application program may be provided for via the monitoring means 2, 4, 6.
The monitoring means 2, 4, 6 may be configured to monitor a plurality of planes for the presence of a pointing member 14, e.g. an index finger of a user’s hand 12. In Fig. 1, only a first plane VI and a second plane 15 V2 are shown for clarity. The planes VI, V2 may extend in parallel, but this need not always be the case. Intersections between different planes VI, V2, however, may preferably be avoided. In case the planes VI, V2 do extend in parallel, a perpendicular distance between them is preferably smaller than a length of the pointing member, such that the pointing member may extend 20 through (and thus operate) both planes at the same time. The planes VI, V2 may be disposed immediately adjacent the interface object 10, for example within a frame or bezel surrounding a video display thereof. Alternatively, as in Fig. 1, the planes may be located at a distance from the interface object 10. In either case, information may be inputted by intersecting the planes with a 25 pointing member and without touching the interface object 10.
In one embodiment, the monitoring means 2, 4, 6 may be equipped with an embedded controller adapted to process any acquired monitoring data pertaining to intersections of the planes VI, V2. In general, the controller may be configured to assess which plane is intersected at what 30 time, and at what spatial coordinates. As far as the first VI and second V2 8 planes are concerned, the controller may in particular be adapted to process said data to extract therefrom a position and a pointing direction R of the pointing member 14, and to determine, using these parameters, the spatial or interface/display coordinates of a target point 11 on the interface object 10 at 5 which the pointing member 14 points. The processed data, including for example the latter coordinates of the target point, may be outputted to the computer 8. Alternatively, the monitoring means 2 may pass on all or part of the acquired monitoring data to the computer 8 for processing and extraction of the respective parameters. Hence, in this latter case, the computer serves 10 as the controller.
The monitoring means 26 may be implemented in a variety of ways. As in the embodiment depicted in Fig. 1, the monitoring means 2, 4, 6 may for example comprise a planar, rectangular frame 2 laterally bounding a region of interest. Infrared beam generators, including for example infrared 15 light emitting diodes (LEDs) 4, may be disposed on two adjoining inner edges of the frame 2, while infrared beam sensors, such as infrared photosensors 6, may be placed on the opposite inner edges. The LEDs 4 and corresponding photosensors 6 may be arranged such that they create two or more substantially parallel 2D-grids of light beams across the frame 2. In the 20 embodiment of Fig. 1, two such grids are created; they extend in the first plane VI and second plane V2, respectively. A pointing member 14 that extends through the frame 2 may interrupt one or more of the light beams in each of the planes VI, V2, and cause a decrease in measured light intensity at the respective photosensors 6. Reference signals outputted by the 25 photosensors 6 may be communicated to the aforementioned controller, and be used to determine the spatial coordinates of the positions at which the pointing member 14 intersects the planes VI, V2. From these spatial coordinates, both a position and a pointing direction R of the pointing member 14 may be extracted, as will be discussed in some more detail below.
9
In an other embodiment of the input apparatus 1, infrared beam generators associated with at least one of the plurality of planes VI, V2, may be configured to generate infrared scanning beams that rotatively scan said at least one plane. The infrared beam sensors may detect any immediately 5 incident light beams and/or light beams reflected off of inner edges of the frame 2 and/or of a pointing member 14, and infer from an absence of expected incident light, a duration of absence and/or a decreased light intensity of incident light whether and where a pointing member 14 is present. Accordingly, a position and a pointing direction of a pointing member 10 14 extending through the frame 2 may be determined.
One skilled in the art will appreciate that the above implementations of the monitoring means 2, 4, 6 are merely exemplary and susceptible to various modifications. In an alternative embodiment, for example, the optical monitoring means may be replaced with sonic monitoring means. In another 15 alternative embodiment, the infrared based optical components may be replaced with components that operate in a different part of the electromagnetic spectrum.
Furthermore, it is noted that the number of planes to be monitored by the monitoring means 2, 4, 6 may be chosen as desired. Two planes VI, V2 20 is considered a minimum for retrieving information about the orientation of the pointing member 14. However, further planes may enable the input apparatus 1 to provide more detailed, three dimensional information about the position and orientation of the pointing member 14. For example, a third plane that is substantially parallel to the first VI and second V2 planes, and 25 that is located between, on the one hand, the first and second planes VI, V2, and on the other hand, the interface object 10, may be added. The monitoring 2, 4, 6 means may monitor this third plane for intersections by the pointing member 14 and output data regarding any intersections to the computer 8.
An application program running on the computer 8 may interpret an 30 intersection, or a certain sequence of intersections, of the third plane as a 10 special type of user input. A single intersection may for example be given the meaning of a confirmation of a selection of a graphical item presented on the video display. Likewise, two or more intersections within a certain period of time and/or of a certain duration may be coupled to a specific function/action 5 of the application program.
Figs. 2A-D schematically illustrate the intersection of the planes VI, V2 by the index finger 14, as shown in Fig. 1. Fig. 2A is a perspective front view of the situation depicted in Fig. 1, seen from the point of view of the user; Fig. 2B is a perspective diagonal-front view; Fig. 2C is a perspective top 10 view, and Fig. 2D is a perspective side view.
From Figs. 2A-D, it is clear that the index finger 14, intersects both planes VI, V2 at different spatial positions PI, P2. The spatial coordinates of both PI and P2 may be recorded relative to any suitable coordinate system.
To this end, Figs. 2A-D illustrate a three-dimensional Cartesian XYZ-15 coordinate system, in which the planes VI and V2 extend substantially perpendicularly to a Z-axis. Accordingly, any intersection of the first plane VI occurs at a first Z-coordinate, e.g. Zl, while any intersection of the second plane V2 occurs at a second Z-coordinate, e.g Z2. Furthermore, the planes VI and V2 are oriented parallel to the interface object 10, which may be located 20 at its own Z-coordinate, e.g. Zmterface. Intersection point Pi may thus be accorded spatial coordinates (XI, Yl, Zl), while intersection point P2 may be accorded spatial coordinates (X2, Y2, Z2). Any of the points PI, P2 may lend its spatial coordinates to the position of the pointing member 14, while the pointing direction R may be defined as the vectorial difference between the 25 spatial coordinates of PI and P2, e.g. as (X2-X1, Y2-Y1, Z2-Z1). From the position of the pointing member 14 and the pointing direction R, the spatial coordinates of the target point 11 on the interface object, (X’, Y’, Zmterface), may be determined by solving for the point at which a PI or P2 based pointing line 16 (cf. Fig. 1), extending in the direction of vector R, intersects a Z-plane with 30 coordinate Zmterface. Thus determined spatial coordinates of the target point 11 11 may be mapped onto corresponding video display coordinates, so as to allow a GUI of an application program run by computer 8 to provide for appropriate feedback at the target point. Such feedback may, for example, include moving a position indicator or cursor to the target point.
5 Above, various implementations of a computer system in conjunction with which the input apparatus, or the corresponding method, according to the present invention can be used have already been mentioned in passing. Indeed, particularly advantageous implementations are foreseen in the automotive industry, including the aforementioned head-up displays. It 10 is understood, however, that the input apparatus may in principle be combined with any computer system, including wireless terminals such as PDA’s, in which it may be used to mimic touch screen functionality, or audio-and/or television systems, in which the input device may serve as a true remote control. Another example underlining the versatility of the input 15 apparatus is the following. Imagine a conference room featuring a U-shaped table setup for participants, wherein a video display is provided at the head (i.e. at the open end of the U-configuration of tables). First and second, typically vertically extending parallel planes may then be provided along and over the entire U-shaped table configuration. This allows a participant, 20 independent of the position of his seat, to extend his arm or hand across his table and towards the video display so as to point out some piece of information being shown thereon. By doing so, he would intersect the first and second parallel planes. From the points of intersection, the input apparatus according to the invention can derive a position and a direction of 25 the pointing arm/hand, and determine coordinates of the target point on the video display that the participant points at. An associated computer running a presentation application program may then provide for feedback on the video display, for example in the form of a position indicator or a highlight effect, indicating to the other participants the aforementioned piece of 30 information.
12
Although illustrative embodiments of the present invention have been described above, in part with reference to the accompanying drawings, it is to be understood that the invention is not limited to these embodiments. Variations to the disclosed embodiments can be understood and effected by 5 those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. Reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present 10 invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, it is noted that particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner to form new, not 15 explicitly described embodiments.
13
List of elements 1 coordinate inputting apparatus 2 frame with LEDs and photosensors 4 LEDs 5 6 photosensors 8 computer 10 interface object / video display 11 target point 12 human hand 10 14 extended index finger / pointing member 16 line-of-pointing VI first plane V2 second plane 15
PI intersection point of pointing member with first plane PI
P2 intersection point of pointing member with second plane P2 R pointing direction of pointing member 20 XI,Y1 XY-plane plane relating to first plane X2,Y2 XY-plane relating to second plane X’, Y’ XY-plane relating to interface object
Claims (13)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2004333A NL2004333C2 (en) | 2010-03-03 | 2010-03-03 | Method and apparatus for touchlessly inputting information into a computer system. |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NL2004333 | 2010-03-03 | ||
NL2004333A NL2004333C2 (en) | 2010-03-03 | 2010-03-03 | Method and apparatus for touchlessly inputting information into a computer system. |
Publications (1)
Publication Number | Publication Date |
---|---|
NL2004333C2 true NL2004333C2 (en) | 2011-09-06 |
Family
ID=42830199
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
NL2004333A NL2004333C2 (en) | 2010-03-03 | 2010-03-03 | Method and apparatus for touchlessly inputting information into a computer system. |
Country Status (1)
Country | Link |
---|---|
NL (1) | NL2004333C2 (en) |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE20112715U1 (en) * | 2001-08-01 | 2001-10-11 | Irschitz Oliver | Information entry facility |
US20020140633A1 (en) * | 2000-02-03 | 2002-10-03 | Canesta, Inc. | Method and system to present immersion virtual simulations using three-dimensional measurement |
US20030063115A1 (en) * | 2001-09-10 | 2003-04-03 | Namco Ltd. | Image generation method, program, and information storage medium |
US20080030460A1 (en) * | 2000-07-24 | 2008-02-07 | Gesturetek, Inc. | Video-based image control system |
DE102007038359A1 (en) * | 2007-08-10 | 2009-02-12 | Visumotion Gmbh | User interface for determining spatial position of section of object, has two consecutively arranged position detectors, for detection of two dimensional positions, where electronic circuit is provided, which receives output signals |
-
2010
- 2010-03-03 NL NL2004333A patent/NL2004333C2/en not_active IP Right Cessation
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020140633A1 (en) * | 2000-02-03 | 2002-10-03 | Canesta, Inc. | Method and system to present immersion virtual simulations using three-dimensional measurement |
US20080030460A1 (en) * | 2000-07-24 | 2008-02-07 | Gesturetek, Inc. | Video-based image control system |
DE20112715U1 (en) * | 2001-08-01 | 2001-10-11 | Irschitz Oliver | Information entry facility |
US20030063115A1 (en) * | 2001-09-10 | 2003-04-03 | Namco Ltd. | Image generation method, program, and information storage medium |
DE102007038359A1 (en) * | 2007-08-10 | 2009-02-12 | Visumotion Gmbh | User interface for determining spatial position of section of object, has two consecutively arranged position detectors, for detection of two dimensional positions, where electronic circuit is provided, which receives output signals |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11567578B2 (en) | Systems and methods of free-space gestural interaction | |
KR102335132B1 (en) | Multi-modal gesture based interactive system and method using one single sensing system | |
CA2481396C (en) | Gesture recognition method and touch system incorporating the same | |
US9767613B1 (en) | Systems and method of interacting with a virtual object | |
US20110298708A1 (en) | Virtual Touch Interface | |
US20110032215A1 (en) | Interactive input system and components therefor | |
US20150169133A1 (en) | Light-based proximity detection system and user interface | |
US20130257734A1 (en) | Use of a sensor to enable touch and type modes for hands of a user via a keyboard | |
KR100974894B1 (en) | 3d space touch apparatus using multi-infrared camera | |
CA2801563A1 (en) | Interactive input system and method | |
US20110304650A1 (en) | Gesture-Based Human Machine Interface | |
US10261653B2 (en) | Method and device for making available a user interface, in particular in a vehicle | |
CN102341814A (en) | Gesture recognition method and interactive input system employing same | |
NO20130843A1 (en) | Camera based, multitouch interaction and lighting device as well as system and method | |
KR101675228B1 (en) | 3d touchscreen device, touchscreen device and method for comtrolling the same and display apparatus | |
US12099645B2 (en) | Systems and/or methods for creating and passively detecting changes in electrical fields | |
US11640198B2 (en) | System and method for human interaction with virtual objects | |
US9201519B2 (en) | Three-dimensional pointing using one camera and three aligned lights | |
KR101575063B1 (en) | multi-user recognition multi-touch interface apparatus and method using depth-camera | |
KR20120136719A (en) | The method of pointing and controlling objects on screen at long range using 3d positions of eyes and hands | |
KR20120072502A (en) | Infrared light touch screen display device and method for determining touch point of the same | |
NL2004333C2 (en) | Method and apparatus for touchlessly inputting information into a computer system. | |
Michel et al. | Building a Multi-Touch Display Based on Computer Vision Techniques. | |
KR101646562B1 (en) | Touchscreen device and method for comtrolling the same and display apparatus | |
US20160274672A1 (en) | Input system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
V1 | Lapsed because of non-payment of the annual fee |
Effective date: 20131001 |