US20120056804A1 - Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications - Google Patents
Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications Download PDFInfo
- Publication number
- US20120056804A1 US20120056804A1 US13/295,340 US201113295340A US2012056804A1 US 20120056804 A1 US20120056804 A1 US 20120056804A1 US 201113295340 A US201113295340 A US 201113295340A US 2012056804 A1 US2012056804 A1 US 2012056804A1
- Authority
- US
- United States
- Prior art keywords
- user
- motion
- manipulated
- physical object
- descriptive
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000004590 computer program Methods 0.000 title claims description 7
- 230000033001 locomotion Effects 0.000 claims description 61
- 238000004891 communication Methods 0.000 claims description 7
- 230000001133 acceleration Effects 0.000 claims description 5
- 238000010079 rubber tapping Methods 0.000 claims description 5
- 210000003811 finger Anatomy 0.000 description 80
- 230000003287 optical effect Effects 0.000 description 10
- 238000010586 diagram Methods 0.000 description 9
- 238000005516 engineering process Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 7
- 230000008569 process Effects 0.000 description 7
- 230000008859 change Effects 0.000 description 5
- 230000005057 finger movement Effects 0.000 description 5
- 239000011159 matrix material Substances 0.000 description 5
- 238000013459 approach Methods 0.000 description 4
- 238000013461 design Methods 0.000 description 4
- 238000003384 imaging method Methods 0.000 description 4
- 238000012545 processing Methods 0.000 description 4
- 230000008901 benefit Effects 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 210000004247 hand Anatomy 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000000877 morphologic effect Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 150000001875 compounds Chemical class 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000011982 device technology Methods 0.000 description 2
- 238000000605 extraction Methods 0.000 description 2
- 230000005484 gravity Effects 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 101150080773 tap-1 gene Proteins 0.000 description 2
- 210000003813 thumb Anatomy 0.000 description 2
- 238000012795 verification Methods 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000004397 blinking Effects 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 125000004122 cyclic group Chemical group 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005686 electrostatic field Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 238000010606 normalization Methods 0.000 description 1
- 238000012567 pattern recognition method Methods 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 229920001690 polydopamine Polymers 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 238000007619 statistical method Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/169—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
Definitions
- the teachings in accordance with the exemplary embodiments of this invention relate generally to user interfaces to electronic devices and, more specifically, relate to manually activated user input devices, methods, systems and computer program products.
- CT computing terminals
- DDS display dominated systems
- the current trend in the development of multimedia device equipment involves hardware miniaturization together with a demand to provide a large input capacity. If the input device can be miniaturized then more space can be allocated for the visualization component(s), particularly in display dominated concept (DDC) devices.
- DDC display dominated concept
- Examples of current user input devices include those based on touch-motion, as in certain music storage and playback devices, and certain personal digital assistant (PDA) and similar devices that are capable of recognizing handwritten letters and commands.
- PDA personal digital assistant
- Also of interest may be certain structured light based systems, such as those described in U.S. Pat. No. 6,690,354 B2 (Feb. 10, 2004), entitled “Method for Enhancing Performance in a System Utilizing an Array of Sensors that Sense at Least Two Dimensions”, Sze; U.S. Pat. No. 6,710,770 (Mar. 23, 2004), entitled “Quasi-Three-Dimensional Method and Apparatus to Detect and Localize Interaction of User-Object and Virtual Transfer Device”, Tomasi et al.; and U.S. Pat. No.
- the exemplary embodiments of this invention provide a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object displayed by the device.
- the exemplary embodiments of this invention provide computer program product embodied in a computer readable medium, execution of the computer program product by at least one data processor resulting in operations that comprise, in response to a user executing a gesture with a user-manipulated physical object in the vicinity of a device, generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to information displayed to the user.
- the exemplary embodiments of this invention provide a device that comprises a unit to display information; an imaging system to generate data that is descriptive of the presence of a user-manipulated object when executing a gesture; and a data processor to interpret the data as pertaining to displayed information.
- the exemplary embodiments of this invention provide a method that includes, in response to a user employing at least one finger to form a gesture in the vicinity of a device, generating data that is descriptive of a presence of the at least one finger in forming the gesture; and interpreting the data as pertaining to at least one object that appears on a display screen.
- the exemplary embodiments of this invention provide an apparatus that includes a display to visualize information; a sensor arrangement that is responsive to the user executing a gesture with a user-manipulated physical object in the vicinity of a surface of the apparatus, the sensor arrangement having an output to provide data descriptive of the presence of the user-manipulated object when executing the gesture; and a unit having an input coupled to the output of the sensor arrangement and operating to interpret the data to identify the executed gesture, and to interpret the identified gesture as pertaining in some manner to visualized information.
- FIG. 1A shows a device that incorporates a plurality of ultrasonic transducers (USTs) as user input devices;
- USTs ultrasonic transducers
- FIG. 1B is a simplified block diagram of the device of FIG. 1A ;
- FIG. 2A shows a further exemplary embodiment of this invention where the USTs are incorporated into a device that embodies a mini-projector;
- FIG. 2B is a simplified block diagram of the mini-projector device of FIG. 2A ;
- FIG. 7 shows the principles of the ultrasonic observation of finger distance
- FIGS. 8A-8D collectively referred to as FIG. 8 , show exemplary finger-based gestures that may be used to select various commands for execution in accordance with further exemplary embodiments of this invention
- FIG. 9 is a logic flow diagram depicting an exemplary finger detection process executed by the device shown in FIG. 10B , and that is suitable for capturing the finger-based gestures shown in FIGS. 8 and 10A ;
- FIG. 10A shows an example of the sensing of multiple points of simultaneous touch detected by device of FIG. 10B ;
- FIG. 10B is a simplified block diagram of a device having a display capable of generating an image of one or more fingertips.
- FIG. 11 is a logic flow diagram that depicts a method in accordance with the exemplary embodiments of this invention.
- FIGS. 1A and 1B collectively referred to as FIG. 1 , that show a device 10 , such as a display dominated device having at least one visual display 12 capable of visualizing information, that incorporates a plurality of ultrasonic transducers (USTs) 14 A, 14 B and 14 C (collectively referred to as USTs 14 ) as user input devices
- FIG. 1B is a simplified block diagram of the device of FIG. 1A .
- the device 10 is assumed to include a data processor (DP) coupled to a memory (MEM) 18 that stores a program 18 A that is suitable for use in implementing this exemplary embodiment of the invention.
- DP data processor
- MEM memory
- the device 10 may be or may include, as non-limiting examples, a PDA, a wireless communications device, a gaming device, an Internet appliance, a remote control device (such as one suitable for use with a TV set or with public interactive billboards), a music storage and playback device, projectors, a video storage and playback device, a multimedia device, a computer such as a desktop or a laptop computer, or in general any type of electronic device that includes a user interface for presenting information to a user (such as a display screen or display surface) and for receiving commands and/or input information from the user.
- the three USTs 14 are arrayed on a surface 10 A of the device 10 and enable the use of triangulation to detect the locations in three dimensional space of the user's fingers 20 A, 20 B (referred to also as finger a, finger b).
- the device 10 exploits the ultrasonic field established in the vicinity of the surface of the device 10 by the USTs 14 to provide a perception technology that enables the device 10 to perceive and react to finger position, and possibly movement, in real time.
- a given UST 14 uses high frequency sound energy to conduct examinations and make measurements.
- a typical pulse/echo set-up configuration is shown in FIG. 7 .
- a typical UST system includes several functional units, such as a pulser/receiver 15 A and the ultrasonic transducer 15 B.
- the pulser/receiver 15 A is an electronic device that can produce mechanical movement and/or an electrical pulse, respectively.
- the transducer 15 B generates high frequency ultrasonic energy.
- the sound energy is introduced and propagates through the air in the form of waves. When there is a discontinuity (such as a finger movement) in the wave path, part of the energy is reflected back from the discontinuity.
- the reflected wave signal is transformed into an electrical signal by the transducer 15 B and is processed to provide distance from the transducer 15 B to the discontinuity (based on a round trip time-of-flight measurement, as is well known).
- the reflected signal strength may be displayed versus the time from signal generation to when an echo was received. Both phase and intensity change of the reflected signal may also be exploited to measure finger-transducer distances.
- the UST 14 system measures the distances to the individual fingers.
- the three UST 14 sensors (which in some exemplary embodiments may have a fixed relative position on the CT) are capable of providing individual finger-sensor distance measurements (a 1 , a 2 , a 3 , b 1 , b 2 , b 3 ).
- the device 10 may be implemented with less than three UST 14 sensors, however by providing the third UST sensor it is possible to use finger movement for execution and basic operational commands (such as, but not limited to, Select; Copy; Paste; Move; Delete) by observation of a change in direction of the finger movement in three dimensional (3D) space.
- the device 10 may also be implemented using more than three UST 14 sensors in form of, for example, a UST sensor array when/if higher spatial detection resolution is needed.
- the range of the detection mechanism In general, it is typically desirable to limit the range of the detection mechanism so that it encompasses a fairly limited volume of space (which may be considered to define a ‘working envelope’) in the vicinity of the sensing surface (whether the sensors be UST sensors or other types of sensors) of the device 10 so as not to, for example, generate unintended inputs due to the presence and/or movement of background objects, such as other parts of the user's body.
- the sensing range will be less than about a meter, and more typically the value will be about, for example, 10-20 cm (or less).
- the maximum sensing range may typically be a function of the sensor technology.
- the UST embodiments of this invention may typically have a greater detection/sensing range than the AMLCD embodiments discussed below.
- “within the vicinity of the device” or sensing surface will be a volume of space, or a plane or more generally a surface, contained within the maximum useful sensing range of the sensing device(s) both in depth (away from the sensing surface) and lateral extent (within an area capable of being sensed from the sensing surface).
- the detected finger position may be translated and presented to the user by displaying two pointers (e.g., two crosses) 12 A, 12 B on the display 12 .
- two pointers e.g., two crosses
- the described UST 14 system may serve to track the finger position of the user in 3D space and in real time.
- Visualization of the tracking (which may be used to provide perceptual feedback to the user) can be performed by showing one or more of the pointers 12 A, 12 B on the display 12 .
- This technique provides visual coordination to the user, and facilitates the manipulation of objects presented on the display 12 (such as icons and command bars).
- a standard set of characters is shown on the display 12 the user may be provided with typewriting (keyboarding) capabilities, where a classical keyboard is replaced by a virtual keyboard.
- Tactile feedback (which appears in mechanical keyboards) can be replaced by, for example, short blinking of a finger “shadow” on the display 12 for indicating that a particular key has been accepted and the character inputted or a corresponding command executed. Furthermore, sound effects may be added to confirm that a certain command has been accepted.
- a displayed pointer (e.g., 12 A) can be associated to the center of gravity of the hand and used to drive/navigate the pointer.
- Such a configuration may significantly simplify the overall requirements (of hardware and software), and is particularly suitable in those cases when only a single pointer navigation/control is required.
- FIG. 2 shows a further exemplary embodiment of this invention where the UST 14 system is incorporated into a device that embodies a mini-projector 30
- FIG. 2B is a simplified block diagram of the mini-projector device 30 of FIG. 2A
- Components that are found as well in FIG. 1 are numbered accordingly.
- the mini-projector device 30 includes a projector or projection engine 32 coupled to the DP 16 and projects an image 34 for viewing by the user.
- the image 34 may be considered to be on a “display screen” or a “display surface”.
- Pointers 34 A, 34 B corresponding to the locations of the user's fingers 20 A, 20 B can be displayed as well.
- the mini-projector device 30 may be linked via some wired or a wireless interface 36 , such as a Bluetooth transceiver, to a phone or other multimedia device 38 , and may display data sourced by the device 38 .
- a wireless interface 36 such as a Bluetooth transceiver
- the same or a similar UST 14 scanning concept may be employed as in FIG. 1 .
- the resulting user input system based on finger/hand placement and/or movement, combined with the projector engine 32 may be exploited for use in, for example, advanced gaming concepts that combine a large projected image and user gesture-based input.
- the use of a gesture-based language with the larger format displayed image 34 enables enhancements to be made to gaming concepts, as well as the design of games based on dynamical user movements in 3D.
- Select object attributed to the pointer position Open/closed hand (see FIGS. 3A , 3 B) Forward/Backward Browsing: anticlockwise/clockwise cyclic rotation by a single finger (see FIGS. 4A-4D ) Zoom In/Out: expand/close two fingers (see FIGS. 5A , 5 B and 8 D) Run/Execute pre-selected icon/command: make a circle with two fingers (see FIG. 6 ).
- the exemplary gesture protocols described above enabling manipulation of objects on the display 14 (or the projected display 34 , by finger movements or gestures.
- exemplary protocols provide a large capacity and design freedom for gesture-based commands and language, and may be used to exploit the full spectrum of the multimedia device 10 capabilities, while also providing enhancements for gaming and other similar applications.
- the use of exemplary embodiments of this invention are also well suited for use with Internet browser and similar applications, such as when scrolling through HTML pages and selecting links within a displayed page.
- the exemplary embodiments of this invention can be used with, as several non-limiting examples, gesture-based gaming devices, wireless communications devices, computers and appliances containing computers, robotics communication systems, communication systems for handicapped persons and navigation tables.
- gesture-based gaming devices can be used with, as several non-limiting examples, gesture-based gaming devices, wireless communications devices, computers and appliances containing computers, robotics communication systems, communication systems for handicapped persons and navigation tables.
- the ability provided by the exemplary ultrasonic-based embodiments of this invention to significantly reduce the physical size of the user input device(s) enables a corresponding increase in the surface are of the user display device, which is beneficial in small, handheld and portable devices, such as PDAs and cellular telephones as two non-limiting examples.
- exemplary embodiments of this invention that also use user-manipulated object (e.g., finger-based) gestures, wherein the gestures are detected through the use of an imaging-type device or system, such as one incorporated into the display device, such as one constructed in accordance with the Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) display device technology referenced above.
- AMLCD Active Matrix LCD with Integrated Optical Touch Screen
- These exemplary embodiments also provide for command/data definition and communication with a computation platform by exploiting finger gestures attributed to predefined commands and protocols, and are suitable for use with DDC devices that employ a minimal number of keymat/keyboards and maximized size of visual display in current and future devices.
- the following exemplary and non-limiting gestures and attributed commands may be employed.
- FIG. 10B shown a block diagram of an exemplary device 50 having a display 52 that is capable of recording an image of the user's finger tip(s), such as the images depicted in FIGS. 8 and/or 10 A.
- the display 52 in this case may be one constructed in accordance with the Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) display device technology referenced above. Note that advanced scanning (e.g., text and bar codes) is possible to accomplish. In other embodiments a separate camera or cameras may be provided so as to image the user's finger(s)/hand(s), such as through the transparent surface of the display 52 .
- AMLCD Active Matrix LCD with Integrated Optical Touch Screen
- the display 52 simultaneously captures images of the user's fingertips at five discrete locations on the surface of the display 52 .
- this particular pattern may be interpreted as being one particular gesture, whereas the presence of four fingertips (e.g., not the thumb) may be interpreted as being another particular gesture.
- the spacing between five or fewer fingertips may be varied to encode a plurality of different gestures, as can differences in angular orientations of the fingertips one to another.
- the program 18 A may be adapted to execute a program in accordance with the logic flow diagram shown in FIG. 9 (see also FIG. 6 of the above-referenced publication: 59.3, A. Abileah et al., “Integrated Optical Touch Panel in a 14.1” AMLCD′′).
- the tips of fingers are extracted from the captured image and the extraction results are recorded. Based on these records, the system decides whether to begin the recognition process. Regardless of whether the recognition process begins, the system also needs to determine whether to and when to delete stored records (this may be timer based). Whenever a new image is captured, all or at least some of the steps are repeated.
- the fingertip in the captured image can be considered as an object described by data expressive of, as non-limiting examples, a center of gravity, a bounding edge, and a different brightness than the background.
- image segmentation methods There are many image segmentation methods that may be used for fingertip image extraction.
- One exemplary and non-limiting segmentation method is the Watershed method.
- the Watershed is a function that applies a morphological watershed operator to an image (a grayscale image typically).
- the watershed operator segments images into watershed regions and their boundaries. Considering the gray scale image as a surface, each local minimum can be thought of as the point to which water falling on the surrounding region drains. The boundaries of the watersheds lie on the tops of the ridges. The operator labels each watershed region with a unique index, and sets the boundaries to zero.
- morphological gradients, or images containing extracted edges are used for input to the watershed operator. Noise and small unimportant fluctuations in the original image can produce spurious minima in the gradients, which can lead to oversegmentation.
- a set with three members can be used to represent the state of one fingertip: two for the coordinates of the tip and one to represent whether it touches the surface or not (touch state).
- a stack or queue is a suitable data structure for recording the coordinates when the finger tip touches the surface.
- a timer or counter may be used to record when the finger tip leaves the surface.
- the task of gesture recognition in accordance with the exemplary embodiments of this invention is to select the correct command/operation from a set of candidates, according to the input gesture.
- the conditions for starting the gesture recognition step may depend on the content of the set. For example, if only the X mark and check mark (see FIGS. 8B , 8 C) are included in the set, the condition can be set as a threshold for the number of continuous images which do not contain fingertips. If the zoom in/out gestures are added to the set, a new condition, when two fingertips are detected in one image (see FIG. 8D ), can be used to initiate the gesture recognition process.
- gesture recognition There are many different pattern recognition methods that may be employed for gesture recognition. For example, one based on statistical methods may be used as it is inherently its robust. Normalization and/or smoothing techniques may be included as part of the gesture recognition process.
- the ability to record the states of fingertip images facilitates gesture recognition.
- these records should be deleted when they are not useful.
- the records indicating the trace of the fingertip can be deleted as soon as the trace is recognized as a check mark (see FIG. 8C ).
- the trace may be deleted, preferably, only after the fingertips have left the surface of the display 52 .
- an aspect of this invention is the sequential creation of individual ones of a plurality of records, where individual ones of the plurality of records comprise data descriptive of a location of the user-manipulated physical object at a corresponding point in time while the gesture is executed.
- the DP 16 may be any type of suitable data processor, such as one embodied within an integrated circuit, and the memory 18 may be any type of suitable data storage device, including semiconductor-based and magnetic disk- or tape-based data storage devices.
- FIG. 8 is depicted the “image” recorded by the image-capable display 52 of the user's finger(s) in contact with the top surface of the display 52 when making the corresponding gesture.
- the feature labeled as 40 in FIG. 8 represents the current location (current image) of the user's finger tip(s), while the feature labeled as 42 represents the prior images made during motion of the user's finger tip(s), i.e., the finger tip trace that was referred to above.
- the arrows generally indicate the direction of motion of the finger tip 40 .
- the use of the display 52 can provide for one finger and multiple finger-based gestures to be recorded and processed in accordance with the exemplary embodiments of this invention. Several non-limiting examples are now provided.
- Attributed command Browsing/Scrolling/Listing applications 2. Gesture: Subsequent tapping by a single finger (Tap 1 -Tap 1 . . . ) Attributed command: Activate device/phone, Run/Execute pre-selected option 3. Gesture: Finger stays motionless (over certain time threshold) above some object/icon Attributed command: Select the object/icon 4. Gesture: Finger stays above some item/object/icon/followed by slow movement Attributed command: Select the item/object/icon till end of the move 5. Gesture: Crossed Perpendicular lines (X mark, see FIG. 8B ) Attributed command: Delete 6. Gesture: Perpendicular moving breach (Check mark, see FIG. 8C ) Attributed command: Acceptance & Verification 7. Gesture: Enclosed Curve around items/icons to be selected Attributed command. Select group of items/icons
- Gesture Linear approaching/digression (fingers approach, then move apart, and vice versa, see FIG. 8D ) Attributed command: Zoom-In/Out, Size adjustment 9. Gesture: Simultaneous touching of an icon/object by two fingers Attributed command: Select the icon/object ready for size adjustment 10. Gesture: Simultaneous tapping by two fingers (Tap 1 & 2 , Tap 1 & 2 , repeated.) Attributed command: High-level importance Acceptance & Verification 11.
- the protocols described above enable manipulation and/or selection of objects on the display 52 by movements of a user-manipulated physical object, such as one or more fingers.
- the use of these protocols provide a large input capacity as well as design freedom for gesture-based commands and language, which can also be used to exploit the full power of the device 50 .
- Gaming devices can also benefit from their use.
- a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device (Step 11 A); generating data that is descriptive of the motion made by the user-manipulated object when executing the gesture (Step 11 B) and interpreting the data as pertaining to (e.g., a command) at least one object that appears on a display screen (Step 11 C).
- I/O input/output
- 2D detection systems touch screen displays
- 3D detection systems such as the UST embodiments discussed above
- camera-based systems such as the UST embodiments discussed above
- camera-microphone based virtual keyboards 3D detection systems
- Structured light systems such as laser-based light projection/detection systems, can also be used, as may a touch pad input device, as additional non-limiting examples.
- these exemplary embodiments of this invention provide display dominated concept devices with a minimal number of required keys, provide for realizing a gesture-based input device, and further do not require any significant hardware to be provided.
- the commands and their interpretation can be determined by software protocols.
- the use of these exemplary embodiments of this invention provide a possibility for command customization by the user (personalization).
- the user may define the Delete gesture to be one different than the one shown in FIG. 8B , such as by defining a circle with a diagonal line drawn through it to be the Delete gesture.
- the motion made by the user-manipulated physical object may comprise one or more of a substantially circular motion, a substantially linear motion, at least one substantially circular motion in combination with at least one substantially linear motion, at least one of a substantially circular motion and a substantially linear motion in combination with a period of substantially no motion, a substantially curved motion and a tapping motion.
- the motion may comprise movement of one finger relative to at least one other finger.
- the data recorded may be descriptive of at least one of a velocity and an acceleration of the user-manipulated physical object.
- the data recorded may be descriptive of at least a shape assumed by the at least one finger.
- the exemplary embodiments of this invention may provide an initial user training session where the user enters the same gesture several times when prompted by the program 18 A in order to train the gesture recognition process to the particular finger motions and/or velocities, and possibly the finger tip size, that are characteristic of the user. This can be useful for, as an example, establishing the specific threshold or thresholds used by the gesture recognition processes.
- an actual touching of the surface of the display 52 may not be necessary if there is sufficient ambient lighting so that the finger tip image can be acquired even when the finger tip is not actually in contact with the surface of the display device.
- the finger or finger tip image may be acquired optically within the three dimensional space in the vicinity of the device 50 .
- the UST system of FIGS. 1-7 may be used in the same device in conjunction with the hand, finger or finger tip image embodiments of FIGS. 8-10 , and at any given time information derived from one or the other, or both, may be used for gesture recognition.
- two or more similar or different object sensing technologies may be used together in the same device.
- the DP 16 may perform substantially all of the required processing, based on program instructions contained in the program 18 A stored in the memory 18 .
- the actual image generation processing may be performed in the imaging system by a local embedded data processor, and the results may be passed to and processed by the DP 16 for performing the gesture recognition and interpretation operations.
- certain hand/finger gestures may be defined to have a standardized and universal meaning across different devices, applications and languages.
- One non-limiting example may be the index finger and thumb formed into a circle, with the remaining three fingers extended (an OK gesture), which may interpreted universally as, for example, “save and close a file”.
- OK gesture an OK gesture
- the use of the exemplary embodiments of this invention facilitates this type of operation.
- an aspect of the exemplary embodiments of this invention is a method, a computer program product and an apparatus that are responsive to a user executing a gesture with a user-manipulated physical object in the vicinity of a device to generate data that is descriptive of the presence of the user-manipulated physical object when executing the gesture and to interpret the data as pertaining to at least one object.
- the “presence of the user-manipulated physical object” may include, but need not be limited to, the spatial orientation of the user-manipulated physical object in two or three dimensional space, the repose of the user-manipulated physical object in two or three dimensional space, a shape formed by the user-manipulated physical object in two or three dimensional space, the motion being made by the user-manipulated physical object in two or three dimensional space, the velocity of the user-manipulated physical object in two or three dimensional space, the acceleration of the user-manipulated physical object in two or three dimensional space, and combinations thereof.
- the patterns traced by the user's fingertip when executing the gestures shown in FIG. 8A for the clockwise and contraclockwise fingertip motions may be identical (i.e., a circle or oval), however the two gestures are distinguishable one from the other by the sensing of the direction of the fingertip motion in real or substantially real time.
- both fixed and scanning type sensors may be used, such as UST systems/components that scan an ultrasonic beam through the environment.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
A method includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object, such as an object displayed by the device.
Description
- The teachings in accordance with the exemplary embodiments of this invention relate generally to user interfaces to electronic devices and, more specifically, relate to manually activated user input devices, methods, systems and computer program products.
- Input devices employed in the converging multimedia electronics industry are becoming increasingly important. The human-computing terminal interface has long challenged systems designers, yet has not significantly evolved since the advent of the mouse several decades ago. This is a particularly challenging problem in the area of mobile and wireless devices, where the objectives of device miniaturization and usability directly conflict with one another. A natural and intelligent interaction between humans and computing terminals (CT) can be achieved if the simplest modalities, such as finger movement and/or user gestures, are used to provide basic input information to the CT (non-limiting examples of which can include multimedia terminals, communication terminals, display dominated systems (DDS) and devices, gaming devices and laptop computers).
- Technology related to input devices has conventionally relied on a set of electro-mechanical switches (such as the classic keyboard). Such an approach requires a relatively large area for a set of switches (keyboard keys), which are usually dedicated to only one operation. A more advanced solution is offered by touch screen displays where touch sensitive switches are embedded into the display itself, such as in Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) technology. In this approach the “single button” trend is evolving towards that of a “distributed sensor system” that may be embedded into the device and/or even directly into the display itself (AMLCD). The physical operation of such a sensor-based input device can be based on mechanical movement of different materials, change of electrical conductivity/capacity, influences by electrostatic field or optical properties (made by finger shadow/reflection from the surface). Reference with regard to AMLCD technology may be made to documents: 56.3, W. den Boer et al., “Active Matrix LCD with Integrated Optical Touch Screen”, SID 03 Digest (Baltimore, 2003) pgs. 1494-1497, and to 59.3, A. Abileah et al., “Integrated Optical Touch Panel in a 14.1” AMLCD″, SID 04 Digest, v. 35, Issue 1, pgs. 1544-1547, and incorporated by reference herein in their entireties.
- Reference may also be made to U.S. Pat. No. 7,009,663 B2 (Mar. 7, 2006), entitled “Integrated Optical Light Sensitive Active Matrix Liquid Crystal display”, A. Abileah et al., and U.S. Pat. No. 7,053,967 B2 (May 30, 2006), entitled “Light Sensitive Display”, A. Abileah et al. (both assigned to Planar Systems, Inc.), which are incorporated by reference herein in their entireties.
- The current trend in the development of multimedia device equipment involves hardware miniaturization together with a demand to provide a large input capacity. If the input device can be miniaturized then more space can be allocated for the visualization component(s), particularly in display dominated concept (DDC) devices. The situation in gaming devices is even more challenging, since improvements in the input devices may provide new design freedom and additional game-related functionalities.
- Examples of current user input devices include those based on touch-motion, as in certain music storage and playback devices, and certain personal digital assistant (PDA) and similar devices that are capable of recognizing handwritten letters and commands.
- Also of interest may be certain structured light based systems, such as those described in U.S. Pat. No. 6,690,354 B2 (Feb. 10, 2004), entitled “Method for Enhancing Performance in a System Utilizing an Array of Sensors that Sense at Least Two Dimensions”, Sze; U.S. Pat. No. 6,710,770 (Mar. 23, 2004), entitled “Quasi-Three-Dimensional Method and Apparatus to Detect and Localize Interaction of User-Object and Virtual Transfer Device”, Tomasi et al.; and U.S. Pat. No. 7,050,177 B2 (May 23, 2006), entitled “Method and Apparatus for Approximating Depth of an Object's Placement Onto a Monitored Region with Applications to Virtual Interface Devices”, Tomasi et al. (all assigned to Canesta, Inc.), which are incorporated by reference herein in their entireties.
- The foregoing and other problems are overcome, and other advantages are realized, in accordance with the non-limiting and exemplary embodiments of this invention.
- In accordance with one aspect thereof the exemplary embodiments of this invention provide a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device; generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to at least one object displayed by the device.
- In accordance with another aspect thereof the exemplary embodiments of this invention provide computer program product embodied in a computer readable medium, execution of the computer program product by at least one data processor resulting in operations that comprise, in response to a user executing a gesture with a user-manipulated physical object in the vicinity of a device, generating data that is descriptive of the presence of the user-manipulated object when executing the gesture; and interpreting the data as pertaining to information displayed to the user.
- In accordance with a further aspect thereof the exemplary embodiments of this invention provide a device that comprises a unit to display information; an imaging system to generate data that is descriptive of the presence of a user-manipulated object when executing a gesture; and a data processor to interpret the data as pertaining to displayed information.
- In accordance with a further aspect thereof the exemplary embodiments of this invention provide a method that includes, in response to a user employing at least one finger to form a gesture in the vicinity of a device, generating data that is descriptive of a presence of the at least one finger in forming the gesture; and interpreting the data as pertaining to at least one object that appears on a display screen.
- In accordance with a still further aspect thereof the exemplary embodiments of this invention provide an apparatus that includes a display to visualize information; a sensor arrangement that is responsive to the user executing a gesture with a user-manipulated physical object in the vicinity of a surface of the apparatus, the sensor arrangement having an output to provide data descriptive of the presence of the user-manipulated object when executing the gesture; and a unit having an input coupled to the output of the sensor arrangement and operating to interpret the data to identify the executed gesture, and to interpret the identified gesture as pertaining in some manner to visualized information.
- The foregoing and other aspects of the teachings of this invention are made more evident in the following Detailed Description, when read in conjunction with the attached Drawing Figures, wherein:
-
FIG. 1A shows a device that incorporates a plurality of ultrasonic transducers (USTs) as user input devices; -
FIG. 1B is a simplified block diagram of the device ofFIG. 1A ; -
FIG. 2A shows a further exemplary embodiment of this invention where the USTs are incorporated into a device that embodies a mini-projector; -
FIG. 2B is a simplified block diagram of the mini-projector device ofFIG. 2A ; -
FIGS. 3A , 3B, collectively referred to asFIG. 3 ,FIGS. 4A-4D , collectively referred to asFIG. 4 ,FIGS. 5A , 5B, collectively referred to asFIG. 5 , andFIG. 6 depict exemplary finger-based gestures that may be used to select various commands for execution in accordance with exemplary embodiments of this invention; -
FIG. 7 shows the principles of the ultrasonic observation of finger distance; -
FIGS. 8A-8D , collectively referred to asFIG. 8 , show exemplary finger-based gestures that may be used to select various commands for execution in accordance with further exemplary embodiments of this invention; -
FIG. 9 is a logic flow diagram depicting an exemplary finger detection process executed by the device shown inFIG. 10B , and that is suitable for capturing the finger-based gestures shown inFIGS. 8 and 10A ; -
FIG. 10A shows an example of the sensing of multiple points of simultaneous touch detected by device ofFIG. 10B ; -
FIG. 10B is a simplified block diagram of a device having a display capable of generating an image of one or more fingertips; and -
FIG. 11 is a logic flow diagram that depicts a method in accordance with the exemplary embodiments of this invention. - Reference is made to
FIGS. 1A and 1B , collectively referred to asFIG. 1 , that show adevice 10, such as a display dominated device having at least onevisual display 12 capable of visualizing information, that incorporates a plurality of ultrasonic transducers (USTs) 14A, 14B and 14C (collectively referred to as USTs 14) as user input devices, whileFIG. 1B is a simplified block diagram of the device ofFIG. 1A . Note inFIG. 1B that thedevice 10 is assumed to include a data processor (DP) coupled to a memory (MEM) 18 that stores aprogram 18A that is suitable for use in implementing this exemplary embodiment of the invention. Thedevice 10 may be or may include, as non-limiting examples, a PDA, a wireless communications device, a gaming device, an Internet appliance, a remote control device (such as one suitable for use with a TV set or with public interactive billboards), a music storage and playback device, projectors, a video storage and playback device, a multimedia device, a computer such as a desktop or a laptop computer, or in general any type of electronic device that includes a user interface for presenting information to a user (such as a display screen or display surface) and for receiving commands and/or input information from the user. - In the exemplary embodiment of
FIG. 1 the three USTs 14 are arrayed on asurface 10A of thedevice 10 and enable the use of triangulation to detect the locations in three dimensional space of the user'sfingers device 10 exploits the ultrasonic field established in the vicinity of the surface of thedevice 10 by the USTs 14 to provide a perception technology that enables thedevice 10 to perceive and react to finger position, and possibly movement, in real time. - In general, a given UST 14 uses high frequency sound energy to conduct examinations and make measurements. To illustrate the general principle, a typical pulse/echo set-up configuration is shown in
FIG. 7 . A typical UST system includes several functional units, such as a pulser/receiver 15A and theultrasonic transducer 15B. The pulser/receiver 15A is an electronic device that can produce mechanical movement and/or an electrical pulse, respectively. Driven by the pulser portion thetransducer 15B generates high frequency ultrasonic energy. The sound energy is introduced and propagates through the air in the form of waves. When there is a discontinuity (such as a finger movement) in the wave path, part of the energy is reflected back from the discontinuity. The reflected wave signal is transformed into an electrical signal by thetransducer 15B and is processed to provide distance from thetransducer 15B to the discontinuity (based on a round trip time-of-flight measurement, as is well known). The reflected signal strength may be displayed versus the time from signal generation to when an echo was received. Both phase and intensity change of the reflected signal may also be exploited to measure finger-transducer distances. - When the user's finger(s) or more generally hand(s) enter the scanned field in front of the
device 10 the UST 14 system measures the distances to the individual fingers. The three UST 14 sensors (which in some exemplary embodiments may have a fixed relative position on the CT) are capable of providing individual finger-sensor distance measurements (a1, a2, a3, b1, b2, b3). Note that thedevice 10 may be implemented with less than three UST 14 sensors, however by providing the third UST sensor it is possible to use finger movement for execution and basic operational commands (such as, but not limited to, Select; Copy; Paste; Move; Delete) by observation of a change in direction of the finger movement in three dimensional (3D) space. Thedevice 10 may also be implemented using more than three UST 14 sensors in form of, for example, a UST sensor array when/if higher spatial detection resolution is needed. - In general, it is typically desirable to limit the range of the detection mechanism so that it encompasses a fairly limited volume of space (which may be considered to define a ‘working envelope’) in the vicinity of the sensing surface (whether the sensors be UST sensors or other types of sensors) of the
device 10 so as not to, for example, generate unintended inputs due to the presence and/or movement of background objects, such as other parts of the user's body. Typically the sensing range will be less than about a meter, and more typically the value will be about, for example, 10-20 cm (or less). The maximum sensing range may typically be a function of the sensor technology. For example, the UST embodiments of this invention may typically have a greater detection/sensing range than the AMLCD embodiments discussed below. As can be appreciated, when the user places a finger or fingers, or a hand or hands, within the vicinity of thedevice 10, “within the vicinity of the device” or sensing surface will be a volume of space, or a plane or more generally a surface, contained within the maximum useful sensing range of the sensing device(s) both in depth (away from the sensing surface) and lateral extent (within an area capable of being sensed from the sensing surface). - Note in
FIG. 1A that the detected finger position may be translated and presented to the user by displaying two pointers (e.g., two crosses) 12A, 12B on thedisplay 12. - The described UST 14 system may serve to track the finger position of the user in 3D space and in real time. Visualization of the tracking (which may be used to provide perceptual feedback to the user) can be performed by showing one or more of the
pointers display 12. This technique provides visual coordination to the user, and facilitates the manipulation of objects presented on the display 12 (such as icons and command bars). Furthermore, if a standard set of characters is shown on thedisplay 12 the user may be provided with typewriting (keyboarding) capabilities, where a classical keyboard is replaced by a virtual keyboard. Tactile feedback (which appears in mechanical keyboards) can be replaced by, for example, short blinking of a finger “shadow” on thedisplay 12 for indicating that a particular key has been accepted and the character inputted or a corresponding command executed. Furthermore, sound effects may be added to confirm that a certain command has been accepted. - In some applications, instead of detecting particular fingers, all or some of the entire hand can be detected. In other words, a displayed pointer (e.g., 12A) can be associated to the center of gravity of the hand and used to drive/navigate the pointer. Such a configuration may significantly simplify the overall requirements (of hardware and software), and is particularly suitable in those cases when only a single pointer navigation/control is required.
-
FIG. 2 shows a further exemplary embodiment of this invention where the UST 14 system is incorporated into a device that embodies a mini-projector 30, whileFIG. 2B is a simplified block diagram of themini-projector device 30 ofFIG. 2A . Components that are found as well inFIG. 1 are numbered accordingly. Themini-projector device 30 includes a projector orprojection engine 32 coupled to theDP 16 and projects animage 34 for viewing by the user. For the purposes of this invention theimage 34 may be considered to be on a “display screen” or a “display surface”.Pointers fingers mini-projector device 30 may be linked via some wired or awireless interface 36, such as a Bluetooth transceiver, to a phone orother multimedia device 38, and may display data sourced by thedevice 38. The same or a similar UST 14 scanning concept may be employed as inFIG. 1 . Furthermore, the resulting user input system based on finger/hand placement and/or movement, combined with theprojector engine 32, may be exploited for use in, for example, advanced gaming concepts that combine a large projected image and user gesture-based input. The use of a gesture-based language with the larger format displayedimage 34 enables enhancements to be made to gaming concepts, as well as the design of games based on dynamical user movements in 3D. - The use of real-time finger tracing and the presentation of attributed pointers on the display/
projector image 12/34 can be used to determine basic object-oriented or gesture-oriented commands. Commands such as: Select, Copy, Paste, Move, Delete and Switch may be applied on different displayed objects (such as icons, boxes, scroll-bars and files). These may be classified as object-oriented and gesture/browsing oriented operations, as follows in accordance with several non-limiting examples. -
- Select: Finger 1 at a display corner or some reserved area
-
Finger 2 moves slowly under a displayed object to be selected
-
- Copy: when selected click by single finger on the object
- Paste: fast double click by a single finger
- Move: move slowly two fingers located on the moving object
- Delete: double (fast) click by two fingers on previously selected object
- Switch: switching (on/off) is based on change in direction of fingers movement or, alternatively, on a change in finger acceleration.
- Select object attributed to the pointer position: Open/closed hand (see
FIGS. 3A , 3B) Forward/Backward Browsing: anticlockwise/clockwise cyclic rotation by a single finger (seeFIGS. 4A-4D )
Zoom In/Out: expand/close two fingers (seeFIGS. 5A , 5B and 8D)
Run/Execute pre-selected icon/command: make a circle with two fingers (seeFIG. 6 ). The exemplary gesture protocols described above enabling manipulation of objects on the display 14 (or the projecteddisplay 34, by finger movements or gestures. These exemplary protocols provide a large capacity and design freedom for gesture-based commands and language, and may be used to exploit the full spectrum of themultimedia device 10 capabilities, while also providing enhancements for gaming and other similar applications. The use of exemplary embodiments of this invention are also well suited for use with Internet browser and similar applications, such as when scrolling through HTML pages and selecting links within a displayed page. - Note that while the use of one or more fingers has been described above, it is also within the scope of the exemplary embodiments to employ at least in part a stylus or some other object that is held and manipulated by the user in the energy field of the USTs 14. All such objects, including a human finger or fingers, hand or hands, stick or stylus may be referred to for convenience as a user-manipulated physical object.
- It should be appreciated that the exemplary embodiments of this invention can be used with, as several non-limiting examples, gesture-based gaming devices, wireless communications devices, computers and appliances containing computers, robotics communication systems, communication systems for handicapped persons and navigation tables. Note that the ability provided by the exemplary ultrasonic-based embodiments of this invention to significantly reduce the physical size of the user input device(s) enables a corresponding increase in the surface are of the user display device, which is beneficial in small, handheld and portable devices, such as PDAs and cellular telephones as two non-limiting examples.
- Note further that the use of the foregoing exemplary embodiments of this invention does not require the user to wear any additional hardware on the hands or fingers. Further, the scalability is improved since the size of a “finger/hand” can be reduced arbitrarily and is not limited to any certain finger/stylus size.
- Described now are further exemplary embodiments of this invention, that also use user-manipulated object (e.g., finger-based) gestures, wherein the gestures are detected through the use of an imaging-type device or system, such as one incorporated into the display device, such as one constructed in accordance with the Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) display device technology referenced above. These exemplary embodiments also provide for command/data definition and communication with a computation platform by exploiting finger gestures attributed to predefined commands and protocols, and are suitable for use with DDC devices that employ a minimal number of keymat/keyboards and maximized size of visual display in current and future devices. In these various embodiments, the following exemplary and non-limiting gestures and attributed commands may be employed.
-
FIG. 10B shown a block diagram of anexemplary device 50 having adisplay 52 that is capable of recording an image of the user's finger tip(s), such as the images depicted inFIGS. 8 and/or 10A. Thedisplay 52 in this case may be one constructed in accordance with the Active Matrix LCD with Integrated Optical Touch Screen (AMLCD) display device technology referenced above. Note that advanced scanning (e.g., text and bar codes) is possible to accomplish. In other embodiments a separate camera or cameras may be provided so as to image the user's finger(s)/hand(s), such as through the transparent surface of thedisplay 52. - In the example shown in
FIG. 10A thedisplay 52 simultaneously captures images of the user's fingertips at five discrete locations on the surface of thedisplay 52. Note that this particular pattern may be interpreted as being one particular gesture, whereas the presence of four fingertips (e.g., not the thumb) may be interpreted as being another particular gesture. The spacing between five or fewer fingertips may be varied to encode a plurality of different gestures, as can differences in angular orientations of the fingertips one to another. - The
program 18A may be adapted to execute a program in accordance with the logic flow diagram shown inFIG. 9 (see alsoFIG. 6 of the above-referenced publication: 59.3, A. Abileah et al., “Integrated Optical Touch Panel in a 14.1” AMLCD″). - In general, the tips of fingers are extracted from the captured image and the extraction results are recorded. Based on these records, the system decides whether to begin the recognition process. Regardless of whether the recognition process begins, the system also needs to determine whether to and when to delete stored records (this may be timer based). Whenever a new image is captured, all or at least some of the steps are repeated.
- The fingertip in the captured image (feature 40 in
FIG. 8 ) can be considered as an object described by data expressive of, as non-limiting examples, a center of gravity, a bounding edge, and a different brightness than the background. There are many image segmentation methods that may be used for fingertip image extraction. One exemplary and non-limiting segmentation method is the Watershed method. - Briefly, the Watershed is a function that applies a morphological watershed operator to an image (a grayscale image typically). The watershed operator segments images into watershed regions and their boundaries. Considering the gray scale image as a surface, each local minimum can be thought of as the point to which water falling on the surrounding region drains. The boundaries of the watersheds lie on the tops of the ridges. The operator labels each watershed region with a unique index, and sets the boundaries to zero. Typically, morphological gradients, or images containing extracted edges are used for input to the watershed operator. Noise and small unimportant fluctuations in the original image can produce spurious minima in the gradients, which can lead to oversegmentation. The use of a smoothing operator, or manually marking seed points, are two exemplary approaches to avoiding oversegmentation. Further reference with regard to the Watershed function can be made to, for example, Dougherty, “An Introduction to Morphological Image Processing”, SPIE Optical Engineering Press, 1992.
- A set with three members can be used to represent the state of one fingertip: two for the coordinates of the tip and one to represent whether it touches the surface or not (touch state). A stack or queue is a suitable data structure for recording the coordinates when the finger tip touches the surface. A timer or counter may be used to record when the finger tip leaves the surface.
- The task of gesture recognition in accordance with the exemplary embodiments of this invention is to select the correct command/operation from a set of candidates, according to the input gesture. The conditions for starting the gesture recognition step may depend on the content of the set. For example, if only the X mark and check mark (see
FIGS. 8B , 8C) are included in the set, the condition can be set as a threshold for the number of continuous images which do not contain fingertips. If the zoom in/out gestures are added to the set, a new condition, when two fingertips are detected in one image (seeFIG. 8D ), can be used to initiate the gesture recognition process. - There are many different pattern recognition methods that may be employed for gesture recognition. For example, one based on statistical methods may be used as it is inherently its robust. Normalization and/or smoothing techniques may be included as part of the gesture recognition process.
- The ability to record the states of fingertip images facilitates gesture recognition. However, these records should be deleted when they are not useful. For example, the records indicating the trace of the fingertip can be deleted as soon as the trace is recognized as a check mark (see
FIG. 8C ). However, for the zoom in/out gesture (seeFIG. 8D ), the trace may be deleted, preferably, only after the fingertips have left the surface of thedisplay 52. - In general, it can be appreciated that an aspect of this invention is the sequential creation of individual ones of a plurality of records, where individual ones of the plurality of records comprise data descriptive of a location of the user-manipulated physical object at a corresponding point in time while the gesture is executed.
- Note that in the various exemplary embodiments discussed above the
DP 16 may be any type of suitable data processor, such as one embodied within an integrated circuit, and thememory 18 may be any type of suitable data storage device, including semiconductor-based and magnetic disk- or tape-based data storage devices. - In
FIG. 8 is depicted the “image” recorded by the image-capable display 52 of the user's finger(s) in contact with the top surface of thedisplay 52 when making the corresponding gesture. The feature labeled as 40 inFIG. 8 represents the current location (current image) of the user's finger tip(s), while the feature labeled as 42 represents the prior images made during motion of the user's finger tip(s), i.e., the finger tip trace that was referred to above. The arrows generally indicate the direction of motion of thefinger tip 40. - The use of the
display 52 can provide for one finger and multiple finger-based gestures to be recorded and processed in accordance with the exemplary embodiments of this invention. Several non-limiting examples are now provided. - Attributed command: Browsing/Scrolling/Listing applications
2. Gesture: Subsequent tapping by a single finger (Tap1-Tap1 . . . )
Attributed command: Activate device/phone, Run/Execute pre-selected option
3. Gesture: Finger stays motionless (over certain time threshold) above some object/icon
Attributed command: Select the object/icon
4. Gesture: Finger stays above some item/object/icon/followed by slow movement
Attributed command: Select the item/object/icon till end of the move
5. Gesture: Crossed Perpendicular lines (X mark, seeFIG. 8B )
Attributed command: Delete
6. Gesture: Perpendicular moving breach (Check mark, seeFIG. 8C )
Attributed command: Acceptance & Verification
7. Gesture: Enclosed Curve around items/icons to be selected
Attributed command. Select group of items/icons - 8. Gesture: Linear approaching/digression (fingers approach, then move apart, and vice versa, see
FIG. 8D )
Attributed command: Zoom-In/Out, Size adjustment
9. Gesture: Simultaneous touching of an icon/object by two fingers
Attributed command: Select the icon/object ready for size adjustment
10. Gesture: Simultaneous tapping by two fingers (Tap1&2, Tap1&2, repeated.)
Attributed command: High-level importance Acceptance & Verification
11. Gesture: One finger stays above an icon/object then object-specific menu appears; the other finger performs circular rotations and toggles through the menu options, lifting simultaneously both fingers up selects and executes a menu option
Attributed command/application: Select & Execute a menu option - Appropriate combination of the basic gestures described above can be used to perform some of compound gestures such COPY, CUT, PASTE etc. For example;
-
- COPY=SELECT+Check mark inside (performed in vicinity of selected item)
- CUT=SELECT+X mark inside
- PASTE is based on COPY assuming that an indicator of the clipboard content is visible on the screen after COPY, then one TAP on the clipboard may create PASTE command and paste the content at the pointer or pre selected item/icon
- The protocols described above enable manipulation and/or selection of objects on the
display 52 by movements of a user-manipulated physical object, such as one or more fingers. The use of these protocols provide a large input capacity as well as design freedom for gesture-based commands and language, which can also be used to exploit the full power of thedevice 50. Gaming devices can also benefit from their use. - Referring to
FIG. 11 , in accordance with the various embodiments of this invention described above it can be appreciated that there is provided a method that includes executing a gesture with a user-manipulated physical object in the vicinity of a device (Step 11A); generating data that is descriptive of the motion made by the user-manipulated object when executing the gesture (Step 11B) and interpreting the data as pertaining to (e.g., a command) at least one object that appears on a display screen (Step 11C). - Note that different input/output (I/O) technologies can be used to implement the gesture based protocols, from touch screen displays (2D detection systems) to 3D detection systems such as the UST embodiments discussed above, or camera-based systems, or camera-microphone based virtual keyboards. Structured light systems, such as laser-based light projection/detection systems, can also be used, as may a touch pad input device, as additional non-limiting examples.
- The use of these exemplary embodiments of this invention provide display dominated concept devices with a minimal number of required keys, provide for realizing a gesture-based input device, and further do not require any significant hardware to be provided. In addition, the commands and their interpretation can be determined by software protocols. Also, the use of these exemplary embodiments of this invention provide a possibility for command customization by the user (personalization). For example, the user may define the Delete gesture to be one different than the one shown in
FIG. 8B , such as by defining a circle with a diagonal line drawn through it to be the Delete gesture. - In general, and as considered herein, the motion made by the user-manipulated physical object may comprise one or more of a substantially circular motion, a substantially linear motion, at least one substantially circular motion in combination with at least one substantially linear motion, at least one of a substantially circular motion and a substantially linear motion in combination with a period of substantially no motion, a substantially curved motion and a tapping motion. For a case where the user-manipulated physical object is comprised of at least two fingers of the user, the motion may comprise movement of one finger relative to at least one other finger. The data recorded may be descriptive of at least one of a velocity and an acceleration of the user-manipulated physical object. For the case where the user-manipulated physical object is comprised of at least one finger of the user, the data recorded may be descriptive of at least a shape assumed by the at least one finger.
- Various modifications and adaptations may become apparent to those skilled in the relevant arts in view of the foregoing description, when read in conjunction with the accompanying drawings and the appended claims. As but some examples, the use of other similar or equivalent user input devices and technologies may be employed, such as resistive and/or capacitive-based touch pad or screen devices, as may other gestures and commands be attempted by those skilled in the art. However, all such and similar modifications of the teachings of this invention will still fall within the scope of this invention.
- Further by example, the exemplary embodiments of this invention may provide an initial user training session where the user enters the same gesture several times when prompted by the
program 18A in order to train the gesture recognition process to the particular finger motions and/or velocities, and possibly the finger tip size, that are characteristic of the user. This can be useful for, as an example, establishing the specific threshold or thresholds used by the gesture recognition processes. - Further by example, and for the two-dimensional embodiments of
FIGS. 8-10 , an actual touching of the surface of thedisplay 52 may not be necessary if there is sufficient ambient lighting so that the finger tip image can be acquired even when the finger tip is not actually in contact with the surface of the display device. In this case the finger or finger tip image may be acquired optically within the three dimensional space in the vicinity of thedevice 50. - Still further, in some embodiments the UST system of
FIGS. 1-7 may be used in the same device in conjunction with the hand, finger or finger tip image embodiments ofFIGS. 8-10 , and at any given time information derived from one or the other, or both, may be used for gesture recognition. In general, two or more similar or different object sensing technologies may be used together in the same device. - It can be noted that in the various illustrative embodiments of this invention that were described above the
DP 16 may perform substantially all of the required processing, based on program instructions contained in theprogram 18A stored in thememory 18. However, it is also within the scope of the exemplary embodiments to perform at least some of the processing in the image acquisition system or subsystem itself, such as in the ultrasonic-based imaging system ofFIGS. 1-7 or in the optical-based imaging system ofFIGS. 8-10 . For example, the actual image generation processing may be performed in the imaging system by a local embedded data processor, and the results may be passed to and processed by theDP 16 for performing the gesture recognition and interpretation operations. - Further, it may be appreciated that certain hand/finger gestures may be defined to have a standardized and universal meaning across different devices, applications and languages. One non-limiting example may be the index finger and thumb formed into a circle, with the remaining three fingers extended (an OK gesture), which may interpreted universally as, for example, “save and close a file”. The use of the exemplary embodiments of this invention facilitates this type of operation.
- In general, it may be appreciated that an aspect of the exemplary embodiments of this invention is a method, a computer program product and an apparatus that are responsive to a user executing a gesture with a user-manipulated physical object in the vicinity of a device to generate data that is descriptive of the presence of the user-manipulated physical object when executing the gesture and to interpret the data as pertaining to at least one object.
- As employed herein the “presence of the user-manipulated physical object” may include, but need not be limited to, the spatial orientation of the user-manipulated physical object in two or three dimensional space, the repose of the user-manipulated physical object in two or three dimensional space, a shape formed by the user-manipulated physical object in two or three dimensional space, the motion being made by the user-manipulated physical object in two or three dimensional space, the velocity of the user-manipulated physical object in two or three dimensional space, the acceleration of the user-manipulated physical object in two or three dimensional space, and combinations thereof.
- For example, the patterns traced by the user's fingertip when executing the gestures shown in
FIG. 8A for the clockwise and contraclockwise fingertip motions may be identical (i.e., a circle or oval), however the two gestures are distinguishable one from the other by the sensing of the direction of the fingertip motion in real or substantially real time. - Note further that both fixed and scanning type sensors may be used, such as UST systems/components that scan an ultrasonic beam through the environment.
- Furthermore, some of the features of the examples of this invention may be used to advantage without the corresponding use of other features. As such, the foregoing description should be considered as merely illustrative of the principles, teachings, examples and exemplary embodiments of this invention, and not in limitation thereof.
Claims (27)
1. A method, comprising:
detecting execution of a gesture performed with a user-manipulated physical object in the vicinity of a device;
generating data that is descriptive of the presence or movement of the user-manipulated object when executing the gesture; and
interpreting the data as pertaining to at least one object displayed by the device.
2. (canceled)
3. The method of claim 1 , wherein generating data comprises generating data that is descriptive of motion being made by the user-manipulated object in either three dimensional space or two dimensional space.
4-5. (canceled)
6. The method of claim 1 , wherein generating data comprises generating data that is descriptive of motion being made by the user-manipulated object over a period of time.
7. The method of claim 1 , wherein generating data comprises sequentially creating individual ones of a plurality of records, wherein individual ones of the plurality of records comprises data descriptive of a location of the user-manipulated physical object at a corresponding point in time while the gesture is executed.
8-10. (canceled)
11. The method of claim 1 , wherein the presence or motion comprises at least one of:
a substantially circular motion, a substantially linear motion, at least one substantially circular motion in combination with at least one substantially linear motion, a substantially circular motion and a substantially linear motion, in combination with a period of substantially no motion, a substantially curved motion, and a tapping motion.
12. The method of claim 1 , where the user-manipulated physical object is comprised of at least two objects, and where the motion comprises movement of one of the at least two objects relative to at least one other of the at least two objects.
13. The method of claim 1 , where the data is descriptive of at least one of a velocity and an acceleration of the user-manipulated physical object.
14. The method of claim 1 , wherein the data is descriptive of at least one of a spatial orientation of at least a part of the user-manipulated physical object in two or three dimensional space, a repose of at least a part of the user-manipulated physical object in two or three dimensional space and a shape formed by at least a part of the user-manipulated physical object in two or three dimensional space.
15. An apparatus, comprising:
at least one processor; and
at least one memory storing computer program instructions configured, working with the at least one processor, to cause the apparatus to perform at least the following:
detecting execution of a gesture performed with a user-manipulated physical object in the vicinity of a device;
generating data that is descriptive of the presence or movement of the user-manipulated object when executing the gesture; and
interpreting the data as pertaining to at least one object displayed by the device.
16. The apparatus of claim 15 , comprising a sensor arrangement configured to output data that is descriptive of motion being made by the user-manipulated physical object or portion of the user-manipulated physical object in two or three dimensional space, and wherein the sensor arrangement is comprised of at least one of a plurality of acoustic sensors and a plurality of light sensors.
17. The apparatus of claim 15 , embodied in a device that comprises means for conducting wireless communications.
18. The apparatus of claim 15 , wherein the data is descriptive of at least one of a spatial orientation of at least a part of the user-manipulated physical object in two or three dimensional space, the repose of at least a part of the user-manipulated physical object in two or three dimensional space and a shape formed by at least a part of the user-manipulated physical object in two or three dimensional space.
19. A non-transitory computer readable medium,
storing computer program instructions that, when performed by at least one processor, causes at least the following to be performed:
detecting execution of a gesture performed with a user-manipulated physical object in the vicinity of a device;
generating data that is descriptive of the presence or movement of the user-manipulated object when executing the gesture; and
interpreting the data as pertaining to at least one object displayed by the device.
20. (canceled)
21. The non-transitory computer readable medium of claim 19 , wherein generating data comprises generating data that is descriptive of a motion made by the user-manipulated object in either three dimensional space or two dimensional space.
22-23. (canceled)
24. The non-transitory computer readable medium of claim 19 , wherein generating data comprises generating data that is descriptive of a motion made by the user-manipulated object over a period of time.
25. The non-transitory computer readable medium of claim 19 , wherein generating data comprises sequentially creating individual ones of a plurality of records, wherein individual ones of the plurality of records comprise data descriptive of a location of the user-manipulated physical object at a corresponding point in time while the gesture is executed.
26-28. (canceled)
29. The non-transitory computer readable medium of claim 19 , wherein generating data comprises generating data that is descriptive of a motion made by the user-manipulated object in at least one of two dimensional space and three dimensional space, wherein the motion comprises at least one of:
a substantially circular motion;
a substantially linear motion;
at least one substantially circular motion in combination with at least one substantially linear motion;
a substantially circular motion and a substantially linear motion, in combination with a period of substantially no motion;
a substantially curved motion; and
a tapping motion.
30. The non-transitory computer readable medium of claim 29 , wherein the user-manipulated physical object is comprised of at least two objects, and wherein the motion comprises movement of one of the at least two objects relative to at least one other of the at least two objects.
31. The non-transitory computer readable medium of claim 19 , wherein the data is descriptive of at least one of a velocity and an acceleration of the user-manipulated physical object.
32. The non-transitory computer readable medium of claim 19 , wherein the data is descriptive of at least one of a spatial orientation of at least a part of the user-manipulated physical object in two or three dimensional space, the repose of at least a part of the user-manipulated physical object in two or three dimensional space and a shape formed by at least a part of the user-manipulated physical object in two or three dimensional space.
33-45. (canceled)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/295,340 US20120056804A1 (en) | 2006-06-28 | 2011-11-14 | Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/477,508 US8086971B2 (en) | 2006-06-28 | 2006-06-28 | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
US13/295,340 US20120056804A1 (en) | 2006-06-28 | 2011-11-14 | Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/477,508 Continuation US8086971B2 (en) | 2006-06-28 | 2006-06-28 | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120056804A1 true US20120056804A1 (en) | 2012-03-08 |
Family
ID=38846028
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/477,508 Active 2027-07-24 US8086971B2 (en) | 2006-06-28 | 2006-06-28 | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
US13/295,340 Abandoned US20120056804A1 (en) | 2006-06-28 | 2011-11-14 | Apparatus, Methods And Computer Program Products Providing Finger-Based And Hand-Based Gesture Commands For Portable Electronic Device Applications |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/477,508 Active 2027-07-24 US8086971B2 (en) | 2006-06-28 | 2006-06-28 | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications |
Country Status (5)
Country | Link |
---|---|
US (2) | US8086971B2 (en) |
EP (2) | EP2038732A4 (en) |
KR (1) | KR101098015B1 (en) |
CN (2) | CN103529942B (en) |
WO (1) | WO2008001202A2 (en) |
Cited By (25)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110191680A1 (en) * | 2010-02-02 | 2011-08-04 | Chae Seung Chul | Method and apparatus for providing user interface using acoustic signal, and device including user interface |
US20120182222A1 (en) * | 2011-01-13 | 2012-07-19 | David Moloney | Detect motion generated from gestures used to execute functionality associated with a computer system |
US20130204457A1 (en) * | 2012-02-06 | 2013-08-08 | Ford Global Technologies, Llc | Interacting with vehicle controls through gesture recognition |
US20130346893A1 (en) * | 2012-06-21 | 2013-12-26 | Fih (Hong Kong) Limited | Electronic device and method for editing document using the electronic device |
WO2014009561A3 (en) * | 2012-07-13 | 2014-05-01 | Softkinetic Software | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
US20140177909A1 (en) * | 2012-12-24 | 2014-06-26 | Industrial Technology Research Institute | Three-dimensional interactive device and operation method thereof |
WO2014138096A1 (en) | 2013-03-06 | 2014-09-12 | Sony Corporation | Apparatus and method for operating a user interface of a device |
CN104076974A (en) * | 2013-03-25 | 2014-10-01 | 柯尼卡美能达株式会社 | Device and method for determining gesture, and computer-readable storage medium for computer program |
US8866781B2 (en) | 2012-05-21 | 2014-10-21 | Huawei Technologies Co., Ltd. | Contactless gesture-based control method and apparatus |
WO2015065341A1 (en) * | 2013-10-29 | 2015-05-07 | Intel Corporation | Gesture based human computer interaction |
US9194741B2 (en) | 2013-09-06 | 2015-11-24 | Blackberry Limited | Device having light intensity measurement in presence of shadows |
US9256290B2 (en) | 2013-07-01 | 2016-02-09 | Blackberry Limited | Gesture detection using ambient light sensors |
US9304596B2 (en) | 2013-07-24 | 2016-04-05 | Blackberry Limited | Backlight for touchless gesture detection |
WO2016053320A1 (en) * | 2014-09-30 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Gesture based manipulation of three-dimensional images |
US9323336B2 (en) | 2013-07-01 | 2016-04-26 | Blackberry Limited | Gesture detection using ambient light sensors |
US9342671B2 (en) | 2013-07-01 | 2016-05-17 | Blackberry Limited | Password by touch-less gesture |
US9367137B2 (en) | 2013-07-01 | 2016-06-14 | Blackberry Limited | Alarm operation by touch-less gesture |
US9398221B2 (en) | 2013-07-01 | 2016-07-19 | Blackberry Limited | Camera control using ambient light sensors |
US9405461B2 (en) | 2013-07-09 | 2016-08-02 | Blackberry Limited | Operating a device using touchless and touchscreen gestures |
US9423913B2 (en) | 2013-07-01 | 2016-08-23 | Blackberry Limited | Performance control of ambient light sensors |
US9465448B2 (en) | 2013-07-24 | 2016-10-11 | Blackberry Limited | Backlight for touchless gesture detection |
US9489051B2 (en) | 2013-07-01 | 2016-11-08 | Blackberry Limited | Display navigation using touch-less gestures |
US9733895B2 (en) | 2011-08-05 | 2017-08-15 | Samsung Electronics Co., Ltd. | Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same |
WO2019154824A1 (en) * | 2018-02-08 | 2019-08-15 | BSH Hausgeräte GmbH | Domestic refrigeration device |
US11294470B2 (en) | 2014-01-07 | 2022-04-05 | Sony Depthsensing Solutions Sa/Nv | Human-to-computer natural three-dimensional hand gesture based navigation method |
Families Citing this family (380)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB9722766D0 (en) | 1997-10-28 | 1997-12-24 | British Telecomm | Portable computers |
US7469381B2 (en) | 2007-01-07 | 2008-12-23 | Apple Inc. | List scrolling and document translation, scaling, and rotation on a touch-screen display |
US7193609B2 (en) | 2002-03-19 | 2007-03-20 | America Online, Inc. | Constraining display motion in display navigation |
US9454269B2 (en) | 2010-10-01 | 2016-09-27 | Z124 | Keyboard fills bottom screen on rotation of a multiple screen device |
US9182937B2 (en) | 2010-10-01 | 2015-11-10 | Z124 | Desktop reveal by moving a logical display stack with gestures |
US7725288B2 (en) * | 2005-11-28 | 2010-05-25 | Navisense | Method and system for object control |
US7788607B2 (en) | 2005-12-01 | 2010-08-31 | Navisense | Method and system for mapping virtual coordinates |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US8018440B2 (en) | 2005-12-30 | 2011-09-13 | Microsoft Corporation | Unintentional touch rejection |
DE102006037156A1 (en) * | 2006-03-22 | 2007-09-27 | Volkswagen Ag | Interactive operating device and method for operating the interactive operating device |
US8059102B2 (en) * | 2006-06-13 | 2011-11-15 | N-Trig Ltd. | Fingertip touch recognition for a digitizer |
WO2008007372A2 (en) | 2006-07-12 | 2008-01-17 | N-Trig Ltd. | Hover and touch detection for a digitizer |
US8180114B2 (en) | 2006-07-13 | 2012-05-15 | Northrop Grumman Systems Corporation | Gesture recognition interface system with vertical display |
US9696808B2 (en) * | 2006-07-13 | 2017-07-04 | Northrop Grumman Systems Corporation | Hand-gesture recognition method |
US8972902B2 (en) * | 2008-08-22 | 2015-03-03 | Northrop Grumman Systems Corporation | Compound gesture recognition |
US8589824B2 (en) | 2006-07-13 | 2013-11-19 | Northrop Grumman Systems Corporation | Gesture recognition interface system |
US8686964B2 (en) * | 2006-07-13 | 2014-04-01 | N-Trig Ltd. | User specific recognition of intended user interaction with a digitizer |
US8234578B2 (en) | 2006-07-25 | 2012-07-31 | Northrop Grumman Systems Corporatiom | Networked gesture collaboration system |
US8432448B2 (en) | 2006-08-10 | 2013-04-30 | Northrop Grumman Systems Corporation | Stereo camera intrusion detection system |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US7940250B2 (en) * | 2006-09-06 | 2011-05-10 | Apple Inc. | Web-clip widgets on a portable multifunction device |
US8564544B2 (en) | 2006-09-06 | 2013-10-22 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US8354997B2 (en) * | 2006-10-31 | 2013-01-15 | Navisense | Touchless user interface for a mobile device |
US8904312B2 (en) * | 2006-11-09 | 2014-12-02 | Navisense | Method and device for touchless signing and recognition |
US7877707B2 (en) * | 2007-01-06 | 2011-01-25 | Apple Inc. | Detecting and interpreting real-world and security gestures on touch and hover sensitive devices |
US7844915B2 (en) | 2007-01-07 | 2010-11-30 | Apple Inc. | Application programming interfaces for scrolling operations |
US7872652B2 (en) * | 2007-01-07 | 2011-01-18 | Apple Inc. | Application programming interfaces for synchronization |
US20080168478A1 (en) | 2007-01-07 | 2008-07-10 | Andrew Platzer | Application Programming Interfaces for Scrolling |
US8813100B1 (en) | 2007-01-07 | 2014-08-19 | Apple Inc. | Memory management |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US8656311B1 (en) | 2007-01-07 | 2014-02-18 | Apple Inc. | Method and apparatus for compositing various types of content |
US7903115B2 (en) * | 2007-01-07 | 2011-03-08 | Apple Inc. | Animations |
US8788954B2 (en) * | 2007-01-07 | 2014-07-22 | Apple Inc. | Web-clip widgets on a portable multifunction device |
US20080168402A1 (en) | 2007-01-07 | 2008-07-10 | Christopher Blumenberg | Application Programming Interfaces for Gesture Operations |
US8060841B2 (en) * | 2007-03-19 | 2011-11-15 | Navisense | Method and device for touchless media searching |
CN101689244B (en) * | 2007-05-04 | 2015-07-22 | 高通股份有限公司 | Camera-based user input for compact devices |
US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
CA2591808A1 (en) * | 2007-07-11 | 2009-01-11 | Hsien-Hsiang Chiu | Intelligent object tracking and gestures sensing input device |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
US9619143B2 (en) * | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
JP4569613B2 (en) * | 2007-09-19 | 2010-10-27 | ソニー株式会社 | Image processing apparatus, image processing method, and program |
US20090102603A1 (en) * | 2007-10-19 | 2009-04-23 | Fein Gene S | Method and apparatus for providing authentication with a user interface system |
US20090109174A1 (en) * | 2007-10-30 | 2009-04-30 | Fein Gene S | Method and Apparatus for User Interface in Electronic Devices With Visual Display Units |
US8477098B2 (en) | 2007-10-31 | 2013-07-02 | Gene S. Fein | Method and apparatus for user interface of input devices |
US20090109215A1 (en) | 2007-10-31 | 2009-04-30 | Fein Gene S | Method and apparatus for user interface communication with an image manipulator |
US8212768B2 (en) * | 2007-10-31 | 2012-07-03 | Fimed Properties Ag Limited Liability Company | Digital, data, and multimedia user interface with a keyboard |
US8139110B2 (en) | 2007-11-01 | 2012-03-20 | Northrop Grumman Systems Corporation | Calibration of a gesture recognition interface system |
US9377874B2 (en) | 2007-11-02 | 2016-06-28 | Northrop Grumman Systems Corporation | Gesture recognition light and video image projector |
US20090125848A1 (en) * | 2007-11-14 | 2009-05-14 | Susann Marie Keohane | Touch surface-sensitive edit system |
US9171454B2 (en) * | 2007-11-14 | 2015-10-27 | Microsoft Technology Licensing, Llc | Magic wand |
AR064377A1 (en) * | 2007-12-17 | 2009-04-01 | Rovere Victor Manuel Suarez | DEVICE FOR SENSING MULTIPLE CONTACT AREAS AGAINST OBJECTS SIMULTANEOUSLY |
FR2925708B1 (en) * | 2007-12-20 | 2009-12-18 | Dav | METHOD FOR DETECTING AN ANGULAR VARIATION OF A CONTROL PATH ON A TOUCH SURFACE AND CORRESPONDING CONTROL MODULE |
US20090189858A1 (en) * | 2008-01-30 | 2009-07-30 | Jeff Lev | Gesture Identification Using A Structured Light Pattern |
US8446373B2 (en) | 2008-02-08 | 2013-05-21 | Synaptics Incorporated | Method and apparatus for extended adjustment based on relative positioning of multiple objects contemporaneously in a sensing region |
US8174502B2 (en) * | 2008-03-04 | 2012-05-08 | Apple Inc. | Touch event processing for web pages |
US8416196B2 (en) | 2008-03-04 | 2013-04-09 | Apple Inc. | Touch event model programming interface |
US8645827B2 (en) | 2008-03-04 | 2014-02-04 | Apple Inc. | Touch event model |
US8717305B2 (en) * | 2008-03-04 | 2014-05-06 | Apple Inc. | Touch event model for web pages |
KR101513023B1 (en) | 2008-03-25 | 2015-04-22 | 엘지전자 주식회사 | Terminal and method of displaying information therein |
KR101012379B1 (en) * | 2008-03-25 | 2011-02-09 | 엘지전자 주식회사 | Terminal and method of displaying information therein |
US20090254855A1 (en) * | 2008-04-08 | 2009-10-08 | Sony Ericsson Mobile Communications, Ab | Communication terminals with superimposed user interface |
US9582049B2 (en) * | 2008-04-17 | 2017-02-28 | Lg Electronics Inc. | Method and device for controlling user interface based on user's gesture |
JP5164675B2 (en) * | 2008-06-04 | 2013-03-21 | キヤノン株式会社 | User interface control method, information processing apparatus, and program |
US8345920B2 (en) | 2008-06-20 | 2013-01-01 | Northrop Grumman Systems Corporation | Gesture recognition interface system with a light-diffusive screen |
DE102008032451C5 (en) * | 2008-07-10 | 2017-10-19 | Rational Ag | Display method and cooking appliance therefor |
DE102008032448B4 (en) * | 2008-07-10 | 2023-11-02 | Rational Ag | Display method and cooking device therefor |
US8847739B2 (en) * | 2008-08-04 | 2014-09-30 | Microsoft Corporation | Fusing RFID and vision for surface object tracking |
US20100031202A1 (en) * | 2008-08-04 | 2010-02-04 | Microsoft Corporation | User-defined gesture set for surface computing |
US8527908B2 (en) * | 2008-09-26 | 2013-09-03 | Apple Inc. | Computer user interface system and methods |
KR101537596B1 (en) * | 2008-10-15 | 2015-07-20 | 엘지전자 주식회사 | Mobile terminal and method for recognizing touch thereof |
US8174504B2 (en) | 2008-10-21 | 2012-05-08 | Synaptics Incorporated | Input device and method for adjusting a parameter of an electronic system |
KR20100048090A (en) * | 2008-10-30 | 2010-05-11 | 삼성전자주식회사 | Interface apparatus for generating control command by touch and motion, interface system including the interface apparatus, and interface method using the same |
US8584031B2 (en) | 2008-11-19 | 2013-11-12 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US20100141592A1 (en) * | 2008-12-09 | 2010-06-10 | Andrei Andrievsky | Digital camera with character based mode initiation |
US8453057B2 (en) * | 2008-12-22 | 2013-05-28 | Verizon Patent And Licensing Inc. | Stage interaction for mobile device |
US8030914B2 (en) * | 2008-12-29 | 2011-10-04 | Motorola Mobility, Inc. | Portable electronic device having self-calibrating proximity sensors |
US8275412B2 (en) * | 2008-12-31 | 2012-09-25 | Motorola Mobility Llc | Portable electronic device having directional proximity sensors based on device orientation |
JP5168161B2 (en) * | 2009-01-16 | 2013-03-21 | ブラザー工業株式会社 | Head mounted display |
DE102009008041A1 (en) * | 2009-02-09 | 2010-08-12 | Volkswagen Ag | Method for operating a motor vehicle with a touchscreen |
US8996995B2 (en) * | 2009-02-25 | 2015-03-31 | Nokia Corporation | Method and apparatus for phrase replacement |
US9684521B2 (en) | 2010-01-26 | 2017-06-20 | Apple Inc. | Systems having discrete and continuous gesture recognizers |
US8566044B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US8566045B2 (en) * | 2009-03-16 | 2013-10-22 | Apple Inc. | Event recognition |
US9311112B2 (en) * | 2009-03-16 | 2016-04-12 | Apple Inc. | Event recognition |
US8285499B2 (en) | 2009-03-16 | 2012-10-09 | Apple Inc. | Event recognition |
US8686951B2 (en) | 2009-03-18 | 2014-04-01 | HJ Laboratories, LLC | Providing an elevated and texturized display in an electronic device |
US20100271312A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Menu Configuration System and Method for Display on an Electronic Device |
US20100271331A1 (en) * | 2009-04-22 | 2010-10-28 | Rachid Alameh | Touch-Screen and Method for an Electronic Device |
US20100289740A1 (en) * | 2009-05-18 | 2010-11-18 | Bong Soo Kim | Touchless control of an electronic device |
US9417700B2 (en) | 2009-05-21 | 2016-08-16 | Edge3 Technologies | Gesture recognition systems and related methods |
US8542186B2 (en) * | 2009-05-22 | 2013-09-24 | Motorola Mobility Llc | Mobile device with user interaction capability and method of operating same |
US8619029B2 (en) * | 2009-05-22 | 2013-12-31 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting consecutive gestures |
US8294105B2 (en) * | 2009-05-22 | 2012-10-23 | Motorola Mobility Llc | Electronic device with sensing assembly and method for interpreting offset gestures |
US8269175B2 (en) * | 2009-05-22 | 2012-09-18 | Motorola Mobility Llc | Electronic device with sensing assembly and method for detecting gestures of geometric shapes |
US8788676B2 (en) | 2009-05-22 | 2014-07-22 | Motorola Mobility Llc | Method and system for controlling data transmission to or from a mobile device |
US8304733B2 (en) * | 2009-05-22 | 2012-11-06 | Motorola Mobility Llc | Sensing assembly for mobile device |
US8391719B2 (en) * | 2009-05-22 | 2013-03-05 | Motorola Mobility Llc | Method and system for conducting communication between mobile devices |
US8344325B2 (en) * | 2009-05-22 | 2013-01-01 | Motorola Mobility Llc | Electronic device with sensing assembly and method for detecting basic gestures |
US8836648B2 (en) * | 2009-05-27 | 2014-09-16 | Microsoft Corporation | Touch pull-in gesture |
US8386963B2 (en) * | 2009-05-28 | 2013-02-26 | Microsoft Corporation | Virtual inking using gesture recognition |
TW201101198A (en) * | 2009-06-17 | 2011-01-01 | Sonix Technology Co Ltd | Command input method |
TWI398818B (en) * | 2009-06-30 | 2013-06-11 | Univ Nat Taiwan Science Tech | Method and system for gesture recognition |
EP2452258B1 (en) | 2009-07-07 | 2019-01-23 | Elliptic Laboratories AS | Control using movements |
US8319170B2 (en) * | 2009-07-10 | 2012-11-27 | Motorola Mobility Llc | Method for adapting a pulse power mode of a proximity sensor |
US20110022307A1 (en) * | 2009-07-27 | 2011-01-27 | Htc Corporation | Method for operating navigation frame, navigation apparatus and recording medium |
US9092115B2 (en) * | 2009-09-23 | 2015-07-28 | Microsoft Technology Licensing, Llc | Computing system with visual clipboard |
US8963829B2 (en) | 2009-10-07 | 2015-02-24 | Microsoft Corporation | Methods and systems for determining and tracking extremities of a target |
US8564534B2 (en) | 2009-10-07 | 2013-10-22 | Microsoft Corporation | Human tracking system |
US9367178B2 (en) | 2009-10-23 | 2016-06-14 | Elliptic Laboratories As | Touchless interfaces |
KR101639383B1 (en) * | 2009-11-12 | 2016-07-22 | 삼성전자주식회사 | Apparatus for sensing proximity touch operation and method thereof |
US20110119216A1 (en) * | 2009-11-16 | 2011-05-19 | Microsoft Corporation | Natural input trainer for gestural instruction |
US8665227B2 (en) * | 2009-11-19 | 2014-03-04 | Motorola Mobility Llc | Method and apparatus for replicating physical key function with soft keys in an electronic device |
EP2333651B1 (en) * | 2009-12-11 | 2016-07-20 | Dassault Systèmes | Method and system for duplicating an object using a touch-sensitive display |
US20110141013A1 (en) * | 2009-12-14 | 2011-06-16 | Alcatel-Lucent Usa, Incorporated | User-interface apparatus and method for user control |
US8736561B2 (en) | 2010-01-06 | 2014-05-27 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
JP5750875B2 (en) * | 2010-12-01 | 2015-07-22 | ソニー株式会社 | Information processing apparatus, information processing method, and program |
WO2011087328A2 (en) * | 2010-01-15 | 2011-07-21 | 한국전자통신연구원 | Apparatus and method for processing a scene |
US9335825B2 (en) | 2010-01-26 | 2016-05-10 | Nokia Technologies Oy | Gesture control |
US8760631B2 (en) * | 2010-01-27 | 2014-06-24 | Intersil Americas Inc. | Distance sensing by IQ domain differentiation of time of flight (TOF) measurements |
US8239785B2 (en) * | 2010-01-27 | 2012-08-07 | Microsoft Corporation | Edge gestures |
US20110185320A1 (en) * | 2010-01-28 | 2011-07-28 | Microsoft Corporation | Cross-reference Gestures |
US20110185299A1 (en) * | 2010-01-28 | 2011-07-28 | Microsoft Corporation | Stamp Gestures |
US9411504B2 (en) | 2010-01-28 | 2016-08-09 | Microsoft Technology Licensing, Llc | Copy and staple gestures |
US8261213B2 (en) * | 2010-01-28 | 2012-09-04 | Microsoft Corporation | Brush, carbon-copy, and fill gestures |
KR101114750B1 (en) * | 2010-01-29 | 2012-03-05 | 주식회사 팬택 | User Interface Using Hologram |
US20110191704A1 (en) * | 2010-02-04 | 2011-08-04 | Microsoft Corporation | Contextual multiplexing gestures |
US9519356B2 (en) * | 2010-02-04 | 2016-12-13 | Microsoft Technology Licensing, Llc | Link gestures |
US20110191719A1 (en) * | 2010-02-04 | 2011-08-04 | Microsoft Corporation | Cut, Punch-Out, and Rip Gestures |
US20110199342A1 (en) * | 2010-02-16 | 2011-08-18 | Harry Vartanian | Apparatus and method for providing elevated, indented or texturized sensations to an object near a display device or input detection using ultrasound |
US8797278B1 (en) * | 2010-02-18 | 2014-08-05 | The Boeing Company | Aircraft charting system with multi-touch interaction gestures for managing a map of an airport |
US9965165B2 (en) * | 2010-02-19 | 2018-05-08 | Microsoft Technology Licensing, Llc | Multi-finger gestures |
US9310994B2 (en) | 2010-02-19 | 2016-04-12 | Microsoft Technology Licensing, Llc | Use of bezel as an input mechanism |
US9274682B2 (en) * | 2010-02-19 | 2016-03-01 | Microsoft Technology Licensing, Llc | Off-screen gestures to create on-screen input |
US9367205B2 (en) * | 2010-02-19 | 2016-06-14 | Microsoft Technolgoy Licensing, Llc | Radial menus with bezel gestures |
US8799827B2 (en) * | 2010-02-19 | 2014-08-05 | Microsoft Corporation | Page manipulations using on and off-screen gestures |
US8707174B2 (en) * | 2010-02-25 | 2014-04-22 | Microsoft Corporation | Multi-screen hold and page-flip gesture |
US9454304B2 (en) * | 2010-02-25 | 2016-09-27 | Microsoft Technology Licensing, Llc | Multi-screen dual tap gesture |
US9075522B2 (en) * | 2010-02-25 | 2015-07-07 | Microsoft Technology Licensing, Llc | Multi-screen bookmark hold gesture |
US8473870B2 (en) * | 2010-02-25 | 2013-06-25 | Microsoft Corporation | Multi-screen hold and drag gesture |
US8539384B2 (en) | 2010-02-25 | 2013-09-17 | Microsoft Corporation | Multi-screen pinch and expand gestures |
US20110209089A1 (en) * | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen object-hold and page-change gesture |
US8751970B2 (en) * | 2010-02-25 | 2014-06-10 | Microsoft Corporation | Multi-screen synchronous slide gesture |
US20110209101A1 (en) * | 2010-02-25 | 2011-08-25 | Hinckley Kenneth P | Multi-screen pinch-to-pocket gesture |
US20110209058A1 (en) * | 2010-02-25 | 2011-08-25 | Microsoft Corporation | Multi-screen hold and tap gesture |
US8756522B2 (en) | 2010-03-19 | 2014-06-17 | Blackberry Limited | Portable electronic device and method of controlling same |
EP2550579A4 (en) * | 2010-03-24 | 2015-04-22 | Hewlett Packard Development Co | Gesture mapping for display device |
WO2011123833A1 (en) * | 2010-04-01 | 2011-10-06 | Yanntek, Inc. | Immersive multimedia terminal |
US10788976B2 (en) | 2010-04-07 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US10025458B2 (en) | 2010-04-07 | 2018-07-17 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US8810509B2 (en) * | 2010-04-27 | 2014-08-19 | Microsoft Corporation | Interfacing with a computing application using a multi-digit sensor |
US8963845B2 (en) | 2010-05-05 | 2015-02-24 | Google Technology Holdings LLC | Mobile device with temperature sensing capability and method of operating same |
US8396252B2 (en) | 2010-05-20 | 2013-03-12 | Edge 3 Technologies | Systems and related methods for three dimensional gesture recognition in vehicles |
US9103732B2 (en) | 2010-05-25 | 2015-08-11 | Google Technology Holdings LLC | User computer device with temperature sensing capabilities and method of operating same |
US8751056B2 (en) | 2010-05-25 | 2014-06-10 | Motorola Mobility Llc | User computer device with temperature sensing capabilities and method of operating same |
US8552999B2 (en) | 2010-06-14 | 2013-10-08 | Apple Inc. | Control selection approximation |
US20110310005A1 (en) * | 2010-06-17 | 2011-12-22 | Qualcomm Incorporated | Methods and apparatus for contactless gesture recognition |
WO2012007034A1 (en) | 2010-07-13 | 2012-01-19 | Nokia Corporation | Sending and receiving information |
US20120030624A1 (en) * | 2010-07-30 | 2012-02-02 | Migos Charles J | Device, Method, and Graphical User Interface for Displaying Menus |
US8655093B2 (en) | 2010-09-02 | 2014-02-18 | Edge 3 Technologies, Inc. | Method and apparatus for performing segmentation of an image |
US8582866B2 (en) | 2011-02-10 | 2013-11-12 | Edge 3 Technologies, Inc. | Method and apparatus for disparity computation in stereo images |
WO2012030872A1 (en) | 2010-09-02 | 2012-03-08 | Edge3 Technologies Inc. | Method and apparatus for confusion learning |
US8666144B2 (en) | 2010-09-02 | 2014-03-04 | Edge 3 Technologies, Inc. | Method and apparatus for determining disparity of texture |
US8890803B2 (en) * | 2010-09-13 | 2014-11-18 | Samsung Electronics Co., Ltd. | Gesture control system |
CN102446032B (en) * | 2010-09-30 | 2014-09-17 | 中国移动通信有限公司 | Information input method and terminal based on camera |
US9405444B2 (en) | 2010-10-01 | 2016-08-02 | Z124 | User interface with independent drawer control |
CN102446042B (en) * | 2010-10-12 | 2014-10-01 | 谊达光电科技股份有限公司 | Capacitive adjacent induction and touch sensing device and method |
GB2498299B (en) * | 2010-10-22 | 2019-08-14 | Hewlett Packard Development Co | Evaluating an input relative to a display |
KR101169583B1 (en) * | 2010-11-04 | 2012-07-31 | 주식회사 매크론 | Virture mouse driving method |
JP2012104994A (en) * | 2010-11-09 | 2012-05-31 | Sony Corp | Input device, input method, program, and recording medium |
KR101731346B1 (en) * | 2010-11-12 | 2017-04-28 | 엘지전자 주식회사 | Method for providing display image in multimedia device and thereof |
CN105242258A (en) | 2010-11-16 | 2016-01-13 | 高通股份有限公司 | System and method for object position estimation based on ultrasonic reflected signals |
US20120120002A1 (en) * | 2010-11-17 | 2012-05-17 | Sony Corporation | System and method for display proximity based control of a touch screen user interface |
KR101646616B1 (en) * | 2010-11-30 | 2016-08-12 | 삼성전자주식회사 | Apparatus and Method for Controlling Object |
EP2649505B1 (en) * | 2010-12-08 | 2019-06-05 | Nokia Technologies Oy | User interface |
US20120159395A1 (en) | 2010-12-20 | 2012-06-21 | Microsoft Corporation | Application-launching interface for multiple modes |
US9244606B2 (en) | 2010-12-20 | 2016-01-26 | Apple Inc. | Device, method, and graphical user interface for navigation of concurrently open software applications |
US8689123B2 (en) | 2010-12-23 | 2014-04-01 | Microsoft Corporation | Application reporting in an application-selectable user interface |
US8612874B2 (en) | 2010-12-23 | 2013-12-17 | Microsoft Corporation | Presenting an application change through a tile |
US9417696B2 (en) * | 2011-01-27 | 2016-08-16 | Blackberry Limited | Portable electronic device and method therefor |
US8421752B2 (en) | 2011-01-27 | 2013-04-16 | Research In Motion Limited | Portable electronic device and method therefor |
EP2482168A1 (en) | 2011-01-27 | 2012-08-01 | Research In Motion Limited | Portable electronic device and method therefor |
EP2482164B1 (en) | 2011-01-27 | 2013-05-22 | Research In Motion Limited | Portable electronic device and method therefor |
US10025388B2 (en) * | 2011-02-10 | 2018-07-17 | Continental Automotive Systems, Inc. | Touchless human machine interface |
US8970589B2 (en) | 2011-02-10 | 2015-03-03 | Edge 3 Technologies, Inc. | Near-touch interaction with a stereo camera grid structured tessellations |
CN103384872B (en) * | 2011-02-22 | 2016-10-12 | 惠普发展公司,有限责任合伙企业 | It is easy to method and the calculating system of user's input |
JP2012190215A (en) * | 2011-03-10 | 2012-10-04 | Sony Corp | Input processor, input processing method, and program |
US9857868B2 (en) | 2011-03-19 | 2018-01-02 | The Board Of Trustees Of The Leland Stanford Junior University | Method and system for ergonomic touch-free interface |
US8743244B2 (en) | 2011-03-21 | 2014-06-03 | HJ Laboratories, LLC | Providing augmented reality based on third party information |
CN102693063B (en) * | 2011-03-23 | 2015-04-29 | 联想(北京)有限公司 | Operation control method and device and electronic equipment |
JP5766479B2 (en) * | 2011-03-25 | 2015-08-19 | 京セラ株式会社 | Electronic device, control method, and control program |
US9298363B2 (en) | 2011-04-11 | 2016-03-29 | Apple Inc. | Region activation for touch sensitive surface |
US8840466B2 (en) | 2011-04-25 | 2014-09-23 | Aquifi, Inc. | Method and system to create three-dimensional mapping in a two-dimensional game |
US20120280900A1 (en) * | 2011-05-06 | 2012-11-08 | Nokia Corporation | Gesture recognition using plural sensors |
CN103502912B (en) * | 2011-05-09 | 2017-11-07 | 皇家飞利浦有限公司 | Object on rotating screen |
US20120293404A1 (en) * | 2011-05-19 | 2012-11-22 | Panasonic Corporation | Low Cost Embedded Touchless Gesture Sensor |
TWI466021B (en) * | 2011-05-24 | 2014-12-21 | Asustek Comp Inc | Computer system and control method thereof |
US9104440B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
US8893033B2 (en) | 2011-05-27 | 2014-11-18 | Microsoft Corporation | Application notifications |
US9158445B2 (en) | 2011-05-27 | 2015-10-13 | Microsoft Technology Licensing, Llc | Managing an immersive interface in a multi-application immersive environment |
US9658766B2 (en) | 2011-05-27 | 2017-05-23 | Microsoft Technology Licensing, Llc | Edge gesture |
US9104307B2 (en) | 2011-05-27 | 2015-08-11 | Microsoft Technology Licensing, Llc | Multi-application environment |
WO2012172322A2 (en) * | 2011-06-13 | 2012-12-20 | Elliptic Laboratories As | Touchless interaction |
US8631317B2 (en) * | 2011-06-28 | 2014-01-14 | International Business Machines Corporation | Manipulating display of document pages on a touchscreen computing device |
KR101262700B1 (en) * | 2011-08-05 | 2013-05-08 | 삼성전자주식회사 | Method for Controlling Electronic Apparatus based on Voice Recognition and Motion Recognition, and Electric Apparatus thereof |
US20130057587A1 (en) | 2011-09-01 | 2013-03-07 | Microsoft Corporation | Arranging tiles |
US9146670B2 (en) | 2011-09-10 | 2015-09-29 | Microsoft Technology Licensing, Llc | Progressively indicating new content in an application-selectable user interface |
CN102314217B (en) * | 2011-09-29 | 2014-01-29 | 上海华勤通讯技术有限公司 | Mobile terminal and mobile terminal control method |
CA2792685C (en) * | 2011-10-18 | 2017-08-22 | Research In Motion Limited | Method of modifying rendered attributes of list elements in a user interface |
US9075631B2 (en) | 2011-10-18 | 2015-07-07 | Blackberry Limited | Method of rendering a user interface |
US20130097534A1 (en) | 2011-10-18 | 2013-04-18 | Research In Motion Limited | Method of rendering a user interface |
CA2792895C (en) | 2011-10-18 | 2020-04-28 | Research In Motion Limited | Method of rendering a user interface |
CA2792188A1 (en) | 2011-10-18 | 2013-04-18 | Research In Motion Limited | Method of animating a rearrangement of ui elements on a display screen of an electronic device |
US9672050B2 (en) | 2011-10-18 | 2017-06-06 | Blackberry Limited | Method of distributed layout negotiation in a user interface framework |
CN108762577A (en) | 2011-10-18 | 2018-11-06 | 卡内基梅隆大学 | Method and apparatus for the touch event on touch sensitive surface of classifying |
US9672609B1 (en) | 2011-11-11 | 2017-06-06 | Edge 3 Technologies, Inc. | Method and apparatus for improved depth-map estimation |
US8941619B2 (en) | 2011-11-18 | 2015-01-27 | Au Optronics Corporation | Apparatus and method for controlling information display |
CN102520791A (en) * | 2011-11-28 | 2012-06-27 | 北京盈胜泰科技术有限公司 | Wireless gesture recognition device |
US8963885B2 (en) | 2011-11-30 | 2015-02-24 | Google Technology Holdings LLC | Mobile device for interacting with an active stylus |
US9063591B2 (en) | 2011-11-30 | 2015-06-23 | Google Technology Holdings LLC | Active styluses for interacting with a mobile device |
EP2602692A1 (en) * | 2011-12-05 | 2013-06-12 | Alcatel Lucent | Method for recognizing gestures and gesture detector |
CN102402290A (en) * | 2011-12-07 | 2012-04-04 | 北京盈胜泰科技术有限公司 | Method and system for identifying posture of body |
EP2605129B1 (en) * | 2011-12-16 | 2019-03-13 | BlackBerry Limited | Method of rendering a user interface |
WO2013095677A1 (en) | 2011-12-23 | 2013-06-27 | Intel Corporation | Computing system utilizing three-dimensional manipulation command gestures |
EP2795430A4 (en) | 2011-12-23 | 2015-08-19 | Intel Ip Corp | Transition mechanism for computing system utilizing user sensing |
WO2013095679A1 (en) * | 2011-12-23 | 2013-06-27 | Intel Corporation | Computing system utilizing coordinated two-hand command gestures |
US10345911B2 (en) | 2011-12-23 | 2019-07-09 | Intel Corporation | Mechanism to provide visual feedback regarding computing system command gestures |
US9052804B1 (en) * | 2012-01-06 | 2015-06-09 | Google Inc. | Object occlusion to initiate a visual search |
US9230171B2 (en) | 2012-01-06 | 2016-01-05 | Google Inc. | Object outlining to initiate a visual search |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US8693731B2 (en) | 2012-01-17 | 2014-04-08 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging |
US9501152B2 (en) | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US11493998B2 (en) | 2012-01-17 | 2022-11-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US9070019B2 (en) | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US8854433B1 (en) | 2012-02-03 | 2014-10-07 | Aquifi, Inc. | Method and system enabling natural user interface gestures with an electronic system |
US20130201161A1 (en) * | 2012-02-03 | 2013-08-08 | John E. Dolan | Methods, Systems and Apparatus for Digital-Marking-Surface Content-Unit Manipulation |
CN102591587A (en) * | 2012-02-06 | 2012-07-18 | 广西佳微电子科技有限公司 | Non-contact page-turning system and non-contact page-turning method for projector |
US20150220149A1 (en) * | 2012-02-14 | 2015-08-06 | Google Inc. | Systems and methods for a virtual grasping user interface |
WO2013136776A1 (en) * | 2012-03-15 | 2013-09-19 | パナソニック株式会社 | Gesture input operation processing device |
JP2013218549A (en) * | 2012-04-10 | 2013-10-24 | Alpine Electronics Inc | Electronic equipment |
TWI497347B (en) * | 2012-05-09 | 2015-08-21 | Hung Ta Liu | Control system using gestures as inputs |
GB2502087A (en) * | 2012-05-16 | 2013-11-20 | St Microelectronics Res & Dev | Gesture recognition |
US9671566B2 (en) | 2012-06-11 | 2017-06-06 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
CN102759987A (en) * | 2012-06-13 | 2012-10-31 | 胡锦云 | Information inputting method |
US8907264B2 (en) | 2012-06-14 | 2014-12-09 | Intersil Americas LLC | Motion and simple gesture detection using multiple photodetector segments |
TWI490755B (en) | 2012-06-20 | 2015-07-01 | Pixart Imaging Inc | Input system |
US8934675B2 (en) | 2012-06-25 | 2015-01-13 | Aquifi, Inc. | Systems and methods for tracking human hands by performing parts based template matching using images from multiple viewpoints |
US9111135B2 (en) | 2012-06-25 | 2015-08-18 | Aquifi, Inc. | Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera |
CN109508091A (en) * | 2012-07-06 | 2019-03-22 | 原相科技股份有限公司 | Input system |
KR101984154B1 (en) * | 2012-07-16 | 2019-05-30 | 삼성전자 주식회사 | Control method for terminal using touch and gesture input and terminal thereof |
CN103577081B (en) * | 2012-07-30 | 2018-07-03 | 联想(北京)有限公司 | A kind of method and electronic equipment for adjusting display output |
KR20140019678A (en) * | 2012-08-07 | 2014-02-17 | 삼성전자주식회사 | Method and apparatus for creating graphic user interface objects |
US8836768B1 (en) | 2012-09-04 | 2014-09-16 | Aquifi, Inc. | Method and system enabling natural user interface gestures with user wearable glasses |
KR101938648B1 (en) * | 2012-10-23 | 2019-01-15 | 삼성전자주식회사 | Mobile system including image sensor, method of operating image sensor and method of operating mobile system |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US9582122B2 (en) | 2012-11-12 | 2017-02-28 | Microsoft Technology Licensing, Llc | Touch-sensitive bezel techniques |
US9235342B2 (en) * | 2012-11-28 | 2016-01-12 | International Business Machines Corporation | Selective sharing of displayed content in a view presented on a touchscreen of a processing system |
US9075514B1 (en) * | 2012-12-13 | 2015-07-07 | Amazon Technologies, Inc. | Interface selection element display |
US9001064B2 (en) * | 2012-12-14 | 2015-04-07 | Barnesandnoble.Com Llc | Touch sensitive device with pinch-based archive and restore functionality |
CN103067782B (en) * | 2012-12-21 | 2017-12-22 | 康佳集团股份有限公司 | A kind of bimanual input interactive operation processing method and system based on intelligent television |
WO2014105183A1 (en) * | 2012-12-28 | 2014-07-03 | Intel Corporation | Three-dimensional user interface device |
CN103529930B (en) * | 2013-01-04 | 2016-12-28 | 努比亚技术有限公司 | The index method of menu based on somatosensory recognition, device and terminal thereof |
CN103914143A (en) * | 2013-01-07 | 2014-07-09 | 义明科技股份有限公司 | Control method of electronic device |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US9465461B2 (en) | 2013-01-08 | 2016-10-11 | Leap Motion, Inc. | Object detection and tracking with audio and optical signals |
US10241639B2 (en) | 2013-01-15 | 2019-03-26 | Leap Motion, Inc. | Dynamic user interactions for display control and manipulation of display objects |
US9459697B2 (en) | 2013-01-15 | 2016-10-04 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
US9129155B2 (en) | 2013-01-30 | 2015-09-08 | Aquifi, Inc. | Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map |
US9092665B2 (en) | 2013-01-30 | 2015-07-28 | Aquifi, Inc | Systems and methods for initializing motion tracking of human hands |
WO2014119258A1 (en) * | 2013-01-31 | 2014-08-07 | パナソニック株式会社 | Information processing method and information processing device |
JP5572851B1 (en) | 2013-02-26 | 2014-08-20 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Electronics |
CN104714728B (en) * | 2013-02-28 | 2018-10-12 | 联想(北京)有限公司 | A kind of display methods and equipment |
US9524028B2 (en) | 2013-03-08 | 2016-12-20 | Fastvdo Llc | Visual language for human computer interfaces |
US9110541B1 (en) * | 2013-03-14 | 2015-08-18 | Amazon Technologies, Inc. | Interface selection approaches for multi-dimensional input |
WO2014200589A2 (en) | 2013-03-15 | 2014-12-18 | Leap Motion, Inc. | Determining positional information for an object in space |
US10721448B2 (en) | 2013-03-15 | 2020-07-21 | Edge 3 Technologies, Inc. | Method and apparatus for adaptive exposure bracketing, segmentation and scene organization |
KR20140114766A (en) | 2013-03-19 | 2014-09-29 | 퀵소 코 | Method and device for sensing touch inputs |
US9612689B2 (en) | 2015-02-02 | 2017-04-04 | Qeexo, Co. | Method and apparatus for classifying a touch event on a touchscreen as related to one of multiple function generating interaction layers and activating a function in the selected interaction layer |
US9013452B2 (en) | 2013-03-25 | 2015-04-21 | Qeexo, Co. | Method and system for activating different interactive functions using different types of finger contacts |
US9298266B2 (en) | 2013-04-02 | 2016-03-29 | Aquifi, Inc. | Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
US10620709B2 (en) | 2013-04-05 | 2020-04-14 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US9323338B2 (en) | 2013-04-12 | 2016-04-26 | Usens, Inc. | Interactive input system and method |
US10082935B2 (en) * | 2013-04-15 | 2018-09-25 | Carnegie Mellon University | Virtual tools for use with touch-sensitive surfaces |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US9747696B2 (en) | 2013-05-17 | 2017-08-29 | Leap Motion, Inc. | Systems and methods for providing normalized parameters of motions of objects in three-dimensional space |
US9733716B2 (en) | 2013-06-09 | 2017-08-15 | Apple Inc. | Proxy gesture recognizer |
US10295338B2 (en) | 2013-07-12 | 2019-05-21 | Magic Leap, Inc. | Method and system for generating map data from an image |
WO2015006784A2 (en) | 2013-07-12 | 2015-01-15 | Magic Leap, Inc. | Planar waveguide apparatus with diffraction element(s) and system employing same |
CN104298342B (en) * | 2013-07-19 | 2019-02-05 | 中兴通讯股份有限公司 | A kind of detection method of three dimensional space coordinate, three-dimensional input method and related device |
US9798388B1 (en) | 2013-07-31 | 2017-10-24 | Aquifi, Inc. | Vibrotactile system to augment 3D input systems |
JP2015032101A (en) * | 2013-08-01 | 2015-02-16 | 株式会社東芝 | Information terminal apparatus |
US10281987B1 (en) | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
TWI505135B (en) * | 2013-08-20 | 2015-10-21 | Utechzone Co Ltd | Control system for display screen, control apparatus and control method |
CN104423578B (en) * | 2013-08-25 | 2019-08-06 | 杭州凌感科技有限公司 | Interactive input system and method |
US10846942B1 (en) | 2013-08-29 | 2020-11-24 | Ultrahaptics IP Two Limited | Predictive information for free space gesture control and communication |
US9632572B2 (en) | 2013-10-03 | 2017-04-25 | Leap Motion, Inc. | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US10168873B1 (en) | 2013-10-29 | 2019-01-01 | Leap Motion, Inc. | Virtual interactions for machine control |
EP3063608B1 (en) | 2013-10-30 | 2020-02-12 | Apple Inc. | Displaying relevant user interface objects |
US9996797B1 (en) * | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Interactions with virtual objects for machine control |
JP2016539413A (en) * | 2013-10-31 | 2016-12-15 | 華為技術有限公司Huawei Technologies Co.,Ltd. | Floating or air handling method and apparatus |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
KR20150051278A (en) * | 2013-11-01 | 2015-05-12 | 삼성전자주식회사 | Object moving method and electronic device implementing the same |
CN103558920B (en) * | 2013-11-15 | 2018-06-19 | 努比亚技术有限公司 | The processing method and processing device of Non-contact posture |
ITCO20130068A1 (en) * | 2013-12-18 | 2015-06-19 | Nu Tech S A S Di De Michele Marco & Co | METHOD TO PROVIDE USER COMMANDS TO AN ELECTRONIC PROCESSOR AND RELATED PROGRAM FOR PROCESSING AND ELECTRONIC CIRCUIT. |
CN103713779A (en) * | 2013-12-31 | 2014-04-09 | 成都有尔科技有限公司 | Non-contact touch device and implementation method thereof |
US9507417B2 (en) | 2014-01-07 | 2016-11-29 | Aquifi, Inc. | Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects |
KR102214437B1 (en) * | 2014-01-10 | 2021-02-10 | 삼성전자주식회사 | Method for copying contents in a computing device, method for pasting contents in a computing device, and the computing device |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
CN103823555A (en) * | 2014-01-23 | 2014-05-28 | 珠海恒宇新科技有限公司 | System and method for converting 3D (three-dimensional) gestures into key codes |
US9619105B1 (en) | 2014-01-30 | 2017-04-11 | Aquifi, Inc. | Systems and methods for gesture based interaction with viewpoint dependent user interfaces |
CN106068201B (en) | 2014-03-07 | 2019-11-01 | 大众汽车有限公司 | User interface and in gestures detection by the method for input component 3D position signal |
CN104914980A (en) * | 2014-03-10 | 2015-09-16 | 联想(北京)有限公司 | Information processing method and device |
US9477337B2 (en) | 2014-03-14 | 2016-10-25 | Microsoft Technology Licensing, Llc | Conductive trace routing for display and bezel sensors |
CN103914305B (en) * | 2014-04-10 | 2019-02-12 | Oppo广东移动通信有限公司 | The method and system of application are freely controlled on a kind of mobile terminal |
US9569006B2 (en) * | 2014-04-10 | 2017-02-14 | Mediatek Inc. | Ultrasound-based methods for touchless gesture recognition, and apparatuses using the same |
CN103955277A (en) * | 2014-05-13 | 2014-07-30 | 广州三星通信技术研究有限公司 | Method and device for controlling cursor on electronic equipment |
JP6494926B2 (en) * | 2014-05-28 | 2019-04-03 | 京セラ株式会社 | Mobile terminal, gesture control program, and gesture control method |
US9639167B2 (en) * | 2014-05-30 | 2017-05-02 | Eminent Electronic Technology Corp. Ltd. | Control method of electronic apparatus having non-contact gesture sensitive region |
KR102303115B1 (en) * | 2014-06-05 | 2021-09-16 | 삼성전자 주식회사 | Method For Providing Augmented Reality Information And Wearable Device Using The Same |
US9696813B2 (en) * | 2015-05-27 | 2017-07-04 | Hsien-Hsiang Chiu | Gesture interface robot |
CN105204610A (en) * | 2014-06-18 | 2015-12-30 | 王昱人 | Device for manipulating functions by means of motion sensing |
CN105320252A (en) * | 2014-06-26 | 2016-02-10 | 中兴通讯股份有限公司 | Interactive method and device for player |
CN104123095B (en) * | 2014-07-24 | 2018-03-30 | 广东欧珀移动通信有限公司 | A kind of suspension touch control method and device based on vector calculus |
CN204480228U (en) | 2014-08-08 | 2015-07-15 | 厉动公司 | motion sensing and imaging device |
US10310675B2 (en) * | 2014-08-25 | 2019-06-04 | Canon Kabushiki Kaisha | User interface apparatus and control method |
US9329715B2 (en) | 2014-09-11 | 2016-05-03 | Qeexo, Co. | Method and apparatus for differentiating touch screen users based on touch event analysis |
US11619983B2 (en) | 2014-09-15 | 2023-04-04 | Qeexo, Co. | Method and apparatus for resolving touch screen ambiguities |
US10606417B2 (en) | 2014-09-24 | 2020-03-31 | Qeexo, Co. | Method for improving accuracy of touch screen event analysis by use of spatiotemporal touch patterns |
US10282024B2 (en) | 2014-09-25 | 2019-05-07 | Qeexo, Co. | Classifying contacts or associations with a touch sensitive device |
DE102014017179B4 (en) * | 2014-11-20 | 2022-10-06 | Audi Ag | Method for operating a navigation system of a motor vehicle using an operating gesture |
ES2880342T3 (en) * | 2014-12-15 | 2021-11-24 | Courtius Oy | Acoustic event detection |
CN104503573A (en) * | 2014-12-16 | 2015-04-08 | 苏州佳世达电通有限公司 | Gesture operating method and gesture operating device |
US10248728B1 (en) * | 2014-12-24 | 2019-04-02 | Open Invention Network Llc | Search and notification procedures based on user history information |
US20160202865A1 (en) | 2015-01-08 | 2016-07-14 | Apple Inc. | Coordination of static backgrounds and rubberbanding |
CN104536576B (en) * | 2015-01-12 | 2017-05-31 | 苏州触达信息技术有限公司 | Same plane inner multimedia equipment room gesture interaction method based on ultrasonic wave |
US20160224118A1 (en) * | 2015-02-02 | 2016-08-04 | Kdh-Design Service Inc. | Helmet-used touchless sensing and gesture recognition structure and helmet thereof |
US9696795B2 (en) * | 2015-02-13 | 2017-07-04 | Leap Motion, Inc. | Systems and methods of creating a realistic grab experience in virtual reality/augmented reality environments |
US10429923B1 (en) | 2015-02-13 | 2019-10-01 | Ultrahaptics IP Two Limited | Interaction engine for creating a realistic experience in virtual reality/augmented reality environments |
KR20160101605A (en) * | 2015-02-17 | 2016-08-25 | 삼성전자주식회사 | Gesture input processing method and electronic device supporting the same |
CN104881192B (en) * | 2015-05-28 | 2018-11-16 | 努比亚技术有限公司 | Operate recognition methods and device and terminal |
US10642404B2 (en) | 2015-08-24 | 2020-05-05 | Qeexo, Co. | Touch sensitive device with multi-sensor stream synchronized data |
CN106484087A (en) * | 2015-09-02 | 2017-03-08 | 黄小明 | A kind of Portable telemetry body-sensing input method and device |
CN105242861B (en) * | 2015-10-15 | 2019-08-02 | Oppo广东移动通信有限公司 | A kind of parameter adjusting method based on ultrasound and device |
CN105243316B (en) * | 2015-10-15 | 2018-01-26 | 广东欧珀移动通信有限公司 | A kind of method and device of mobile terminal unblock |
CN105302303A (en) * | 2015-10-15 | 2016-02-03 | 广东欧珀移动通信有限公司 | Game control method and apparatus and mobile terminal |
CN105204649B (en) * | 2015-10-15 | 2018-01-19 | 广东欧珀移动通信有限公司 | A kind of method and device of mobile terminal unblock |
CN105242786A (en) * | 2015-10-15 | 2016-01-13 | 广东欧珀移动通信有限公司 | Ultrasonic wave-based application control method and device |
CN105306820A (en) * | 2015-10-15 | 2016-02-03 | 广东欧珀移动通信有限公司 | Method and device for controlling rotation of camera in mobile terminal and mobile terminal |
CN105844216B (en) * | 2016-03-11 | 2020-10-27 | 南京航空航天大学 | Detection and matching mechanism for recognizing handwritten letters by WiFi signals |
CN105607745A (en) | 2016-03-16 | 2016-05-25 | 京东方科技集团股份有限公司 | Display control circuit, display control method and display device |
CN105844705B (en) * | 2016-03-29 | 2018-11-09 | 联想(北京)有限公司 | A kind of three-dimensional object model generation method and electronic equipment |
KR101671831B1 (en) * | 2016-04-11 | 2016-11-03 | 김수민 | Apparatus for sharing data and providing reward in accordance with shared data |
CN105881548B (en) * | 2016-04-29 | 2018-07-20 | 北京快乐智慧科技有限责任公司 | Wake up the method and intelligent interaction robot of intelligent interaction robot |
CN106055098B (en) * | 2016-05-24 | 2019-03-15 | 北京小米移动软件有限公司 | Every empty gesture operation method and device |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
TWI634487B (en) * | 2017-03-02 | 2018-09-01 | 合盈光電科技股份有限公司 | Action gesture recognition system |
CN109597405A (en) * | 2017-09-30 | 2019-04-09 | 阿里巴巴集团控股有限公司 | Control the mobile method of robot and robot |
CN108052202B (en) * | 2017-12-11 | 2021-06-11 | 深圳市星野信息技术有限公司 | 3D interaction method and device, computer equipment and storage medium |
US10585525B2 (en) | 2018-02-12 | 2020-03-10 | International Business Machines Corporation | Adaptive notification modifications for touchscreen interfaces |
US10579099B2 (en) | 2018-04-30 | 2020-03-03 | Apple Inc. | Expandable ring device |
US11875012B2 (en) | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
US11360558B2 (en) | 2018-07-17 | 2022-06-14 | Apple Inc. | Computer systems with finger devices |
US11009989B2 (en) | 2018-08-21 | 2021-05-18 | Qeexo, Co. | Recognizing and rejecting unintentional touch events associated with a touch sensitive device |
CN109634415B (en) * | 2018-12-11 | 2019-10-18 | 哈尔滨拓博科技有限公司 | It is a kind of for controlling the gesture identification control method of analog quantity |
CN109480904A (en) * | 2018-12-25 | 2019-03-19 | 无锡祥生医疗科技股份有限公司 | A kind of ultrasonic imaging method, apparatus and system |
CN109480903A (en) * | 2018-12-25 | 2019-03-19 | 无锡祥生医疗科技股份有限公司 | Imaging method, the apparatus and system of ultrasonic diagnostic equipment |
CN109753219B (en) * | 2018-12-29 | 2021-07-20 | 广州欧科信息技术股份有限公司 | Handicraft production system, method and device based on virtual reality |
US10818015B2 (en) * | 2019-01-28 | 2020-10-27 | Florida Analytical Imaging Solutions, LLC. | Automatic region of interest selection in centrosome analysis |
CN109947183B (en) * | 2019-03-27 | 2021-12-24 | 联想(北京)有限公司 | Control method and electronic equipment |
US11016643B2 (en) | 2019-04-15 | 2021-05-25 | Apple Inc. | Movement of user interface object with user-specified content |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US10942603B2 (en) | 2019-05-06 | 2021-03-09 | Qeexo, Co. | Managing activity states of an application processor in relation to touch or hover interactions with a touch sensitive device |
US11231815B2 (en) | 2019-06-28 | 2022-01-25 | Qeexo, Co. | Detecting object proximity using touch sensitive surface sensing and ultrasonic sensing |
US11592423B2 (en) | 2020-01-29 | 2023-02-28 | Qeexo, Co. | Adaptive ultrasonic sensing techniques and systems to mitigate interference |
US11360587B1 (en) | 2020-04-07 | 2022-06-14 | Apple Inc. | Deployment systems for computer system finger devices |
WO2022093723A1 (en) * | 2020-10-29 | 2022-05-05 | Intrface Solutions Llc | Systems and methods for remote manipulation of multidimensional models |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6903730B2 (en) * | 2000-11-10 | 2005-06-07 | Microsoft Corporation | In-air gestures for electromagnetic coordinate digitizers |
US20050162402A1 (en) * | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
US20050271279A1 (en) * | 2004-05-14 | 2005-12-08 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
US7084859B1 (en) * | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US20060267966A1 (en) * | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Hover widgets: using the tracking state to extend capabilities of pen-operated devices |
US7227526B2 (en) * | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4988981B1 (en) * | 1987-03-17 | 1999-05-18 | Vpl Newco Inc | Computer data entry and manipulation apparatus and method |
DE69509637T2 (en) | 1994-02-15 | 2000-01-13 | Breyer Branco | COMPUTER GUIDE ARRANGEMENT |
US5821922A (en) | 1997-05-27 | 1998-10-13 | Compaq Computer Corporation | Computer having video controlled cursor system |
US6720949B1 (en) * | 1997-08-22 | 2004-04-13 | Timothy R. Pryor | Man machine interfaces and applications |
US6950534B2 (en) * | 1998-08-10 | 2005-09-27 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
US6147678A (en) * | 1998-12-09 | 2000-11-14 | Lucent Technologies Inc. | Video hand image-three-dimensional computer interface with multiple degrees of freedom |
US6313825B1 (en) | 1998-12-28 | 2001-11-06 | Gateway, Inc. | Virtual input device |
JP4332649B2 (en) * | 1999-06-08 | 2009-09-16 | 独立行政法人情報通信研究機構 | Hand shape and posture recognition device, hand shape and posture recognition method, and recording medium storing a program for executing the method |
US6710770B2 (en) | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
US7050177B2 (en) | 2002-05-22 | 2006-05-23 | Canesta, Inc. | Method and apparatus for approximating depth of an object's placement onto a monitored region with applications to virtual interface devices |
US6771294B1 (en) * | 1999-12-29 | 2004-08-03 | Petri Pulli | User interface |
US6847354B2 (en) | 2000-03-23 | 2005-01-25 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Three dimensional interactive display |
US6624833B1 (en) * | 2000-04-17 | 2003-09-23 | Lucent Technologies Inc. | Gesture-based input interface system with shadow detection |
US7042442B1 (en) * | 2000-06-27 | 2006-05-09 | International Business Machines Corporation | Virtual invisible keyboard |
WO2002048642A2 (en) | 2000-11-19 | 2002-06-20 | Canesta, Inc. | Method for enhancing performance in a system utilizing an array of sensors that sense at least two-dimensions |
DE10100615A1 (en) * | 2001-01-09 | 2002-07-18 | Siemens Ag | Hand recognition with position determination |
US6775014B2 (en) * | 2001-01-17 | 2004-08-10 | Fujixerox Co., Ltd. | System and method for determining the location of a target in a room or small area |
US7053967B2 (en) | 2002-05-23 | 2006-05-30 | Planar Systems, Inc. | Light sensitive display |
US7009663B2 (en) | 2003-12-17 | 2006-03-07 | Planar Systems, Inc. | Integrated optical light sensitive active matrix liquid crystal display |
SE521283C2 (en) * | 2002-05-10 | 2003-10-14 | Henrik Dryselius | Device for input control signals to an electronic device |
JP4298407B2 (en) * | 2002-09-30 | 2009-07-22 | キヤノン株式会社 | Video composition apparatus and video composition method |
US7554530B2 (en) | 2002-12-23 | 2009-06-30 | Nokia Corporation | Touch screen user interface featuring stroke-based object selection and functional object activation |
US7231609B2 (en) * | 2003-02-03 | 2007-06-12 | Microsoft Corporation | System and method for accessing remote screen content |
KR100588042B1 (en) * | 2004-01-14 | 2006-06-09 | 한국과학기술연구원 | Interactive presentation system |
US7519223B2 (en) * | 2004-06-28 | 2009-04-14 | Microsoft Corporation | Recognizing gestures and using gestures for interacting with software applications |
GB2419433A (en) * | 2004-10-20 | 2006-04-26 | Glasgow School Of Art | Automated Gesture Recognition |
KR20060070280A (en) * | 2004-12-20 | 2006-06-23 | 한국전자통신연구원 | Apparatus and its method of user interface using hand gesture recognition |
US20070064004A1 (en) * | 2005-09-21 | 2007-03-22 | Hewlett-Packard Development Company, L.P. | Moving a graphic element |
US20070130547A1 (en) | 2005-12-01 | 2007-06-07 | Navisense, Llc | Method and system for touchless user interface control |
-
2006
- 2006-06-28 US US11/477,508 patent/US8086971B2/en active Active
-
2007
- 2007-06-27 CN CN201310480408.9A patent/CN103529942B/en active Active
- 2007-06-27 EP EP07804538A patent/EP2038732A4/en not_active Withdrawn
- 2007-06-27 KR KR1020097001481A patent/KR101098015B1/en active IP Right Grant
- 2007-06-27 EP EP13188745.7A patent/EP2717120B1/en active Active
- 2007-06-27 CN CN2007800245045A patent/CN101730874B/en active Active
- 2007-06-27 WO PCT/IB2007/001757 patent/WO2008001202A2/en active Application Filing
-
2011
- 2011-11-14 US US13/295,340 patent/US20120056804A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7084859B1 (en) * | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
US7227526B2 (en) * | 2000-07-24 | 2007-06-05 | Gesturetek, Inc. | Video-based image control system |
US6903730B2 (en) * | 2000-11-10 | 2005-06-07 | Microsoft Corporation | In-air gestures for electromagnetic coordinate digitizers |
US20050162402A1 (en) * | 2004-01-27 | 2005-07-28 | Watanachote Susornpol J. | Methods of interacting with a computer using a finger(s) touch sensing input device with visual feedback |
US20050271279A1 (en) * | 2004-05-14 | 2005-12-08 | Honda Motor Co., Ltd. | Sign based human-machine interaction |
US8479122B2 (en) * | 2004-07-30 | 2013-07-02 | Apple Inc. | Gestures for touch sensitive input devices |
US20060267966A1 (en) * | 2005-05-24 | 2006-11-30 | Microsoft Corporation | Hover widgets: using the tracking state to extend capabilities of pen-operated devices |
Non-Patent Citations (1)
Title |
---|
Yamakawa, A. et al., "A Pointing Device Using Hand and Fingers Equipped with a Multi-color Tracker", 1997, Biomedical Fuzzy and Muman Sciences, Vol. 3. No. 1, pp. 11-20 * |
Cited By (40)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9857920B2 (en) * | 2010-02-02 | 2018-01-02 | Samsung Electronics Co., Ltd. | Method and apparatus for providing user interface using acoustic signal, and device including user interface |
US20110191680A1 (en) * | 2010-02-02 | 2011-08-04 | Chae Seung Chul | Method and apparatus for providing user interface using acoustic signal, and device including user interface |
US20120182222A1 (en) * | 2011-01-13 | 2012-07-19 | David Moloney | Detect motion generated from gestures used to execute functionality associated with a computer system |
US8730190B2 (en) * | 2011-01-13 | 2014-05-20 | Qualcomm Incorporated | Detect motion generated from gestures used to execute functionality associated with a computer system |
US9733895B2 (en) | 2011-08-05 | 2017-08-15 | Samsung Electronics Co., Ltd. | Method for controlling electronic apparatus based on voice recognition and motion recognition, and electronic apparatus applying the same |
US20130204457A1 (en) * | 2012-02-06 | 2013-08-08 | Ford Global Technologies, Llc | Interacting with vehicle controls through gesture recognition |
US8866781B2 (en) | 2012-05-21 | 2014-10-21 | Huawei Technologies Co., Ltd. | Contactless gesture-based control method and apparatus |
US20130346893A1 (en) * | 2012-06-21 | 2013-12-26 | Fih (Hong Kong) Limited | Electronic device and method for editing document using the electronic device |
US9864433B2 (en) | 2012-07-13 | 2018-01-09 | Softkinetic Software | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
US11513601B2 (en) * | 2012-07-13 | 2022-11-29 | Sony Depthsensing Solutions Sa/Nv | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
WO2014009561A3 (en) * | 2012-07-13 | 2014-05-01 | Softkinetic Software | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
CN105378593A (en) * | 2012-07-13 | 2016-03-02 | 索夫特克尼特科软件公司 | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
JP2015522195A (en) * | 2012-07-13 | 2015-08-03 | ソフトキネティック ソフトウェア | Method and system for simultaneous human-computer gesture-based interaction using unique noteworthy points on the hand |
US20170097687A1 (en) * | 2012-07-13 | 2017-04-06 | Softkinetic Software | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
EP3007039A1 (en) * | 2012-07-13 | 2016-04-13 | Softkinetic Software | Method and system for human-to-computer gesture based simultaneous interactions using singular points of interest on a hand |
US8923562B2 (en) * | 2012-12-24 | 2014-12-30 | Industrial Technology Research Institute | Three-dimensional interactive device and operation method thereof |
US20140177909A1 (en) * | 2012-12-24 | 2014-06-26 | Industrial Technology Research Institute | Three-dimensional interactive device and operation method thereof |
EP2951670A4 (en) * | 2013-03-06 | 2016-10-26 | Sony Corp | Apparatus and method for operating a user interface of a device |
WO2014138096A1 (en) | 2013-03-06 | 2014-09-12 | Sony Corporation | Apparatus and method for operating a user interface of a device |
US9507425B2 (en) | 2013-03-06 | 2016-11-29 | Sony Corporation | Apparatus and method for operating a user interface of a device |
CN104076974A (en) * | 2013-03-25 | 2014-10-01 | 柯尼卡美能达株式会社 | Device and method for determining gesture, and computer-readable storage medium for computer program |
US9489051B2 (en) | 2013-07-01 | 2016-11-08 | Blackberry Limited | Display navigation using touch-less gestures |
US9256290B2 (en) | 2013-07-01 | 2016-02-09 | Blackberry Limited | Gesture detection using ambient light sensors |
US9398221B2 (en) | 2013-07-01 | 2016-07-19 | Blackberry Limited | Camera control using ambient light sensors |
US9423913B2 (en) | 2013-07-01 | 2016-08-23 | Blackberry Limited | Performance control of ambient light sensors |
US9367137B2 (en) | 2013-07-01 | 2016-06-14 | Blackberry Limited | Alarm operation by touch-less gesture |
US9342671B2 (en) | 2013-07-01 | 2016-05-17 | Blackberry Limited | Password by touch-less gesture |
US9928356B2 (en) | 2013-07-01 | 2018-03-27 | Blackberry Limited | Password by touch-less gesture |
US9323336B2 (en) | 2013-07-01 | 2016-04-26 | Blackberry Limited | Gesture detection using ambient light sensors |
US9865227B2 (en) | 2013-07-01 | 2018-01-09 | Blackberry Limited | Performance control of ambient light sensors |
US9405461B2 (en) | 2013-07-09 | 2016-08-02 | Blackberry Limited | Operating a device using touchless and touchscreen gestures |
US9465448B2 (en) | 2013-07-24 | 2016-10-11 | Blackberry Limited | Backlight for touchless gesture detection |
US9304596B2 (en) | 2013-07-24 | 2016-04-05 | Blackberry Limited | Backlight for touchless gesture detection |
US9194741B2 (en) | 2013-09-06 | 2015-11-24 | Blackberry Limited | Device having light intensity measurement in presence of shadows |
WO2015065341A1 (en) * | 2013-10-29 | 2015-05-07 | Intel Corporation | Gesture based human computer interaction |
US9304597B2 (en) | 2013-10-29 | 2016-04-05 | Intel Corporation | Gesture based human computer interaction |
US11294470B2 (en) | 2014-01-07 | 2022-04-05 | Sony Depthsensing Solutions Sa/Nv | Human-to-computer natural three-dimensional hand gesture based navigation method |
WO2016053320A1 (en) * | 2014-09-30 | 2016-04-07 | Hewlett-Packard Development Company, L.P. | Gesture based manipulation of three-dimensional images |
US10268277B2 (en) | 2014-09-30 | 2019-04-23 | Hewlett-Packard Development Company, L.P. | Gesture based manipulation of three-dimensional images |
WO2019154824A1 (en) * | 2018-02-08 | 2019-08-15 | BSH Hausgeräte GmbH | Domestic refrigeration device |
Also Published As
Publication number | Publication date |
---|---|
US20080005703A1 (en) | 2008-01-03 |
US8086971B2 (en) | 2011-12-27 |
EP2038732A2 (en) | 2009-03-25 |
CN103529942B (en) | 2016-09-28 |
EP2717120A1 (en) | 2014-04-09 |
KR101098015B1 (en) | 2011-12-22 |
WO2008001202A2 (en) | 2008-01-03 |
KR20090029816A (en) | 2009-03-23 |
EP2717120B1 (en) | 2021-05-19 |
WO2008001202A3 (en) | 2008-05-22 |
EP2038732A4 (en) | 2012-01-18 |
CN101730874A (en) | 2010-06-09 |
CN103529942A (en) | 2014-01-22 |
CN101730874B (en) | 2013-11-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8086971B2 (en) | Apparatus, methods and computer program products providing finger-based and hand-based gesture commands for portable electronic device applications | |
US11262864B2 (en) | Method and apparatus for classifying finger touch events | |
US11048333B2 (en) | System and method for close-range movement tracking | |
JP6074170B2 (en) | Short range motion tracking system and method | |
CN105116971B (en) | Customization of GUI layout based on usage history | |
US8432301B2 (en) | Gesture-enabled keyboard and associated apparatus and computer-readable storage medium | |
US9448714B2 (en) | Touch and non touch based interaction of a user with a device | |
EP1942399A1 (en) | Multi-event input system | |
US20130082928A1 (en) | Keyboard-based multi-touch input system using a displayed representation of a users hand | |
JP2013037675A5 (en) | ||
US20130257734A1 (en) | Use of a sensor to enable touch and type modes for hands of a user via a keyboard | |
US9454257B2 (en) | Electronic system | |
KR20130129271A (en) | Gesture based user interface for augmented reality | |
EP2575007A1 (en) | Scaling of gesture based input | |
KR19990084901A (en) | Software Keyboard System Using Stylus Traces and Its Key Code Recognition Method | |
US20170192465A1 (en) | Apparatus and method for disambiguating information input to a portable electronic device | |
KR200477008Y1 (en) | Smart phone with mouse module | |
KR101019255B1 (en) | wireless apparatus and method for space touch sensing and screen apparatus using depth sensor | |
WO2019134606A1 (en) | Terminal control method, device, storage medium, and electronic apparatus | |
JP6232694B2 (en) | Information processing apparatus, control method thereof, and program | |
Athira | Touchless technology | |
KR20090103384A (en) | Network Apparatus having Function of Space Projection and Space Touch and the Controlling Method thereof | |
Yang | Blurring the boundary between direct & indirect mixed mode input environments |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |