[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2005064439A2 - Dynamically modifiable virtual keyboard or virtual mouse interface - Google Patents

Dynamically modifiable virtual keyboard or virtual mouse interface Download PDF

Info

Publication number
WO2005064439A2
WO2005064439A2 PCT/EP2004/014334 EP2004014334W WO2005064439A2 WO 2005064439 A2 WO2005064439 A2 WO 2005064439A2 EP 2004014334 W EP2004014334 W EP 2004014334W WO 2005064439 A2 WO2005064439 A2 WO 2005064439A2
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
keyboard
interface
user
pattern
Prior art date
Application number
PCT/EP2004/014334
Other languages
French (fr)
Other versions
WO2005064439A3 (en
Inventor
Stephen Dana Bjorgan
Alfred Chioiu
Original Assignee
France Telecom
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US10/748,146 external-priority patent/US20050141752A1/en
Application filed by France Telecom filed Critical France Telecom
Publication of WO2005064439A2 publication Critical patent/WO2005064439A2/en
Publication of WO2005064439A3 publication Critical patent/WO2005064439A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1615Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function
    • G06F1/1616Constructional details or arrangements for portable computers with several enclosures having relative motions, each enclosure supporting at least one I/O or computing function with folding flat displays, e.g. laptop computers or notebooks having a clamshell configuration, with body parts pivoting to an open position around an axis parallel to the plane they define in closed position
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1662Details related to the integrated keyboard
    • G06F1/1673Arrangements for projecting a virtual keyboard
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1686Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/169Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated pointing device, e.g. trackball in the palm rest area, mini-joystick integrated between keyboard keys, touch pads or touch stripes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus

Definitions

  • the present invention is directed to a dynamically modifiable keyboard- style interface, and, in one embodiment, to a laser drawn keyboard interface that dynamically changes according to user preferences or commands.
  • Keyboards for personal computers are known user-input devices.
  • Known keyboards have included numerous keys, including a "standard" style keyboard utilizing approximately 101 keys.
  • new design for keyboards have emerged that have required that an existing keyboard be thrown out and replaced by the new design since the physical arrangement of keys were such that new keys could not simply be added to an existing keyboard.
  • Keyboards are also not the only user input device that a user often interacts with. In the laptop environment, such as is shown in Figure 2, a user also has access to a touch pad that sits between the user and the keyboard keys. Such a positioning of the mouse pad is preferable for manipulation of the mouse pad, but the mouse pad is often accidentally touched while typing.
  • the positioning of the mouse pad also increases the distance that a user has to reach to get to the keys, and increases the required depth of the computer in order to fit both the mouse pad and keys.
  • Keyboards are also poor input devices in a multi-language environment. For example, in a kiosk in an international airport, it is difficult to have only one keyboard since keyboards are actually language dependent. For example, while the US-style keypad uses a "QWERTY” layout, France uses a “AZERTY” lay-out. Also, alternative keyboard interfaces (such as Dvorak style keyboards) exist, and users accustomed to those alternative interfaces may have difficulty in using a "Standard" keyboard.
  • the present invention is directed to a virtual user-input device that enables various input configurations to be utilized dynamically such that at least one of the keyboard layout and keyboard character mappings are changed dynamically.
  • a system for achieving such a keyboard includes a dynamic pattern generation module and a motion sensor for determining interactions with the pattern(s) generated by the dynamic pattern generation module.
  • the dynamic pattern generation module may be either a projector-based image or a monitor-based image.
  • Figure 1 is a schematic illustration of a known, fixed-key keyboard for a desktop-style personal computer
  • Figure 2 is a schematic illustration of a known laptop-configuration with a set of fixed keyboard keys and a touchpad area with corresponding mouse buttons;
  • Figure 3 is a schematic illustration of a laptop including a keyboard implemented by a dynamic pattern generation module and a motion sensor according to the present invention;
  • Figure 4 is a schematic illustration of a laptop including a keyboard and mousepad implemented by a dynamic pattern generation module and a motion sensor according to the present invention;
  • FIG. 5 is a schematic illustration of an embodiment of a dynamic user interface based on a projection unit operating remotely from an application server from which the dynamic user interface is delivered over a wired network;
  • Figure 6 is a schematic illustration of an embodiment of a dynamic user interface using a monitor-based image and operating remotely from an application server from which the dynamic user interface is delivered over a wired network;
  • Figure 7 is a schematic illustration of an embodiment of a dynamic user interface based on a projection unit operating remotely from an application server from which the dynamic user interface is delivered over at least one wireless network;
  • Figure 8 is a schematic illustration of an embodiment of a dynamic user interface using a monitor-based image and operating remotely from an application server from which the dynamic user interface is delivered over at least one wireless network.
  • Figure 3 illustrates a laptop computer 200 including (1) a projection unit 300 at the top of the flip-top of the laptop computer 200 and (2) a motion sensor 310 at the base of flip-top of the laptop computer 200.
  • the projection unit 300 may be a laser-based projection system and may include possibly at least one mirror.
  • Potential laser-based displays include the laser-based display described in the article entitled “Cell phone captures and projects images” in Laser Focus World, July 2003.
  • An alternate embodiment utilizes the Laser Projection Display from Symbol Technologies as is described in Symbol Technologies article "Preliminary Concept: Laser Projection Display (LPD)." The contents of both of those articles is incorporated herein by reference.
  • the projection unit 300 is designed to display more than one pattern onto the base unit 320.
  • the projection unit 300 may project a keyboard-style image 325 on either a planar surface or a 3D surface.
  • the motion sensor 310 may include an LR beam transmitter and an LR sensor which may belong to either separate components or a single integrated component. Other technologies for motion sensing may also be used instead of LR transceivers.
  • a keyboard-style pattern 325 is projected onto the base unit 320.
  • the base unit 320 may include registration marks thereon in order to indicate to a user when the projection unit 300 is aligned with the motion sensor 310.
  • the laptop computer 200 may provide a calibration process or method by which points on the projected keyboard-style image 325 are identified to the motion sensor 310. This calibration process enables the image to be projected by the projection unit 300 even if the flip-top is not at the exact angle (compared to the base unit 320) for which the keyboard-style pattern 325 was originally created.
  • the projection unit 300 virtually superimposes or integrates with the keyboard-style image 325 a mousepad 330 and corresponding mouse buttons 340. That is, a portion of the keyboard-style pattern 325 is virtually occluded and the mousepad 330 and buttons 340 are drawn where a portion of the keyboard-style pattern 325 image otherwise would have been drawn. (As would be understood by one of ordinary skill in the art, in embodiments utilizing a laser, the keyboard-style pattern 325 is not physically overwritten, but rather a portion of the keyboard-style pattern 325 is suppressed from being projected and the mousepad 330 and buttons 340 are projected in place of that portion of the keyboard-style pattern 325.)
  • the keyboard-style pattern 325 can be created using technology other than a projection unit 300.
  • a fixed pattern can be printed onto the base unit 320.
  • the base unit 320 could be printed on with a variety of colors and patterns such that the user could, knowing the color conesponding to the current configuration, see several user interfaces simultaneously.
  • the "overhead" projection unit of Figures 3 and 4 is replaced with a pattern generation device that is underneath the surface of where the user is interacting with the interface.
  • a pattern generation device that is underneath the surface of where the user is interacting with the interface.
  • an LCD panel or display is typed on and tracked by the motion sensor 310.
  • no special touch sensitive material is required for the LCD since the motion sensor can use infrared to pick up the hand motions.
  • a monitor including flat panel monitors
  • under glass or other transparent material can be used to generate the keyboard-style pattern 325. The user simply types on the transparent material, protecting the monitor from harm while dynamically being able to update the display.
  • the dynamic user interface may also be implemented in applications other than local computing enviromnents (e.g., laptops and desktop computer environments).
  • a kiosk 250 may be equipped with a projection unit 300 that generates a user interface (e.g., a keyboard-style pattern 325 and/or a mousepad and corresponding buttons). Interactions with the keyboard-style pattern 325 are picked up by the motion sensor 310. Those interactions are communicated to a control and communications unit 350 over a communications link 345.
  • the control and communications unit 350 may then either process those interactions locally or send them on to an application server 400 connected to the control and communications unit 350 by a LAN or WAN connection across at least one communications link 360.
  • Such communications link may be any one or a combination of wired (e.g., Ethernet, ATM, FDDI, TokenRing) or wireless (e.g., 802.1 la, b, g and other follow-on standards, and other RF and IR standards) links.
  • the communication protocols may include any one of connection- oriented and connectionless communications protocols using either datagram or error- correcting protocols. Such protocols include TCP/IP, UDP, etc.
  • Examples of applications for kiosks 250 include a public pay phone where the user interacts with the keyboard-style pattern 325 instead of a physical telephone interface.
  • a user of the kiosk 250 can also be provided with Internet related services or enhanced calling features as well. For example, a user may browse emails or facsimiles corresponding to the user.
  • the kiosk may also include a phone handset or a speakerphone that the user utilizes to communicate with a called party.
  • the projection unit 300 may include an interface for receiving any one or a combination of power and control signals used to drive the projection unit.
  • Such interfaces can be any wired or wireless communication devices including, but not limited to Ethernet, serial, USB, parallel, Bluetooth, CDMA, GSM/GPRS, etc.
  • Control signals sent to the projection unit may include which one of plural user interfaces (or partial user interfaces) is cun-ently projected.
  • a provider of a kiosk 250 may also elect to utilize an under-mounted display technology, as described above with reference to a monitor or LCD panel covered with a transparent protective material.
  • FIGs 7 and 8 illustrate embodiments in which either the kiosk provider provides to a user or a user brings his own portable device 290 that interacts with a kiosk.
  • the portable device 290 utilizes an RF module 380 (or an optical module such as an IR module) to communicate with an application server 400 across a WAN/LAN using at least one communications link 360. In this fashion, the entire control interface may be transported to and used in or near the kiosk.
  • Other possible kiosks or applications can include any interface description accessible by the user that may be transmitted from an application server 400 to a terminal containing the display mechanism, displayed on a surface, and operated upon by the user.
  • Some applications include: web browsers; video conference applications (e.g., mute one or more participants, display a particular image to the audience, control volume, dial-in participants); multimedia equipment controls (e.g., wave your hand down to decrease volume — up to increase it; dial a station, play a CD); information kiosks; advertising displays with feedback mechanisms; ticketing services; self-service interfaces (e.g., vending machines); remote device control (e.g., cameras, alarms, locks); remote vehicle control to, for example, control vehicles in hazardous environments; industrial environments (flow controls, heating/ventilation/air conditioning); clean rooms; sterile and medical environments where mechanical equipment placement is prohibitive; test equipment; hazardous environments; remote control of distant objects; e.g., factory equipment, defense applications, building security (alanns, cameras, locking mechanisms); and
  • the present invention includes at least one of hardware and software for controlling at least one of the projection unit 300 and the motion sensor 310.
  • a central processing unit CPU
  • interacts with at least one memory e.g., DRAM, ROM, EPROM, EEPROM, SRAM, SDRAM, and Flash RAM
  • other optional special purpose logic devices e.g., ASICs
  • configurable logic devices e.g., GAL and reprogrammable FPGA
  • the kiosk 250 may also include a floppy disk drive; other removable media devices (e.g., compact disc, tape, and removable magneto-optical media); and a hard disk, or other fixed, high density media drives, connected using an appropriate device bus (e.g., a SCSI bus, an Enhanced IDE bus, a Ultra DMA bus or a Serial ATA interface).
  • the kiosk 250 may further include a compact disc reader, a compact disc reader/writer unit or a compact disc jukebox .
  • a printer may also provides printed listings of work perfonned by a user at the kiosk 250.
  • the software for controlling the kiosk includes at least one computer readable medium.
  • Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto- optical disks, PROMs (EPROM, EEPROM, Flash EPROM), DRAM, SRAM, SDRAM, etc.
  • the present invention includes software for controlling both the hardware of the kiosk 250 and for enabling the kiosk 250 to interact with a human user.
  • Such software may include, but is not limited to, device drivers, operating systems and user applications, such as development tools.
  • the computer readable media and the software thereon form a computer program product of the present invention for providing a virtual user interface.
  • the computer code devices of the present invention can be any interpreted or executable code mechanism, including but not limited to scripts, interpreters, dynamic link libraries, Java classes, and complete executable programs.
  • Such software controls a pattern to be displayed to a user, and the pattern may be dynamically changed in response to configuration information provided to the software. Such changes include changes in keyboard key labels and positions and shapes of individual keys.
  • Such software further includes a dynamically configurable memory for determining which key corresponds to the portion of the keyboard interface that a user is interacting with. For example, if an existing key is split into two parts to make two keys in its place, a computer memory is updated to be able to differentiate interactions with one of the new keys from interactions with the other of the two keys. Similarly, if keys are added where no keys existed before, the software tracks the location and extent of the new key. Such tracking may also occur for a virtual mousepad and virtual mouse buttons.
  • any of the functions described above in terms of software can instead be implemented in special-purpose hardware such as FPGAs, ASIC, PALs,

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)

Abstract

A method and system for providing a configurable user-input device in the form of a keyboard input device. In one embodiment, a projection unit projects a dynamically configurable keyboard pattern onto a planar or non-planar surface. Interactions with that pattern are monitored by at least one motion sensor to identify how a user is using the pattern.

Description

TITLE OF THE INVENTION DYNAMICALLY MODIFIABLE KEYBOARD-STYLE INTERFACE
BACKGROUND OF THE INVENTION Field of the Invention
[0001] The present invention is directed to a dynamically modifiable keyboard- style interface, and, in one embodiment, to a laser drawn keyboard interface that dynamically changes according to user preferences or commands.
Discussion of the Background
[0002] Keyboards for personal computers, such as is shown in Figure 1, are known user-input devices. Known keyboards have included numerous keys, including a "standard" style keyboard utilizing approximately 101 keys. However, new design for keyboards have emerged that have required that an existing keyboard be thrown out and replaced by the new design since the physical arrangement of keys were such that new keys could not simply be added to an existing keyboard. [0003] Keyboards are also not the only user input device that a user often interacts with. In the laptop environment, such as is shown in Figure 2, a user also has access to a touch pad that sits between the user and the keyboard keys. Such a positioning of the mouse pad is preferable for manipulation of the mouse pad, but the mouse pad is often accidentally touched while typing. This causes the computer to erroneously believe that the user intended to signal a click or movement of the mouse. The positioning of the mouse pad also increases the distance that a user has to reach to get to the keys, and increases the required depth of the computer in order to fit both the mouse pad and keys.
[0004] Keyboards are also often bulky and sometimes require wires to connect the keyboard to the computer. Such requirements cause many users to wish to not carry a keyboard. Keyboards, however, are a more rapid input device than a PDA touch screen or character recognition solutions. Accordingly, many people would often like to have a keyboard without the hassle and bulk of carrying a keyboard. A known concept for a virtual keyboard, for computers and PDAs, has been presented by Canesta. The system includes a pattern projector that is believed to be fixed, an IR light source (behind an engraved film) and an IR sensor module. However, a problem associated with the design of the Canesta system is that, by virtue of the film used, the pattern drawn by the pattern projector and analyzed by the sensor module appears to be fixed and does not allow for dynamic reconfiguration of the drawn pattern and interactions with the pattern.
[0005] Keyboards are also poor input devices in a multi-language environment. For example, in a kiosk in an international airport, it is difficult to have only one keyboard since keyboards are actually language dependent. For example, while the US-style keypad uses a "QWERTY" layout, France uses a "AZERTY" lay-out. Also, alternative keyboard interfaces (such as Dvorak style keyboards) exist, and users accustomed to those alternative interfaces may have difficulty in using a "Standard" keyboard.
[0006] Some provisions exist to cause a computer to pretend that an existing keyboard with letters and symbols printed on it in one fashion is actually a keyboard corresponding to an alternate language. However, in such an environment, the user does not actually see the letters as they would appear on the alternate keyboard, and the user can become confused.
SUMMARY OF THE INVENTION [0007] The present invention is directed to a virtual user-input device that enables various input configurations to be utilized dynamically such that at least one of the keyboard layout and keyboard character mappings are changed dynamically. [0008] One embodiment of a system for achieving such a keyboard includes a dynamic pattern generation module and a motion sensor for determining interactions with the pattern(s) generated by the dynamic pattern generation module. The dynamic pattern generation module may be either a projector-based image or a monitor-based image.
BRIEF DESCRIPTION OF THE DRAWINGS [0009] These and other advantages of the invention will become more apparent and more readily appreciated from the following detailed description of the exemplary embodiments of the invention taken in conjunction with the accompanying drawings, where:
[0010] Figure 1 is a schematic illustration of a known, fixed-key keyboard for a desktop-style personal computer;
[0011] Figure 2 is a schematic illustration of a known laptop-configuration with a set of fixed keyboard keys and a touchpad area with corresponding mouse buttons; [0012] Figure 3 is a schematic illustration of a laptop including a keyboard implemented by a dynamic pattern generation module and a motion sensor according to the present invention; [0013] Figure 4 is a schematic illustration of a laptop including a keyboard and mousepad implemented by a dynamic pattern generation module and a motion sensor according to the present invention;
[0014] Figure 5 is a schematic illustration of an embodiment of a dynamic user interface based on a projection unit operating remotely from an application server from which the dynamic user interface is delivered over a wired network;
[0015] Figure 6 is a schematic illustration of an embodiment of a dynamic user interface using a monitor-based image and operating remotely from an application server from which the dynamic user interface is delivered over a wired network;
[0016] Figure 7 is a schematic illustration of an embodiment of a dynamic user interface based on a projection unit operating remotely from an application server from which the dynamic user interface is delivered over at least one wireless network; and
[0017] Figure 8 is a schematic illustration of an embodiment of a dynamic user interface using a monitor-based image and operating remotely from an application server from which the dynamic user interface is delivered over at least one wireless network.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
[0018] According to an embodiment of a dynamic user interface according to the present invention, Figure 3 illustrates a laptop computer 200 including (1) a projection unit 300 at the top of the flip-top of the laptop computer 200 and (2) a motion sensor 310 at the base of flip-top of the laptop computer 200. The projection unit 300 may be a laser-based projection system and may include possibly at least one mirror. Potential laser-based displays include the laser-based display described in the article entitled "Cell phone captures and projects images" in Laser Focus World, July 2003. An alternate embodiment utilizes the Laser Projection Display from Symbol Technologies as is described in Symbol Technologies article "Preliminary Concept: Laser Projection Display (LPD)." The contents of both of those articles is incorporated herein by reference. However, the projection unit 300 is designed to display more than one pattern onto the base unit 320. The projection unit 300 may project a keyboard-style image 325 on either a planar surface or a 3D surface. The motion sensor 310 may include an LR beam transmitter and an LR sensor which may belong to either separate components or a single integrated component. Other technologies for motion sensing may also be used instead of LR transceivers. [0019] As shown in the example of Figures 3 and 4, a keyboard-style pattern 325 is projected onto the base unit 320. As would be understood by those skilled in the art, the base unit 320 may include registration marks thereon in order to indicate to a user when the projection unit 300 is aligned with the motion sensor 310. Moreover, the laptop computer 200 may provide a calibration process or method by which points on the projected keyboard-style image 325 are identified to the motion sensor 310. This calibration process enables the image to be projected by the projection unit 300 even if the flip-top is not at the exact angle (compared to the base unit 320) for which the keyboard-style pattern 325 was originally created.
[0020] In one embodiment of the present invention, shown in Figure 4, the projection unit 300 virtually superimposes or integrates with the keyboard-style image 325 a mousepad 330 and corresponding mouse buttons 340. That is, a portion of the keyboard-style pattern 325 is virtually occluded and the mousepad 330 and buttons 340 are drawn where a portion of the keyboard-style pattern 325 image otherwise would have been drawn. (As would be understood by one of ordinary skill in the art, in embodiments utilizing a laser, the keyboard-style pattern 325 is not physically overwritten, but rather a portion of the keyboard-style pattern 325 is suppressed from being projected and the mousepad 330 and buttons 340 are projected in place of that portion of the keyboard-style pattern 325.)
[0021] Alternatively, the keyboard-style pattern 325 can be created using technology other than a projection unit 300. For example, a fixed pattern can be printed onto the base unit 320. In such a configuration, the user would not be able to see any changes to the interface as it was dynamically updated, but various keyboard configurations could nonetheless be used dynamically. In addition, the base unit 320 could be printed on with a variety of colors and patterns such that the user could, knowing the color conesponding to the current configuration, see several user interfaces simultaneously.
[0022] In yet another embodiment, the "overhead" projection unit of Figures 3 and 4 is replaced with a pattern generation device that is underneath the surface of where the user is interacting with the interface. For example, an LCD panel or display is typed on and tracked by the motion sensor 310. In this configuration, no special touch sensitive material is required for the LCD since the motion sensor can use infrared to pick up the hand motions. Similarly, even a monitor (including flat panel monitors) under glass or other transparent material can be used to generate the keyboard-style pattern 325. The user simply types on the transparent material, protecting the monitor from harm while dynamically being able to update the display. As would be understood by those of ordinary skill in the art, in light of the inexpensive nature of computer monitors, several monitors may be used together to increase the size of the keyboard-style pattern 325 that can be interacted with. [0023] As illustrated in Figure 5, the dynamic user interface may also be implemented in applications other than local computing enviromnents (e.g., laptops and desktop computer environments). For example, a kiosk 250 may be equipped with a projection unit 300 that generates a user interface (e.g., a keyboard-style pattern 325 and/or a mousepad and corresponding buttons). Interactions with the keyboard-style pattern 325 are picked up by the motion sensor 310. Those interactions are communicated to a control and communications unit 350 over a communications link 345. The control and communications unit 350 may then either process those interactions locally or send them on to an application server 400 connected to the control and communications unit 350 by a LAN or WAN connection across at least one communications link 360. Such communications link may be any one or a combination of wired (e.g., Ethernet, ATM, FDDI, TokenRing) or wireless (e.g., 802.1 la, b, g and other follow-on standards, and other RF and IR standards) links. Moreover, the communication protocols may include any one of connection- oriented and connectionless communications protocols using either datagram or error- correcting protocols. Such protocols include TCP/IP, UDP, etc. [0024] Examples of applications for kiosks 250 include a public pay phone where the user interacts with the keyboard-style pattern 325 instead of a physical telephone interface. In light of the existence of a monitor and a keyboard-style pattern 325, a user of the kiosk 250 can also be provided with Internet related services or enhanced calling features as well. For example, a user may browse emails or facsimiles corresponding to the user. In a kiosk that implements a phone booth, the kiosk may also include a phone handset or a speakerphone that the user utilizes to communicate with a called party. [0025] In an environment where a kiosk provider does not want to incur the cost or risk of providing a projection unit 300, either a user can bring his/her own projection unit (e.g., integrated within a PDA or other portable device) or the kiosk provider can provide a base unit 320 with a predefined pattern printed thereon (see Figure 6). In an embodiment where the user brings his/her own projection unit, the projection unit 300 may include an interface for receiving any one or a combination of power and control signals used to drive the projection unit. Such interfaces can be any wired or wireless communication devices including, but not limited to Ethernet, serial, USB, parallel, Bluetooth, CDMA, GSM/GPRS, etc. Control signals sent to the projection unit may include which one of plural user interfaces (or partial user interfaces) is cun-ently projected.
[0026] A provider of a kiosk 250 may also elect to utilize an under-mounted display technology, as described above with reference to a monitor or LCD panel covered with a transparent protective material.
[0027] Figures 7 and 8 illustrate embodiments in which either the kiosk provider provides to a user or a user brings his own portable device 290 that interacts with a kiosk. The portable device 290 utilizes an RF module 380 (or an optical module such as an IR module) to communicate with an application server 400 across a WAN/LAN using at least one communications link 360. In this fashion, the entire control interface may be transported to and used in or near the kiosk.
[0028] Other possible kiosks or applications can include any interface description accessible by the user that may be transmitted from an application server 400 to a terminal containing the display mechanism, displayed on a surface, and operated upon by the user. Some applications include: web browsers; video conference applications (e.g., mute one or more participants, display a particular image to the audience, control volume, dial-in participants); multimedia equipment controls (e.g., wave your hand down to decrease volume — up to increase it; dial a station, play a CD); information kiosks; advertising displays with feedback mechanisms; ticketing services; self-service interfaces (e.g., vending machines); remote device control (e.g., cameras, alarms, locks); remote vehicle control to, for example, control vehicles in hazardous environments; industrial environments (flow controls, heating/ventilation/air conditioning); clean rooms; sterile and medical environments where mechanical equipment placement is prohibitive; test equipment; hazardous environments; remote control of distant objects; e.g., factory equipment, defense applications, building security (alanns, cameras, locking mechanisms); and simulations.
[0029] In order to dynamically generate the keyboard-style pattern 325 and/or determine a location on the keyboard-style pattern 325 that the user is interacting with, the present invention includes at least one of hardware and software for controlling at least one of the projection unit 300 and the motion sensor 310. In one software embodiment, a central processing unit (CPU) interacts with at least one memory (e.g., DRAM, ROM, EPROM, EEPROM, SRAM, SDRAM, and Flash RAM), and other optional special purpose logic devices (e.g., ASICs) or configurable logic devices (e.g., GAL and reprogrammable FPGA). The kiosk 250 may also include a floppy disk drive; other removable media devices (e.g., compact disc, tape, and removable magneto-optical media); and a hard disk, or other fixed, high density media drives, connected using an appropriate device bus (e.g., a SCSI bus, an Enhanced IDE bus, a Ultra DMA bus or a Serial ATA interface). The kiosk 250 may further include a compact disc reader, a compact disc reader/writer unit or a compact disc jukebox . In addition, a printer may also provides printed listings of work perfonned by a user at the kiosk 250.
[0030] As stated above, the software for controlling the kiosk (or the kiosk and the portable device) includes at least one computer readable medium. Examples of computer readable media are compact discs, hard disks, floppy disks, tape, magneto- optical disks, PROMs (EPROM, EEPROM, Flash EPROM), DRAM, SRAM, SDRAM, etc. Stored on any one or on a combination of computer readable media, the present invention includes software for controlling both the hardware of the kiosk 250 and for enabling the kiosk 250 to interact with a human user. Such software may include, but is not limited to, device drivers, operating systems and user applications, such as development tools. Together, the computer readable media and the software thereon form a computer program product of the present invention for providing a virtual user interface. The computer code devices of the present invention can be any interpreted or executable code mechanism, including but not limited to scripts, interpreters, dynamic link libraries, Java classes, and complete executable programs. Such software controls a pattern to be displayed to a user, and the pattern may be dynamically changed in response to configuration information provided to the software. Such changes include changes in keyboard key labels and positions and shapes of individual keys. Such software further includes a dynamically configurable memory for determining which key corresponds to the portion of the keyboard interface that a user is interacting with. For example, if an existing key is split into two parts to make two keys in its place, a computer memory is updated to be able to differentiate interactions with one of the new keys from interactions with the other of the two keys. Similarly, if keys are added where no keys existed before, the software tracks the location and extent of the new key. Such tracking may also occur for a virtual mousepad and virtual mouse buttons.
[0031 ] hi addition, any of the functions described above in terms of software can instead be implemented in special-purpose hardware such as FPGAs, ASIC, PALs,
GALs, etc.
[0032] Numerous modifications of the above-teachings can be made by those of ordinary skill in the art without departing from the scope of protection afforded by the appended claims.

Claims

CLAIMS: 1. A dynamically configurable user-input interface for interacting with a user, comprising: a projection unit for projecting (1) a first virtual interface including at least one of a virtual keyboard, a virtual mousepad and at least one virtual mouse button and (2) a second virtual interface including at least one of a virtual keyboard, a virtual mousepad and at least one virtual mouse button to be displayed in place of at least a portion of said first virtual interface; a motion sensor for determining a position on the first and second virtual interfaces that is interacted with by a user; a communications controller for communicating the position on the first and second virtual interfaces outside of the user-input interface; and a controller for controlling the projection unit to switch from the first virtual interface to the second virtual interface.
2. The dynamically configurable user-input interface as claimed in claim 1, wherein the first virtual interface comprises a keyboard interface in a first language and the second virtual interface comprises a keyboard interface in a second language.
3. The dynamically configurable user-input interface as claimed in claim 1, wherein the first virtual interface comprises a keyboard interface and the second virtual interface comprises a mousepad.
4. The d namically configurable user-input interface as claimed in claim 1, wherein the first virtual interface comprises a keyboard interface and the second virtual interface comprises a mousepad and at least one mouse button.
5. The dynamically configurable user-input interface as claimed in claim 1, further comprising a telephone interface for communicating by phone between the user and a remotely located telephone customer.
PCT/EP2004/014334 2003-12-31 2004-12-16 Dynamically modifiable virtual keyboard or virtual mouse interface WO2005064439A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US10/748,146 2003-12-31
US10/748,146 US20050141752A1 (en) 2003-12-31 2003-12-31 Dynamically modifiable keyboard-style interface
EP04291783 2004-07-12
EP04291783.1 2004-07-12

Publications (2)

Publication Number Publication Date
WO2005064439A2 true WO2005064439A2 (en) 2005-07-14
WO2005064439A3 WO2005064439A3 (en) 2006-04-20

Family

ID=34740679

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2004/014334 WO2005064439A2 (en) 2003-12-31 2004-12-16 Dynamically modifiable virtual keyboard or virtual mouse interface

Country Status (1)

Country Link
WO (1) WO2005064439A2 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8228315B1 (en) 2011-07-12 2012-07-24 Google Inc. Methods and systems for a virtual input device
WO2014107087A1 (en) * 2013-01-07 2014-07-10 Samsung Electronics Co., Ltd. Method and apparatus for providing mouse function using touch device
US9069164B2 (en) 2011-07-12 2015-06-30 Google Inc. Methods and systems for a virtual input device
CN111026066A (en) * 2019-12-30 2020-04-17 东风小康汽车有限公司重庆分公司 Vehicle control method and device
US10838504B2 (en) 2016-06-08 2020-11-17 Stephen H. Lewis Glass mouse
US11340710B2 (en) 2016-06-08 2022-05-24 Architectronics Inc. Virtual mouse

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001093182A1 (en) * 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2002057089A1 (en) * 2000-11-17 2002-07-25 Clear Technologies, Inc. Electronic input device
US20030128188A1 (en) * 2002-01-10 2003-07-10 International Business Machines Corporation System and method implementing non-physical pointers for computer devices
US20030174125A1 (en) * 1999-11-04 2003-09-18 Ilhami Torunoglu Multiple input modes in overlapping physical space

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030174125A1 (en) * 1999-11-04 2003-09-18 Ilhami Torunoglu Multiple input modes in overlapping physical space
WO2001093182A1 (en) * 2000-05-29 2001-12-06 Vkb Inc. Virtual data entry device and method for input of alphanumeric and other data
WO2002057089A1 (en) * 2000-11-17 2002-07-25 Clear Technologies, Inc. Electronic input device
US20030128188A1 (en) * 2002-01-10 2003-07-10 International Business Machines Corporation System and method implementing non-physical pointers for computer devices

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8228315B1 (en) 2011-07-12 2012-07-24 Google Inc. Methods and systems for a virtual input device
US9069164B2 (en) 2011-07-12 2015-06-30 Google Inc. Methods and systems for a virtual input device
WO2014107087A1 (en) * 2013-01-07 2014-07-10 Samsung Electronics Co., Ltd. Method and apparatus for providing mouse function using touch device
US9223414B2 (en) 2013-01-07 2015-12-29 Samsung Electronics Co., Ltd. Method and apparatus for providing mouse function using touch device
US10838504B2 (en) 2016-06-08 2020-11-17 Stephen H. Lewis Glass mouse
US11340710B2 (en) 2016-06-08 2022-05-24 Architectronics Inc. Virtual mouse
CN111026066A (en) * 2019-12-30 2020-04-17 东风小康汽车有限公司重庆分公司 Vehicle control method and device

Also Published As

Publication number Publication date
WO2005064439A3 (en) 2006-04-20

Similar Documents

Publication Publication Date Title
US20050141752A1 (en) Dynamically modifiable keyboard-style interface
CN111107222B (en) Interface sharing method and electronic equipment
US8432362B2 (en) Keyboards and methods thereof
US9977541B2 (en) Mobile terminal and method for controlling the same
JP3952896B2 (en) Coordinate input device, control method therefor, and program
US10140016B2 (en) User input apparatus, computer connected to user input apparatus, and control method for computer connected to user input apparatus, and storage medium
CN102439656B (en) Based on the customization of the GUI layout of use history
US8334837B2 (en) Method for displaying approached interaction areas
US10296127B2 (en) Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US6437774B1 (en) Display and input device and display and input system
US20070229458A1 (en) Wheel input device and method for four-way key stroke in portable terminal
US20120019488A1 (en) Stylus for a touchscreen display
WO2021057337A1 (en) Operation method and electronic device
WO2009139971A2 (en) Computer vision-based multi-touch sensing using infrared lasers
KR20130113997A (en) Object control method performed in device including transparent display, the device, and computer readable recording medium thereof
US20020041325A1 (en) Interaction configuration
WO2021004327A1 (en) Method for setting application permission, and terminal device
US10126843B2 (en) Touch control method and electronic device
WO2005064439A2 (en) Dynamically modifiable virtual keyboard or virtual mouse interface
KR200477008Y1 (en) Smart phone with mouse module
US20100001951A1 (en) Cursor control device
KR101682527B1 (en) touch keypad combined mouse using thin type haptic module
US6803907B2 (en) Wireless beam-pen pointing device
CN110502302A (en) Control method, terminal device and the storage medium of Application Program Interface
CN113986075B (en) Information display method and device, verification method and device and electronic equipment

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase in:

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

122 Ep: pct application non-entry in european phase