US20210349593A1 - Systems and methods for non-contacting interaction with user terminals - Google Patents
Systems and methods for non-contacting interaction with user terminals Download PDFInfo
- Publication number
- US20210349593A1 US20210349593A1 US17/007,278 US202017007278A US2021349593A1 US 20210349593 A1 US20210349593 A1 US 20210349593A1 US 202017007278 A US202017007278 A US 202017007278A US 2021349593 A1 US2021349593 A1 US 2021349593A1
- Authority
- US
- United States
- Prior art keywords
- user terminal
- processing unit
- mobile
- user
- communications device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 24
- 230000003993 interaction Effects 0.000 title claims description 51
- 238000012545 processing Methods 0.000 claims description 122
- 238000010295 mobile communication Methods 0.000 claims description 82
- 238000004891 communication Methods 0.000 claims description 40
- 230000006870 function Effects 0.000 claims description 6
- 238000013459 approach Methods 0.000 description 8
- 230000002452 interceptive effect Effects 0.000 description 7
- 230000002093 peripheral effect Effects 0.000 description 7
- 230000009471 action Effects 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 3
- 238000012986 modification Methods 0.000 description 3
- 230000004048 modification Effects 0.000 description 3
- 230000010399 physical interaction Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000003213 activating effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000006835 compression Effects 0.000 description 2
- 238000007906 compression Methods 0.000 description 2
- 230000036541 health Effects 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000003068 static effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 241000894006 Bacteria Species 0.000 description 1
- 241000700605 Viruses Species 0.000 description 1
- 244000052616 bacterial pathogen Species 0.000 description 1
- 230000001413 cellular effect Effects 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 230000005055 memory storage Effects 0.000 description 1
- 238000011012 sanitization Methods 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000003860 storage Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F19/00—Complete banking systems; Coded card-freed arrangements adapted for dispensing or receiving monies or the like and posting such transactions to existing accounts, e.g. automatic teller machines
- G07F19/20—Automatic teller machines [ATMs]
- G07F19/206—Software aspects at ATMs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/002—Specific input/output arrangements not covered by G06F3/01 - G06F3/16
- G06F3/005—Input arrangements through a video camera
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/08—Payment architectures
- G06Q20/10—Payment architectures specially adapted for electronic funds transfer [EFT] systems; specially adapted for home banking systems
- G06Q20/108—Remote banking, e.g. home banking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q20/00—Payment architectures, schemes or protocols
- G06Q20/08—Payment architectures
- G06Q20/18—Payment architectures involving self-service terminals [SST], vending machines, kiosks or multimedia terminals
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07G—REGISTERING THE RECEIPT OF CASH, VALUABLES, OR TOKENS
- G07G1/00—Cash registers
- G07G1/0036—Checkout procedures
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/003—Details of a display terminal, the details relating to the control arrangement of the display terminal and to the interfaces thereto
- G09G5/006—Details of the interface to the display terminal
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/80—Services using short range communication, e.g. near-field communication [NFC], radio-frequency identification [RFID] or low energy communication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/12—Use of DVI or HDMI protocol in interfaces along the display data pipeline
Definitions
- Touch screen interfaces are becoming ubiquitous as they are increasingly incorporated into a wide variety of devices.
- Example devices that include touch screen interfaces are ATMs, kiosks, elevator touch panels, point of sale terminals, airport check-in kiosks, supermarket self-check-out terminals, vending machines, gas pumps, casino games, in-restaurant ordering devices, ticket dispensers, employee kiosks, health assessment kiosks, wayfinding and general information kiosks, and paid parking machines, among numerous others.
- a vast majority of such devices are in public places and are routinely touched by large volumes of people. Users are becoming increasingly concerned with the possible exchange of bacteria, germs, viruses, and other contagions through physical interaction with commonly touched surfaces, including touch screen interface surfaces.
- Touch screen interfaces are rarely, if ever, disinfected, sanitized, or even cleaned. While other user input technologies, such as voice control, may allow at least some degree of contactless interaction with certain devices, these technologies are often not practical to implement nor do they provide an acceptable level of usability.
- FIG. 1 depicts an example gesture processing unit in accordance with one non-limiting embodiment.
- FIG. 2 depicts another example gesture processing unit in accordance with one non-limiting embodiment.
- FIG. 3 depicts yet another example gesture processing unit in accordance with one non-limiting embodiment.
- FIG. 4 depicts an example mobile commands processing unit in accordance with one non-limiting embodiment that enables contactless interaction with a user terminal via a mobile communications device.
- FIG. 5 depicts another mobile commands processing unit in accordance with one non-limiting embodiment that enables contactless interaction with a user terminal via a mobile communications device
- FIG. 6 depicts another example embodiment of contactless control of a user terminal in accordance with one non-limiting embodiment.
- FIG. 7 depicts an example Internet-based mobile commands processing unit in accordance with one non-limiting embodiment.
- FIG. 8 depicts another example Internet-based mobile commands processing unit in accordance with one non-limiting embodiment.
- FIG. 9 depicts a user terminal that includes a built-in gesture processing unit in accordance with one non-limiting embodiment.
- FIG. 10 depicts a user terminal that includes a built-in HID/Bluetooth module allowing for user input from a coupled mobile communications device in accordance with one non-limiting embodiment.
- FIG. 11 depicts an example user terminal that includes a built-in Internet-based mobile commands processing unit in accordance with one non-limiting embodiment.
- references to components or modules generally refer to items that logically can be grouped together to perform a function or group of related functions. Like reference numerals are generally intended to refer to the same or similar components.
- Components and modules can be implemented in software, hardware, or a combination of software and hardware.
- the term software is used expansively to include not only executable code, but also data structures, data stores, and computing instructions in any electronic format, firmware, and embedded software.
- information and data are used expansively and can include a wide variety of electronic information, including but not limited to machine-executable or machine-interpretable instructions; content such as text, video data, and audio data, among others; and various codes or flags.
- the terms information, data, and content are sometimes used interchangeably when permitted by context.
- the presently disclosed systems and methods generally enable users to interact with user terminals having a touch screen interface without requiring the user to physically contact a surface of the touch screen interface.
- user terminal broadly refers to any system, device, or component that incorporates a touch screen interface as a means for user input. Examples of user terminals include, but are not limited to, ATMs, kiosks, casino games, elevator touch panels, point of sale (POS) terminals, gas pumps, airport check-in kiosks, supermarket self-check-out terminals, vending machines, in-restaurant ordering devices, ticket dispensers, employee kiosks, health assessment kiosks, and wayfinding and general information kiosks. Accordingly, user terminals in accordance with the present disclosure can have a wide variety of form factors and utilize a variety of different types of touch screen interfaces and/or other forms of user input interfaces requiring physical contact by a user.
- a user can interact with a user terminal through non-contacting interactions.
- a user can have non-contacting interactions with a touch screen interface of a user terminal through a series of hand gestures.
- a user can have non-contacting interactions with a touch screen interface of a user terminal through contacting interactions with the user's mobile communication device.
- the user's interactions with their mobile communications device can be relayed to the user terminal through network communications.
- contactless interaction functionality can be provided as a retrofit kit or add-on system that is incorporated into an existing user terminal, although this disclosure is not so limited.
- the gesture processing unit 100 can be incorporated into an existing user terminal 190 in order to provide non-contact inputs from a user.
- the gesture processing unit 100 is positioned internal to a housing, case, or cabinet of the user terminal 190 .
- the user terminal can have an existing power supply 112 , a primary display 114 , and a primary computer 116 .
- the primary display 114 can incorporate a touch screen interface that is used to track a user's physical interaction with the surface of the interface.
- the user terminal 190 may have other forms of input devices, such as a numeric keypad, a mouse, a stylus, a keyboard, push buttons, and so forth.
- the gesture processing unit 100 can allow a user to provide inputs to the user terminal 190 through non-contacting hand gestures and without requiring physical contact with any surfaces of the user terminal 190 . Such hand gestures can be executed within the field of view of a camera 102 that is mounted to the user terminal 190 .
- the gesture processing unit 100 can utilize a video feed from an existing camera 102 that was installed in the user terminal 190 at the time of manufacture.
- the user terminal 190 may not include a camera 102 , or the in-situ camera is not suitable for the needs of the gesture processing unit 100 .
- a camera 102 can be mounted to the user terminal 190 at the time the gesture processing unit 100 is installed.
- the camera 102 can be a depth sensor camera as depicted, or any other any suitable type of camera.
- the user terminal 190 may have multiple cameras 102 .
- the gesture processing unit 100 can be connected to the peripheral modules of the user terminal 190 .
- the gesture processing unit 100 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMI/VGA and USB, for example.
- the gesture processing unit 100 can be treated as a peripheral device by the other computers of the user terminal 190 .
- the gesture processing unit 100 can be connected to the primary computer 116 of the user terminal 190 via USB port to provide users with a touchless approach for interacting with the user terminal 190 .
- a user can interact with the user terminal 190 without being required to physically touch an input device of the user terminal (such as a keyboard, a mouse, a touchscreen, a keypad, and so forth).
- a video stream can be routed from the primary computer 116 to the gesture processing unit 100 by connecting HDMI/VGA output from the primary computer 116 to an input port in the gesture processing unit 100 .
- the primary display 114 of the user terminal 190 can be connected to the output HDMI/VGA port on the gesture processing unit 100 .
- the original video feed from the primary computer 116 can be proxied via the gesture processing unit 100 , unaltered, and then displayed on the primary display 114 .
- An application executed by a CPU 106 of the gesture processing unit 100 can actively monitor motion via the connected depth camera 102 .
- a gesture recognizer module 104 using a tensor processing unit (TPU) is utilized, although this disclosure is not so limited.
- TPU tensor processing unit
- relevant gestures can be identified and translated to the equivalent human interface device (HID) commands by an HID module 108 of the gesture processing unit 100 .
- the commands can be delivered to the primary computer 116 via USB connection (or other suitable communications), while the video feed overlay module 110 can provide a user of the user terminal 190 with visual feedback for each relevant action on the primary display 114 .
- gestures of the user can cause the cursor to point to various interactive elements on the screen that would typically be physically touched by a user. When the cursor is pointing to a desired interactive element, the user can execute a gesture for selecting or activating the interactive element.
- the gesture processing unit 200 can be incorporated into an existing user terminal 290 in order to provide non-contact inputs from a user.
- the user terminal can have an existing power supply 212 , a primary display 214 , and a primary computer 216 .
- the primary display 214 can incorporate a touch screen interface that is used to track a user's physical interaction with the surface of the interface.
- the user terminal 290 may have other forms of input devices, such as a numeric keypad, a mouse, a stylus, a keyboard, push buttons, and so forth.
- the gesture processing unit 200 can allow a user to provide inputs to the user terminal 290 through non-contacting hand gestures and without requiring physical contact with any surfaces of the user terminal 290 .
- Such hand gestures can be executed within the field of view of camera 202 that is mounted to the user terminal 290 .
- the camera 202 can be a depth sensor camera as depicted, or any other suitable type of camera. Furthermore, in some embodiments, multiple cameras 202 can be utilized.
- the gesture processing unit 200 can be connected to the peripheral modules of the user terminal 290 .
- the gesture processing unit 200 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMI/VGA and USB, for example.
- the gesture processing unit 200 can be treated as a peripheral device by the other computers of the user terminal 290 .
- the gesture processing unit 200 can be connected to the primary computer 216 of the user terminal 290 via USB port to provide users with a touchless approach for interacting with the user terminal 290 .
- a video stream can be routed from the primary computer 216 to the gesture processing unit 200 by feeding the video output from the primary computer 216 to an input port in the gesture processing unit 200 .
- a VGA to HDMI converter 211 can be used to convert the source video.
- a video input module 210 can be used to compress the video and provide the compressed video to the CPU 206 via a camera serial interface (CSI) bus, for example.
- the primary display 214 of the user terminal 290 can be connected to the output HDMI/VGA port on the gesture processing unit 100 .
- an HDMI to VGA converter 209 can be used to output video prior to feeding it to the primary display 214 .
- the original video feed from the primary computer 216 can be proxied via the gesture processing unit 200 , unaltered, and then displayed on the primary display 214 .
- An application executed by the CPU 206 of the gesture processing unit 200 can actively monitor motion via the connected camera 202 .
- a gesture recognizer module 204 using a tensor processing unit (TPU) is utilized, although this disclosure is not so limited.
- TPU tensor processing unit
- relevant gestures can be identified and translated to the equivalent human interface device (HID) commands by an HID module 208 of the gesture processing unit 200 .
- the commands can be delivered to the primary computer 216 via USB connection (or other suitable communications).
- the CPU 206 can also modify the video feed such that visual feedback for each relevant action on the primary display 214 .
- gestures of the user can cause the cursor to point to various interactive elements on the screen that would typically be physically touched by a user. When the cursor is pointing to a desired interactive element, the user can execute a gesture for selecting or activating the interactive element.
- FIG. 3 depicts another example of a gesture processing unit 300 in accordance with one non-limiting embodiment. Similar to FIG. 2 , the gesture processing unit 300 can be incorporated into an existing user terminal 390 in order to provide non-contact inputs from a user.
- the user terminal can have an existing power supply 312 , a primary display 314 , and a primary computer 316 .
- the user terminal 390 can also have a camera 302 .
- the gesture processing unit 300 can be similar to the gesture processing unit 200 ( FIG. 2 ), having a CPU 306 , and an HID module 308 .
- the gesture processing unit 300 can also include a video input module 310 , a VGA to HDMI converter 311 , and a HDMI to VGA converter 309 , although this disclosure is not so limited.
- This example embodiment of the gesture processing unit 300 does not necessarily utilize a tensor processing unit (TPU) for gesture recognition.
- the CPU 306 can include a TPU without departing from the scope of the present disclosure.
- the camera signal is fed to the CPU 306 processing.
- the camera module 302 can include a depth sensor such that the camera signal provided to the CPU 306 includes depth information.
- Providing depth information in the camera signal can increase the ability of the CPU 306 to provide real-time gesture processing without necessarily needing a TPU.
- the functionality provided by the gesture processing unit 300 can be similar to the functionality provided by the gesture processing unit 200 , but without the need for dedicated TPU.
- a mobile commands processing unit 400 is depicted in accordance with one non-limiting embodiment that enables contactless interaction with a user terminal 490 via a mobile communications device 450 .
- the mobile commands processing unit 400 can be integrated into an existing user terminal 490 .
- the mobile commands processing unit 400 can serve to mirror the images presented on a primary display 414 of the user terminal 490 onto a display screen of mobile communications device 450 via wireless communications, as schematically depicted in FIG. 4 .
- the user can interact with their mobile communications device 450 (i.e., via touching the display screen) and such interactions are relayed to the user terminal 490 .
- the user terminal 490 can have a power supply 412 , a primary display 414 , and a primary computer 416 , similar to the user terminal 190 , 290 , 390 , described above, for example.
- the mobile commands processing unit 400 can have a CPU 406 and a video overlay module 410 .
- the mobile commands processing unit 400 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMINGA and USB, for example.
- the mobile commands processing unit 400 can be treated as a peripheral device by the other computers of the user terminal 490 .
- FIG. 4 depict the embodiment depicted in FIG. 4 , as well as other embodiments, depict the use of Bluetooth for the purposes of illustration, it is to be appreciated that any suitable short-range wireless communication protocol and/or technologies can be used without departing from the scope of the present disclosure.
- the mobile commands processing unit 400 can be connected to a primary computer 416 of the user terminal 490 via USB port to provide users with a touchless approach for interacting with the user terminal 490 . Additionally, a video stream can be routed from the primary computer 416 to the mobile commands processing unit 400 by connecting HDM/VGA output from the primary computer 416 to an input port of the mobile commands processing unit 400 .
- the primary display 414 of the user terminal 490 can be connected to the output HDMI/VGA port on the mobile commands processing unit 400 .
- a HID/Bluetooth module 408 is utilized to communicate with a mobile communications device 450 .
- the mobile communications device 450 can be a smart phone, a wearable (i.e., smart watch), tablet computer, laptop computer, or any other suitable device with a screen and that can receive user input.
- the original video feed from the primary computer 416 can be proxied via the mobile commands processing unit 400 , unaltered, and then displayed on the primary display 414 .
- An application running on the CPU 406 of the mobile commands processing unit 400 can actively monitor user connections.
- the display that would conventionally be displayed on the primary display 414 can be displayed on the mobile communications device 450 instead.
- the video display of the primary display 414 can operationally display, for example, the identical content that is displayed on the mobile communication device 450 .
- other content can be provided for display on the primary display 414 , such as an advertisement/marketing materials, or a variety of other suitable communications or messages.
- the primary display 414 may be blank.
- the HID/Bluetooth Module 408 of the mobile commands processing unit 400 can translate such interaction to the equivalent HID (human interface device) commands.
- the HID/Bluetooth Module 408 can deliver the commands to the primary computer 416 via USB connection.
- the video feed overlay module 410 can provide a user with a visual feedback of each relevant action.
- the mobile communication device 450 can be paired to the user terminal 490 via any suitable technique.
- the CPU 406 can cause a QR code to be displayed on the primary display 414 .
- the user of the user terminal 490 can capture the QR code with a camera of the mobile communications device 450 . Capturing the QR code can initiate a secure Bluetooth connection between the mobile communications device 450 and the HID/Bluetooth module 408 of the mobile commands processing unit 400 .
- the images presented on the display screen of the mobile communications device 450 can be replicas of the images that would be presented on the primary display 414 . In some embodiments, however, the images presented on the display screen of the mobile communications device 450 can be modified, customized, or otherwise adjusted for proper display on the mobile communications device 450 .
- FIG. 5 depicts another example mobile commands processing unit 500 in accordance with one non-limiting embodiment. Similar to FIG. 4 , the mobile commands processing unit 500 can be incorporated into an existing user terminal 590 in order to provide non-contact inputs from a user.
- the user terminal can have an existing power supply 512 , a primary display 514 , and a primary computer 516 .
- the mobile commands processing unit 500 can be similar to the mobile commands processing unit 400 ( FIG. 4 ), having a CPU 506 , and an HID module 508 .
- the mobile commands processing unit 500 can also include a video input module 510 , a VGA to HDMI converter 511 , and a HDMI to VGA converter 509 , although this disclosure is not so limited.
- This embodiment of the gesture processing unit 300 does not utilize, however, a tensor processing unit (TPU) for gesture recognition.
- TPU tensor processing unit
- the functionality provided by the mobile commands processing unit 500 can be similar to the functionality described above with regard to mobile commands processing unit 400 .
- a user can approach the user terminal 590 and open a specific mobile application on their mobile communications device 550 .
- the mobile application can finds the user terminal 590 via Bluetooth connection, for example, and attempt to connect. If the connection was successful, in accordance with some embodiments, the application executing on the mobile communications device 550 can provide the user with a choice of mirror mode or touchpad mode.
- the mobile application can display, for example, a viewfinder box on the screen of the mobile communications device 550 .
- the size of the viewfinder box can have the same aspect ratio as the primary display 514 user terminal 590 , for example.
- the user can point a rear-facing camera of mobile communications device 550 towards the primary display 514 , such that the camera feed is displayed in the viewfinder box.
- the user can then interact with the display of the mobile communications device 550 (by touching a location in the viewfinder box that is displaying an interactive element from the primary display, for example).
- These interactions with the touchscreen of the mobile communications device 550 can be sent to the mobile commands processing unit 500 and can be translated to coordinates on the primary display 514 .
- the HID module 508 can send corresponding commands to the primary computer 516 .
- the mobile application can present a virtual track pad panel on the touchscreen on the mobile communications device 550 , similar in functionality to a touchpad on a laptop computer, for example.
- the mobile application of the mobile communications device 550 can process user interaction (taps, moves and swipes) with the virtual track pad panel and send them via the Bluetooth connection to the HID module 508 , for example.
- the HID module 508 can then send the actions to the primary computer 516 user terminal 590 of the via standard HID protocol.
- the HID module 508 can also sends the actions and to the CPU 506 via I2C connection, for example.
- FIG. 6 an example embodiment of contactless control of a user terminal 690 is depicted.
- the user can interact with their mobile communication device 650 to control a mouse or pointer on a screen of the user terminal 690 , similar to the “touchpad mode” described above, for example.
- the touchscreen of their mobile communication device 650 can serve as track pad such that as a user moves their finger along the touchscreen of their mobile communication device 650 , they will see the pointer on the device screen move in parallel.
- the user taps or double-taps the touchscreen of the their mobile communication device 650 it would register on the device screen as a “touch” or “click”, or other suitable action, on the user terminal 690 .
- a microcontroller module 606 can include a USB HID/Bluetooth and/or WiFi Module.
- the user can connect their mobile communication device 650 to the microcontroller module 606 via suitable communication protocol and connection process.
- An application running on the microcontroller module 606 can actively monitor users. Users can use the connected mobile communication device 650 as a touchpad, and the microcontroller module 606 can translate such interaction to the equivalent HID (human interface device) commands and deliver the HID commands to the primary computer of the user terminal 690 via USB connection.
- HID human interface device
- the mobile commands processing unit 700 can be a retrofit kit that can establish a bi-directional connection between a mobile communications device 750 and a user terminal 790 via a websocket protocol.
- a dedicated, secure bi-directional connection between the mobile communications device 750 and the user terminal 790 can be established.
- the user could either see a broadcast version of the screen of the user terminal 790 (similar to FIG. 4 , for example) or the user could instead simply use their mobile communications device 750 as a touchpad to interact with the screen of user terminal 790 (similar to FIG. 6 , for example).
- the user terminal 790 can be similar to user terminals 190 , 490 , and 690 having a power supply 712 , a primary display 714 , and a primary computer 716 .
- the mobile commands processing unit 700 can be similar to the mobile commands processing unit 200 of FIG. 4 , having a CPU 706 , video overlay module 710 , and an HID module 708 .
- an application running on the CPU 706 can display an overlay message directing the user to navigate to a specific website address on their mobile communications device 750 .
- the primary display 714 can also show a unique QR code (or other unique identifier) displayed next to the message. The user can then navigate to the website address on their mobile communications device 750 and be prompted to scan the currently displayed QR code from the primary display 714 , or otherwise enter the unique identifier.
- a unique alphanumeric code can be displayed on the primary display 714 of the user terminal 790 that can be typed into the webpage displayed on the mobile communications device 750 .
- a dedicated websocket connection can be established between the mobile communications device 750 and the user terminal 790 through a webserver 760 .
- the primary display 714 of the user terminal 790 can be set to display alternative content, display a black screen, or display a message that private connection in session, for example.
- the mobile commands processing unit 700 can utilize a video compression algorithm and only send static images of the screen to the mobile communications device 750 to minimize the latency and reduce the required bandwidth.
- the website presented at the mobile communications device 750 can capture the user's interactions and send them to the web server 760 , which in turn sends them to the mobile commands processing unit 700 .
- the mobile commands processing unit 700 can translate such interactions to the HID command, which is delivered via USB port to the user terminal 790 .
- FIG. 7 depicts the use of the websocket protocol for illustration purposes
- this disclosure is not so limited. Instead, any suitable web-based communication protocol, or combination of multiple web-based communication protocols can be utilized without departing from the scope of the present disclosure.
- some embodiments of the present disclosure can utilize a RTP Control Protocol (RTCP), an HTTP Live Streaming (HLS) protocol, among others.
- RTCP RTP Control Protocol
- HLS HTTP Live Streaming
- some embodiments can utilize, for example, an HLS protocol for video transmission and a websocket protocol for gesture input transmission.
- FIG. 8 depicts another example mobile commands processing unit 800 in accordance with one non-limiting embodiment. Similar to FIG. 7 , the mobile commands processing unit 800 can be incorporated into an existing user terminal 890 in order to provide non-contact inputs from a user.
- the user terminal can have an existing power supply 812 , a primary display 814 , and a primary computer 816 .
- the mobile commands processing unit 800 can be similar to the mobile commands processing unit 700 ( FIG. 7 ), having a CPU 806 , and an HID module 808 .
- the mobile commands processing unit 800 can also include a video input module 810 , a VGA to HDMI converter 811 , and a HDMI to VGA converter 809 , although this disclosure is not so limited.
- a user can scans a QR Code with a mobile communications device 850 .
- the QR code can be associated with the user terminal 890 (i.e. via a unique identifier of the user terminal) and direct a browser of the mobile communications device 850 to a website when scanned.
- the website can send the command to the CPU 806 to generate and show the start session QR code.
- This command can be sent via a message broker service 858 , such as AWS SQS, for example.
- the CPU 806 can cause a “start-session QR code” to be presented on the primary display 814 via video overlay.
- the mobile communications device 850 can scan the “start-session QR code” presented on the primary display 814 using a QR code scanner.
- the QR Code scanner is provided by the website interface.
- the website can then send the command to the CPU 806 to start the session. This command is sent via the message broker service 858 , for example.
- the CPU 806 can remove the start-session QR code from the primary display 814 and start sending frames with original images of the primary display 814 to the website via a websocket connection.
- the frames are jpeg images, which form an mjpeg stream.
- the website can show these frames to a user of the mobile communications device 850 .
- the website can registers the user's taps, moves, or other interactions with the screen and send them to the CPU 806 via opened websocket connection stablished between the mobile communications device 850 and the user terminal 890 through a webserver 860 .
- the CPU 806 can receive the coordinates of the user interactions from the websocket connection and send them to the USB HID Module 808 via an I2C connection.
- the USB HID module 808 can receives the coordinates of the user interactions and send them to the primary computer 816 via standard HID protocol.
- FIGS. 1-8 generally depict existing user terminals that can be retrofitted with contactless interaction capabilities, this disclosure is not so limited.
- FIGS. 9-11 example embodiments are depicted in which user terminals are manufactured with contactless interaction capabilities similar to the gesture-based and mobile command-based approaches described above.
- a user terminal 990 that includes a built-in gesture processing unit 900 in accordance with one non-limiting embodiment is depicted. Similar to the user terminal 190 of FIG. 1 , a user can provide input to the user terminal 990 via non-contacting gestures.
- the gesture processing unit 900 can be connected to the peripheral modules of user terminal 990 .
- the gesture processing unit 900 can have a gesture recognizer module 906 with a USB output that is to be treated as a peripheral device by the other computers on-board the user terminal 990 .
- the user terminal 990 can also be built with one (or more) depth sensor camera 902 that is connected to the gesture recognizer module 906 .
- the gesture recognizer module 906 can be connected to the primary computer of the user terminal 990 via the USB port such that touch-based devices (touchscreen, keyboard, mouse, etc.) of the user terminal 990 can be replaced with a touchless approach.
- An application running on the gesture processing unit 900 can actively monitor for motion via the connected depth camera 902 .
- the user terminal 990 can be configured to graphically include a cursor on a device screen 970 to aid the user in gesture-based navigation.
- gestures by the user can be translated by the gesture recognizer module 906 to the equivalent HID (human interface device) commands, delivered to the primary computer via USB connection, and displayed on the screen 970 .
- HID human interface device
- FIG. 9 schematically depicts a camera module 902 having a depth sensor and a gesture recognizer module 906 having a TPU
- this disclosure is not so limited. Similar to embodiments provided above, for example, the camera module 902 does not necessarily need to have a depth sensor, nor is a TPU required. As such, in some example embodiments, the camera module 902 can include a depth sensor and a TPU is not used for gesture processing. In other example embodiments, the camera module 902 does not have a depth sensor and a TPU is used gesture processing.
- an example user terminal 1090 that includes a built-in HID/Bluetooth module 1080 allowing for user input from a coupled mobile communications device 1050 .
- the user terminal 1090 can be configured to either receive commands from the mobile communications device 1050 when it is used as a trackpad, or the display of the user terminal 1090 can be mirrored to the mobile communications device 1050 via a local network connection.
- the HID/Bluetooth module 1080 can be a microcontroller with Bluetooth or WiFi module that connects as a HID device to the main computer of the user terminal 1090 .
- a user can connect their mobile communication device 1050 to the HID/Bluetooth module 1080 through a Bluetooth-based connection to enable contactless interaction with the user terminal 1090 .
- an example user terminal 1190 that includes a built-in Internet-based mobile commands processing unit allowing for a bi-directional connection between a mobile communications device 1150 and the user terminal 1190 via a websocket protocol.
- a websocket protocol As noted above, however, any suitable web-based communication protocol(s) can be utilized.
- the user terminal 1190 can be configured to allow a user to see a broadcast version of the user terminal 1190 display mirrored on the mobile communications device 1150 .
- the user terminal 1190 can be configured to allow a user to use their mobile communications device 1150 as a touchpad for a display screen 11110 of the user terminal 1190 .
- an application running on the user terminal 1190 can display an overlay message directing the user to navigate to a specific website address on their mobile communications device 1150 .
- the display screen 1170 can also show a unique QR code or other unique identifier. The user can then navigate to the website address on their mobile communications device 1150 and be prompted to scan the currently displayed QR code from the display screen 1170 , or otherwise enter the unique identifier.
- a unique alphanumeric code can be displayed on the display screen 1170 of the user terminal 1190 that can be typed into the webpage displayed on the mobile communications device 1150 .
- a dedicated websocket connection can be established between the mobile communications device 1150 and the user terminal 1190 through a webserver 1160 .
- the display screen 1170 of the user terminal 1190 can be set to display alternative content, display a black screen, or display a message that private connection in session, for example.
- the user terminal 1190 can utilize a video compression algorithm and only send static images of the screen to the mobile communications device 1150 to minimize the latency and reduce the required bandwidth.
- the website presented on the mobile communications device 1150 can capture the user's interactions and send them to the web server 1160 , which in turn sends them to the user terminal 1190 .
- the user terminal 1190 can translate such interactions to an HID command.
- user terminals in accordance with the present disclosure can include multiple systems for different types of contactless interaction.
- various user terminals may be retrofitted or manufactured with units that allow users to provide input via gestures (similar to FIGS. 1-3, and 9 , for example), or their mobile communications devices (similar to FIGS. 4-8 and 10-11 , for example).
- a user wishing to interact with such user terminal can optionally choose the type of non-contacting user interface they prefer to use.
- the processes described herein can be executed by programmable equipment, such as computers or computer systems and/or processors.
- Software that can cause programmable equipment to execute processes can be stored in any storage device, such as, for example, a computer system (nonvolatile) memory, an optical disk, magnetic tape, or magnetic disk.
- a computer system nonvolatile memory
- an optical disk such as, for example, an optical disk, magnetic tape, or magnetic disk.
- at least some of the processes can be programmed when the computer system is manufactured or stored on various types of computer-readable media.
- a computer-readable medium can include, for example, memory devices such as optical disk drives or hard disk drives.
- a computer-readable medium can also include memory storage that is physical, virtual, permanent, temporary, semi-permanent, and/or semi-temporary.
- a “computer,” “computer system,” “host,” “server,” or “processor” can be, for example and without limitation, a processor, microcomputer, minicomputer, server, mainframe, laptop, personal data assistant (PDA), wireless e-mail device, cellular phone, pager, processor, fax machine, scanner, or any other programmable device configured to transmit and/or receive data over a network.
- Computer systems and computer-based devices disclosed herein can include memory for storing certain software modules used in obtaining, processing, and communicating information. It can be appreciated that such memory can be internal or external with respect to operation of the disclosed embodiments.
- a single component can be replaced by multiple components and multiple components can be replaced by a single component to perform a given function or functions. Except where such substitution would not be operative, such substitution is within the intended scope of the embodiments.
- the computer systems can comprise one or more processors in communication with memory (e.g., RAM or ROM) via one or more data buses.
- the data buses can carry electrical signals between the processor(s) and the memory.
- the processor and the memory can comprise electrical circuits that conduct electrical current. Charge states of various components of the circuits, such as solid state transistors of the processor(s) and/or memory circuit(s), can change during operation of the circuits.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Business, Economics & Management (AREA)
- Human Computer Interaction (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Multimedia (AREA)
- Strategic Management (AREA)
- General Business, Economics & Management (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Development Economics (AREA)
- Economics (AREA)
- Computer Hardware Design (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
- This application claims priority to U.S. patent application Ser. No. 63/022,774, filed May 11, 2020, entitled SYSTEMS AND METHODS FOR NON-CONTACTING INTERACTION WITH TOUCH SCREEN INTERFACES, the disclosure of which is incorporated herein by reference in its entirety.
- Touch screen interfaces are becoming ubiquitous as they are increasingly incorporated into a wide variety of devices. Example devices that include touch screen interfaces are ATMs, kiosks, elevator touch panels, point of sale terminals, airport check-in kiosks, supermarket self-check-out terminals, vending machines, gas pumps, casino games, in-restaurant ordering devices, ticket dispensers, employee kiosks, health assessment kiosks, wayfinding and general information kiosks, and paid parking machines, among numerous others. A vast majority of such devices are in public places and are routinely touched by large volumes of people. Users are becoming increasingly concerned with the possible exchange of bacteria, germs, viruses, and other contagions through physical interaction with commonly touched surfaces, including touch screen interface surfaces. Touch screen interfaces are rarely, if ever, disinfected, sanitized, or even cleaned. While other user input technologies, such as voice control, may allow at least some degree of contactless interaction with certain devices, these technologies are often not practical to implement nor do they provide an acceptable level of usability.
- The present disclosure will be more readily understood from a detailed description of some example embodiments taken in conjunction with the following figures:
-
FIG. 1 depicts an example gesture processing unit in accordance with one non-limiting embodiment. -
FIG. 2 depicts another example gesture processing unit in accordance with one non-limiting embodiment. -
FIG. 3 depicts yet another example gesture processing unit in accordance with one non-limiting embodiment. -
FIG. 4 depicts an example mobile commands processing unit in accordance with one non-limiting embodiment that enables contactless interaction with a user terminal via a mobile communications device. -
FIG. 5 depicts another mobile commands processing unit in accordance with one non-limiting embodiment that enables contactless interaction with a user terminal via a mobile communications device -
FIG. 6 depicts another example embodiment of contactless control of a user terminal in accordance with one non-limiting embodiment. -
FIG. 7 depicts an example Internet-based mobile commands processing unit in accordance with one non-limiting embodiment. -
FIG. 8 depicts another example Internet-based mobile commands processing unit in accordance with one non-limiting embodiment. -
FIG. 9 depicts a user terminal that includes a built-in gesture processing unit in accordance with one non-limiting embodiment. -
FIG. 10 depicts a user terminal that includes a built-in HID/Bluetooth module allowing for user input from a coupled mobile communications device in accordance with one non-limiting embodiment. -
FIG. 11 depicts an example user terminal that includes a built-in Internet-based mobile commands processing unit in accordance with one non-limiting embodiment. - Various non-limiting embodiments of the present disclosure will now be described to provide an overall understanding of the principles of the structure, function, and use of the systems and methods as disclosed herein. One or more examples of these non-limiting embodiments are illustrated in the accompanying drawings. Those of ordinary skill in the art will understand that systems and methods specifically described herein and illustrated in the accompanying drawings are non-limiting embodiments. The features illustrated or described in connection with one non-limiting embodiment may be combined with the features of other non-limiting embodiments. Such modifications and variations are intended to be included within the scope of the present disclosure.
- Reference throughout the specification to “various embodiments,” “some embodiments,” “one embodiment,” “some example embodiments,” “one example embodiment,” or “an embodiment” means that a particular feature, structure, or characteristic described in connection with any embodiment is included in at least one embodiment. Thus, appearances of the phrases “in various embodiments,” “in some embodiments,” “in one embodiment,” “some example embodiments,” “one example embodiment,” or “in an embodiment” in places throughout the specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures or characteristics may be combined in any suitable manner in one or more embodiments.
- Throughout this disclosure, references to components or modules generally refer to items that logically can be grouped together to perform a function or group of related functions. Like reference numerals are generally intended to refer to the same or similar components. Components and modules can be implemented in software, hardware, or a combination of software and hardware. The term software is used expansively to include not only executable code, but also data structures, data stores, and computing instructions in any electronic format, firmware, and embedded software. The terms information and data are used expansively and can include a wide variety of electronic information, including but not limited to machine-executable or machine-interpretable instructions; content such as text, video data, and audio data, among others; and various codes or flags. The terms information, data, and content are sometimes used interchangeably when permitted by context.
- The examples discussed herein are examples only and are provided to assist in the explanation of the systems and methods described herein. None of the features or components shown in the drawings or discussed below should be taken as mandatory for any specific implementation of any of these systems and methods unless specifically designated as mandatory. For ease of reading and clarity, certain components, modules, or methods may be described solely in connection with a specific figure. Any failure to specifically describe a combination or sub-combination of components should not be understood as an indication that any combination or sub-combination is not possible. Also, for any methods described, regardless of whether the method is described in conjunction with a flow diagram, it should be understood that unless otherwise specified or required by context, any explicit or implicit ordering of steps performed in the execution of a method does not imply that those steps must be performed in the order presented but instead may be performed in a different order or in parallel.
- In accordance with various embodiments, the presently disclosed systems and methods generally enable users to interact with user terminals having a touch screen interface without requiring the user to physically contact a surface of the touch screen interface. As used herein, “user terminal” broadly refers to any system, device, or component that incorporates a touch screen interface as a means for user input. Examples of user terminals include, but are not limited to, ATMs, kiosks, casino games, elevator touch panels, point of sale (POS) terminals, gas pumps, airport check-in kiosks, supermarket self-check-out terminals, vending machines, in-restaurant ordering devices, ticket dispensers, employee kiosks, health assessment kiosks, and wayfinding and general information kiosks. Accordingly, user terminals in accordance with the present disclosure can have a wide variety of form factors and utilize a variety of different types of touch screen interfaces and/or other forms of user input interfaces requiring physical contact by a user.
- As described in more detail below, the presently disclosed systems and methods allow a user to interact with a user terminal through non-contacting interactions. In some embodiments, for example, a user can have non-contacting interactions with a touch screen interface of a user terminal through a series of hand gestures. In other embodiments, a user can have non-contacting interactions with a touch screen interface of a user terminal through contacting interactions with the user's mobile communication device. The user's interactions with their mobile communications device can be relayed to the user terminal through network communications. Furthermore, in some embodiments, such contactless interaction functionality can be provided as a retrofit kit or add-on system that is incorporated into an existing user terminal, although this disclosure is not so limited.
- Referring now to
FIG. 1 , an examplegesture processing unit 100 is depicted in accordance with one non-limiting embodiment. Thegesture processing unit 100 can be incorporated into anexisting user terminal 190 in order to provide non-contact inputs from a user. In some embodiments, for example, thegesture processing unit 100 is positioned internal to a housing, case, or cabinet of theuser terminal 190. The user terminal can have an existingpower supply 112, aprimary display 114, and aprimary computer 116. Theprimary display 114 can incorporate a touch screen interface that is used to track a user's physical interaction with the surface of the interface. Additionally or alternatively, theuser terminal 190 may have other forms of input devices, such as a numeric keypad, a mouse, a stylus, a keyboard, push buttons, and so forth. Thegesture processing unit 100 can allow a user to provide inputs to theuser terminal 190 through non-contacting hand gestures and without requiring physical contact with any surfaces of theuser terminal 190. Such hand gestures can be executed within the field of view of acamera 102 that is mounted to theuser terminal 190. In some embodiments, thegesture processing unit 100 can utilize a video feed from an existingcamera 102 that was installed in theuser terminal 190 at the time of manufacture. In other embodiments, theuser terminal 190 may not include acamera 102, or the in-situ camera is not suitable for the needs of thegesture processing unit 100. In such embodiments acamera 102 can be mounted to theuser terminal 190 at the time thegesture processing unit 100 is installed. Thecamera 102 can be a depth sensor camera as depicted, or any other any suitable type of camera. Furthermore, in some embodiments, theuser terminal 190 may havemultiple cameras 102. - The
gesture processing unit 100 can be connected to the peripheral modules of theuser terminal 190. In accordance with various embodiments, thegesture processing unit 100 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMI/VGA and USB, for example. When installed in theuser terminal 190, thegesture processing unit 100 can be treated as a peripheral device by the other computers of theuser terminal 190. - The
gesture processing unit 100 can be connected to theprimary computer 116 of theuser terminal 190 via USB port to provide users with a touchless approach for interacting with theuser terminal 190. As such, a user can interact with theuser terminal 190 without being required to physically touch an input device of the user terminal (such as a keyboard, a mouse, a touchscreen, a keypad, and so forth). Additionally, a video stream can be routed from theprimary computer 116 to thegesture processing unit 100 by connecting HDMI/VGA output from theprimary computer 116 to an input port in thegesture processing unit 100. Theprimary display 114 of theuser terminal 190 can be connected to the output HDMI/VGA port on thegesture processing unit 100. - In an idle state, the original video feed from the
primary computer 116 can be proxied via thegesture processing unit 100, unaltered, and then displayed on theprimary display 114. An application executed by aCPU 106 of thegesture processing unit 100 can actively monitor motion via theconnected depth camera 102. In some embodiments, agesture recognizer module 104 using a tensor processing unit (TPU) is utilized, although this disclosure is not so limited. Upon the identification of an “initiate” gesture by a user of theuser terminal 190, for example, theCPU 106 can begin to overlay the original video with a semi-transparent image of a cursor by avideo overlay module 110 to aid the user in navigation. - During a user's contactless interaction with the
user terminal 190, relevant gestures can be identified and translated to the equivalent human interface device (HID) commands by anHID module 108 of thegesture processing unit 100. The commands can be delivered to theprimary computer 116 via USB connection (or other suitable communications), while the videofeed overlay module 110 can provide a user of theuser terminal 190 with visual feedback for each relevant action on theprimary display 114. By way of non-limiting examples, gestures of the user can cause the cursor to point to various interactive elements on the screen that would typically be physically touched by a user. When the cursor is pointing to a desired interactive element, the user can execute a gesture for selecting or activating the interactive element. - Referring now to
FIG. 2 , another example of agesture processing unit 200 is depicted in accordance with one non-limiting embodiment. Similar toFIG. 1 , thegesture processing unit 200 can be incorporated into an existinguser terminal 290 in order to provide non-contact inputs from a user. The user terminal can have an existingpower supply 212, aprimary display 214, and aprimary computer 216. Theprimary display 214 can incorporate a touch screen interface that is used to track a user's physical interaction with the surface of the interface. Additionally or alternatively, theuser terminal 290 may have other forms of input devices, such as a numeric keypad, a mouse, a stylus, a keyboard, push buttons, and so forth. Thegesture processing unit 200 can allow a user to provide inputs to theuser terminal 290 through non-contacting hand gestures and without requiring physical contact with any surfaces of theuser terminal 290. Such hand gestures can be executed within the field of view ofcamera 202 that is mounted to theuser terminal 290. Thecamera 202 can be a depth sensor camera as depicted, or any other suitable type of camera. Furthermore, in some embodiments,multiple cameras 202 can be utilized. - The
gesture processing unit 200 can be connected to the peripheral modules of theuser terminal 290. In accordance with various embodiments, thegesture processing unit 200 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMI/VGA and USB, for example. When installed in theuser terminal 290, thegesture processing unit 200 can be treated as a peripheral device by the other computers of theuser terminal 290. - Similar to the
gesture processing unit 100 ofFIG. 1 , thegesture processing unit 200 can be connected to theprimary computer 216 of theuser terminal 290 via USB port to provide users with a touchless approach for interacting with theuser terminal 290. Additionally, a video stream can be routed from theprimary computer 216 to thegesture processing unit 200 by feeding the video output from theprimary computer 216 to an input port in thegesture processing unit 200. In some embodiments, a VGA toHDMI converter 211 can be used to convert the source video. As shown, avideo input module 210 can be used to compress the video and provide the compressed video to theCPU 206 via a camera serial interface (CSI) bus, for example. Theprimary display 214 of theuser terminal 290 can be connected to the output HDMI/VGA port on thegesture processing unit 100. As shown, an HDMI toVGA converter 209 can be used to output video prior to feeding it to theprimary display 214. - In an idle state, the original video feed from the
primary computer 216 can be proxied via thegesture processing unit 200, unaltered, and then displayed on theprimary display 214. An application executed by theCPU 206 of thegesture processing unit 200 can actively monitor motion via theconnected camera 202. In some embodiments, agesture recognizer module 204 using a tensor processing unit (TPU) is utilized, although this disclosure is not so limited. Upon identifying an “initiate” gesture by a user of theuser terminal 290, for example, theCPU 206 can begin to overlay the original video with a semi-transparent image of a cursor by avideo overlay module 210 to aid the user in navigation. - During a user's contactless interaction with the
user terminal 290, relevant gestures can be identified and translated to the equivalent human interface device (HID) commands by anHID module 208 of thegesture processing unit 200. The commands can be delivered to theprimary computer 216 via USB connection (or other suitable communications). TheCPU 206 can also modify the video feed such that visual feedback for each relevant action on theprimary display 214. By way of non-limiting examples, gestures of the user can cause the cursor to point to various interactive elements on the screen that would typically be physically touched by a user. When the cursor is pointing to a desired interactive element, the user can execute a gesture for selecting or activating the interactive element. -
FIG. 3 depicts another example of agesture processing unit 300 in accordance with one non-limiting embodiment. Similar toFIG. 2 , thegesture processing unit 300 can be incorporated into an existinguser terminal 390 in order to provide non-contact inputs from a user. The user terminal can have an existingpower supply 312, aprimary display 314, and aprimary computer 316. Theuser terminal 390 can also have acamera 302. Thegesture processing unit 300 can be similar to the gesture processing unit 200 (FIG. 2 ), having aCPU 306, and anHID module 308. In some embodiments, thegesture processing unit 300 can also include avideo input module 310, a VGA toHDMI converter 311, and a HDMI toVGA converter 309, although this disclosure is not so limited. This example embodiment of thegesture processing unit 300 does not necessarily utilize a tensor processing unit (TPU) for gesture recognition. In other embodiments, however, theCPU 306 can include a TPU without departing from the scope of the present disclosure. As shown inFIG. 3 , the camera signal is fed to theCPU 306 processing. In some embodiments, thecamera module 302 can include a depth sensor such that the camera signal provided to theCPU 306 includes depth information. Providing depth information in the camera signal can increase the ability of theCPU 306 to provide real-time gesture processing without necessarily needing a TPU. Thus, the functionality provided by thegesture processing unit 300 can be similar to the functionality provided by thegesture processing unit 200, but without the need for dedicated TPU. - Referring now to
FIG. 4 , a mobilecommands processing unit 400 is depicted in accordance with one non-limiting embodiment that enables contactless interaction with auser terminal 490 via amobile communications device 450. Similar to thegesture processing units commands processing unit 400 can be integrated into an existinguser terminal 490. Generally, the mobilecommands processing unit 400 can serve to mirror the images presented on aprimary display 414 of theuser terminal 490 onto a display screen ofmobile communications device 450 via wireless communications, as schematically depicted inFIG. 4 . The user can interact with their mobile communications device 450 (i.e., via touching the display screen) and such interactions are relayed to theuser terminal 490. - The
user terminal 490 can have apower supply 412, aprimary display 414, and aprimary computer 416, similar to theuser terminal gesture processing units commands processing unit 400 can have aCPU 406 and avideo overlay module 410. In accordance with various embodiments, the mobilecommands processing unit 400 can have inputs for HDMI/VGA, Bluetooth/WiFi communications, and USB devices, and outputs for HDMINGA and USB, for example. When installed in theuser terminal 490, the mobilecommands processing unit 400 can be treated as a peripheral device by the other computers of theuser terminal 490. Furthermore, while the embodiment depicted inFIG. 4 , as well as other embodiments, depict the use of Bluetooth for the purposes of illustration, it is to be appreciated that any suitable short-range wireless communication protocol and/or technologies can be used without departing from the scope of the present disclosure. - The mobile
commands processing unit 400 can be connected to aprimary computer 416 of theuser terminal 490 via USB port to provide users with a touchless approach for interacting with theuser terminal 490. Additionally, a video stream can be routed from theprimary computer 416 to the mobilecommands processing unit 400 by connecting HDM/VGA output from theprimary computer 416 to an input port of the mobilecommands processing unit 400. Theprimary display 414 of theuser terminal 490 can be connected to the output HDMI/VGA port on the mobilecommands processing unit 400. In this embodiment, a HID/Bluetooth module 408 is utilized to communicate with amobile communications device 450. As is to be appreciated, themobile communications device 450 can be a smart phone, a wearable (i.e., smart watch), tablet computer, laptop computer, or any other suitable device with a screen and that can receive user input. - In the idle state, the original video feed from the
primary computer 416 can be proxied via the mobilecommands processing unit 400, unaltered, and then displayed on theprimary display 414. An application running on theCPU 406 of the mobilecommands processing unit 400 can actively monitor user connections. After establishing a secure connection with themobile communications device 450, the display that would conventionally be displayed on theprimary display 414 can be displayed on themobile communications device 450 instead. Once successfully mirrored, the video display of theprimary display 414 can operationally display, for example, the identical content that is displayed on themobile communication device 450. Alternatively, in some embodiments, other content can be provided for display on theprimary display 414, such as an advertisement/marketing materials, or a variety of other suitable communications or messages. In other embodiments, theprimary display 414 may be blank. - Once mirroring is complete, users can use their
mobile communications device 450 as a touchpad, and the HID/Bluetooth Module 408 of the mobilecommands processing unit 400 can translate such interaction to the equivalent HID (human interface device) commands. The HID/Bluetooth Module 408 can deliver the commands to theprimary computer 416 via USB connection. In some embodiments, the videofeed overlay module 410 can provide a user with a visual feedback of each relevant action. - The
mobile communication device 450 can be paired to theuser terminal 490 via any suitable technique. In one example embodiment, theCPU 406 can cause a QR code to be displayed on theprimary display 414. The user of theuser terminal 490 can capture the QR code with a camera of themobile communications device 450. Capturing the QR code can initiate a secure Bluetooth connection between themobile communications device 450 and the HID/Bluetooth module 408 of the mobilecommands processing unit 400. Furthermore, the images presented on the display screen of themobile communications device 450 can be replicas of the images that would be presented on theprimary display 414. In some embodiments, however, the images presented on the display screen of themobile communications device 450 can be modified, customized, or otherwise adjusted for proper display on themobile communications device 450. -
FIG. 5 depicts another example mobilecommands processing unit 500 in accordance with one non-limiting embodiment. Similar toFIG. 4 , the mobilecommands processing unit 500 can be incorporated into an existinguser terminal 590 in order to provide non-contact inputs from a user. The user terminal can have an existingpower supply 512, aprimary display 514, and aprimary computer 516. The mobilecommands processing unit 500 can be similar to the mobile commands processing unit 400 (FIG. 4 ), having aCPU 506, and anHID module 508. In some embodiments, the mobilecommands processing unit 500 can also include avideo input module 510, a VGA toHDMI converter 511, and a HDMI toVGA converter 509, although this disclosure is not so limited. This embodiment of thegesture processing unit 300 does not utilize, however, a tensor processing unit (TPU) for gesture recognition. - The functionality provided by the mobile
commands processing unit 500 can be similar to the functionality described above with regard to mobilecommands processing unit 400. In one example embodiment, for example, a user can approach theuser terminal 590 and open a specific mobile application on theirmobile communications device 550. The mobile application can finds theuser terminal 590 via Bluetooth connection, for example, and attempt to connect. If the connection was successful, in accordance with some embodiments, the application executing on themobile communications device 550 can provide the user with a choice of mirror mode or touchpad mode. - If the mirror mode is selected, the mobile application can display, for example, a viewfinder box on the screen of the
mobile communications device 550. The size of the viewfinder box can have the same aspect ratio as theprimary display 514user terminal 590, for example. The user can point a rear-facing camera ofmobile communications device 550 towards theprimary display 514, such that the camera feed is displayed in the viewfinder box. The user can then interact with the display of the mobile communications device 550 (by touching a location in the viewfinder box that is displaying an interactive element from the primary display, for example). These interactions with the touchscreen of themobile communications device 550 can be sent to the mobilecommands processing unit 500 and can be translated to coordinates on theprimary display 514. Based on the user's interaction with the touchscreen of theirmobile communications device 550 theHID module 508 can send corresponding commands to theprimary computer 516. - If the touchpad mode is selected, the mobile application can present a virtual track pad panel on the touchscreen on the
mobile communications device 550, similar in functionality to a touchpad on a laptop computer, for example. The mobile application of themobile communications device 550 can process user interaction (taps, moves and swipes) with the virtual track pad panel and send them via the Bluetooth connection to theHID module 508, for example. TheHID module 508 can then send the actions to theprimary computer 516user terminal 590 of the via standard HID protocol. TheHID module 508 can also sends the actions and to theCPU 506 via I2C connection, for example. - Referring now to
FIG. 6 , an example embodiment of contactless control of auser terminal 690 is depicted. In this embodiment, the user can interact with theirmobile communication device 650 to control a mouse or pointer on a screen of theuser terminal 690, similar to the “touchpad mode” described above, for example. More specifically, the touchscreen of theirmobile communication device 650 can serve as track pad such that as a user moves their finger along the touchscreen of theirmobile communication device 650, they will see the pointer on the device screen move in parallel. When the user taps or double-taps the touchscreen of the theirmobile communication device 650, it would register on the device screen as a “touch” or “click”, or other suitable action, on theuser terminal 690. - Similar to
FIG. 4 , amicrocontroller module 606 can include a USB HID/Bluetooth and/or WiFi Module. The user can connect theirmobile communication device 650 to themicrocontroller module 606 via suitable communication protocol and connection process. An application running on themicrocontroller module 606 can actively monitor users. Users can use the connectedmobile communication device 650 as a touchpad, and themicrocontroller module 606 can translate such interaction to the equivalent HID (human interface device) commands and deliver the HID commands to the primary computer of theuser terminal 690 via USB connection. Thus, a cursor provided by the operating system of theuser terminal 690 can be controlled by the user through the interaction with the connectedmobile communication device 650 without requiring the user to physically contact theuser terminal 690. - Referring now to
FIG. 7 , an example Internet-based mobilecommands processing unit 700 is depicted in accordance with one non-limiting embodiment. Generally, the mobilecommands processing unit 700 can be a retrofit kit that can establish a bi-directional connection between amobile communications device 750 and auser terminal 790 via a websocket protocol. Thus, instead of using a direct Bluetooth connection or a WiFi connection between themobile communications device 750 and theuser terminal 790, a dedicated, secure bi-directional connection between themobile communications device 750 and theuser terminal 790 can be established. Using this technique for contactless interaction, the user could either see a broadcast version of the screen of the user terminal 790 (similar toFIG. 4 , for example) or the user could instead simply use theirmobile communications device 750 as a touchpad to interact with the screen of user terminal 790 (similar toFIG. 6 , for example). - The
user terminal 790 can be similar touser terminals power supply 712, aprimary display 714, and aprimary computer 716. The mobilecommands processing unit 700 can be similar to the mobilecommands processing unit 200 ofFIG. 4 , having aCPU 706,video overlay module 710, and anHID module 708. - When a user approaches the
user terminal 790 in its idle state, an application running on theCPU 706 can display an overlay message directing the user to navigate to a specific website address on theirmobile communications device 750. Theprimary display 714 can also show a unique QR code (or other unique identifier) displayed next to the message. The user can then navigate to the website address on theirmobile communications device 750 and be prompted to scan the currently displayed QR code from theprimary display 714, or otherwise enter the unique identifier. For example, in some embodiments, a unique alphanumeric code can be displayed on theprimary display 714 of theuser terminal 790 that can be typed into the webpage displayed on themobile communications device 750. Once the unique code is entered, a dedicated websocket connection can be established between themobile communications device 750 and theuser terminal 790 through awebserver 760. Theprimary display 714 of theuser terminal 790 can be set to display alternative content, display a black screen, or display a message that private connection in session, for example. - In accordance with various embodiments, the mobile
commands processing unit 700 can utilize a video compression algorithm and only send static images of the screen to themobile communications device 750 to minimize the latency and reduce the required bandwidth. The website presented at themobile communications device 750 can capture the user's interactions and send them to theweb server 760, which in turn sends them to the mobilecommands processing unit 700. The mobilecommands processing unit 700 can translate such interactions to the HID command, which is delivered via USB port to theuser terminal 790. - Although
FIG. 7 , and other embodiments described herein (such asFIG. 11 , below) depict the use of the websocket protocol for illustration purposes, this disclosure is not so limited. Instead, any suitable web-based communication protocol, or combination of multiple web-based communication protocols can be utilized without departing from the scope of the present disclosure. By way of non-limiting examples, some embodiments of the present disclosure can utilize a RTP Control Protocol (RTCP), an HTTP Live Streaming (HLS) protocol, among others. Further, some embodiments can utilize, for example, an HLS protocol for video transmission and a websocket protocol for gesture input transmission. -
FIG. 8 depicts another example mobilecommands processing unit 800 in accordance with one non-limiting embodiment. Similar toFIG. 7 , the mobilecommands processing unit 800 can be incorporated into an existinguser terminal 890 in order to provide non-contact inputs from a user. The user terminal can have an existingpower supply 812, aprimary display 814, and aprimary computer 816. The mobilecommands processing unit 800 can be similar to the mobile commands processing unit 700 (FIG. 7 ), having a CPU 806, and anHID module 808. In some embodiments, the mobilecommands processing unit 800 can also include avideo input module 810, a VGA toHDMI converter 811, and a HDMI toVGA converter 809, although this disclosure is not so limited. - In accordance with one example implementation, a user can scans a QR Code with a
mobile communications device 850. The QR code can be associated with the user terminal 890 (i.e. via a unique identifier of the user terminal) and direct a browser of themobile communications device 850 to a website when scanned. The website can send the command to the CPU 806 to generate and show the start session QR code. This command can be sent via amessage broker service 858, such as AWS SQS, for example. The CPU 806 can cause a “start-session QR code” to be presented on theprimary display 814 via video overlay. Themobile communications device 850 can scan the “start-session QR code” presented on theprimary display 814 using a QR code scanner. In some embodiments, the QR Code scanner is provided by the website interface. - The website can then send the command to the CPU 806 to start the session. This command is sent via the
message broker service 858, for example. Once the session is started, the CPU 806 can remove the start-session QR code from theprimary display 814 and start sending frames with original images of theprimary display 814 to the website via a websocket connection. In some embodiments, the frames are jpeg images, which form an mjpeg stream. The website can show these frames to a user of themobile communications device 850. The website can registers the user's taps, moves, or other interactions with the screen and send them to the CPU 806 via opened websocket connection stablished between themobile communications device 850 and theuser terminal 890 through awebserver 860. - The CPU 806 can receive the coordinates of the user interactions from the websocket connection and send them to the USB HID
Module 808 via an I2C connection. The USB HIDmodule 808 can receives the coordinates of the user interactions and send them to theprimary computer 816 via standard HID protocol. - While
FIGS. 1-8 generally depict existing user terminals that can be retrofitted with contactless interaction capabilities, this disclosure is not so limited. Referring now toFIGS. 9-11 , example embodiments are depicted in which user terminals are manufactured with contactless interaction capabilities similar to the gesture-based and mobile command-based approaches described above. - Referring first to
FIG. 9 , auser terminal 990 that includes a built-ingesture processing unit 900 in accordance with one non-limiting embodiment is depicted. Similar to theuser terminal 190 ofFIG. 1 , a user can provide input to theuser terminal 990 via non-contacting gestures. - At the time of manufacture, the
gesture processing unit 900 can be connected to the peripheral modules ofuser terminal 990. Thegesture processing unit 900 can have agesture recognizer module 906 with a USB output that is to be treated as a peripheral device by the other computers on-board theuser terminal 990. Theuser terminal 990 can also be built with one (or more)depth sensor camera 902 that is connected to thegesture recognizer module 906. Thegesture recognizer module 906 can be connected to the primary computer of theuser terminal 990 via the USB port such that touch-based devices (touchscreen, keyboard, mouse, etc.) of theuser terminal 990 can be replaced with a touchless approach. - An application running on the
gesture processing unit 900 can actively monitor for motion via theconnected depth camera 902. Upon identifying an “Initiate” gesture, theuser terminal 990 can be configured to graphically include a cursor on adevice screen 970 to aid the user in gesture-based navigation. As describe above, gestures by the user can be translated by thegesture recognizer module 906 to the equivalent HID (human interface device) commands, delivered to the primary computer via USB connection, and displayed on thescreen 970. - Furthermore, while
FIG. 9 schematically depicts acamera module 902 having a depth sensor and agesture recognizer module 906 having a TPU, this disclosure is not so limited. Similar to embodiments provided above, for example, thecamera module 902 does not necessarily need to have a depth sensor, nor is a TPU required. As such, in some example embodiments, thecamera module 902 can include a depth sensor and a TPU is not used for gesture processing. In other example embodiments, thecamera module 902 does not have a depth sensor and a TPU is used gesture processing. - Referring now to
FIG. 10 , anexample user terminal 1090 is depicted that includes a built-in HID/Bluetooth module 1080 allowing for user input from a coupledmobile communications device 1050. Similar to the embodiments described above, theuser terminal 1090 can be configured to either receive commands from themobile communications device 1050 when it is used as a trackpad, or the display of theuser terminal 1090 can be mirrored to themobile communications device 1050 via a local network connection. In particular, the HID/Bluetooth module 1080 can be a microcontroller with Bluetooth or WiFi module that connects as a HID device to the main computer of theuser terminal 1090. A user can connect theirmobile communication device 1050 to the HID/Bluetooth module 1080 through a Bluetooth-based connection to enable contactless interaction with theuser terminal 1090. - Referring now to
FIG. 11 , anexample user terminal 1190 is depicted that includes a built-in Internet-based mobile commands processing unit allowing for a bi-directional connection between amobile communications device 1150 and theuser terminal 1190 via a websocket protocol. As noted above, however, any suitable web-based communication protocol(s) can be utilized. Theuser terminal 1190 can be configured to allow a user to see a broadcast version of theuser terminal 1190 display mirrored on themobile communications device 1150. Alternatively, theuser terminal 1190 can be configured to allow a user to use theirmobile communications device 1150 as a touchpad for a display screen 11110 of theuser terminal 1190. - When a user approaches the
user terminal 1190 in its idle state, an application running on theuser terminal 1190 can display an overlay message directing the user to navigate to a specific website address on theirmobile communications device 1150. Thedisplay screen 1170 can also show a unique QR code or other unique identifier. The user can then navigate to the website address on theirmobile communications device 1150 and be prompted to scan the currently displayed QR code from thedisplay screen 1170, or otherwise enter the unique identifier. For example, in some embodiments, a unique alphanumeric code can be displayed on thedisplay screen 1170 of theuser terminal 1190 that can be typed into the webpage displayed on themobile communications device 1150. Once the unique code is entered, a dedicated websocket connection can be established between themobile communications device 1150 and theuser terminal 1190 through awebserver 1160. Thedisplay screen 1170 of theuser terminal 1190 can be set to display alternative content, display a black screen, or display a message that private connection in session, for example. - In accordance with various embodiments, the
user terminal 1190 can utilize a video compression algorithm and only send static images of the screen to themobile communications device 1150 to minimize the latency and reduce the required bandwidth. The website presented on themobile communications device 1150 can capture the user's interactions and send them to theweb server 1160, which in turn sends them to theuser terminal 1190. Theuser terminal 1190 can translate such interactions to an HID command. - It is noted that user terminals in accordance with the present disclosure can include multiple systems for different types of contactless interaction. By way of example, various user terminals may be retrofitted or manufactured with units that allow users to provide input via gestures (similar to
FIGS. 1-3, and 9 , for example), or their mobile communications devices (similar toFIGS. 4-8 and 10-11 , for example). As such, a user wishing to interact with such user terminal can optionally choose the type of non-contacting user interface they prefer to use. - The processes described herein can be executed by programmable equipment, such as computers or computer systems and/or processors. Software that can cause programmable equipment to execute processes can be stored in any storage device, such as, for example, a computer system (nonvolatile) memory, an optical disk, magnetic tape, or magnetic disk. Furthermore, at least some of the processes can be programmed when the computer system is manufactured or stored on various types of computer-readable media.
- It can also be appreciated that certain portions of the processes described herein can be performed using instructions stored on a computer-readable medium or media that directs a computer system to perform the process steps. A computer-readable medium can include, for example, memory devices such as optical disk drives or hard disk drives. A computer-readable medium can also include memory storage that is physical, virtual, permanent, temporary, semi-permanent, and/or semi-temporary.
- A “computer,” “computer system,” “host,” “server,” or “processor” can be, for example and without limitation, a processor, microcomputer, minicomputer, server, mainframe, laptop, personal data assistant (PDA), wireless e-mail device, cellular phone, pager, processor, fax machine, scanner, or any other programmable device configured to transmit and/or receive data over a network. Computer systems and computer-based devices disclosed herein can include memory for storing certain software modules used in obtaining, processing, and communicating information. It can be appreciated that such memory can be internal or external with respect to operation of the disclosed embodiments.
- In various embodiments disclosed herein, a single component can be replaced by multiple components and multiple components can be replaced by a single component to perform a given function or functions. Except where such substitution would not be operative, such substitution is within the intended scope of the embodiments. The computer systems can comprise one or more processors in communication with memory (e.g., RAM or ROM) via one or more data buses. The data buses can carry electrical signals between the processor(s) and the memory. The processor and the memory can comprise electrical circuits that conduct electrical current. Charge states of various components of the circuits, such as solid state transistors of the processor(s) and/or memory circuit(s), can change during operation of the circuits.
- The foregoing description of embodiments and examples has been presented for purposes of illustration and description. It is not intended to be exhaustive or limiting to the forms described. Numerous modifications are possible in light of the above teachings. Some of those modifications have been discussed, and others will be understood by those skilled in the art. The embodiments were chosen and described in order to best illustrate principles of various embodiments as are suited to particular uses contemplated. The scope is, of course, not limited to the examples set forth herein, but can be employed in any number of applications and equivalent devices by those of ordinary skill in the art.
Claims (22)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/007,278 US20210349593A1 (en) | 2020-05-11 | 2020-08-31 | Systems and methods for non-contacting interaction with user terminals |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063022774P | 2020-05-11 | 2020-05-11 | |
US17/007,278 US20210349593A1 (en) | 2020-05-11 | 2020-08-31 | Systems and methods for non-contacting interaction with user terminals |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210349593A1 true US20210349593A1 (en) | 2021-11-11 |
Family
ID=78412689
Family Applications (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/007,262 Active US11194468B2 (en) | 2020-05-11 | 2020-08-31 | Systems and methods for non-contacting interaction with user terminals |
US17/007,278 Abandoned US20210349593A1 (en) | 2020-05-11 | 2020-08-31 | Systems and methods for non-contacting interaction with user terminals |
US17/517,752 Active US11409433B2 (en) | 2020-05-11 | 2021-11-03 | Systems and methods for non-contacting interaction with user terminals |
US17/855,962 Active US11693557B2 (en) | 2020-05-11 | 2022-07-01 | Systems and methods for non-contacting interaction with user terminals |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/007,262 Active US11194468B2 (en) | 2020-05-11 | 2020-08-31 | Systems and methods for non-contacting interaction with user terminals |
Family Applications After (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/517,752 Active US11409433B2 (en) | 2020-05-11 | 2021-11-03 | Systems and methods for non-contacting interaction with user terminals |
US17/855,962 Active US11693557B2 (en) | 2020-05-11 | 2022-07-01 | Systems and methods for non-contacting interaction with user terminals |
Country Status (1)
Country | Link |
---|---|
US (4) | US11194468B2 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11194468B2 (en) | 2020-05-11 | 2021-12-07 | Aron Ezra | Systems and methods for non-contacting interaction with user terminals |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050104852A1 (en) * | 2003-11-18 | 2005-05-19 | Emerson Theodore F. | Generating pointer position data from position data of a pointing device of a remote console |
US20110246904A1 (en) * | 2010-04-01 | 2011-10-06 | Gus Pinto | Interacting with Remote Applications Displayed Within a Virtual Desktop of a Tablet Computing Device |
US20140006451A1 (en) * | 2012-06-30 | 2014-01-02 | At&T Mobility Ii Llc | Remote-Controllable Kiosk |
US9098846B2 (en) * | 2011-03-04 | 2015-08-04 | Bank Of America Corporation | Contactless automated teller machine |
US20160216774A1 (en) * | 2015-01-27 | 2016-07-28 | I/O Interconnect Inc. | Method for Generating a Cursor on an External Monitor Connected to a Handheld Computer |
US20160364710A1 (en) * | 2014-02-11 | 2016-12-15 | Bixolon Co., Ltd. | Pos peripherals controlling device, mobile host device, and control method thereof |
US20200184531A1 (en) * | 2018-12-11 | 2020-06-11 | T-Mobile Usa, Inc. | Mirrored display and proximal control of autonomous retail systems |
US20210200501A1 (en) * | 2019-12-27 | 2021-07-01 | Abalta Technologies, Inc. | Projection, control, and management of user device applications using a connected resource |
US20210392054A1 (en) * | 2020-06-11 | 2021-12-16 | Darren David | Remote controlled information display systems |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7036094B1 (en) | 1998-08-10 | 2006-04-25 | Cybernet Systems Corporation | Behavior recognition system |
US6950534B2 (en) | 1998-08-10 | 2005-09-27 | Cybernet Systems Corporation | Gesture-controlled interfaces for self-service machines and other applications |
EP1596271A1 (en) | 2004-05-11 | 2005-11-16 | Hitachi Europe S.r.l. | Method for displaying information and information display system |
US8180396B2 (en) | 2007-10-18 | 2012-05-15 | Yahoo! Inc. | User augmented reality for camera-enabled mobile devices |
US9740293B2 (en) | 2009-04-02 | 2017-08-22 | Oblong Industries, Inc. | Operating environment with gestural control and multiple client devices, displays, and users |
US20140365895A1 (en) | 2008-05-13 | 2014-12-11 | Apple Inc. | Device and method for generating user interfaces from a template |
US9311115B2 (en) | 2008-05-13 | 2016-04-12 | Apple Inc. | Pushing a graphical user interface to a remote device with display rules provided by the remote device |
US8907894B2 (en) | 2009-10-20 | 2014-12-09 | Northridge Associates Llc | Touchless pointing device |
US9152373B2 (en) | 2011-04-12 | 2015-10-06 | Apple Inc. | Gesture visualization and sharing between electronic devices and remote displays |
FR2975516B1 (en) | 2011-05-19 | 2014-02-28 | Sncf | FREE SERVICE TERMINAL COMPRISING A GESTURE RECOGNITION DEVICE AND A DIRECTIONAL SOUND TRANSMITTING DEVICE. |
EP2587432A1 (en) | 2011-10-31 | 2013-05-01 | NCR Corporation | Self-service terminal transactions |
US9164779B2 (en) | 2012-02-10 | 2015-10-20 | Nokia Technologies Oy | Apparatus and method for providing for remote user interaction |
JP2013196047A (en) * | 2012-03-15 | 2013-09-30 | Omron Corp | Gesture input apparatus, control program, computer-readable recording medium, electronic device, gesture input system, and control method of gesture input apparatus |
US9119156B2 (en) | 2012-07-13 | 2015-08-25 | Microsoft Technology Licensing, Llc | Energy-efficient transmission of content over a wireless connection |
US10353581B1 (en) | 2012-07-27 | 2019-07-16 | Merge Healthcare Solutions Inc. | Mobile computer input devices |
US9939905B2 (en) | 2012-11-09 | 2018-04-10 | Ross Conrad Labelson | Optical control of display screens |
US20140331189A1 (en) | 2013-05-02 | 2014-11-06 | Jpmorgan Chase Bank, N.A. | Accessible self-service kiosk with enhanced communication features |
US20150058216A1 (en) | 2013-08-21 | 2015-02-26 | Sergio Luciani | ATM Enabling Interface with Mobile Technology |
US9922488B2 (en) | 2013-10-16 | 2018-03-20 | Redbox Automated Retail, Llc | Wireless communication for consumer-operated kiosks |
US10257341B2 (en) | 2013-11-01 | 2019-04-09 | Ebay Inc. | Using a smartphone for remote interaction with visual user interfaces |
US9477317B1 (en) | 2014-04-22 | 2016-10-25 | sigmund lindsay clements | Sanitarily operating a multiuser device using a touch free display |
CN105094224A (en) * | 2014-05-14 | 2015-11-25 | 高胜科技国际控股有限公司 | Attachable computer |
US20160034058A1 (en) | 2014-07-31 | 2016-02-04 | Microsoft Corporation | Mobile Device Input Controller For Secondary Display |
US10614442B2 (en) | 2014-12-03 | 2020-04-07 | Mastercard International Incorporated | System and method of facilitating cash transactions at an ATM system without an ATM card using mobile |
US10304412B1 (en) | 2016-06-30 | 2019-05-28 | Google Llc | Device synchronization |
US10168767B2 (en) * | 2016-09-30 | 2019-01-01 | Intel Corporation | Interaction mode selection based on detected distance between user and machine interface |
WO2018189603A1 (en) | 2017-04-13 | 2018-10-18 | RxPrism Health Systems Pvt. Ltd | A system for providing access to a kiosk using a mobile device |
US10366291B2 (en) * | 2017-09-09 | 2019-07-30 | Google Llc | Systems, methods, and apparatus for providing image shortcuts for an assistant application |
US11194468B2 (en) | 2020-05-11 | 2021-12-07 | Aron Ezra | Systems and methods for non-contacting interaction with user terminals |
-
2020
- 2020-08-31 US US17/007,262 patent/US11194468B2/en active Active
- 2020-08-31 US US17/007,278 patent/US20210349593A1/en not_active Abandoned
-
2021
- 2021-11-03 US US17/517,752 patent/US11409433B2/en active Active
-
2022
- 2022-07-01 US US17/855,962 patent/US11693557B2/en active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050104852A1 (en) * | 2003-11-18 | 2005-05-19 | Emerson Theodore F. | Generating pointer position data from position data of a pointing device of a remote console |
US20110246904A1 (en) * | 2010-04-01 | 2011-10-06 | Gus Pinto | Interacting with Remote Applications Displayed Within a Virtual Desktop of a Tablet Computing Device |
US9098846B2 (en) * | 2011-03-04 | 2015-08-04 | Bank Of America Corporation | Contactless automated teller machine |
US20140006451A1 (en) * | 2012-06-30 | 2014-01-02 | At&T Mobility Ii Llc | Remote-Controllable Kiosk |
US20160364710A1 (en) * | 2014-02-11 | 2016-12-15 | Bixolon Co., Ltd. | Pos peripherals controlling device, mobile host device, and control method thereof |
US20160216774A1 (en) * | 2015-01-27 | 2016-07-28 | I/O Interconnect Inc. | Method for Generating a Cursor on an External Monitor Connected to a Handheld Computer |
US20200184531A1 (en) * | 2018-12-11 | 2020-06-11 | T-Mobile Usa, Inc. | Mirrored display and proximal control of autonomous retail systems |
US20210200501A1 (en) * | 2019-12-27 | 2021-07-01 | Abalta Technologies, Inc. | Projection, control, and management of user device applications using a connected resource |
US20210392054A1 (en) * | 2020-06-11 | 2021-12-16 | Darren David | Remote controlled information display systems |
Also Published As
Publication number | Publication date |
---|---|
US20220057930A1 (en) | 2022-02-24 |
US11409433B2 (en) | 2022-08-09 |
US11194468B2 (en) | 2021-12-07 |
US11693557B2 (en) | 2023-07-04 |
US20220334713A1 (en) | 2022-10-20 |
US20210349626A1 (en) | 2021-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20230161417A1 (en) | Sharing Across Environments | |
EP2854010B1 (en) | Method and apparatus for displaying messages | |
US9189147B2 (en) | Ink lag compensation techniques | |
US9361019B2 (en) | Apparatus and method for providing web browser interface using gesture in device | |
KR102109617B1 (en) | Terminal including fingerprint reader and method for processing a user input through the fingerprint reader | |
US8982066B2 (en) | Automatic ending of interactive whiteboard sessions | |
US8493341B2 (en) | Optical touch display device and method thereof | |
JP6090413B2 (en) | Automatic operation at login | |
US20120110516A1 (en) | Position aware gestures with visual feedback as input method | |
JP2017508218A (en) | Conductive trace routing for display and bezel sensors | |
WO2015168977A1 (en) | Screen control method and device | |
CN103207698B (en) | Touch display, method for switching operation mode of touch display and computer system | |
CN102708329B (en) | Data safety management system and method | |
JP2005517226A (en) | Collaborative handwriting input system and method | |
US11693557B2 (en) | Systems and methods for non-contacting interaction with user terminals | |
US20210133363A1 (en) | Display apparatus, display method, and image processing system | |
US20140372915A1 (en) | Method and system for operating display device | |
US20060154652A1 (en) | Method and system for direct communication between mobile device and application on display | |
CN115348328B (en) | Handwriting data processing method and handwriting data processing system | |
US20230251895A1 (en) | Information processing apparatus and control method | |
CN204270284U (en) | A kind of display device | |
KR20030081741A (en) | Personal hand-held terminal and image cognition method thereof | |
US20210027750A1 (en) | Display apparatus, display system, and display method | |
US9389778B2 (en) | Image capturing method of touch display module and electronic device | |
CN102455858A (en) | Cursor management method and related management system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |