[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

WO2015017670A2 - Methods and systems for managing multi-device interaction using the doppler effect - Google Patents

Methods and systems for managing multi-device interaction using the doppler effect Download PDF

Info

Publication number
WO2015017670A2
WO2015017670A2 PCT/US2014/049175 US2014049175W WO2015017670A2 WO 2015017670 A2 WO2015017670 A2 WO 2015017670A2 US 2014049175 W US2014049175 W US 2014049175W WO 2015017670 A2 WO2015017670 A2 WO 2015017670A2
Authority
WO
WIPO (PCT)
Prior art keywords
receiving
gesture
user
receiving device
doppler
Prior art date
Application number
PCT/US2014/049175
Other languages
French (fr)
Other versions
WO2015017670A3 (en
Inventor
Sidhant Gupta
Shwetak N. Patel
Tanvir Islam AUMI
Mayank Goel
Eric Cooper LARSON
Original Assignee
Sidhant Gupta
Patel Shwetak N
Aumi Tanvir Islam
Mayank Goel
Larson Eric Cooper
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sidhant Gupta, Patel Shwetak N, Aumi Tanvir Islam, Mayank Goel, Larson Eric Cooper filed Critical Sidhant Gupta
Publication of WO2015017670A2 publication Critical patent/WO2015017670A2/en
Publication of WO2015017670A3 publication Critical patent/WO2015017670A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/18Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using ultrasonic, sonic, or infrasonic waves

Definitions

  • the present technology is generally related to methods and systems for managing interactions between multiple electronic devices.
  • several embodiments are directed to methods and systems utilizing the Doppler effect to manage multi-device interactions.
  • Figure 1 is a schematic representation of a multi-device interaction management system configured in accordance with an embodiment the present technology.
  • Figures 2A and 2B illustrate example gestures made by a user with a transmitting device.
  • Figures 3A-3F are frequency graphs showing audio frequencies detected by receivers.
  • Figure 4 is a block diagram of a method for managing multi-device interaction according to one embodiment of the present technology.
  • Figure 5 is a block diagram of a method for managing multi-device interaction according to another embodiment of the present technology.
  • a Doppler-based device selection approach is presented.
  • low-cost, commodity audio hardware can be used to determine when a user gestures with a device with respect to another device (e.g., the user moves her mobile phone at the other device in a pointing motion).
  • This technique relies on a well-understood phenomenon known as the "Doppler Effect,” which characterizes the change in observed frequency of a sound wave as a transmitter moves towards or away from the receiver.
  • embodiments of the present technology are much less susceptible to occlusions between the controller and receiver, and are invisible to the user.
  • the present technology is simple, robust, and easy to implement— it can be developed on cheap hardware at low cost or on top of existing commodity hardware. Additionally, by implementing a "scanning gesture", this approach can be used to find the relative position of each device with respect to one another.
  • Example applications of how the system can be used include (1) rapidly pairing and un-pairing devices for grouping or sharing information, (2) controling home appliances with minimal instrumentation, and (3) ascertaining relative positions of separate monitors to stitch together a multi-display system.
  • the present technology leverages the Doppler shift of an audio tone to manage interactions between multiple devices.
  • the user wants to connect to a particular device, she holds a button to initiate an audio tone (which can be sufficiently high to be inaudible to human hearing) from her mobile phone and then make a pointing gesture with her phone towards a target. Since the velocity of the sound transmitter changes, a Doppler shift can be observed by all the potential target devices in the vicinity.
  • Certain embodiments of the present technology work on the hypothesis that the intended target device will receive the maximum frequency shift compared to the other potential target devices. By comparing the peak frequency shift in all the devices, the intended recipient can be inferred.
  • the device being used to gesture is referred to herein as the "transmitter” and the devices sensing the Doppler shift are referred to as the "receivers.”
  • the observed frequency shift sensed by a receiver is proportional to the transmitter frequency (f T ) and to the velocity at which the transmitter moves relative to the receivers. Initially the transmitter and receiver can be stationary, with no change in the receiving frequency. When a user moves the transmitter towards the receiver, it causes a shift in frequency to be observed. The amount of this shift is measured by the receiver (f R ) and can be described by the equation (1):
  • c, VT, and VR are the velocity of sound in air, of the transmitter, and of the receiver, respectively.
  • the received frequency (JR) increases proportionally as the transmitter velocity (vr) increases.
  • FIG. 1 is a schematic representation of a multi-device interaction management system configured in accordance with an embodiment of the present technology.
  • the system 100 includes a server 101, one or more receivers 103 (three are shown in the present embodiment as a first receiver 103a, a second receiver 103b, and third receiver 103c).
  • a transmitter 105 emits a tone that can be detected by the receivers 103a-c.
  • each of the receivers 103a-c can detect the Doppler shift in the tone emitted by the transmitter 105. These detected Doppler shifts can then be communicated to the server 101 for analysis and comparison.
  • the server 101 may compare the magnitudes of the detected Doppler shifts in order to determine which device the transmitter 105 was pointed towards. For example, when the transmitter 105 is moved towards, or pointed towards, receiver 103b, the receiver 103b will detect a greater Doppler shift than that detected by the other receivers 103 a and 103c. In some embodiments, the transmitter 105 can then be paired with the receiver 103b. In another application, the server 101 may compare the time stamps of the detected Doppler shifts in order to determine the relative positions of the three receivers 103a-c.
  • the time at which each receiver 103a-c detects the Doppler shift can be used to determine the relative locations of the receivers 103a-c. The determined relative positions can then be used, in one example, to stitch together multiple displays.
  • Other uses of the detected Doppler shifts are possible, for example more complex gesture vocabulary (beyond pointing and sweeping) can be developed. For example, the user may perform a "pulling" gesture, in which the transmitter is moved away from the devices. Detection of this Doppler shift (in which the frequency would be shifted downward rather than upward) could be used to unpair the transmitter from the intended receiver.
  • the detected Doppler shifts can be used not only to select a receiver or to indicate relative positions, but also to transmit control logic to one or more of the receivers. It will also be appreciated that a variety of other gestures and corresponding actions may be possible using embodiments of the present technology.
  • Communication between the receivers 103a-c and the server 101 can be wired or wireless.
  • the central server 101 can be replaced with a distributed and decentralized architecture.
  • the server 101 can be physically integrated in the same device in which any one of receivers 103a-c are positioned.
  • receiver 103 a may be a laptop computer including a microphone for detecting the audio signal emitted by the transmitter 105, and the server 101 may also reside on the laptop.
  • the server 101 can be physically separate from the receivers 103a-c.
  • the receivers can be computers, computer displays, routers, modems, smartphones, tablets, printers, televisions, stereo systems, smart appliances (e.g., ovens, washers, dryers, dishwashers, refrigerators, microwaves, toasters, alarm clocks, home security systems), or other electronic devices capable of detecting audio signals and determining Doppler shifts.
  • electronic devices can be retrofitted to enable them to detect audio signals and determine Doppler shifts.
  • the Doppler shifts can be determined using a software approach with FFT.
  • a hardware solution could be implemented, for example a local oscillator coupled with an analog phase detector to sense Doppler shift with an 8-bit microcontroller using a low sample rate ADC.
  • the transmitter can be a smartphone, tablet, remote control, computer, speaker, or other electronic device capable of emitting a continuous audio tone.
  • FIGs 2A and 2B illustrate example gestures made by a user with a transmitting device.
  • the user 200 performs a pointing gesture, in which the transmitter 105 is moved towards the first receiver 103a and second receiver 103b.
  • the transmitter 105 is a smartphone and the first receiver 103 a and second receiver 105b are each laptop computers.
  • the receivers can be any number of electronic devices capable of sensing the audio signal and determining a Doppler shift
  • the transmitter can be any electronic device capable of emitting a continuous tone.
  • the transmitter 105 can emit a continuous tone prior to and during the pointing gesture performed by the user 200.
  • each of the receivers 103a-b may detect a Doppler shift in the audio signal received from the transmitter 105. By comparing the magnitudes of the detected Doppler shifts, it can be determined which of the receivers 103a-b the user 200 pointed towards. As noted above, the receiver pointed towards by the transmitter will register the greatest Doppler shift.
  • the user 200 performs a sweeping gesture in which the transmitter 105 is moved laterally with respect to the first receiver 103a and second receiver 103b. Again the transmitter 105 emits a continuous tone prior to and throughout the gesture performed by the user 200. As the user 200 sweeps the transmitter 105 from left to right, each of the receivers 103a-b will register a Doppler shift. However, the first receiver 103 a will register a Doppler shift first, and the second receiver 103b will register a Doppler shift second. By comparing the times of the detected Doppler shifts, the relative positions of the first and second receivers 103a-b can be determined.
  • the detection by the first receiver 103 a of a Doppler shift prior to detection by the second receiver 103b indicates that the first receiver 103a is left of the second receiver 103b.
  • the transmitter 105 can comprise a computing device (e.g., smartphone, tablet, or similar device), and a user interface can be provided.
  • the user can receive, via the user interface on the computing device, instructions to initiate transmission of an audio signal.
  • the user may then initiate transmission of the continuous tone, for example by pressing a "start" button on the user interface.
  • the computing device may then initiate transmission of the continuous tone.
  • the user interface may then optionally provide instructions for the user to perform a gesture with the computing device. For example, the user interface may provide instructions for the user to point the computing device towards the intended receiver device (as in Figure 2A).
  • the computing device can receive, from the intended receiving device, an indication to establish a connection with the receiving device.
  • the intended receiving device can be a laptop computer, and the computing device can initiate connection with the laptop computer via WiFi, Bluetooth, RFID, near-field communication, or other suitable communication technology.
  • Figures 3A-3F are frequency graphs showing audio frequencies detected by receivers.
  • Figures 3A and 3B illustrate the detected signals at the first receiver Rl and second receiver R2, respectively, when the transmitter is not in motion.
  • the peak signal detected by Rl 301a in Figure 3A
  • the peak signal detected by R2 301b in Figure 3B
  • Figures 3C and 3D illustrate the detected signals at Rl and R2, respectively, when the transmitter is moved towards Rl.
  • FIG. 4 is a block diagram of a method for managing multi-device interaction according to one embodiment of the present technology in which a user points the transmitter towards one of the receivers.
  • Process 400 begins with the first receiver Rl in block 401 receiving an audio signal.
  • the audio signal can be a continuous tone emitted by a transmitter, for example, a smartphone held by a user.
  • the receiver Rl can be a laptop, another smartphone, or any electronic device configured to detect a Doppler shift in a received audio signal.
  • a transmitter device generates a continuous tone played through the device's speakers at 18 kHz.
  • the continuous tone can be as low as 6 kHz, and in some embodiments the tone may be higher than 18 kHz.
  • a continuous tone at 18 kHz is generally inaudible while also being detectable by nearly all commodity hardware. Additionally, higher transmitter frequencies result in greater Doppler shifts, making it computationally easier to estimate motion at a given frequency resolution.
  • all receivers Prior to the user performing a gesture with the transmitter, all receivers can continuously sample their microphones, for example in some embodiments they may sample their microphones at approximately 44.1 kHz.
  • the incoming audio signal can be buffered into 4096-point, non-overlapping frames and the
  • the receiver Rl can scan the frequency range of interest, for example in the range between 17-19 kHz, to find the bins with the peak with largest magnitude that spans no more than 1-2 bins, i.e., a sharp peak as a result of the transmitter tone. Also, these bins should remain consistent across multiple FFT vectors. For example, this could be defined when the same FFT bins are selected consecutively for 5 frames. This bin index can be saved as Ntransmitter, and the receivers can continue to sample their microphones to wait for frequency shifts.
  • the first receiver Rl determines the Doppler shift (Dl) of the received audio signal.
  • Dl Doppler shift
  • movement of a transmitter with respect to the receiver Rl will generate a Doppler shift that can be detected by the receiver.
  • gestures can be conservatively bound to a maximum velocity, such as 6m/sec, in which case the maximum shift, in frequency bins, would be approximately 31 bins.
  • the signal of interest can be limited to 31 bins.
  • the receiver Rl can store the frequency bin with the largest magnitude, N pea k and compare it to Nt ransm itte r . The difference between these peaks captures the observed Doppler shift, AN.
  • each user may perform a pointing gesture with slightly different velocities and durations.
  • the frequency shift will be averaged over that duration by the FFT, and thus, a consistent shift may not be seen in consecutive frames.
  • the start and ending of the gesture can be estimated by storing the Doppler shift over all frames during the detected gesture, and summing the observed Doppler shifts over the entire gesture.
  • This threshold can be determined experimentally, and should be great enough that when a continuous tone is played without any gestures performed, the maximum observed deviation AN is less than AN3 ⁇ 4 res 3 ⁇ 4.
  • the end of the gesture can be determined, for example, when AN ⁇ 4 for 4 consecutive frames (about 400 ms).
  • the sum of ANs for each frame during the detected "gesture event" is then calculated. This sum, AN ⁇ , is transmitted to the server as Dl.
  • the second receiver R2 receives an audio signal.
  • the audio signal from a single transmitter is received by both Rl and R2.
  • the second receiver R2 likewise determines the Doppler shift (D2), which depends on the relative movement of the transmitter and the second receiver R2.
  • D2 Doppler shift
  • the magnitudes of the first Doppler shift Dl and the second Doppler shift D2 are compared by the server. By comparing the magnitudes of D 1 and D2, it can be determined which receiver is being pointed to by the transmitter. As noted above, the receiver pointed to will register the greatest Doppler shift. Once it is determined that the transmitter is pointed towards one of the receivers, a connection can be established between the transmitter and that receiver. Referring to Figure 4, if the magnitude of Dl is greater than D2 (D1>D2), then in block 41 1 a connection is established between the first receiver Rl and the transmitter.
  • the user can quickly point toward another device and add many devices to the same "group" in rapid succession.
  • multi-pairing can be done easily using this technique.
  • the receiver starts listening to the tone without establishing a connection to the server. When it hears a shift, it sets up a connection and sends AN ⁇ to the server. In this way, no initial synchronization is needed between the devices other than being aware of the central server.
  • all devices are continually listening for a tone from an outside device. A new device can be quickly added to the group simply by gesturing at any device in the group.
  • FIG. 5 is a block diagram of a method for managing multi-device interaction according to another embodiment of the present technology in which a user performs a sweeping motion with the transmitter.
  • Process 500 begins with the first receiver Rl in block 501 receiving an audio signal.
  • the audio signal can be a continuous tone emitted by a transmitter, for example a smartphone held by a user.
  • the receiver Rl can be a laptop, another smartphone, or any electronic device configured to detect a Doppler shift in a received audio signal.
  • the first receiver Rl determines the Doppler shift (D l) of the received audio signal.
  • D l Doppler shift
  • movement of a transmitter with respect to the receiver Rl will generate a Doppler shift that can be detected by the receiver.
  • the second receiver R2 receives an audio signal.
  • the audio signal from a single transmitter is received by both Rl and R2.
  • the second receiver R2 likewise determines the Doppler shift (D2), which depends on the relative movement of the transmitter and the second receiver R2.
  • the time stamps of the first Doppler shift Dl and the second Doppler shift D2 are compared at the server. By comparing the time stamps of Dl and D2, in block 51 1 the relative positions of Rl and R2 can be determined.
  • a single device (the transmitter) is being used to ascertain the relative position of other devices (receivers Rl and R2).
  • the user instead of pointing, the user initiates a scanning gesture across the devices of interest, for example across two computer displays. All the receivers that are in front of the transmitter will sense a shift and report it to the server, as before. However, the time stamp of when they detected the gesture will be different. If scanning is performed from left to right, then the leftmost device will observe the shift first (and vice-versa for scanning right to left), then the second device, and so on.
  • the server in this case, instead of calculating the maximum, can organize the devices in a sequence based on their arrival time. The server then sends the whole sequence to all the receivers so that they are aware of their position relative to other devices.
  • This technique uses this relative positioning to stitch multiple displays into a single display.
  • the receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
  • connection comprises at least one of: WiFi, Bluetooth, RFID, or near-field communication.
  • the transmitting device comprises a smartphone.
  • the first receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
  • first and second receiving devices comprise displays, and wherein the method further comprises stitching the displays together based on the determined gesture.
  • comparing the first and second Doppler shifts comprises comparing the magnitudes of the first and second Doppler shifts.
  • comparing the first and second Doppler shifts comprises comparing the timing of the first and second Doppler shifts.
  • a computer-readable medium storing computer-executable instructions for managing multiple device interactions, the computer-executable instructions comprising instructions that, if executed by a computing system having a processor, cause the computing system to perform operations, the operations comprising:

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Human Computer Interaction (AREA)
  • Measurement Of Velocity Or Position Using Acoustic Or Ultrasonic Waves (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The present technology relates generally to systems and methods for managing multi-device interactions using the Doppler effect. In some embodiments, for example, a method for managing multiple device interactions in accordance with the technology comprises receiving an indication of a first Doppler shift detected by a first receiving device of an audio signal from a transmitting device, receiving an indication of a second Doppler shift detected by a second receiving device of the audio signal from the transmitting device, comparing the first and second Doppler shifts and, based at least in part on comparing the first and second Doppler shifts, determining a gesture made with the transmitting device.

Description

METHODS AND SYSTEMS FOR MANAGING MULTI-DEVICE INTERACTION USING THE DOPPLER EFFECT
CROSS-REFERENCE TO RELATED APPLICATION
[0001] This application claims the benefit of U.S. Provisional Patent Application No. 61/860,682, filed July 31, 2013, which is incorporated herein by reference in its entirety.
TECHNICAL FIELD
[0002] The present technology is generally related to methods and systems for managing interactions between multiple electronic devices. In particular, several embodiments are directed to methods and systems utilizing the Doppler effect to manage multi-device interactions.
BACKGROUND
[0003] With the proliferation of multiple smart computing devices in the environment— personal computers, smartphones, smart TVs, and other network-enabled devices (i.e., the smart home)— it is becoming increasingly important for these personal mobile devices to connect to each other and share information. However, there are no easy approaches for selecting which devices to connect to, nor is there a way to select a subset of devices easily at a distance. In particular, selecting, controlling, and sharing information between devices still requires a significant effort from a user. For example, if a user wants to select one or more co-located wireless devices, she need to know their addresses or have to physically access it, in contrast to accomplishing it in a more natural way - such as simply pointing to the device. As the number of these devices increases, this becomes tedious and unmanageable.
BRIEF DESCRIPTION OF THE DRAWINGS
[0004] Figure 1 is a schematic representation of a multi-device interaction management system configured in accordance with an embodiment the present technology.
[0005] Figures 2A and 2B illustrate example gestures made by a user with a transmitting device.
[0006] Figures 3A-3F are frequency graphs showing audio frequencies detected by receivers. [0007] Figure 4 is a block diagram of a method for managing multi-device interaction according to one embodiment of the present technology.
[0008] Figure 5 is a block diagram of a method for managing multi-device interaction according to another embodiment of the present technology.
DETAILED DESCRIPTION
[0009] Numerous techniques have been proposed for selecting objects in a physical environment. A number of these techniques require extra sensors to identify which pair of devices to be connected, or they require simultaneous movements of both devices, which is not feasible for larger devices (e.g. TVs, printers) or devices embedded into the environment (e.g., in furniture). There are also many vision based approaches for household device selection, but these can be sensitive to lighting conditions and require line-of-sight. More importantly, they do not blend or disappear into the environment (e.g., blinking LEDs or QR code based systems).
[0010] To address these and other limitations, a Doppler-based device selection approach is presented. As described in more detail below, low-cost, commodity audio hardware (often already embedded in smart devices) can be used to determine when a user gestures with a device with respect to another device (e.g., the user moves her mobile phone at the other device in a pointing motion). This technique relies on a well-understood phenomenon known as the "Doppler Effect," which characterizes the change in observed frequency of a sound wave as a transmitter moves towards or away from the receiver. Contrary to vision-based approaches, embodiments of the present technology are much less susceptible to occlusions between the controller and receiver, and are invisible to the user. The present technology is simple, robust, and easy to implement— it can be developed on cheap hardware at low cost or on top of existing commodity hardware. Additionally, by implementing a "scanning gesture", this approach can be used to find the relative position of each device with respect to one another. Example applications of how the system can be used include (1) rapidly pairing and un-pairing devices for grouping or sharing information, (2) controling home appliances with minimal instrumentation, and (3) ascertaining relative positions of separate monitors to stitch together a multi-display system.
[0011] Specific details of several embodiments of the present technology are described below with reference to Figures 1-5. Although many of the embodiments are described below with respect to devices, systems, and methods for managing multiple device interactions using the Doppler Effect, other embodiments are within the scope of the present technology. Additionally, other embodiments of the present technology can have different configurations, components, and/or procedures than those described herein. For example, other embodiments can include additional elements and features beyond those described herein, or other embodiments may not include several of the elements and features shown and described herein. Some embodiments described below determine the Doppler shift of received audio signals by using Fast Fourier Transform (FFT). Other approaches are possible, for example using a hardware solution could use a local oscillator coupled with an analog phase detector (like those commonly found in low cost FM demodulators) to sense Doppler shift with an 8-bit microcontroller using a low sample rate ADC.
[0012] For ease of reference, throughout this disclosure identical reference numbers are used to identify similar or analogous components or features, but the use of the same reference number does not imply that the parts should be construed to be identical. Indeed, in many examples described herein, the identically numbered parts are distinct in structure and/or function.
Selected Embodiments of Multi-Device Interaction Management Systems and Methods
[0013] The present technology leverages the Doppler shift of an audio tone to manage interactions between multiple devices. In one embodiment, for example, when the user wants to connect to a particular device, she holds a button to initiate an audio tone (which can be sufficiently high to be inaudible to human hearing) from her mobile phone and then make a pointing gesture with her phone towards a target. Since the velocity of the sound transmitter changes, a Doppler shift can be observed by all the potential target devices in the vicinity. Certain embodiments of the present technology work on the hypothesis that the intended target device will receive the maximum frequency shift compared to the other potential target devices. By comparing the peak frequency shift in all the devices, the intended recipient can be inferred.
[0014] The device being used to gesture is referred to herein as the "transmitter" and the devices sensing the Doppler shift are referred to as the "receivers." The observed frequency shift sensed by a receiver is proportional to the transmitter frequency (fT) and to the velocity at which the transmitter moves relative to the receivers. Initially the transmitter and receiver can be stationary, with no change in the receiving frequency. When a user moves the transmitter towards the receiver, it causes a shift in frequency to be observed. The amount of this shift is measured by the receiver (fR) and can be described by the equation (1):
Figure imgf000005_0001
[0015] Here c, VT, and VR are the velocity of sound in air, of the transmitter, and of the receiver, respectively. In embodiments in which the receiver is stationary (VR = 0), the received frequency (JR) increases proportionally as the transmitter velocity (vr) increases.
[0016] Figure 1 is a schematic representation of a multi-device interaction management system configured in accordance with an embodiment of the present technology. The system 100 includes a server 101, one or more receivers 103 (three are shown in the present embodiment as a first receiver 103a, a second receiver 103b, and third receiver 103c). A transmitter 105 emits a tone that can be detected by the receivers 103a-c. As the transmitter 105 is moved relative to the receivers 103a-c, each of the receivers 103a-c can detect the Doppler shift in the tone emitted by the transmitter 105. These detected Doppler shifts can then be communicated to the server 101 for analysis and comparison.
[0017] In one application, the server 101 may compare the magnitudes of the detected Doppler shifts in order to determine which device the transmitter 105 was pointed towards. For example, when the transmitter 105 is moved towards, or pointed towards, receiver 103b, the receiver 103b will detect a greater Doppler shift than that detected by the other receivers 103 a and 103c. In some embodiments, the transmitter 105 can then be paired with the receiver 103b. In another application, the server 101 may compare the time stamps of the detected Doppler shifts in order to determine the relative positions of the three receivers 103a-c. For example, if a user sweeps the transmitter 105 with respect to the receivers 103a-c, the time at which each receiver 103a-c detects the Doppler shift can be used to determine the relative locations of the receivers 103a-c. The determined relative positions can then be used, in one example, to stitch together multiple displays. Other uses of the detected Doppler shifts are possible, for example more complex gesture vocabulary (beyond pointing and sweeping) can be developed. For example, the user may perform a "pulling" gesture, in which the transmitter is moved away from the devices. Detection of this Doppler shift (in which the frequency would be shifted downward rather than upward) could be used to unpair the transmitter from the intended receiver. In some embodiments, the detected Doppler shifts can be used not only to select a receiver or to indicate relative positions, but also to transmit control logic to one or more of the receivers. It will also be appreciated that a variety of other gestures and corresponding actions may be possible using embodiments of the present technology. [0018] Communication between the receivers 103a-c and the server 101 can be wired or wireless. In some embodiments, the central server 101 can be replaced with a distributed and decentralized architecture. In some embodiments, the server 101 can be physically integrated in the same device in which any one of receivers 103a-c are positioned. For example, receiver 103 a may be a laptop computer including a microphone for detecting the audio signal emitted by the transmitter 105, and the server 101 may also reside on the laptop. In other embodiments, the server 101 can be physically separate from the receivers 103a-c. In various embodiments, the receivers can be computers, computer displays, routers, modems, smartphones, tablets, printers, televisions, stereo systems, smart appliances (e.g., ovens, washers, dryers, dishwashers, refrigerators, microwaves, toasters, alarm clocks, home security systems), or other electronic devices capable of detecting audio signals and determining Doppler shifts. In some embodiments, electronic devices can be retrofitted to enable them to detect audio signals and determine Doppler shifts. On many electronic devices, such as most smartphones and computers, no additional hardware may be needed. On other electronic devices, a microphone and phase shift sensing hardware may be added to enable them to determine Doppler shifts. As described in more detail below, the Doppler shifts can be determined using a software approach with FFT. Alternatively, a hardware solution could be implemented, for example a local oscillator coupled with an analog phase detector to sense Doppler shift with an 8-bit microcontroller using a low sample rate ADC. In various embodiments, the transmitter can be a smartphone, tablet, remote control, computer, speaker, or other electronic device capable of emitting a continuous audio tone.
[0019] Figures 2A and 2B illustrate example gestures made by a user with a transmitting device. Referring to Figure 2A, the user 200 performs a pointing gesture, in which the transmitter 105 is moved towards the first receiver 103a and second receiver 103b. In the illustrated embodiment, the transmitter 105 is a smartphone and the first receiver 103 a and second receiver 105b are each laptop computers. As noted previously, in other embodiments the receivers can be any number of electronic devices capable of sensing the audio signal and determining a Doppler shift, and the transmitter can be any electronic device capable of emitting a continuous tone. The transmitter 105 can emit a continuous tone prior to and during the pointing gesture performed by the user 200. Due to the pointing movement, each of the receivers 103a-b may detect a Doppler shift in the audio signal received from the transmitter 105. By comparing the magnitudes of the detected Doppler shifts, it can be determined which of the receivers 103a-b the user 200 pointed towards. As noted above, the receiver pointed towards by the transmitter will register the greatest Doppler shift.
[0020] Referring to Figure 2B, the user 200 performs a sweeping gesture in which the transmitter 105 is moved laterally with respect to the first receiver 103a and second receiver 103b. Again the transmitter 105 emits a continuous tone prior to and throughout the gesture performed by the user 200. As the user 200 sweeps the transmitter 105 from left to right, each of the receivers 103a-b will register a Doppler shift. However, the first receiver 103 a will register a Doppler shift first, and the second receiver 103b will register a Doppler shift second. By comparing the times of the detected Doppler shifts, the relative positions of the first and second receivers 103a-b can be determined. For example, if the user performs a left-to-right sweeping motion, the detection by the first receiver 103 a of a Doppler shift prior to detection by the second receiver 103b indicates that the first receiver 103a is left of the second receiver 103b.
[0021] In some embodiments, the transmitter 105 can comprise a computing device (e.g., smartphone, tablet, or similar device), and a user interface can be provided. The user can receive, via the user interface on the computing device, instructions to initiate transmission of an audio signal. The user may then initiate transmission of the continuous tone, for example by pressing a "start" button on the user interface. The computing device may then initiate transmission of the continuous tone. The user interface may then optionally provide instructions for the user to perform a gesture with the computing device. For example, the user interface may provide instructions for the user to point the computing device towards the intended receiver device (as in Figure 2A). After the user has performed the pointing gesture with the computing device, the computing device can receive, from the intended receiving device, an indication to establish a connection with the receiving device. For example, the intended receiving device can be a laptop computer, and the computing device can initiate connection with the laptop computer via WiFi, Bluetooth, RFID, near-field communication, or other suitable communication technology.
[0022] Figures 3A-3F are frequency graphs showing audio frequencies detected by receivers. Figures 3A and 3B, for example, illustrate the detected signals at the first receiver Rl and second receiver R2, respectively, when the transmitter is not in motion. As shown, the peak signal detected by Rl (301a in Figure 3A) and the peak signal detected by R2 (301b in Figure 3B) are similar. These peak signals can be utilized as a baseline against which to calculate the Doppler shifts detected by each receiver. Figures 3C and 3D illustrate the detected signals at Rl and R2, respectively, when the transmitter is moved towards Rl. As shown, the peak signal detected at Rl (301c in Figure 3C) is shifted higher, whereas the peak signal detected at R2 (301d in Figure 3D) is substantially identical to that of Figure 3B, in which the transmitter was not in motion. Figures 3E and 3F illustrate the detected signals at Rl and R2, respectively, when the transmitter is moved towards R2. As shown, the peaks signal detected by R2 (30 If in Figure 3F) is shifted higher, whereas the peak signal detected by Rl (301e in Figure 3E) is substantially identical to the case in which the transmitter was not moved (Figure 3A). In some embodiments, both Rl and R2 will detect some Doppler shift in the peak signal, but the magnitudes may be different.
[0023] Figure 4 is a block diagram of a method for managing multi-device interaction according to one embodiment of the present technology in which a user points the transmitter towards one of the receivers. Process 400 begins with the first receiver Rl in block 401 receiving an audio signal. As indicated above, the audio signal can be a continuous tone emitted by a transmitter, for example, a smartphone held by a user. The receiver Rl can be a laptop, another smartphone, or any electronic device configured to detect a Doppler shift in a received audio signal. In one particular embodiment of this process, a transmitter device generates a continuous tone played through the device's speakers at 18 kHz. In some embodiments, the continuous tone can be as low as 6 kHz, and in some embodiments the tone may be higher than 18 kHz. A continuous tone at 18 kHz is generally inaudible while also being detectable by nearly all commodity hardware. Additionally, higher transmitter frequencies result in greater Doppler shifts, making it computationally easier to estimate motion at a given frequency resolution. Prior to the user performing a gesture with the transmitter, all receivers can continuously sample their microphones, for example in some embodiments they may sample their microphones at approximately 44.1 kHz. In one exemplary process, the incoming audio signal can be buffered into 4096-point, non-overlapping frames and the |FFT| of each frame can be computed. This yields a frequency resolution of 10.76 Hz per bin and a frame rate of about 10Hz.
[0024] Initially, the receiver Rl can scan the frequency range of interest, for example in the range between 17-19 kHz, to find the bins with the peak with largest magnitude that spans no more than 1-2 bins, i.e., a sharp peak as a result of the transmitter tone. Also, these bins should remain consistent across multiple FFT vectors. For example, this could be defined when the same FFT bins are selected consecutively for 5 frames. This bin index can be saved as Ntransmitter, and the receivers can continue to sample their microphones to wait for frequency shifts.
[0025] In block 403 the first receiver Rl determines the Doppler shift (Dl) of the received audio signal. As noted above, movement of a transmitter with respect to the receiver Rl will generate a Doppler shift that can be detected by the receiver. In one approach, to determine the gestures of interest, gestures can be conservatively bound to a maximum velocity, such as 6m/sec, in which case the maximum shift, in frequency bins, would be approximately 31 bins. Thus, in this example the signal of interest can be limited to 31 bins. For each FFT frame, the receiver Rl can store the frequency bin with the largest magnitude, Npeak and compare it to Ntransmitter. The difference between these peaks captures the observed Doppler shift, AN. However, each user may perform a pointing gesture with slightly different velocities and durations. Moreover, because each frame contains 100 ms of data, the frequency shift will be averaged over that duration by the FFT, and thus, a consistent shift may not be seen in consecutive frames. As such, the start and ending of the gesture can be estimated by storing the Doppler shift over all frames during the detected gesture, and summing the observed Doppler shifts over the entire gesture.
[0026] To detect the start of the gesture, a shift greater than a predetermined threshold, for example AN¾res¾=4 bins, can be used. This threshold can be determined experimentally, and should be great enough that when a continuous tone is played without any gestures performed, the maximum observed deviation AN is less than AN¾res¾. The end of the gesture can be determined, for example, when AN<4 for 4 consecutive frames (about 400 ms). The sum of ANs for each frame during the detected "gesture event" is then calculated. This sum, AN, is transmitted to the server as Dl.
[0027] Similar steps are carried out in parallel by the second receiver R2. In block 405 the second receiver R2 receives an audio signal. The audio signal from a single transmitter is received by both Rl and R2. In block 405 the second receiver R2 likewise determines the Doppler shift (D2), which depends on the relative movement of the transmitter and the second receiver R2. Both the receivers Rl and R2 report the value of peak frequency shift to the server (which can also reside on one of the receivers).
[0028] In block 409, the magnitudes of the first Doppler shift Dl and the second Doppler shift D2 are compared by the server. By comparing the magnitudes of D 1 and D2, it can be determined which receiver is being pointed to by the transmitter. As noted above, the receiver pointed to will register the greatest Doppler shift. Once it is determined that the transmitter is pointed towards one of the receivers, a connection can be established between the transmitter and that receiver. Referring to Figure 4, if the magnitude of Dl is greater than D2 (D1>D2), then in block 41 1 a connection is established between the first receiver Rl and the transmitter. If, on the other hand, the magnitude of D2 is greater than Dl (D1<D2), then in block 413 a connection is established between the second receiver R2 and the transmitter. As the user points her device to receiver Rl, both Rl and R2 may observe a frequency shift. However, the frequency shift received by Rl would be larger than R2, and conversely pointing at R2 results in frequency shift greater for R2. Thus, by comparing these two shifts, the system can infer which of the two devices the user pointed towards, and initiate a connection.
[0029] In one embodiment, once the first device connects, the user can quickly point toward another device and add many devices to the same "group" in rapid succession. Hence, multi-pairing can be done easily using this technique. In this example, initially the receiver starts listening to the tone without establishing a connection to the server. When it hears a shift, it sets up a connection and sends AN to the server. In this way, no initial synchronization is needed between the devices other than being aware of the central server. Moreover, once a group is established, all devices are continually listening for a tone from an outside device. A new device can be quickly added to the group simply by gesturing at any device in the group.
[0030] Figure 5 is a block diagram of a method for managing multi-device interaction according to another embodiment of the present technology in which a user performs a sweeping motion with the transmitter. Process 500 begins with the first receiver Rl in block 501 receiving an audio signal. As indicated above, the audio signal can be a continuous tone emitted by a transmitter, for example a smartphone held by a user. The receiver Rl can be a laptop, another smartphone, or any electronic device configured to detect a Doppler shift in a received audio signal. In block 503 the first receiver Rl determines the Doppler shift (D l) of the received audio signal. As noted above, movement of a transmitter with respect to the receiver Rl will generate a Doppler shift that can be detected by the receiver.
[0031] Similar steps are carried out in parallel by the second receiver R2. In block 505 the second receiver R2 receives an audio signal. The audio signal from a single transmitter is received by both Rl and R2. In block 505 the second receiver R2 likewise determines the Doppler shift (D2), which depends on the relative movement of the transmitter and the second receiver R2. In block 509, the time stamps of the first Doppler shift Dl and the second Doppler shift D2 are compared at the server. By comparing the time stamps of Dl and D2, in block 51 1 the relative positions of Rl and R2 can be determined.
[0032] Referring still to Figure 5, a single device (the transmitter) is being used to ascertain the relative position of other devices (receivers Rl and R2). Here, instead of pointing, the user initiates a scanning gesture across the devices of interest, for example across two computer displays. All the receivers that are in front of the transmitter will sense a shift and report it to the server, as before. However, the time stamp of when they detected the gesture will be different. If scanning is performed from left to right, then the leftmost device will observe the shift first (and vice-versa for scanning right to left), then the second device, and so on. The server in this case, instead of calculating the maximum, can organize the devices in a sequence based on their arrival time. The server then sends the whole sequence to all the receivers so that they are aware of their position relative to other devices. One example application of this technique uses this relative positioning to stitch multiple displays into a single display.
Examples
1. A method, performed by a computing system having a processor, for managing multiple device interactions, the method comprising:
receiving, via a user interface displayed by the computing system, instructions to initiate transmission of an audio signal;
causing, by the computing system, the audio signal to be transmitted by the computing system; and
after a user has performed a gesture with the computing system while the audio signal is transmitted, receiving, from a receiving device, an indication to establish a connection with the receiving device.
2. The method of example 1 wherein the computing system comprises a smartphone.
3. The method of example 1 or examples 2, further comprising displaying, via the user interface, instructions for the user to perform a gesture with the computing system. 4. The method of example 3 wherein the instructions for the user to perform a gesture comprise instructions for the user to move the computing device towards the receiving device.
5. The method of any one of examples 1-4 wherein the receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
6. A method, performed by a computing system having a processor, for managing multiple device interactions, the method comprising:
receiving an indication of a first Doppler shift detected by a first receiving device of an audio signal from a transmitting device;
receiving an indication of a second Doppler shift detected by a second receiving device of the audio signal from the transmitting device;
comparing the first and second Doppler shifts; and
based at least in part on comparing the first and second Doppler shifts, determining a gesture made with the transmitting device.
7. The method of example 6, further comprising determining relative positions of the first receiving device and the second receiving device, based on the determined gesture.
8. The method of example 6, further comprising based on the determined gesture, establishing a connection between the transmitting device and the first receiving device based on the determined gesture.
9. The method of example 8 wherein the connection comprises at least one of: WiFi, Bluetooth, RFID, or near-field communication.
10. The method of any one of examples 6-9 wherein the transmitting device comprises a smartphone. 11. The method of any one of examples 6-10 wherein the first receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
12. The method of any one of examples 6-10 wherein the first and second receiving devices comprise displays, and wherein the method further comprises stitching the displays together based on the determined gesture.
13. The method of any one of examples 6-12 wherein comparing the first and second Doppler shifts comprises comparing the magnitudes of the first and second Doppler shifts.
14. The method of any one of examples 6-12 wherein comparing the first and second Doppler shifts comprises comparing the timing of the first and second Doppler shifts.
15. The method of any one of examples 6-14 wherein the first and second Doppler shifts are caused by a user moving the transmitting device towards the first device.
16. The method of any one of examples 6-15 wherein the first and second Doppler shifts are caused by a user moving the transmitting device away from the first device.
17. The method of any one of examples 6-16 wherein the first and second Doppler shifts are caused by a user scanning the transmitting device laterally relative to the first and second devices.
18. A computer-readable medium storing computer-executable instructions for managing multiple device interactions, the computer-executable instructions comprising instructions that, if executed by a computing system having a processor, cause the computing system to perform operations, the operations comprising:
receiving an indication of a first Doppler shift detected by a first receiving device of an audio signal from a transmitting device;
receiving an indication of a second Doppler shift detected by a second receiving device of a second audio signal from the transmitting device; comparing the first and second Doppler shifts; and
based at least in part on comparing the first and second Doppler shifts, determining a gesture made with the transmitting device.
19. The computer-readable medium of example 18, wherein the operations further comprise determining relative positions of the first receiving device and the second receiving device, based on the determined gesture.
20. The computer-readable medium of example 18, wherein the operations further comprise, based on the determined gesture, establishing a connection between the transmitting device and the first receiving device based on the determined gesture.
Conclusion
[0033] The above detailed descriptions of embodiments of the technology are not intended to be exhaustive or to limit the technology to the precise form disclosed above. Although specific embodiments of, and examples for, the technology are described above for illustrative purposes, various equivalent modifications are possible within the scope of the technology, as those skilled in the relevant art will recognize. For example, while steps are presented in a given order, alternative embodiments may perform steps in a different order. The various embodiments described herein may also be combined to provide further embodiments.
[0034] From the foregoing, it will be appreciated that specific embodiments of the invention have been described herein for purposes of illustration, but well-known structures and functions have not been shown or described in detail to avoid unnecessarily obscuring the description of the embodiments of the technology. Where the context permits, singular or plural terms may also include the plural or singular term, respectively.
[0035] Moreover, unless the word "or" is expressly limited to mean only a single item exclusive from the other items in reference to a list of two or more items, then the use of "or" in such a list is to be interpreted as including (a) any single item in the list, (b) all of the items in the list, or (c) any combination of the items in the list. Additionally, the term "comprising" is used throughout to mean including at least the recited feature(s) such that any greater number of the same feature and/or additional types of other features are not precluded. It will also be appreciated that specific embodiments have been described herein for purposes of illustration, but that various modifications may be made without deviating from the technology. Further, while advantages associated with certain embodiments of the technology have been described in the context of those embodiments, other embodiments may also exhibit such advantages, and not all embodiments need necessarily exhibit such advantages to fall within the scope of the technology. Accordingly, the disclosure and associated technology can encompass other embodiments not expressly shown or described herein.

Claims

CLAIMS I/We claim:
1. A method, performed by a computing system having a processor, for managing multiple device interactions, the method comprising:
receiving, via a user interface displayed by the computing system, instructions to initiate transmission of an audio signal;
causing, by the computing system, the audio signal to be transmitted by the computing system; and
after a user has performed a gesture with the computing system while the audio signal is transmitted, receiving, from a receiving device, an indication to establish a connection with the receiving device.
2. The method of claim 1 wherein the computing system comprises a smartphone.
3. The method of claim 1, further comprising displaying, via the user interface, instructions for the user to perform a gesture with the computing system.
4. The method of claim 3 wherein the instructions for the user to perform a gesture comprise instructions for the user to move the computing device towards the receiving device.
5. The method of claim 1 wherein the receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
6. A method, performed by a computing system having a processor, for managing multiple device interactions, the method comprising:
receiving an indication of a first Doppler shift detected by a first receiving device of an audio signal from a transmitting device;
receiving an indication of a second Doppler shift detected by a second receiving device of the audio signal from the transmitting device; comparing the first and second Doppler shifts; and
based at least in part on comparing the first and second Doppler shifts, determining a gesture made with the transmitting device.
7. The method of claim 6, further comprising determining relative positions of the first receiving device and the second receiving device, based on the determined gesture.
8. The method of claim 6, further comprising based on the determined gesture, establishing a connection between the transmitting device and the first receiving device based on the determined gesture.
9. The method of claim 8 wherein the connection comprises at least one of: WiFi, Bluetooth, RFID, or near-field communication.
10. The method of claim 6 wherein the transmitting device comprises a smartphone.
11. The method of claim 6 wherein the first receiving device is selected from the group consisting of: a computer, a printer, a television, a dishwasher, a refrigerator, a microwave, an oven, a stove, a washer, a dryer, a thermostat, a home security system, and a stereo system.
12. The method of claim 6 wherein the first and second receiving devices comprise displays, and wherein the method further comprises stitching the displays together based on the determined gesture.
13. The method of claim 6 wherein comparing the first and second Doppler shifts comprises comparing the magnitudes of the first and second Doppler shifts.
14. The method of claim 6 wherein comparing the first and second Doppler shifts comprises comparing the timing of the first and second Doppler shifts.
15. The method of claim 6 wherein the first and second Doppler shifts are caused by a user moving the transmitting device towards the first device.
16. The method of claim 6 wherein the first and second Doppler shifts are caused by a user moving the transmitting device away from the first device.
17. The method of claim 6, wherein the first and second Doppler shifts are caused by a user scanning the transmitting device laterally relative to the first and second devices.
18. A computer-readable medium storing computer-executable instructions for managing multiple device interactions, the computer-executable instructions comprising instructions that, if executed by a computing system having a processor, cause the computing system to perform operations, the operations comprising:
receiving an indication of a first Doppler shift detected by a first receiving device of an audio signal from a transmitting device;
receiving an indication of a second Doppler shift detected by a second receiving device of a second audio signal from the transmitting device;
comparing the first and second Doppler shifts; and
based at least in part on comparing the first and second Doppler shifts, determining a gesture made with the transmitting device.
19. The computer-readable medium of claim 18 wherein the operations further comprise determining relative positions of the first receiving device and the second receiving device, based on the determined gesture.
20. The computer-readable medium of claim 18 wherein the operations further comprise, based on the determined gesture, establishing a connection between the transmitting device and the first receiving device based on the determined gesture.
PCT/US2014/049175 2013-07-31 2014-07-31 Methods and systems for managing multi-device interaction using the doppler effect WO2015017670A2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361860682P 2013-07-31 2013-07-31
US61/860,682 2013-07-31

Publications (2)

Publication Number Publication Date
WO2015017670A2 true WO2015017670A2 (en) 2015-02-05
WO2015017670A3 WO2015017670A3 (en) 2015-04-23

Family

ID=52432575

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/049175 WO2015017670A2 (en) 2013-07-31 2014-07-31 Methods and systems for managing multi-device interaction using the doppler effect

Country Status (1)

Country Link
WO (1) WO2015017670A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210389455A1 (en) * 2020-06-12 2021-12-16 Aisin Seiki Kabushiki Kaisha Object detector
US20210389446A1 (en) * 2020-06-12 2021-12-16 Aisin Seiki Kabushiki Kaisha Object detector

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9134798B2 (en) * 2008-12-15 2015-09-15 Microsoft Technology Licensing, Llc Gestures, interactions, and common ground in a surface computing environment
US8296151B2 (en) * 2010-06-18 2012-10-23 Microsoft Corporation Compound gesture-speech commands
WO2012135554A1 (en) * 2011-03-29 2012-10-04 Qualcomm Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
EP2575332A1 (en) * 2011-09-30 2013-04-03 France Telecom Method for transferring data between two devices

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20210389455A1 (en) * 2020-06-12 2021-12-16 Aisin Seiki Kabushiki Kaisha Object detector
US20210389446A1 (en) * 2020-06-12 2021-12-16 Aisin Seiki Kabushiki Kaisha Object detector
US11709263B2 (en) * 2020-06-12 2023-07-25 Aisin Corporation Object detector
US11852715B2 (en) * 2020-06-12 2023-12-26 Aisin Corporation Object detector

Also Published As

Publication number Publication date
WO2015017670A3 (en) 2015-04-23

Similar Documents

Publication Publication Date Title
Aumi et al. Doplink: Using the doppler effect for multi-device interaction
Chen et al. AirLink: sharing files between multiple devices using in-air gestures
KR102447438B1 (en) Alarm device and method for informing location of objects thereof
US20230393675A1 (en) Active pen and sensor controller that use data generated from identification data
US8593398B2 (en) Apparatus and method for proximity based input
US9904460B2 (en) Method and system for data transfer with a touch enabled device
US10084649B2 (en) Terminal for internet of things and operation method of the same
CN103210366A (en) Apparatus and method for proximity based input
US20170315631A1 (en) System and method for multimode stylus
KR20140008637A (en) Method using pen input device and terminal thereof
JP6065234B2 (en) Multi-terminal positioning method, and related devices and systems
EP3794426B1 (en) Motion sensor using cross coupling
WO2015017670A2 (en) Methods and systems for managing multi-device interaction using the doppler effect
KR20150079460A (en) Method, device, and system for recognizing gesture based on multi-terminal collaboration
CN107943549B (en) Application processing method and terminal
US8780279B2 (en) Television and control device having a touch unit and method for controlling the television using the control device
US20180145845A1 (en) Device control method and apparatus in home network system
EP2856765B1 (en) Method and home device for outputting response to user input
JP4604571B2 (en) Operation terminal
KR102250856B1 (en) Method for detecting touch-input, apparatus for sensing touch-input, and apparatus for inputting touch-input
CN104866209B (en) A kind of data transmission method and electronic equipment
WO2017185068A1 (en) A system for enabling rich contextual applications for interface-poor smart devices
WO2012001412A1 (en) User control of electronic devices
EP2658292A1 (en) Method and apparatus to detect interaction between devices
EP3286937A2 (en) Location-based services

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14832124

Country of ref document: EP

Kind code of ref document: A2

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14832124

Country of ref document: EP

Kind code of ref document: A2