US11126885B2 - Character recognition in air-writing based on network of radars - Google Patents
Character recognition in air-writing based on network of radars Download PDFInfo
- Publication number
- US11126885B2 US11126885B2 US16/360,284 US201916360284A US11126885B2 US 11126885 B2 US11126885 B2 US 11126885B2 US 201916360284 A US201916360284 A US 201916360284A US 11126885 B2 US11126885 B2 US 11126885B2
- Authority
- US
- United States
- Prior art keywords
- millimeter
- wave radars
- monitoring space
- determining
- wave
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000012544 monitoring process Methods 0.000 claims abstract description 86
- 238000000034 method Methods 0.000 claims abstract description 72
- 238000013528 artificial neural network Methods 0.000 claims abstract description 32
- 230000015654 memory Effects 0.000 claims description 18
- 230000035559 beat frequency Effects 0.000 claims description 14
- 230000002123 temporal effect Effects 0.000 claims description 8
- 230000000306 recurrent effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 17
- 210000004027 cell Anatomy 0.000 description 11
- 239000011159 matrix material Substances 0.000 description 10
- 238000001994 activation Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 8
- 230000004913 activation Effects 0.000 description 7
- 239000013598 vector Substances 0.000 description 7
- 238000001914 filtration Methods 0.000 description 5
- 239000003550 marker Substances 0.000 description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000013527 convolutional neural network Methods 0.000 description 4
- 238000009499 grossing Methods 0.000 description 4
- 239000002184 metal Substances 0.000 description 4
- 241000282412 Homo Species 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 3
- 238000000605 extraction Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 230000005855 radiation Effects 0.000 description 3
- 238000005070 sampling Methods 0.000 description 3
- 230000002457 bidirectional effect Effects 0.000 description 2
- 230000001427 coherent effect Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000005286 illumination Methods 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000010183 spectrum analysis Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 241000699666 Mus <mouse, genus> Species 0.000 description 1
- 241000699670 Mus sp. Species 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 230000020411 cell activation Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000005755 formation reaction Methods 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000012886 linear function Methods 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 210000002569 neuron Anatomy 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000006403 short-term memory Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Images
Classifications
-
- G06K9/46—
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/411—Identification of targets based on measurements of radar reflectivity
- G01S7/412—Identification of targets based on measurements of radar reflectivity based on a comparison between measured values and known or stored values
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/46—Indirect determination of position data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/50—Systems of measurement based on relative movement of target
- G01S13/58—Velocity or trajectory determination systems; Sense-of-movement determination systems
- G01S13/581—Velocity or trajectory determination systems; Sense-of-movement determination systems using transmission of interrupted pulse modulated waves and based upon the Doppler effect resulting from movement of targets
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
- G01S13/874—Combination of several systems for attitude determination
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/87—Combinations of radar systems, e.g. primary radar and secondary radar
- G01S13/878—Combination of several spaced transmitters or receivers of known location for determining the position of a transponder or a reflector
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/28—Details of pulse systems
- G01S7/285—Receivers
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/28—Details of pulse systems
- G01S7/285—Receivers
- G01S7/292—Extracting wanted echo-signals
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/417—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section involving the use of neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/21—Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
- G06F18/217—Validation; Performance evaluation; Active pattern learning techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G06K9/6262—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/049—Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/764—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/46—Indirect determination of position data
- G01S2013/466—Indirect determination of position data by Trilateration, i.e. two antennas or two sensors determine separately the distance to a target, whereby with the knowledge of the baseline length, i.e. the distance between the antennas or sensors, the position data of the target is determined
Definitions
- the present invention relates generally to an electronic system and method, and, in particular embodiments, to character recognition in air-writing based on network of radars for human-machine interface.
- Human-machine interfaces are used by humans to interact with machines.
- Conventional human-machine interfaces have been available for many years. Examples of conventional human-machine interfaces include input hardware, such as keyboards, mice, game pads, microphones, output hardware, such as computer monitors, printers, and speakers, and input/output hardware, such as touchscreens.
- Human-machine interfaces may include several categories, such as command line interfaces (e.g., receiving an input via a keyboard and providing an output as text via a computer monitor), graphical user interfaces (e.g., receiving an input via a keyboard and a mouse and providing an output using graphics via a computer monitor), voice user interfaces (e.g., using a person's voice to control a device), and gesture interfaces (e.g., using a person's hand gesture captured via a video camera to control a device).
- command line interfaces e.g., receiving an input via a keyboard and providing an output as text via a computer monitor
- graphical user interfaces e.g., receiving an input via a keyboard and a mouse and providing an output using graphics via a computer monitor
- voice user interfaces e.g., using a person's voice to control a device
- gesture interfaces e.g., using a person's hand gesture captured via a video camera to control a device.
- Human-machine interface may be used to facilitate interaction between humans.
- a first human may interact with a second human by interacting with a first computer using a first human-machine interface that includes a microphone, a speaker, video camera and a computer monitor.
- the first computer transmits data associated with such interaction to a second computer.
- the second human interacts with the second computer using a second human-machine interface that includes a microphone, a speaker, a video camera, and a computer monitor.
- a method for air-writing character recognition includes: determining a position of an object in a monitoring space using trilateration by using a plurality of millimeter-wave radars, where each millimeter-wave radar of the plurality of millimeter-wave radars has a field of view, and where an intersection of the fields of view of the plurality of millimeter-wave radars forms the monitoring space; tracking the position of the object in the monitoring space over time using the plurality of millimeter-wave radars; determining a character symbol depicted by the tracked position of the object over time using a neural network (NN); and providing a signal based on the determined character symbol.
- NN neural network
- an air-writing character recognition system includes: a plurality of millimeter-wave radars, where each millimeter-wave radar of the plurality of millimeter-wave radars is configured to have a field of view, and where an intersection of the fields of view of the plurality of millimeter-wave radars forms a monitoring space; and a controller configured to: determine a position of an object in the monitoring space based on outputs of the plurality of millimeter-wave radars by using trilateration, track the position of the object in the monitoring space over time based on the determined position using the plurality of millimeter-wave radars, determine a character symbol depicted by the tracked position of the object over time using a neural network (NN), and provide a signal based on the determined character symbol.
- a neural network a neural network
- a millimeter-wave radar system includes: three millimeter-wave radars, each of the three millimeter-wave radars configured to have a field of view, where an intersection of the fields of view of each of the three millimeter-wave radars forms a monitoring space; and a controller configured to: determine a position of an object in the monitoring space based on output of the three millimeter-wave radars by using trilateration, determine a trajectory of the object over time based on the determined position of the object, apply a filter to the determined trajectory to generate a filtered trajectory, determine a character symbol depicted by the filtered trajectory using a long-short term memory (LSTM), and provide a signal based on the determined character symbol.
- LSTM long-short term memory
- FIG. 1 shows a schematic diagram of a millimeter-wave (mmWave) radar system, according to an embodiment of the present invention
- FIG. 2 shows a schematic diagram of the millimeter-wave radar of FIG. 1 , according to an embodiment of the present invention
- FIG. 3 shows a flowchart of an embodiment method of fine distance estimation, according to an embodiment of the present invention
- FIG. 4 shows a millimeter-wave radar having a field-of-view (FoV), according to an embodiment of the present invention
- FIG. 5 shows a millimeter-wave radar having a FoV, according to an embodiment of the present invention
- FIG. 6 shows a network of millimeter-wave radars and a corresponding monitoring space, according to an embodiment of the present invention
- FIG. 7 shows a schematic diagram of the network of millimeter-wave radars of
- FIG. 6 according to an embodiment of the present invention.
- FIG. 8 shows a flowchart of an embodiment method of tracking a trajectory of an object in a monitoring space with a Kalman filter, according to an embodiment of the present invention
- FIG. 9 shows a schematic diagram of an LSTM network, according to an embodiment of the present invention.
- FIG. 10 illustrates trajectories of a metal marker in a monitoring space and the corresponding labels generated by an LSTM network, according to an embodiment of the present invention
- FIG. 11 shows a flowchart of an embodiment method for character recognition in air writing, according to an embodiment of the present invention
- FIG. 12 shows a flowchart of an embodiment method for character recognition in air writing using framewise classification, according to an embodiment of the present invention
- FIG. 13 shows a flowchart of an embodiment method for character recognition in air writing using framewise classification, according to an embodiment of the present invention
- FIG. 14 shows a flowchart of an embodiment method for character recognition in air writing using CTC classification, according to an embodiment of the present invention
- FIG. 15 shows a smartphone having the network of millimeter-wave radars of FIG. 6 for character recognition in air writing, according to an embodiment of the present invention.
- FIG. 16 shows a computer monitor having the network of millimeter-wave radars of FIG. 6 for character recognition in air writing, according to an embodiment of the present invention.
- Embodiments of the present invention will be described in a specific context, a system and method of character recognition in air-writing based on network of millimeter-wave radars for human-machine interface.
- Embodiments of the present invention may be implemented with other types of radars (e.g., other than millimeter-wave radars) and may be used for other applications, such as for machine-machine interface, for example.
- embodiments are illustrated with respect to Latin character symbols and numerical symbols, it is understood that other types of symbols, such as non-Latin characters (e.g., kanji symbols, Hebrew characters, Urdu characters, and sign language gestures) may also be used.
- an air-writing system that includes a network of millimeter-wave radars uses a two-stage approach for extraction and recognition of handwriting gestures.
- the extraction processing stage uses fine range estimates combined with trilateration technique to detect and localize the hand of a user, followed by Kalman filter or any smoothing filter to create a smooth trajectory of the hand gesture movement.
- the recognition stage classifies characters drawn by the user by using a long short term memory (LSTM) such as a bi-directional LSTM (BiLSTM) that is fed with consecutive Kalman filter states along the gesture trajectory.
- LSTM long short term memory
- BiLSTM bi-directional LSTM
- Hand gesture recognition systems may be based on camera modules that use computer vision techniques and optical sensors. Camera based recognition systems generally use substantial computational power processing graphic images that are illuminated with proper illumination and that include at least portions of a human. Hand-gesture recognition systems may also be implemented using light and time-of-flight (ToF) technology.
- TOF time-of-flight
- Advantages of some embodiments include accurately detecting character symbols handwritten with hand gestures without processing graphic images without being impacted by illumination changes of the hand performing the gesture. Some embodiments, therefore, are less computationally costly than conventional camera based systems as well as avoid privacy issues associated with graphic images of humans. Some embodiments advantageously determine character symbols from hand gestures without using wearables that aid in hand gesture recognition. Using millimeter-wave radars instead of conventional ToF technology advantageously allows for using estimates of Doppler shifts to accurately determine the trajectory of the hand.
- motion gesture sensing is performed wirelessly by a network of millimeter-wave radars. Trilateration is used to accurately localize the hand in three-dimensional coordinates.
- a Kalman filter or any smoothing filter is used to track and smooth the trajectory of the hand in motion.
- a bi-directional LSTM generates the text representation from the hand motion sensor data received from the Kalman filter.
- FIG. 1 shows a schematic diagram of millimeter-wave radar system 100 , according to an embodiment of the present invention.
- millimeter-wave radar 102 transmits a plurality of radiation pulses 106 , such as chirps, towards scene 108 .
- the chirps are linear chirps (i.e., the instantaneous frequency of the chirp varies linearly with time).
- the transmitted radiation pulses 106 are reflected by objects (also referred to as targets) in scene 108 .
- the reflected radiation pulses (not shown in FIG. 1 ), which are also referred to as the echo signal, are detected by millimeter-wave radar 102 and processed by processor 104 to, for example, determine the angle of arrival of the echo signal.
- the objects in scene 108 may include, for example, hand 114 of a human.
- Scene 108 may include other objects, such as a finger of a hand, a metallic marker, and other objects.
- Processor 104 analyses the echo data to determine the location of, e.g., hand 114 using signal processing techniques. For example, in some embodiments, a range FFT is used for estimating the range component of the location of a tip of an index finger of hand 114 (i.e., the distance of the tip of the index finger of hand 114 from the millimeter-wave radar). The azimuth component of the location of the detected human may be determined using angle estimation techniques.
- Processor 104 may be implemented as a general purpose processor, controller or digital signal processor (DSP) that includes, for example, combinatorial circuits coupled to a memory.
- the DSP may be implemented with an ARM architecture, for example.
- processor 104 may be implemented as a custom application specific integrated circuit (ASIC).
- ASIC application specific integrated circuit
- processor 104 includes a plurality of processors, each having one or more processing cores.
- processor 104 includes a single processor having one or more processing cores.
- Other implementations are also possible.
- some embodiments may be implemented as a combination of hardware accelerator and software running on a DSP or general purpose micro-controller.
- Millimeter-wave radar 102 operates as a frequency-modulated continuous-wave (FMCW) radar that includes a millimeter-wave radar sensor circuit, a transmitting antenna, and a receiving antenna. Millimeter-wave radar 102 transmits and receives signals in the 20 GHz to 122 GHz range. Alternatively, frequencies outside of this range, such as frequencies between 1 GHz and 20 GHz, or frequencies between 122 GHz and 300 GHz, may also be used.
- FMCW frequency-modulated continuous-wave
- the echo signals received by the receiving antenna of millimeter-wave radar 102 are filtered and amplified using band-pass filter (BPFs), low-pass filter (LPFs), mixers, low-noise amplifier (LNAs), and intermediate frequency (IF) amplifiers in ways known in the art.
- BPFs band-pass filter
- LPFs low-pass filter
- LNAs low-noise amplifier
- IF intermediate frequency amplifier
- the echo signals are then digitized using one or more analog-to-digital converters (ADCs) for further processing.
- ADCs analog-to-digital converters
- FIG. 2 shows a schematic diagram of millimeter-wave radar 102 , according to an embodiment of the present invention.
- Millimeter-wave radar 102 includes millimeter-wave radar sensor circuit 200 , transmitting antenna 212 , and receiving antenna 214 .
- Millimeter-wave radar sensor circuit include phased-locked-loop (PLL) 2104 , voltage controlled oscillator (VCO) 206 , divider 208 , amplifier 210 , mixer 216 , low-pass filter (LPF) 218 , and ADC 220 .
- PLL phased-locked-loop
- VCO voltage controlled oscillator
- LPF low-pass filter
- VCO 206 During normal operation, VCO 206 generates a linear frequency chirp (e.g., from 57 GHz to 64 GHz) that is transmitted by transmitting antenna 212 .
- the VCO is controlled by PLL 204 , which receives a reference clock signal (e.g., 80 MHz) from reference oscillator 202 .
- PLL 204 is controlled by a loop that includes divider 208 and amplifier 210 .
- the linear chirp transmitted by transmitting antenna 212 is reflected by hand 114 and received by receiving antenna 214 .
- the echo received by transmitting antenna 214 is mixed with the signal transmitted by transmitting antenna 212 using mixer 216 to produce an intermediate frequency (IF) signal x(t) (also known as the beat signal).
- IF intermediate frequency
- the beat signal x(t) is filtered with low-pass filtered by LPF 218 and then sampled by ADC 220 .
- ADC 220 is advantageously capable of sampling the beat signal x(t) with a sampling frequency that is much smaller than the frequency of signal received by receiving antenna 214 .
- FMCW radars therefore, advantageously allows for a compact and low cost implementation of ADC 220 , in some embodiments.
- the propagation delay between the signal transmitted by transmitting antenna 212 and receiving antenna 214 may be identified by determining the beat frequency of the beat signal x(t).
- the beat frequency of beat signal x(t) may be identified by using spectral analysis (e.g., by using FFT) using processor 104 .
- a millimeter-wave radar performs fine range estimation based on the frequency and phase of the beat signal.
- the beat frequency may be given by
- millimeter-wave radar 102 is characterized by having an elevation field-of-view (FoV) of 70°, an azimuth FoV of 70°, a ramp start frequency f min of 57 GHz, a ramp stop frequency f max of 63 GHz, a bandwidth B of 6 GHz (f max ⁇ f min ), a chirp time T c of 171.2 ⁇ s, a sampling frequency f s of 0.747664 MHz, and a number of samples per chirp N s of 128. Since the beat frequency f b may be estimated, e.g., using spectral analysis of the beat signal x(t), and quantities B, c, and T c are known, range R may be estimated by using Equation 2.
- FoV elevation field-of-view
- the beat signal phase ⁇ b may be given by
- FIG. 3 shows a flowchart of embodiment method 300 of fine distance estimation, according to an embodiment of the present invention.
- Method 300 may be implemented, for example, by millimeter-wave radar 102 .
- the beat frequency estimate ⁇ circumflex over (f) ⁇ b is estimated, e.g., using Equation 1 and a Fourier transform, such as an FFT.
- a coarse range estimate R fb is estimated, e.g., using Equation 2 and the beat frequency estimate ⁇ circumflex over (f) ⁇ b estimated in step 302 .
- a beat signal phase estimate ⁇ circumflex over ( ⁇ ) ⁇ b is estimated, e.g., using Equation 3 and the coarse range estimate R fb estimated in step 304 .
- the beat signal ⁇ tilde over (x) ⁇ (t) is demodulated.
- the phase ⁇ b of demodulated signal ⁇ tilde over (x) ⁇ (t) in Equation 6 is linear in time with a slope ⁇ f b and a y-intersect ⁇ b .
- the demodulated beat phase ⁇ b is determined, e.g., using demodulated signal ⁇ tilde over (x) ⁇ (t).
- range ⁇ R is determined, e.g., using Equation 3.
- ⁇ R may be determined by
- ⁇ ⁇ ⁇ R c ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ b 4 ⁇ ⁇ ⁇ f min ( 7 )
- fine estimate range R fb, ⁇ b is determined based on ⁇ R.
- three millimeter-wave radars form a network of millimeter-wave radars.
- Each of the three millimeter-wave radar have a respective FoV directed to a monitoring space.
- each of the three millimeter-wave radars performs a fine range estimate to determine the range of hand 114 with respect to the respective millimeter-wave radar.
- a trilateration technique based on the fine range estimates from each of the three millimeter-wave radars is used to determine the location of hand 114 in the monitoring spaces.
- the network of radars is non-coherent and each of the millimeter-wave radars operates independently from each other when monitoring the same target.
- Processing for localizing the target (e.g., a finger) in the monitoring space (e.g., a finger) using data from the network of radars may be performed centrally (e.g., using a central processor) or distributed (e.g., using a plurality of processors, such as the processors of the millimeter-wave radars of the network of radars).
- FIG. 4 shows millimeter-wave radar 402 having FoV 404 , according to an embodiment of the present invention.
- Millimeter-wave radar 402 may be implemented in a similar manner as millimeter-wave radar 102 .
- FoV 404 is based on elevation FoV ⁇ ver , and azimuth FoV ⁇ hor .
- Vertical distance D ver may be given by
- the FoV of millimeter-wave radar 402 may be extended by steering the millimeter-wave radar by an angle ⁇ .
- FIG. 5 shows millimeter-wave radar 402 having FoV 504 , according to an embodiment of the present invention.
- Distance d h , d v1 and d v2 are in plane 406 .
- Distance d h may be given by
- V ⁇ 2 2 ⁇ [ h cos ⁇ ( ⁇ - ⁇ v ⁇ e ⁇ r 2 ) ] ⁇ tan ⁇ ( ⁇ v ⁇ e ⁇ r 2 ) .
- FIG. 6 shows network of millimeter-wave radars 600 and corresponding monitoring space 608 , according to an embodiment of the present invention.
- Network of millimeter-wave radars 600 includes millimeter-wave radars 602 , 604 , and 606 .
- Each of millimeter-wave radars 602 , 604 , and 606 may be implemented in a similar manner as millimeter-wave radar 102 and may have a FoV based on steering or without using steering.
- network of millimeter-wave radars 600 shows only three millimeter-wave radars, it is understood that network of millimeter-wave radars 600 may include more than three millimeter-wave radars, such as four, five, six, or more.
- millimeter-wave radars 602 , 604 , and 606 are respectively located in points R 1 , R 2 , and R 3 , having respective coordinates (x 1 ,y 1 ,z 1 ), (x 2 ,y 2 ,z 2 ), and (x 3 ,y 3 ,z 3 ).
- R 1 , R 2 , and R 3 do not lie in a straight line (i.e. are not situated in a straight line).
- Monitoring space 608 is formed at an intersection of FoVs FoV 602 , FoV 604 , and FoV 606 of millimeter-wave radars 602 , 604 , and 606 , respectively.
- monitoring space 608 has a volume of about 10 cm 3 .
- monitoring space 608 may have a volume higher than 10 cm 3 , such as 11 cm 3 , 13 cm 3 , 15 cm 3 or more, or lower than 10 cm 3 , such as 9 cm 3 , 7 cm 3 or less.
- millimeter-wave radars 602 , 604 , and 606 monitor monitoring space 608 .
- an object such as hand 114 or a finger of hand 114 enters monitoring space 608
- millimeter-wave radars 602 , 604 , and 606 are used to determine the location of such object using trilateration.
- the coordinates of T may be obtained by
- x p is the particular solution of Equation 11
- x h is a solution of the homogenous system
- a x 0
- ⁇ is a real parameter.
- Real parameter a may be determined by using Equation 12 to generate Equation 13
- x p (x p0 ,x p1 ,x p2 ,x p3 ) T
- x h (x h0 ,x h1 ,x h2 ,x h3 ) T
- x (x 0 ,x 1 ,x 2 ,x 3 ) T
- Some embodiments may implement the trilateration technique in other ways known in the art.
- FIG. 7 shows a schematic diagram of network of millimeter-wave radars 600 , according to an embodiment of the present invention.
- Each of millimeter-wave radars of network of millimeter-wave radars 600 may operate in an independent and non-coherent manner when monitoring an object in monitoring space 608 .
- network of millimeter-wave radars 600 includes processor 702 .
- Processor 702 receives the fine range estimate from each of millimeter-wave radars 602 , 604 , and 606 , and determines the location of an object in monitoring space 608 by using trilateration.
- processor 702 receives data from millimeter-wave radars 602 , 604 , and 606 wirelessly.
- Processor 702 may be implemented as a general purpose processor, controller or digital signal processor (DSP) that includes, for example, combinatorial circuits coupled to a memory.
- the DSP may be implemented with an ARM architecture, for example.
- processor 702 may be implemented as a custom application specific integrated circuit (ASIC).
- ASIC application specific integrated circuit
- processor 702 includes a plurality of processors, each having one or more processing cores.
- processor 702 includes a single processor having one or more processing cores.
- Other implementations are also possible.
- some embodiments may be implemented as a combination of hardware accelerator and software running on a DSP or general purpose micro-controller.
- processor 702 may be implemented together with a processor 104 of one of millimeter-wave radars 602 , 604 , or 606 .
- processor 702 determines a plurality of locations of the object over time based on data received from millimeter-wave radars 602 , 604 , and 606 , and uses a filter such as a Kalman filter or any other smoothing filter to generate a smooth trajectory of the object in monitoring space 608 over time in a two-dimensional (2D) plane.
- a filter such as a Kalman filter or any other smoothing filter to generate a smooth trajectory of the object in monitoring space 608 over time in a two-dimensional (2D) plane.
- a Kalman filter may be considered as an iterative process that uses a set of equations and consecutive data inputs to estimate position, velocity, etc., of an object when the data inputs (e.g., measured values) contain unpredicted or random error, uncertainty or variation.
- a constant acceleration model is used for motion sensing and trajectory generation in a 2D plane.
- the observation model that maps the state of the estimation with observed space may be given by
- a processing noise may cause uncertainty in estimating the position of the sensed object through the state transition model of the Kalman filter.
- noise in the estimated location estimated by trilateration There may be noise in the estimated location estimated by trilateration.
- the noise may arise due to, e.g., thermal noise from the millimeter-wave radars 602 , 604 , and 606 .
- the measurement noise due to noisy sensor measurements e.g., noise associated with generating the fine range estimates
- R [ ⁇ ⁇ x 2 0 0 ⁇ ⁇ y 2 ] ⁇ x _ k ( 21 )
- R represents the measurement noise variance on each axis.
- FIG. 8 shows a flowchart of embodiment method 800 of tracking a trajectory of an object in monitoring space 608 with a Kalman filter, according to an embodiment of the present invention.
- Method 800 includes initialization step 802 , prediction step 804 , and correction/update step 810 .
- Method 800 may be implemented, for example, by processor 702 .
- the covariance matrices P, Q, and R are initialized, where covariance matrix P corresponds to the covariance of the estimates of the location of the object, covariance matrix Q corresponds to the covariance of the processing noise, and covariance matrix R corresponds to the covariance of the measurement noise.
- the initial values of matrices P, Q, and R are systematically determined based on, e.g., an experimental setup.
- the initial values of matrices P, Q, and R are set to a predetermined default values, such as scaled identity matrices, or others.
- Step 804 a prediction of the location of the object is made based on the current state of the Kalman filter.
- Step 804 includes steps 806 and 808 .
- Step 810 the estimate of the location is updated based on the fine range estimates from network of millimeter-wave radars 600 and from the estimated location from step 804 .
- Step 810 includes 812 , 814 , 816 and 818 .
- fine range estimates zk of the position of the object are received from network of millimeter-wave radars 600 .
- the estimate x k is updated based on fine range estimates z k and the Kalman gain K k .
- step 818 k is increased by one and the sequence is repeated, as shown in FIG. 8 .
- Some embodiments may implement the Kalman filter in other ways known in the art.
- Using a Kalman filter advantageously allows for smoothing the trajectory of the object in the monitoring space.
- a neural network such as an LSTM neural network (e.g., an unidirectional or bidirectional LSTM) is trained, during a training phase, with a collection of gestures of an object (e.g., a metal marker, hand or finger) moving in monitoring space 608 .
- the LSTM neural network is then used to associate (e.g., assign a label) the smoothened trajectories generated by the Kalman filter with character symbols, such as Latin characters, where the smoothened trajectories generated by the Kalman filter correspond to gestures performed by an object in monitoring space 608 during normal operation.
- LSTM neural networks may be understood as recurrent networks that include a memory to model temporal dependencies.
- the activation of a neuron is fed back to itself with a weight and a unit time delay, which provides it with a memory (hidden value) of past activations, which allows it to learn the temporal dynamics of sequential data.
- a recurrent neural network RNN
- RNN recurrent neural network
- h l t ⁇ ( W xh l a t l +h t ⁇ 1 l W hh l +b h l ) (27)
- ⁇ is the non-linear activation function
- b h l is the hidden bias vector
- W terms denote weight matrices, W xh l being the input-hidden weight matrix and W hh l the hidden-hidden weight matrix.
- LSTMs extend RNN with memory cells by using gating.
- Gating may be understood as a mechanism based on component-wise multiplication of inputs that control the behavior of each individual memory cell of the LSTM.
- the LSTM updates its cell state according to an activation of the gates.
- the input provided to an LSTM is fed into different gates that control which operation is performed on the cell memory.
- the operations include write (input gate), read (output gate) and reset (forget gate).
- the vectorial representation (vectors denoting all units in a layer) of the update of an LSTM later may be given by
- ⁇ i t ⁇ i ⁇ ( W a ⁇ i ⁇ a t + W h ⁇ i ⁇ h t - 1 + W c ⁇ i ⁇ c t - 1 + b i )
- f t ⁇ f ⁇ ( W a ⁇ f ⁇ a t + W h ⁇ f ⁇ h t - 1 + W c ⁇ f ⁇ c t - 1 + b f )
- c t f t ⁇ c t - 1 + i t ⁇ ⁇ c ⁇ ( W a ⁇ c ⁇ a t + W h ⁇ c ⁇ h t - 1 + b c )
- o t ⁇ o ⁇ ( W a ⁇ o ⁇ a t + W h ⁇ o ⁇ h t - 1 + W c ⁇
- Terms a represent non-linear functions (e.g., a sigmoid function).
- the term a t is the input of the memory cell at time t and, in some embodiments, may be the location output at time k from the Kalman filter.
- W ai , W hi , W ci , W af , W hf , W cf , W ac , W hc , W ho , and W co are weight matrices, where subscripts represent from-to relationships, and terms b i , b f , b c , and b o are bias vectors.
- the input weights may be the input states to the LSTM and are determined during the training phase.
- the input weights of the LSTM are the Kaman filtered states of the variables x, y, ⁇ dot over (x) ⁇ dot over (y) ⁇ umlaut over (x) ⁇ of Equation 17.
- the loss function of the LSTM may be the loss determined at a single sequence step, which may be the average of log loss determined separately for each label, and may be given by
- ⁇ is predicted class probability or label probability
- y is the true class or label
- L is the number possible classes or labels.
- FIG. 9 shows a schematic diagram of LSTM network 900 , according to an embodiment of the present invention.
- LSTM network 900 includes a plurality of cells (cell 1 , cell 2 , . . . , cell T).
- each cell cell receives data x i (e.g., location coordinates at each frame of the target) from the output of the Kalman filter. For example, if the Kalman filter generates 100 locations estimates over time (where the 100 location estimates are associated with a trajectory of an object in monitoring space 608 ), then T is equal to 100 and each cell cell, receives a corresponding location estimate x i .
- LSTM network 900 then associates the trajectory received from the Kalman filter to a label, such as a Latin character symbol or numerical symbol.
- FIG. 10 illustrates trajectories of a metal marker in monitoring space 608 and the corresponding labels generated by LSTM network 900 , according to an embodiment of the present invention.
- the x and y axes illustrate distances in meters with respect to origin (0,0).
- the origin (0,0) corresponds to the location of one of the millimeter-wave radars (e.g., millimeter-wave radars 602 ). Other reference locations may also be used.
- a metal marker was used to generate the trajectories illustrated in FIG. 10
- other objects such as finger of hand 114 may be used.
- monitoring space 608 may be located between 0.2 m and 0.6 m from one or more of the millimeter-wave radars.
- a closest edge of monitoring space 608 may be closer to one or more of the millimeter-wave radars (e.g., 0.15 m or less), or further (such as 0.5 m or more).
- a furthest edge of monitoring space 608 may be closer than 0.6 m (such as 0.5 m or less) or further than 0.6 m, such as 0.7 m or more).
- the trajectory of an object is a depiction of a character symbol, such as shown in FIG. 10 .
- the neural network such as LSTM network 900 , maps such depiction to a character symbol.
- an approximate depiction of a character symbol, a shorthand depiction of a character symbol, or an arbitrary gesture may be mapped to a character symbol.
- a trajectory of a straight line may be associated with the character symbol a, for example.
- Some embodiments may implement the LSTM network in other ways known in the art.
- FIG. 11 shows a flowchart of embodiment method 1100 for character recognition in air writing, according to an embodiment of the present invention.
- Method 1100 includes location extraction stage 1101 and recognition stage 1111 .
- millimeter-wave radars 602 , 604 and 606 send and receive radar signals towards monitoring space 608 during steps 1102 602 , 1102 604 , and 1102 606 , respectively.
- the received radar signals may be filtered and digitized using, e.g., LPF 218 and ADC 220 .
- background removal and filtering is performed, e.g. by respective processors 104 .
- the digitized radar data generated during steps 1102 602 , 1102 604 , and 1102 606 may be filtered, and DC components may be removed, e.g., to remove the transmitter-receiver self-interference and optionally pre-filtering the interference colored noise.
- filtering includes removing data outliers that have significantly different values from other neighboring range-gate measurements. Thus, this filtering also serves to remove background noise from the radar data.
- a Hampel filter is applied with a sliding window at each range-gate to remove such outliers.
- other filtering for range preprocessing known in the art may be used.
- millimeter-wave radars 602 , 604 and 606 detect the object during steps 1104 602 , 1104 604 , and 1104 606 , respectively, based on the radar signals received during steps 1102 602 , 1102 604 , and 1102 606 , respectively, and filtered during steps 1103 602 , 1106 604 , and 1103 606 , respectively.
- an object is detected by using range FFT on the radar data and detecting an object when a range bin is higher than a predetermined threshold. Other methods for object detection may also be used.
- the Doppler velocity of an detected object is determined. If the Doppler velocity is within a predetermined Doppler velocity range (e.g., a velocity range of a typical human hand movements), millimeter-wave radars 602 , 604 , and 606 may select such object for further processing. If the Doppler velocity is outside the predetermined Doppler velocity range, such as too slow (e.g., static) or too fast (e.g., 10 m/s), the object may be ignored and not selected for further processing. In some embodiments, other attributes, such as the size of the object detected, may be used, instead, or in addition to Doppler velocity to select an object for further processing.
- a predetermined Doppler velocity range e.g., a velocity range of a typical human hand movements
- millimeter-wave radars 602 , 604 and 606 respectively generate fine range estimates of the distance towards the detected and selected object (e.g., a tip of a finger of hand 114 ) using, e.g., method 300 .
- processor 702 determines the location of the object in monitoring space 608 using trilateration, such as explained with respect to Equations 9 to 16.
- Processor 702 also generates traces of the determined location over time (e.g., by storing each determined location in local memory, for example).
- processor 702 smoothens the trajectory (traces) generated during step 1107 and track the object over time, e.g., by using a Kalman filter, such as using method 800 . In some embodiments, other smoothening filters may be used.
- processor 702 associates the trajectory generated by the Kalman filter in step 1108 with a character symbol using an LSTM network, such as LSTM network 900 , and generates an output (label) based on the association.
- LSTM network such as LSTM network 900
- the neural network used in step 1112 is implemented as an unidirectional or bidirectional LSTM network. In other embodiments, other types of neural networks may be used.
- a convolutional neural network (CNN) is used during step 1112 to generate an output (label) based on the trajectory generate by the Kalman filter in step 1108 .
- a 2D image similar to the images shown in FIG. 10 , is generate during step 1108 .
- a CNN then generates during step 1112 a label based on the 2D image generated in step 1108 .
- processor 702 determines the beginning and end of a trajectory based on fixed boundaries to generate a segment of trajectory. Processor 702 then feeds such segment of trajectory to LSTM network 900 for classification purposes.
- FIG. 12 shows a flowchart of embodiment method 1200 for character recognition in air writing using framewise classification, according to an embodiment of the present invention.
- network of millimeter-wave radars 600 detect an object, such as a tip of a finger of hand 114 entering monitoring space 608 .
- a smoothened trajectory of the object in monitoring space 608 is generated for a fixed number of frames, such as 100 frames during 2 seconds.
- the smoothening trajectory may be obtained as described in step 1101 of FIG. 11 , for example.
- a different number of frames, such as 128, 200, 96, 64, or other may be used.
- a different duration, such as 1.5 seconds, 2.5 seconds, or other may also be used.
- the LSTM network receives the fixed number of frames and generates an output, as described with respect to step 1112 of FIG. 11 , for example.
- FIG. 13 shows a flowchart of embodiment method 1300 for character recognition in air writing using framewise classification, according to an embodiment of the present invention.
- a smoothened trajectory of the object in monitoring space 608 is generated for a bounded number of frames.
- the bound may be determined, for example, by detecting that the object exit monitoring space 608 .
- the number of frames in such a bounded segment may be different for each trajectory.
- the LSTM network receives the bounded number of frames and generates an output, as described with respect to step 1112 of FIG. 11 , for example.
- processor 702 uses a connectionist temporal classification (CTC) layer to process unsegmented stream of data from the output of LSTM network 900 , where the unsegmented stream of data is based on an unsegmented stream of location of the object over time.
- CTC connectionist temporal classification
- the output of LSTM network 900 is transformed into a conditional probability distribution over a label sequence. The total probability of any one label sequence may be found by summing the probabilities of different alignments.
- Processor 702 then segments the trajectory based on the output of the CTC layer and associates the segmented trajectory with a character symbol.
- the general operation of a CTC layer is known in the art.
- FIG. 14 shows a flowchart of embodiment method 1400 for character recognition in air writing using CTC classification, according to an embodiment of the present invention.
- a smoothened trajectory of the object in monitoring space 608 is generated continuously and is fed to an LSTM network with a CTC layer.
- the LSTM network with the CTC layer finds in real time the frame alignment with higher probability based on the possible LSTM labels (e.g., the set of possible character symbols plus a blank), segments the frame in accordance with such alignments and outputs the associated character symbol that corresponds with the segmented trajectory.
- processor 702 is advantageously capable of generating a continuous sequence of character symbols based on the trajectory of an object in monitoring space 608 without having to use predetermined time boundaries.
- processor 702 is advantageously capable of accurately associating character symbols with object trajectory irrespective of whether the object is moving fast or slow in monitoring space 608 .
- Some embodiments may implement the LSTM network with CTC in other ways known in the art.
- processor 702 may begin monitoring an object when the object enters monitoring space 608 and continuously generate character symbols using an LSTM with CTC for a predetermined number of frames (e.g., 500 or more) or until the object exits monitoring space 608 .
- FIG. 15 shows smartphone 1502 having network of millimeter-wave radars 600 for character recognition in air writing, according to an embodiment of the present invention.
- Network of millimeter-wave radars 600 may implement, e.g., any of methods 1100 , 1200 , 1300 , and 1400 for character recognition in air writing.
- smartphone 1502 may have a millimeter-wave radar in three of the 4 corners of smartphone 1502 .
- smartphone 1502 may have more than three radars, such as four radars (e.g., one in each corner), or more.
- the millimeter-wave radars may be located in other locations, such as in an edge of screen 1504 .
- the outer edge of monitoring space 608 that is closest to screen 1504 is at 10 cm from screen 1504 . In some embodiments, the distance between the outer edge of monitoring space 608 that is closest to screen 1504 and screen 1504 is smaller than 10 cm, such as 9 cm, 8 cm, or smaller. In some embodiments, the distance between the outer edge of monitoring space 608 that is closest to screen 1504 and screen 1504 is larger than 10 cm, such as 11 cm, 14 cm, or larger.
- Screen 1504 may be, for example, a 5.2 inch screen. In some embodiments, smaller screens, such as 5.1 inch, 5 inch, 4.7 inch or smaller may be used. Larger screens, such as 5.3 inch, 5.5 inch or larger may be used.
- other devices such as devices larger than smartphone 1502 may implement network of millimeter-wave radars 600 for character recognition in air writing.
- devices larger than smartphone 1502 may implement network of millimeter-wave radars 600 for character recognition in air writing.
- tablets, computer monitors, or TVs may implement network of millimeter-wave radars 600 in a similar manner as smartphone 1502 .
- FIG. 16 shows computer monitor 1602 having network of millimeter-wave radars 600 for character recognition in air writing, according to an embodiment of the present invention.
- computer monitor 1602 may have a millimeter-wave radar in three of the 4 corners of computer monitor 1602 .
- computer monitor 1602 may have more than three radars, such as four radars (e.g., one in each corner), or more.
- the millimeter-wave radars may be located in other locations, such as in an edge of screen 1604 .
- the outer edge of monitoring space 608 that is closest to screen 1604 is at 20 cm from screen 1504 . In some embodiments, the distance between the outer edge of monitoring space 608 that is closest to screen 1204 and screen 1204 is smaller than 20 cm, such as 18 cm, 15 cm, or smaller. In some embodiments, the distance between the outer edge of monitoring space 608 that is closest to screen 1604 and screen 1604 is larger than 20 cm, such as 25 cm, 30 cm, or larger.
- Screen 1604 may be, for example, a 27 inch screen. In some embodiments, smaller screens, such as 24 inch, 21 inch, 20 inch or smaller may be used. Larger screens, such as 34 inch, or larger may be used.
- a method for air-writing character recognition including: determining a position of an object in a monitoring space using trilateration by using a plurality of millimeter-wave radars, where each millimeter-wave radar of the plurality of millimeter-wave radars has a field of view, and where an intersection of the fields of view of the plurality of millimeter-wave radars forms the monitoring space; tracking the position of the object in the monitoring space over time using the plurality of millimeter-wave radars; determining a character symbol depicted by the tracked position of the object over time using a neural network (NN); and providing a signal based on the determined character symbol.
- NN neural network
- NN includes a recurrent NN (RNN).
- CTC connectionist temporal classification
- NN includes a convolutional NN (CNN).
- tracking the position of the object includes: determining a trajectory of the object based on multiple determinations of the position of the object over time; and using a smoothening filter to smooth the determined trajectory.
- An air-writing character recognition system including: a plurality of millimeter-wave radars, where each millimeter-wave radar of the plurality of millimeter-wave radars is configured to have a field of view, and where an intersection of the fields of view of the plurality of millimeter-wave radars forms a monitoring space; and a controller configured to: determine a position of an object in the monitoring space based on outputs of the plurality of millimeter-wave radars by using trilateration, track the position of the object in the monitoring space over time based on the determined position using the plurality of millimeter-wave radars, determine a character symbol depicted by the tracked position of the object over time using a neural network (NN), and provide a signal based on the determined character symbol.
- a neural network a neural network
- CTC connectionist temporal classification
- LSTM long-short term memory
- a millimeter-wave radar system including: three millimeter-wave radars, each of the three millimeter-wave radars configured to have a field of view, where an intersection of the fields of view of each of the three millimeter-wave radars forms a monitoring space; and a controller configured to: determine a position of an object in the monitoring space based on output of the three millimeter-wave radars by using trilateration, determine a trajectory of the object over time based on the determined position of the object, apply a filter to the determined trajectory to generate a filtered trajectory, determine a character symbol depicted by the filtered trajectory using a long-short term memory (LSTM), and provide a signal based on the determined character symbol.
- LSTM long-short term memory
- a smartphone includes a millimeter-wave radar system and a screen.
- the millimeter-wave radar system includes: three millimeter-wave radars. Each of the millimeter-wave radars located at a corner of the smartphone. Each of the three millimeter-wave radars is configured to have a field of view, where an intersection of the fields of view of each of the three millimeter-wave radars forms a monitoring space.
- the smartphone further including a controller configured to: determine a position of an object in the monitoring space based on output of the three millimeter-wave radars by using trilateration, determine a trajectory of the object over time based on the determined position of the object, apply a filter to the determined trajectory to generate a filtered trajectory, determine a character symbol depicted by the filtered trajectory using a long-short term memory (LSTM), and provide a signal based on the determined character symbol.
- LSTM long-short term memory
- a computer monitor includes a millimeter-wave radar system and a screen.
- the millimeter-wave radar system includes: three millimeter-wave radars. Each of the millimeter-wave radars located at a corner of the smartphone. Each of the three millimeter-wave radars is configured to have a field of view, where an intersection of the fields of view of each of the three millimeter-wave radars forms a monitoring space.
- the smartphone further including a controller configured to: determine a position of an object in the monitoring space based on output of the three millimeter-wave radars by using trilateration, determine a trajectory of the object over time based on the determined position of the object, apply a filter to the determined trajectory to generate a filtered trajectory, determine a character symbol depicted by the filtered trajectory using a long-short term memory (LSTM), and provide a signal based on the determined character symbol.
- LSTM long-short term memory
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Remote Sensing (AREA)
- Radar, Positioning & Navigation (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Computer Networks & Wireless Communication (AREA)
- General Health & Medical Sciences (AREA)
- Artificial Intelligence (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Medical Informatics (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Computational Linguistics (AREA)
- Molecular Biology (AREA)
- Mathematical Physics (AREA)
- Social Psychology (AREA)
- Psychiatry (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Evolutionary Biology (AREA)
- Radar Systems Or Details Thereof (AREA)
Abstract
Description
x(t)=Ae j(2πf
where fb is the beat frequency and is the difference between the signal transmitted by transmitting
where B is the ramp bandwidth of the linear chirp transmitted by transmitting
Since
{tilde over (x)}(t)=x(t)·e −j(2π{circumflex over (f)}
where {tilde over (x)}(t) is the demodulated signal. Substituting beat signal x(t) in
{tilde over (x)}(t)=Ae j[2π(f b −{circumflex over (f)} b)t+(ϕb−{circumflex over (ϕ)}b)] (5)
or
{tilde over (x)}(t)=Ae j[2π(Δf b)t+Δϕ b] (6)
where Δfb=fb−{circumflex over (f)}b and Δϕb=ϕb−{circumflex over (ϕ)}b.
R fb,ϕb =R fb +ΔR (8)
and horizontal distance Dhor may be given by
where h is the distance between millimeter-
distance dv1 may be given
by and distance dv2 may be given by
which in matrix form is given by
which can be rewritten in the form
Ax=b (11)
where xε E and where E={(x0,x1,x2,x3)T ε /x0=x1 2+x2 2+x3 2}
x = x p +α x h (12)
where xp is the particular solution of Equation 11, xh is a solution of the homogenous system Ax=0 and α is a real parameter.
where xp =(xp0,xp1,xp2,xp3)T, xh =(xh0,xh1,xh2,xh3)T, and x=(x0,x1,x2,x3)T. Since xε E, then
x p0 +α·x h0=(x p1 +α·x h1)2+(x p2 °α·x h2)2+(s p3 +α·x h3)2 (14)
and thus
α2(x h1 2 +x h2 2 +x h3 2 0+α(2·x p1 ·x h1+2·x p2 ·x h2+2·x p3 ·x h3 −x h0)+x p1 2 + x p2 2 +x p3 2 −x p0=0 (15)
x =[xy{dot over (x)}{dot over (y)}{umlaut over (x)}ÿ] (17)
and
and the 2D location z(x,y) may be estimated by network of millimeter-
Q=G·G T·ρa 2 (20)
where G=[0.52 0.5δt 2 δt δt 1 1] and ρa 2 is an acceleration process noise.
where R represents the measurement noise variance on each axis.
{tilde over (x)} k+1 =Ax k (22)
{tilde over (P)} k+1 =AP k A T +Q (23)
K k ={tilde over (P)} k+1 H T(H{tilde over (P)} k+1 H T +R)−1 (24)
x k+1 ={tilde over (x)} k+1 +K k( z k −H{tilde over (x)} k+1) (25)
P k+1=(1−K k H) {tilde over (P)} k+1 (26)
h l t=ρ(W xh l a t l +h t−1 l W hh l +b h l) (27)
where ρ is the non-linear activation function, bh l is the hidden bias vector and W terms denote weight matrices, Wxh l being the input-hidden weight matrix and Whh l the hidden-hidden weight matrix. The activation for the recurrent units may be given by
a t t+1 =h t l W ha l +b a l (28)
where Wha l is the hidden-activation weight matrix and ba l is the bias activation vector.
where i, f, o and c are respectively the input gate, forget gate, output gate, and cell activation vectors, all of which may have the same size as vector h, which represents the hidden value. Terms a represent non-linear functions (e.g., a sigmoid function). The term at is the input of the memory cell at time t and, in some embodiments, may be the location output at time k from the Kalman filter. Wai, Whi, Wci, Waf, Whf, Wcf, Wac, Whc, Who, and Wco are weight matrices, where subscripts represent from-to relationships, and terms bi, bf, bc, and bo are bias vectors.
Claims (23)
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/360,284 US11126885B2 (en) | 2019-03-21 | 2019-03-21 | Character recognition in air-writing based on network of radars |
EP20164180.0A EP3712809A1 (en) | 2019-03-21 | 2020-03-19 | Character recognition in air-writing based on network of radars |
CN202010206608.5A CN111722706B (en) | 2019-03-21 | 2020-03-23 | Method and system for identifying aerial writing characters based on radar network |
US17/392,829 US11686815B2 (en) | 2019-03-21 | 2021-08-03 | Character recognition in air-writing based on network of radars |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US16/360,284 US11126885B2 (en) | 2019-03-21 | 2019-03-21 | Character recognition in air-writing based on network of radars |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/392,829 Continuation US11686815B2 (en) | 2019-03-21 | 2021-08-03 | Character recognition in air-writing based on network of radars |
Publications (2)
Publication Number | Publication Date |
---|---|
US20200302210A1 US20200302210A1 (en) | 2020-09-24 |
US11126885B2 true US11126885B2 (en) | 2021-09-21 |
Family
ID=69846313
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/360,284 Active 2039-12-17 US11126885B2 (en) | 2019-03-21 | 2019-03-21 | Character recognition in air-writing based on network of radars |
US17/392,829 Active US11686815B2 (en) | 2019-03-21 | 2021-08-03 | Character recognition in air-writing based on network of radars |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/392,829 Active US11686815B2 (en) | 2019-03-21 | 2021-08-03 | Character recognition in air-writing based on network of radars |
Country Status (3)
Country | Link |
---|---|
US (2) | US11126885B2 (en) |
EP (1) | EP3712809A1 (en) |
CN (1) | CN111722706B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220083075A1 (en) * | 2020-09-15 | 2022-03-17 | Infineon Technologies Ag | Robot Guiding System and Method |
US20230003867A1 (en) * | 2021-06-30 | 2023-01-05 | Apple Inc. | Electronic Devices with Low Signal-to-Noise Ratio Range Measurement Capabilities |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11263409B2 (en) * | 2017-11-03 | 2022-03-01 | Board Of Trustees Of Michigan State University | System and apparatus for non-intrusive word and sentence level sign language translation |
US11301672B2 (en) * | 2019-04-16 | 2022-04-12 | The Board Of Trustees Of The University Of Alabama | Radar-based methods and apparatus for communication and interpretation of sign languages |
EP3977184A4 (en) * | 2019-05-24 | 2023-05-17 | 3M Innovative Properties Company | Radar retroreflective article |
WO2021086688A2 (en) * | 2019-10-30 | 2021-05-06 | Google Llc | Smart-device-based radar system performing gesture recognition using a space time neural network |
US12093802B2 (en) * | 2020-10-20 | 2024-09-17 | International Business Machines Corporation | Gated unit for a gated recurrent neural network |
KR20220063862A (en) * | 2020-11-10 | 2022-05-18 | 삼성전자주식회사 | An electronic apparatus and a method of operating the electronic apparatus |
CN112198966B (en) * | 2020-12-08 | 2021-03-16 | 中南大学 | Stroke identification method and system based on FMCW radar system |
CN114661142B (en) * | 2020-12-22 | 2024-08-27 | 华为技术有限公司 | A gesture recognition method and device |
CN117321604A (en) * | 2021-04-09 | 2023-12-29 | 谷歌有限责任公司 | Radar-based gesture detection in a surrounding computer environment using a machine learning module |
WO2022217537A1 (en) * | 2021-04-15 | 2022-10-20 | 刘珲 | Smart door and non-contact control method therefor |
CN113253255A (en) * | 2021-05-11 | 2021-08-13 | 浙江大学 | Multi-point multi-sensor target monitoring system and method |
CN113139349B (en) * | 2021-05-12 | 2022-11-29 | 江西师范大学 | Method, device and equipment for removing atmospheric noise in InSAR time sequence |
CN113720862B (en) * | 2021-08-17 | 2023-01-13 | 珠海格力电器股份有限公司 | Part abnormality detection method, device, equipment and storage medium |
US12235341B2 (en) * | 2021-12-06 | 2025-02-25 | Microsoft Technology Licensing, Llc | Radar tracking with greater than range resolution precision |
CN114863433A (en) * | 2022-03-31 | 2022-08-05 | 北京科技大学 | An online assistant teaching system with real-time handwriting interaction |
Citations (151)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4241347A (en) | 1978-06-28 | 1980-12-23 | International Telephone And Telegraph Corporation | PRC/FM CW Radar system |
GB2247799A (en) | 1990-09-04 | 1992-03-11 | Gec Ferranti Defence Syst | Radar based navigation aid |
US6147572A (en) | 1998-07-15 | 2000-11-14 | Lucent Technologies, Inc. | Filter including a microstrip antenna and a frequency selective surface |
JP2001174539A (en) | 1999-10-06 | 2001-06-29 | Nec Corp | Multi-beam ladar device |
US20030179127A1 (en) | 2000-07-28 | 2003-09-25 | Hans-Theo Wienand | People counter |
US6636174B2 (en) | 2000-06-06 | 2003-10-21 | Altratek Inc. | System and method for detection and tracking of targets |
CN1463161A (en) | 2002-06-01 | 2003-12-24 | 三星电子株式会社 | Device and method of correcting image movement |
JP2004198312A (en) | 2002-12-19 | 2004-07-15 | Mitsubishi Electric Corp | Radar equipment |
US20040238857A1 (en) | 2001-08-28 | 2004-12-02 | Tessera, Inc. | High frequency chip packages with connecting elements |
CN1716695A (en) | 2004-06-30 | 2006-01-04 | 国际商业机器公司 | Apparatus and methods for constructing and packaging printed antenna devices |
US20060049995A1 (en) | 2004-09-01 | 2006-03-09 | Toshikazu Imaoka | Integrated antenna type circuit apparatus |
US20060067456A1 (en) | 2004-09-27 | 2006-03-30 | Point Grey Research Inc. | People counting systems and methods |
US7048973B2 (en) | 2001-08-08 | 2006-05-23 | Mitsubishi Heavy Industries, Ltd. | Metal film vapor phase deposition method and vapor phase deposition apparatus |
US7057564B2 (en) | 2004-08-31 | 2006-06-06 | Freescale Semiconductor, Inc. | Multilayer cavity slot antenna |
JP2006234513A (en) | 2005-02-23 | 2006-09-07 | Toyota Central Res & Dev Lab Inc | Obstacle detection device |
WO2007060069A1 (en) | 2005-11-28 | 2007-05-31 | Nokia Siemens Networks Gmbh & Co. Kg | Method and arrangement for calibrating transmit paths of an antenna array |
US20070210959A1 (en) | 2006-03-07 | 2007-09-13 | Massachusetts Institute Of Technology | Multi-beam tile array module for phased array systems |
US7317417B2 (en) | 2004-07-12 | 2008-01-08 | Orhan Arikan | Methods for detection and tracking of targets |
JP2008029025A (en) | 2002-06-27 | 2008-02-07 | Harris Corp | Highly efficient resonance line |
JP2008089614A (en) | 2007-12-27 | 2008-04-17 | Hitachi Ltd | Radar sensor |
US20080106460A1 (en) | 2006-06-01 | 2008-05-08 | James Lynn Kurtz | Radar microsensor for detection, tracking, and classification |
US20080238759A1 (en) | 2007-03-30 | 2008-10-02 | Honeywell International Inc. | Integrated distance measuring equipment and transponder system and method |
US20080291115A1 (en) | 2007-05-22 | 2008-11-27 | Sibeam, Inc. | Surface mountable integrated circuit packaging scheme |
US20080308917A1 (en) | 2007-06-13 | 2008-12-18 | Infineon Technologies Ag | Embedded chip package |
US20090073026A1 (en) | 2007-09-18 | 2009-03-19 | Mitsubishi Electric Corporation | Radar apparatus |
US20090085815A1 (en) | 2007-09-27 | 2009-04-02 | Jakab Andrew J | Tightly-coupled pcb gnss circuit and manufacturing method |
KR20090063166A (en) | 2007-12-12 | 2009-06-17 | 브로드콤 코포레이션 | Method and system for phased array antenna embedded in integrated circuit package |
CN101490578A (en) | 2006-07-13 | 2009-07-22 | 罗伯特·博世有限公司 | FMCW-radarsensor |
US7596241B2 (en) | 2005-06-30 | 2009-09-29 | General Electric Company | System and method for automatic person counting and detection of specific events |
CN101585361A (en) | 2009-05-25 | 2009-11-25 | 郭文艺 | Anti-collision and lane-departure-prevention control device for automobile |
DE102008054570A1 (en) | 2008-12-12 | 2010-06-17 | Robert Bosch Gmbh | FMCW radar sensor for motor vehicles |
US20100207805A1 (en) | 2009-02-19 | 2010-08-19 | Agd Systems Limited | Obtaining an indication of a number of moving objects passing speed detection apparatus |
US20100313150A1 (en) * | 2009-06-03 | 2010-12-09 | Microsoft Corporation | Separable displays and composable surfaces |
US7873326B2 (en) | 2006-07-11 | 2011-01-18 | Mojix, Inc. | RFID beam forming system |
US7889147B2 (en) | 2007-02-23 | 2011-02-15 | Northrop Grumman Systems Corporation | Modular active phased array |
US20110181509A1 (en) | 2010-01-26 | 2011-07-28 | Nokia Corporation | Gesture Control |
US20110254765A1 (en) * | 2010-04-18 | 2011-10-20 | Primesense Ltd. | Remote text input using handwriting |
JP2011529181A (en) | 2008-07-24 | 2011-12-01 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Distance measurement |
US20110299433A1 (en) | 2010-06-03 | 2011-12-08 | Broadcom Corporation | Rf front-end module |
DE102011100907A1 (en) | 2011-05-09 | 2012-01-12 | Daimler Ag | Device for determining road condition of roadway located in front of vehicle, comprises imaging sensor for detecting road surface, and evaluation unit for evaluation of image data |
US20120087230A1 (en) | 2009-06-17 | 2012-04-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Method For Antenna Calibration In A Wideband Communication System |
US20120092284A1 (en) * | 2010-09-30 | 2012-04-19 | Broadcom Corporation | Portable computing device including a three-dimensional touch screen |
US20120116231A1 (en) | 2009-05-18 | 2012-05-10 | Perception Digital Limited | Earplug-type earphone |
JP2012112861A (en) | 2010-11-26 | 2012-06-14 | Fujitsu Ltd | Fm-cw radar device and pairing method |
US8228382B2 (en) | 2005-11-05 | 2012-07-24 | Ram Pattikonda | System and method for counting people |
US20120195161A1 (en) | 2005-05-03 | 2012-08-02 | Sonosite, Inc. | Systems and methods for ultrasound beam forming data control |
US20120206339A1 (en) | 2009-07-07 | 2012-08-16 | Elliptic Laboratories As | Control using movements |
US20120265486A1 (en) | 2009-12-23 | 2012-10-18 | Endress + Hauser Gmbh + Co. Kg | Method for ascertaining and monitoring fill level of a medium in a container with a travel time measuring method |
US20120268314A1 (en) | 2011-02-11 | 2012-10-25 | Honda Elesys Co., Ltd. | Multibeam radar apparatus for vehicle, multibeam radar method, and multibeam radar program |
US20120280900A1 (en) | 2011-05-06 | 2012-11-08 | Nokia Corporation | Gesture recognition using plural sensors |
DE102011075725A1 (en) | 2011-05-12 | 2012-11-15 | Robert Bosch Gmbh | Method for recognizing gestures |
CN102788969A (en) | 2012-07-04 | 2012-11-21 | 中国人民解放军海军航空工程学院 | Sea surface micromotion target detection and feature extraction method based on short-time fractional Fourier transform |
US20120313900A1 (en) * | 2009-10-07 | 2012-12-13 | Elliptic Laboratories As | User interfaces |
US20120326995A1 (en) * | 2011-06-24 | 2012-12-27 | Ricoh Company, Ltd. | Virtual touch panel system and interactive mode auto-switching method |
WO2013009473A2 (en) | 2011-07-12 | 2013-01-17 | Sensormatic Electronics, LLC | Method and system for people counting using passive infrared detectors |
US20130027240A1 (en) | 2010-03-05 | 2013-01-31 | Sazzadur Chowdhury | Radar system and method of manufacturing same |
CN102967854A (en) | 2012-12-07 | 2013-03-13 | 中国人民解放军海军航空工程学院 | Multi-fractal detection method of targets in FRFT (Fractional Fourier Transformation) domain sea clutter |
US20130106673A1 (en) | 2011-10-20 | 2013-05-02 | Waveconnex, Inc. | Low-profile wireless connectors |
CN103529444A (en) | 2013-09-27 | 2014-01-22 | 安徽师范大学 | A vehicle-mounted millimeter-wave radar moving target recognizer and recognition method |
US20140028542A1 (en) | 2012-07-30 | 2014-01-30 | Microsoft Corporation | Interaction with Devices Based on User State |
US8659369B2 (en) | 2007-12-13 | 2014-02-25 | Broadcom Corporation | IC package with embedded filters |
US20140070994A1 (en) | 2012-09-13 | 2014-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | 3d short range detection with phased array radar |
US8731502B2 (en) | 2009-02-13 | 2014-05-20 | Freescale Semiconductor, Inc. | Integrated circuit comprising frequency generation circuitry for controlling a frequency source |
US20140145883A1 (en) | 2012-11-26 | 2014-05-29 | International Business Machines Corporation | Millimeter-wave radio frequency integrated circuit packages with integrated antennas |
US8836596B2 (en) | 2013-01-15 | 2014-09-16 | Cubic Corporation | Filter antenna |
US8860532B2 (en) | 2011-05-20 | 2014-10-14 | University Of Central Florida Research Foundation, Inc. | Integrated cavity filter/antenna system |
US20140324888A1 (en) | 2011-12-09 | 2014-10-30 | Nokia Corporation | Method and Apparatus for Identifying a Gesture Based Upon Fusion of Multiple Sensor Signals |
CN203950036U (en) | 2014-03-12 | 2014-11-19 | 肖令军 | A kind of have people's depopulated helicopter anti-collision system based on millimetre-wave radar range finding |
US20150185316A1 (en) | 2013-10-25 | 2015-07-02 | Texas Instruments Incorporated | Techniques for angle resolution in radar |
US20150181840A1 (en) | 2013-12-31 | 2015-07-02 | i4c Innovations Inc. | Ultra-Wideband Radar System for Animals |
DE102014118063A1 (en) | 2014-01-29 | 2015-07-30 | Fujitsu Ten Limited | radar device |
US20150243575A1 (en) | 2014-02-27 | 2015-08-27 | Stats Chippac, Ltd. | Semiconductor Device and Method of Forming Encapsulated Wafer Level Chip Scale Package (EWLCSP) |
US20150277569A1 (en) | 2014-03-28 | 2015-10-01 | Mark E. Sprenger | Radar-based gesture recognition |
US9172132B2 (en) | 2011-02-17 | 2015-10-27 | Globalfoundries Inc | Integrated antenna for RFIC package applications |
US9182476B2 (en) | 2009-04-06 | 2015-11-10 | Conti Temic Microelectronic Gmbh | Radar system having arrangements and methods for the decoupling of transmitting and receiving signals and for the suppression of interference radiation |
US20150325925A1 (en) | 2013-12-18 | 2015-11-12 | Telesphor Teles Kamgaing | Embedded millimeter-wave phased array module |
US9202105B1 (en) | 2012-01-13 | 2015-12-01 | Amazon Technologies, Inc. | Image analysis for user authentication |
US20150346820A1 (en) | 2014-06-03 | 2015-12-03 | Google Inc. | Radar-Based Gesture-Recognition through a Wearable Device |
US20150348821A1 (en) | 2012-12-26 | 2015-12-03 | Hitachi Chemical Company, Ltd. | Expansion method, method for manufacturing semiconductor device, and semiconductor device |
US20150364816A1 (en) | 2014-06-16 | 2015-12-17 | Texas Instruments Incorporated | Millimeter wave integrated circuit with ball grid array package including transmit and receive channels |
US20160018511A1 (en) | 2014-07-17 | 2016-01-21 | Texas Instruments Incorporated | Distributed Radar Signal Processing in a Radar System |
US20160041618A1 (en) | 2014-08-07 | 2016-02-11 | Google Inc. | Radar-Based Gesture Sensing and Data Transmission |
US20160041617A1 (en) | 2014-08-07 | 2016-02-11 | Google Inc. | Radar-Based Gesture Recognition |
US20160061947A1 (en) | 2014-08-27 | 2016-03-03 | Texas Instruments Incorporated | Fmcw doppler processing algorithm for achieving cw performance |
US20160061942A1 (en) | 2014-08-27 | 2016-03-03 | Texas Instruments Incorporated | Range resolution in fmcw radars |
US20160098089A1 (en) | 2014-10-02 | 2016-04-07 | Google Inc. | Non-Line-of-Sight Radar-Based Gesture Recognition |
US20160103213A1 (en) | 2014-10-08 | 2016-04-14 | Texas Instruments Incorporated | Three Dimensional (3D) Tracking of Objects in a Radar System |
US20160109566A1 (en) | 2014-10-21 | 2016-04-21 | Texas Instruments Incorporated | Camera Assisted Tracking of Objects in a Radar System |
US20160118353A1 (en) | 2014-10-22 | 2016-04-28 | Infineon Techologies Ag | Systems and Methods Using an RF Circuit on Isolating Material |
US20160135655A1 (en) | 2014-11-17 | 2016-05-19 | Samsung Electronics Co., Ltd. | Robot cleaner, terminal apparatus, and method of controlling the same |
US20160146931A1 (en) | 2014-11-21 | 2016-05-26 | Texas Instruments Incorporated | Techniques for high arrival angle resolution using multiple nano-radars |
US20160146933A1 (en) | 2014-11-25 | 2016-05-26 | Texas Instruments Incorporated | Controlling Radar Transmission to Enable Interference Mitigation |
US20160178730A1 (en) | 2014-12-23 | 2016-06-23 | Infineon Technologies Ag | RF System with an RFIC and Antenna System |
US20160187462A1 (en) | 2014-12-30 | 2016-06-30 | Texas Instruments Incorporated | Multiple Chirp Generation in a Radar System |
US20160191232A1 (en) | 2014-12-31 | 2016-06-30 | Texas Instruments Incorporated | Dynamic measurement of frequency synthesizer noise spurs or phase noise |
US20160223651A1 (en) | 2015-01-29 | 2016-08-04 | Nidec Elesys Corporation | Neural network-based radar system having independent multibeam antenna |
US9413079B2 (en) | 2013-03-13 | 2016-08-09 | Intel Corporation | Single-package phased array module with interleaved sub-arrays |
US20160240907A1 (en) | 2015-02-12 | 2016-08-18 | Texas Instruments Incorporated | Dielectric Waveguide Radar Signal Distribution |
US20160249133A1 (en) | 2013-10-07 | 2016-08-25 | Gn Netcom A/S | Earphone device with optical sensor |
US20160252607A1 (en) | 2015-02-27 | 2016-09-01 | Texas Instruments Incorporated | Gesture Recognition using Frequency Modulated Continuous Wave (FMCW) Radar with Low Angle Resolution |
US20160259037A1 (en) * | 2015-03-03 | 2016-09-08 | Nvidia Corporation | Radar based user interface |
US20160266233A1 (en) | 2013-06-13 | 2016-09-15 | Texas Instruments Incorporated | Kalman filter for indoor positioning |
US20160291130A1 (en) | 2015-04-06 | 2016-10-06 | Texas Instruments Incorporated | Interference Detection in a Frequency Modulated Continuous Wave (FMCW) Radar System |
US20160299215A1 (en) | 2015-04-09 | 2016-10-13 | Texas Instruments Incorporated | Circuit and method for impedance detection in millimeter wave systems |
US20160306034A1 (en) | 2014-12-23 | 2016-10-20 | Infineon Technologies Ag | RF System with an RFIC and Antenna System |
US20160320852A1 (en) | 2015-04-30 | 2016-11-03 | Google Inc. | Wide-Field Radar-Based Gesture Recognition |
US20160320853A1 (en) | 2015-04-30 | 2016-11-03 | Google Inc. | RF-Based Micro-Motion Tracking for Gesture Tracking and Recognition |
US20160327633A1 (en) | 2015-05-05 | 2016-11-10 | Texas Instruments Incorporated | Dynamic Programming of Chirps in a Frequency Modulated Continuous Wave (FMCW) Radar System |
US9495600B2 (en) | 2013-05-31 | 2016-11-15 | Samsung Sds Co., Ltd. | People detection apparatus and method and people counting apparatus and method |
US20160334502A1 (en) | 2015-05-15 | 2016-11-17 | Texas Instruments Incorporated | Low Complexity Super-Resolution Technique for Object Detection in Frequency Modulation Continuous Wave Radar |
US9504920B2 (en) * | 2011-04-25 | 2016-11-29 | Aquifi, Inc. | Method and system to create three-dimensional mapping in a two-dimensional game |
US20160349845A1 (en) | 2015-05-28 | 2016-12-01 | Google Inc. | Gesture Detection Haptics and Virtual Tools |
US20160378195A1 (en) * | 2015-06-26 | 2016-12-29 | Orange | Method for recognizing handwriting on a physical surface |
US20170033062A1 (en) | 2015-07-29 | 2017-02-02 | STATS ChipPAC Pte. Ltd. | Antenna In Embedded Wafer-Level Ball-Grid Array Package |
US20170045607A1 (en) | 2015-08-13 | 2017-02-16 | Texas Instruments Incorporated | Chirp frequency non-linearity mitigation in radar systems |
US20170054449A1 (en) | 2015-08-19 | 2017-02-23 | Texas Instruments Incorporated | Method and System for Compression of Radar Signals |
US20170052618A1 (en) | 2014-04-30 | 2017-02-23 | Lg Innotek Co., Ltd. | Touch device, wearable device having the same and touch recognition method |
US20170070952A1 (en) | 2015-09-08 | 2017-03-09 | Texas Instruments Incorporated | Re-sampling with reduced power consumption and complexity |
US20170074974A1 (en) | 2015-09-15 | 2017-03-16 | Texas Instruments Incorporated | Method and apparatus for fmcw radar processing |
US20170074980A1 (en) | 2014-02-24 | 2017-03-16 | Massachusetts Institute Of Technology | Object tracking via radio reflections |
US20170090015A1 (en) | 2015-09-30 | 2017-03-30 | Texas Instruments Incorporated | Multi-Chip Transceiver Testing in a Radar System |
US20170090014A1 (en) | 2015-09-30 | 2017-03-30 | Texas Instruments Incorporated | Measurement of Transceiver Performance Parameters in a Radar System |
US20170115377A1 (en) | 2015-10-23 | 2017-04-27 | Texas Instruments Incorporated | RF/mm-Wave Peak Detector with High-Dynamic Range Calibration |
US20170131395A1 (en) | 2014-06-25 | 2017-05-11 | University Of Washington | Devices, systems, and methods for detecting gestures using multiple antennas and/or reflections of signals transmitted by the detecting device |
US20170139036A1 (en) | 2015-11-12 | 2017-05-18 | Texas Instruments Incorporated | Buffer sample size control for variable chirp radar |
US20170141453A1 (en) | 2015-11-17 | 2017-05-18 | Vega Grieshaber Kg | Antenna device and method for transmitting and/or receiving a signal |
US20170170947A1 (en) | 2015-12-09 | 2017-06-15 | Samsung Electronics Co. Ltd. | Method for operating switch and electronic device supporting the same |
US20170176574A1 (en) | 2015-12-18 | 2017-06-22 | Texas Instruments Incorporated | Circuits and methods for determining chirp signal linearity and phase noise of a fmcw radar |
US20170192847A1 (en) | 2015-12-31 | 2017-07-06 | Texas Instruments Incorporated | Protecting Data Memory in a Signal Processing System |
US20170201019A1 (en) | 2016-01-13 | 2017-07-13 | Infineon Technologies Ag | System and Method for Measuring a Plurality of RF Signal Paths |
US20170212597A1 (en) | 2016-01-27 | 2017-07-27 | Wipro Limited | Method and System for Recommending One or More Gestures to Users Interacting With Computing Device |
US20170364160A1 (en) | 2016-06-17 | 2017-12-21 | Texas Instruments Incorporated | Hidden markov model-based gesture recognition with fmcw radar |
US9886095B2 (en) | 2015-09-24 | 2018-02-06 | Stmicroelectronics Sa | Device and method for recognizing hand gestures using time-of-flight sensing |
US20180046255A1 (en) | 2016-08-09 | 2018-02-15 | Google Inc. | Radar-based gestural interface |
US20180071473A1 (en) | 2015-03-16 | 2018-03-15 | Mario Agostino FERRARIO | Device for tracheal intubation |
US9935065B1 (en) | 2016-12-21 | 2018-04-03 | Infineon Technologies Ag | Radio frequency device packages and methods of formation thereof |
US20180101239A1 (en) | 2016-10-09 | 2018-04-12 | Alibaba Group Holding Limited | Three-dimensional graphical user interface for informational input in virtual reality environment |
US20180157330A1 (en) * | 2016-12-05 | 2018-06-07 | Google Inc. | Concurrent Detection of Absolute Distance and Relative Movement for Sensing Action Gestures |
US10055660B1 (en) * | 2017-09-19 | 2018-08-21 | King Fahd University Of Petroleum And Minerals | Arabic handwriting recognition utilizing bag of features representation |
US10386481B1 (en) * | 2018-07-19 | 2019-08-20 | King Abdullah University Of Science And Technology | Angle-of-arrival-based gesture recognition system and method |
US20190311227A1 (en) * | 2018-04-06 | 2019-10-10 | Dropbox, Inc. | Generating searchable text for documents portrayed in a repository of digital images utilizing orientation and text prediction neural networks |
US20200026361A1 (en) * | 2018-07-19 | 2020-01-23 | Infineon Technologies Ag | Gesture Detection System and Method Using A Radar Sensors |
US20200082196A1 (en) * | 2018-09-10 | 2020-03-12 | Sony Corporation | License plate number recognition based on three dimensional beam search |
US10591998B2 (en) * | 2012-10-03 | 2020-03-17 | Rakuten, Inc. | User interface device, user interface method, program, and computer-readable information storage medium |
US20200150771A1 (en) * | 2018-11-13 | 2020-05-14 | Google Llc | Radar-Image Shaper for Radar-Based Applications |
US20200201443A1 (en) * | 2018-12-19 | 2020-06-25 | Arizona Board Of Regents On Behalf Of Arizona State University | Three-dimensional in-the-air finger motion based user login framework for gesture interface |
US20200234030A1 (en) * | 2019-01-22 | 2020-07-23 | Infineon Technologies Ag | User Authentication Using mm-Wave Sensor for Automotive Radar Systems |
US20200250413A1 (en) * | 2019-02-06 | 2020-08-06 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for a multifactor user identification and authentication framework for in-air-handwriting with hand geometry and deep hashing |
US10739864B2 (en) * | 2018-12-31 | 2020-08-11 | International Business Machines Corporation | Air writing to speech system using gesture and wrist angle orientation for synthesized speech modulation |
US20200293613A1 (en) * | 2019-03-12 | 2020-09-17 | Wipro Limited | Method and system for identifying and rendering hand written content onto digital display interface |
US20210033693A1 (en) * | 2018-03-19 | 2021-02-04 | King Abdullah University Of Science And Technology | Ultrasound based air-writing system and method |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7084859B1 (en) * | 1992-09-18 | 2006-08-01 | Pryor Timothy R | Programmable tactile touch screen displays and man-machine interfaces for improved vehicle instrumentation and telematics |
WO2008001092A2 (en) * | 2006-06-28 | 2008-01-03 | Cambridge Consultants Limited | Radar for through wall detection |
US8660303B2 (en) * | 2009-05-01 | 2014-02-25 | Microsoft Corporation | Detection of body and props |
GB0916707D0 (en) * | 2009-09-23 | 2009-11-04 | Elliptic Laboratories As | Acoustic motion determination |
WO2014106823A2 (en) * | 2013-01-03 | 2014-07-10 | Meta Company | Extramissive spatial imaging digital eye glass apparatuses, methods and systems for virtual or augmediated vision, manipulation, creation, or interaction with objects, materials, or other entities |
CN106527672A (en) * | 2015-09-09 | 2017-03-22 | 广州杰赛科技股份有限公司 | Non-contact type character input method |
US10157309B2 (en) * | 2016-01-14 | 2018-12-18 | Nvidia Corporation | Online detection and classification of dynamic gestures with recurrent convolutional neural networks |
CN105786185B (en) * | 2016-03-12 | 2019-01-18 | 浙江大学 | Non-contact gesture identifying system and method based on continuous wave micro-doppler radar |
JP6468260B2 (en) * | 2016-08-04 | 2019-02-13 | トヨタ自動車株式会社 | Wireless communication apparatus and wireless communication method |
US10754476B2 (en) * | 2016-08-25 | 2020-08-25 | Tactual Labs Co. | Systems and methods for ultrasonic, millimeter wave and hybrid sensing |
CN107024685A (en) * | 2017-04-10 | 2017-08-08 | 北京航空航天大学 | A kind of gesture identification method based on apart from velocity characteristic |
US10772511B2 (en) * | 2018-05-16 | 2020-09-15 | Qualcomm Incorporated | Motion sensor using cross coupling |
CN109188414A (en) * | 2018-09-12 | 2019-01-11 | 北京工业大学 | A kind of gesture motion detection method based on millimetre-wave radar |
WO2020176105A1 (en) * | 2019-02-28 | 2020-09-03 | Google Llc | Smart-device-based radar system detecting user gestures in the presence of saturation |
-
2019
- 2019-03-21 US US16/360,284 patent/US11126885B2/en active Active
-
2020
- 2020-03-19 EP EP20164180.0A patent/EP3712809A1/en active Pending
- 2020-03-23 CN CN202010206608.5A patent/CN111722706B/en active Active
-
2021
- 2021-08-03 US US17/392,829 patent/US11686815B2/en active Active
Patent Citations (169)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4241347A (en) | 1978-06-28 | 1980-12-23 | International Telephone And Telegraph Corporation | PRC/FM CW Radar system |
GB2247799A (en) | 1990-09-04 | 1992-03-11 | Gec Ferranti Defence Syst | Radar based navigation aid |
US6147572A (en) | 1998-07-15 | 2000-11-14 | Lucent Technologies, Inc. | Filter including a microstrip antenna and a frequency selective surface |
US6414631B1 (en) | 1999-06-10 | 2002-07-02 | Nec Corporation | Time sharing type multi-beam radar apparatus having alternately arranged transmitting antennas and receiving antennas |
JP2001174539A (en) | 1999-10-06 | 2001-06-29 | Nec Corp | Multi-beam ladar device |
US6636174B2 (en) | 2000-06-06 | 2003-10-21 | Altratek Inc. | System and method for detection and tracking of targets |
US20030179127A1 (en) | 2000-07-28 | 2003-09-25 | Hans-Theo Wienand | People counter |
US7048973B2 (en) | 2001-08-08 | 2006-05-23 | Mitsubishi Heavy Industries, Ltd. | Metal film vapor phase deposition method and vapor phase deposition apparatus |
US20040238857A1 (en) | 2001-08-28 | 2004-12-02 | Tessera, Inc. | High frequency chip packages with connecting elements |
CN1463161A (en) | 2002-06-01 | 2003-12-24 | 三星电子株式会社 | Device and method of correcting image movement |
US7171052B2 (en) | 2002-06-01 | 2007-01-30 | Samsung Electronics Co., Ltd. | Apparatus and method for correcting motion of image |
JP2008029025A (en) | 2002-06-27 | 2008-02-07 | Harris Corp | Highly efficient resonance line |
JP2004198312A (en) | 2002-12-19 | 2004-07-15 | Mitsubishi Electric Corp | Radar equipment |
CN1716695A (en) | 2004-06-30 | 2006-01-04 | 国际商业机器公司 | Apparatus and methods for constructing and packaging printed antenna devices |
US20060001572A1 (en) | 2004-06-30 | 2006-01-05 | Gaucher Brian P | Apparatus and method for constructing and packaging printed antenna devices |
US7317417B2 (en) | 2004-07-12 | 2008-01-08 | Orhan Arikan | Methods for detection and tracking of targets |
US7057564B2 (en) | 2004-08-31 | 2006-06-06 | Freescale Semiconductor, Inc. | Multilayer cavity slot antenna |
US20060049995A1 (en) | 2004-09-01 | 2006-03-09 | Toshikazu Imaoka | Integrated antenna type circuit apparatus |
US20060067456A1 (en) | 2004-09-27 | 2006-03-30 | Point Grey Research Inc. | People counting systems and methods |
JP2006234513A (en) | 2005-02-23 | 2006-09-07 | Toyota Central Res & Dev Lab Inc | Obstacle detection device |
US20120195161A1 (en) | 2005-05-03 | 2012-08-02 | Sonosite, Inc. | Systems and methods for ultrasound beam forming data control |
US7596241B2 (en) | 2005-06-30 | 2009-09-29 | General Electric Company | System and method for automatic person counting and detection of specific events |
US8228382B2 (en) | 2005-11-05 | 2012-07-24 | Ram Pattikonda | System and method for counting people |
WO2007060069A1 (en) | 2005-11-28 | 2007-05-31 | Nokia Siemens Networks Gmbh & Co. Kg | Method and arrangement for calibrating transmit paths of an antenna array |
US20070210959A1 (en) | 2006-03-07 | 2007-09-13 | Massachusetts Institute Of Technology | Multi-beam tile array module for phased array systems |
US20080106460A1 (en) | 2006-06-01 | 2008-05-08 | James Lynn Kurtz | Radar microsensor for detection, tracking, and classification |
US7873326B2 (en) | 2006-07-11 | 2011-01-18 | Mojix, Inc. | RFID beam forming system |
CN101490578A (en) | 2006-07-13 | 2009-07-22 | 罗伯特·博世有限公司 | FMCW-radarsensor |
US20090315761A1 (en) | 2006-07-13 | 2009-12-24 | Thomas Walter | FMCW Radar Sensor |
US7889147B2 (en) | 2007-02-23 | 2011-02-15 | Northrop Grumman Systems Corporation | Modular active phased array |
US20080238759A1 (en) | 2007-03-30 | 2008-10-02 | Honeywell International Inc. | Integrated distance measuring equipment and transponder system and method |
US20080291115A1 (en) | 2007-05-22 | 2008-11-27 | Sibeam, Inc. | Surface mountable integrated circuit packaging scheme |
US20080308917A1 (en) | 2007-06-13 | 2008-12-18 | Infineon Technologies Ag | Embedded chip package |
US20090073026A1 (en) | 2007-09-18 | 2009-03-19 | Mitsubishi Electric Corporation | Radar apparatus |
JP2009069124A (en) | 2007-09-18 | 2009-04-02 | Mitsubishi Electric Corp | Radar system |
US7692574B2 (en) | 2007-09-18 | 2010-04-06 | Mitsubishi Electric Corporation | Radar apparatus |
US20090085815A1 (en) | 2007-09-27 | 2009-04-02 | Jakab Andrew J | Tightly-coupled pcb gnss circuit and manufacturing method |
KR20090063166A (en) | 2007-12-12 | 2009-06-17 | 브로드콤 코포레이션 | Method and system for phased array antenna embedded in integrated circuit package |
US8497805B2 (en) | 2007-12-12 | 2013-07-30 | Broadcom Corporation | IC package with embedded phased array antenna |
US20090153428A1 (en) | 2007-12-12 | 2009-06-18 | Ahmadreza Rofougaran | Method and system for a phased array antenna embedded in an integrated circuit package |
US8659369B2 (en) | 2007-12-13 | 2014-02-25 | Broadcom Corporation | IC package with embedded filters |
JP2008089614A (en) | 2007-12-27 | 2008-04-17 | Hitachi Ltd | Radar sensor |
JP2011529181A (en) | 2008-07-24 | 2011-12-01 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | Distance measurement |
DE102008054570A1 (en) | 2008-12-12 | 2010-06-17 | Robert Bosch Gmbh | FMCW radar sensor for motor vehicles |
US8847814B2 (en) | 2008-12-12 | 2014-09-30 | Robert Bosch Gmbh | FMCW radar sensor for motor vehicles |
US8731502B2 (en) | 2009-02-13 | 2014-05-20 | Freescale Semiconductor, Inc. | Integrated circuit comprising frequency generation circuitry for controlling a frequency source |
US20100207805A1 (en) | 2009-02-19 | 2010-08-19 | Agd Systems Limited | Obtaining an indication of a number of moving objects passing speed detection apparatus |
US9182476B2 (en) | 2009-04-06 | 2015-11-10 | Conti Temic Microelectronic Gmbh | Radar system having arrangements and methods for the decoupling of transmitting and receiving signals and for the suppression of interference radiation |
US20160269815A1 (en) | 2009-05-18 | 2016-09-15 | Well Being Digital Limited | Earplug-Type Earphone |
US20120116231A1 (en) | 2009-05-18 | 2012-05-10 | Perception Digital Limited | Earplug-type earphone |
CN101585361A (en) | 2009-05-25 | 2009-11-25 | 郭文艺 | Anti-collision and lane-departure-prevention control device for automobile |
US20100313150A1 (en) * | 2009-06-03 | 2010-12-09 | Microsoft Corporation | Separable displays and composable surfaces |
US20120087230A1 (en) | 2009-06-17 | 2012-04-12 | Telefonaktiebolaget Lm Ericsson (Publ) | Method For Antenna Calibration In A Wideband Communication System |
US20120206339A1 (en) | 2009-07-07 | 2012-08-16 | Elliptic Laboratories As | Control using movements |
US20120313900A1 (en) * | 2009-10-07 | 2012-12-13 | Elliptic Laboratories As | User interfaces |
US20120265486A1 (en) | 2009-12-23 | 2012-10-18 | Endress + Hauser Gmbh + Co. Kg | Method for ascertaining and monitoring fill level of a medium in a container with a travel time measuring method |
US20110181509A1 (en) | 2010-01-26 | 2011-07-28 | Nokia Corporation | Gesture Control |
US20130027240A1 (en) | 2010-03-05 | 2013-01-31 | Sazzadur Chowdhury | Radar system and method of manufacturing same |
US8976061B2 (en) | 2010-03-05 | 2015-03-10 | Sazzadur Chowdhury | Radar system and method of manufacturing same |
JP2013521508A (en) | 2010-03-05 | 2013-06-10 | ユニバーシティ・オブ・ウィンザー | Radar system and manufacturing method thereof |
US20110254765A1 (en) * | 2010-04-18 | 2011-10-20 | Primesense Ltd. | Remote text input using handwriting |
US20110299433A1 (en) | 2010-06-03 | 2011-12-08 | Broadcom Corporation | Rf front-end module |
US20120092284A1 (en) * | 2010-09-30 | 2012-04-19 | Broadcom Corporation | Portable computing device including a three-dimensional touch screen |
JP2012112861A (en) | 2010-11-26 | 2012-06-14 | Fujitsu Ltd | Fm-cw radar device and pairing method |
US20120268314A1 (en) | 2011-02-11 | 2012-10-25 | Honda Elesys Co., Ltd. | Multibeam radar apparatus for vehicle, multibeam radar method, and multibeam radar program |
US9172132B2 (en) | 2011-02-17 | 2015-10-27 | Globalfoundries Inc | Integrated antenna for RFIC package applications |
US9504920B2 (en) * | 2011-04-25 | 2016-11-29 | Aquifi, Inc. | Method and system to create three-dimensional mapping in a two-dimensional game |
US20120280900A1 (en) | 2011-05-06 | 2012-11-08 | Nokia Corporation | Gesture recognition using plural sensors |
DE102011100907A1 (en) | 2011-05-09 | 2012-01-12 | Daimler Ag | Device for determining road condition of roadway located in front of vehicle, comprises imaging sensor for detecting road surface, and evaluation unit for evaluation of image data |
DE102011075725A1 (en) | 2011-05-12 | 2012-11-15 | Robert Bosch Gmbh | Method for recognizing gestures |
US8860532B2 (en) | 2011-05-20 | 2014-10-14 | University Of Central Florida Research Foundation, Inc. | Integrated cavity filter/antenna system |
US20120326995A1 (en) * | 2011-06-24 | 2012-12-27 | Ricoh Company, Ltd. | Virtual touch panel system and interactive mode auto-switching method |
WO2013009473A2 (en) | 2011-07-12 | 2013-01-17 | Sensormatic Electronics, LLC | Method and system for people counting using passive infrared detectors |
US20130106673A1 (en) | 2011-10-20 | 2013-05-02 | Waveconnex, Inc. | Low-profile wireless connectors |
KR20140082815A (en) | 2011-10-20 | 2014-07-02 | 웨이브코넥스, 아이엔씨. | Low-profile wireless connectors |
US20140324888A1 (en) | 2011-12-09 | 2014-10-30 | Nokia Corporation | Method and Apparatus for Identifying a Gesture Based Upon Fusion of Multiple Sensor Signals |
US9202105B1 (en) | 2012-01-13 | 2015-12-01 | Amazon Technologies, Inc. | Image analysis for user authentication |
CN102788969A (en) | 2012-07-04 | 2012-11-21 | 中国人民解放军海军航空工程学院 | Sea surface micromotion target detection and feature extraction method based on short-time fractional Fourier transform |
US20140028542A1 (en) | 2012-07-30 | 2014-01-30 | Microsoft Corporation | Interaction with Devices Based on User State |
JP2014055957A (en) | 2012-09-13 | 2014-03-27 | Toyota Motor Engineering & Manufacturing North America Inc | 3d short range detection with phased array radar |
US20140070994A1 (en) | 2012-09-13 | 2014-03-13 | Toyota Motor Engineering & Manufacturing North America, Inc. | 3d short range detection with phased array radar |
US10591998B2 (en) * | 2012-10-03 | 2020-03-17 | Rakuten, Inc. | User interface device, user interface method, program, and computer-readable information storage medium |
US20140145883A1 (en) | 2012-11-26 | 2014-05-29 | International Business Machines Corporation | Millimeter-wave radio frequency integrated circuit packages with integrated antennas |
CN102967854A (en) | 2012-12-07 | 2013-03-13 | 中国人民解放军海军航空工程学院 | Multi-fractal detection method of targets in FRFT (Fractional Fourier Transformation) domain sea clutter |
US20150348821A1 (en) | 2012-12-26 | 2015-12-03 | Hitachi Chemical Company, Ltd. | Expansion method, method for manufacturing semiconductor device, and semiconductor device |
US8836596B2 (en) | 2013-01-15 | 2014-09-16 | Cubic Corporation | Filter antenna |
US9413079B2 (en) | 2013-03-13 | 2016-08-09 | Intel Corporation | Single-package phased array module with interleaved sub-arrays |
US9495600B2 (en) | 2013-05-31 | 2016-11-15 | Samsung Sds Co., Ltd. | People detection apparatus and method and people counting apparatus and method |
US20160266233A1 (en) | 2013-06-13 | 2016-09-15 | Texas Instruments Incorporated | Kalman filter for indoor positioning |
CN103529444A (en) | 2013-09-27 | 2014-01-22 | 安徽师范大学 | A vehicle-mounted millimeter-wave radar moving target recognizer and recognition method |
US20160249133A1 (en) | 2013-10-07 | 2016-08-25 | Gn Netcom A/S | Earphone device with optical sensor |
US20150185316A1 (en) | 2013-10-25 | 2015-07-02 | Texas Instruments Incorporated | Techniques for angle resolution in radar |
US20150325925A1 (en) | 2013-12-18 | 2015-11-12 | Telesphor Teles Kamgaing | Embedded millimeter-wave phased array module |
US20150181840A1 (en) | 2013-12-31 | 2015-07-02 | i4c Innovations Inc. | Ultra-Wideband Radar System for Animals |
US20150212198A1 (en) | 2014-01-29 | 2015-07-30 | Fujitsu Ten Limited | Radar apparatus |
DE102014118063A1 (en) | 2014-01-29 | 2015-07-30 | Fujitsu Ten Limited | radar device |
US20170074980A1 (en) | 2014-02-24 | 2017-03-16 | Massachusetts Institute Of Technology | Object tracking via radio reflections |
US20150243575A1 (en) | 2014-02-27 | 2015-08-27 | Stats Chippac, Ltd. | Semiconductor Device and Method of Forming Encapsulated Wafer Level Chip Scale Package (EWLCSP) |
CN203950036U (en) | 2014-03-12 | 2014-11-19 | 肖令军 | A kind of have people's depopulated helicopter anti-collision system based on millimetre-wave radar range finding |
US20150277569A1 (en) | 2014-03-28 | 2015-10-01 | Mark E. Sprenger | Radar-based gesture recognition |
US20170052618A1 (en) | 2014-04-30 | 2017-02-23 | Lg Innotek Co., Ltd. | Touch device, wearable device having the same and touch recognition method |
US20150346820A1 (en) | 2014-06-03 | 2015-12-03 | Google Inc. | Radar-Based Gesture-Recognition through a Wearable Device |
US20150364816A1 (en) | 2014-06-16 | 2015-12-17 | Texas Instruments Incorporated | Millimeter wave integrated circuit with ball grid array package including transmit and receive channels |
US20170131395A1 (en) | 2014-06-25 | 2017-05-11 | University Of Washington | Devices, systems, and methods for detecting gestures using multiple antennas and/or reflections of signals transmitted by the detecting device |
US20160018511A1 (en) | 2014-07-17 | 2016-01-21 | Texas Instruments Incorporated | Distributed Radar Signal Processing in a Radar System |
US20160041618A1 (en) | 2014-08-07 | 2016-02-11 | Google Inc. | Radar-Based Gesture Sensing and Data Transmission |
US20160041617A1 (en) | 2014-08-07 | 2016-02-11 | Google Inc. | Radar-Based Gesture Recognition |
WO2016033361A1 (en) | 2014-08-27 | 2016-03-03 | Texas Instruments Incorporated | Improving the range resolution in fmcw radars |
US20160061942A1 (en) | 2014-08-27 | 2016-03-03 | Texas Instruments Incorporated | Range resolution in fmcw radars |
US20160061947A1 (en) | 2014-08-27 | 2016-03-03 | Texas Instruments Incorporated | Fmcw doppler processing algorithm for achieving cw performance |
US20160098089A1 (en) | 2014-10-02 | 2016-04-07 | Google Inc. | Non-Line-of-Sight Radar-Based Gesture Recognition |
US20160103213A1 (en) | 2014-10-08 | 2016-04-14 | Texas Instruments Incorporated | Three Dimensional (3D) Tracking of Objects in a Radar System |
US20160109566A1 (en) | 2014-10-21 | 2016-04-21 | Texas Instruments Incorporated | Camera Assisted Tracking of Objects in a Radar System |
US20160118353A1 (en) | 2014-10-22 | 2016-04-28 | Infineon Techologies Ag | Systems and Methods Using an RF Circuit on Isolating Material |
US20160135655A1 (en) | 2014-11-17 | 2016-05-19 | Samsung Electronics Co., Ltd. | Robot cleaner, terminal apparatus, and method of controlling the same |
US20160146931A1 (en) | 2014-11-21 | 2016-05-26 | Texas Instruments Incorporated | Techniques for high arrival angle resolution using multiple nano-radars |
US20160146933A1 (en) | 2014-11-25 | 2016-05-26 | Texas Instruments Incorporated | Controlling Radar Transmission to Enable Interference Mitigation |
US20160178730A1 (en) | 2014-12-23 | 2016-06-23 | Infineon Technologies Ag | RF System with an RFIC and Antenna System |
US20160306034A1 (en) | 2014-12-23 | 2016-10-20 | Infineon Technologies Ag | RF System with an RFIC and Antenna System |
US20160187462A1 (en) | 2014-12-30 | 2016-06-30 | Texas Instruments Incorporated | Multiple Chirp Generation in a Radar System |
US20160191232A1 (en) | 2014-12-31 | 2016-06-30 | Texas Instruments Incorporated | Dynamic measurement of frequency synthesizer noise spurs or phase noise |
US20160223651A1 (en) | 2015-01-29 | 2016-08-04 | Nidec Elesys Corporation | Neural network-based radar system having independent multibeam antenna |
US20160240907A1 (en) | 2015-02-12 | 2016-08-18 | Texas Instruments Incorporated | Dielectric Waveguide Radar Signal Distribution |
US20160252607A1 (en) | 2015-02-27 | 2016-09-01 | Texas Instruments Incorporated | Gesture Recognition using Frequency Modulated Continuous Wave (FMCW) Radar with Low Angle Resolution |
US20160259037A1 (en) * | 2015-03-03 | 2016-09-08 | Nvidia Corporation | Radar based user interface |
US20170060254A1 (en) | 2015-03-03 | 2017-03-02 | Nvidia Corporation | Multi-sensor based user interface |
US20180071473A1 (en) | 2015-03-16 | 2018-03-15 | Mario Agostino FERRARIO | Device for tracheal intubation |
US20160291130A1 (en) | 2015-04-06 | 2016-10-06 | Texas Instruments Incorporated | Interference Detection in a Frequency Modulated Continuous Wave (FMCW) Radar System |
US20160299215A1 (en) | 2015-04-09 | 2016-10-13 | Texas Instruments Incorporated | Circuit and method for impedance detection in millimeter wave systems |
US20160320853A1 (en) | 2015-04-30 | 2016-11-03 | Google Inc. | RF-Based Micro-Motion Tracking for Gesture Tracking and Recognition |
US20160320852A1 (en) | 2015-04-30 | 2016-11-03 | Google Inc. | Wide-Field Radar-Based Gesture Recognition |
US20160327633A1 (en) | 2015-05-05 | 2016-11-10 | Texas Instruments Incorporated | Dynamic Programming of Chirps in a Frequency Modulated Continuous Wave (FMCW) Radar System |
US20160334502A1 (en) | 2015-05-15 | 2016-11-17 | Texas Instruments Incorporated | Low Complexity Super-Resolution Technique for Object Detection in Frequency Modulation Continuous Wave Radar |
US20160349845A1 (en) | 2015-05-28 | 2016-12-01 | Google Inc. | Gesture Detection Haptics and Virtual Tools |
US20160378195A1 (en) * | 2015-06-26 | 2016-12-29 | Orange | Method for recognizing handwriting on a physical surface |
US20170033062A1 (en) | 2015-07-29 | 2017-02-02 | STATS ChipPAC Pte. Ltd. | Antenna In Embedded Wafer-Level Ball-Grid Array Package |
US20170045607A1 (en) | 2015-08-13 | 2017-02-16 | Texas Instruments Incorporated | Chirp frequency non-linearity mitigation in radar systems |
US20170054449A1 (en) | 2015-08-19 | 2017-02-23 | Texas Instruments Incorporated | Method and System for Compression of Radar Signals |
US20170070952A1 (en) | 2015-09-08 | 2017-03-09 | Texas Instruments Incorporated | Re-sampling with reduced power consumption and complexity |
US20170074974A1 (en) | 2015-09-15 | 2017-03-16 | Texas Instruments Incorporated | Method and apparatus for fmcw radar processing |
US9886095B2 (en) | 2015-09-24 | 2018-02-06 | Stmicroelectronics Sa | Device and method for recognizing hand gestures using time-of-flight sensing |
US20170090014A1 (en) | 2015-09-30 | 2017-03-30 | Texas Instruments Incorporated | Measurement of Transceiver Performance Parameters in a Radar System |
US20170090015A1 (en) | 2015-09-30 | 2017-03-30 | Texas Instruments Incorporated | Multi-Chip Transceiver Testing in a Radar System |
US20170115377A1 (en) | 2015-10-23 | 2017-04-27 | Texas Instruments Incorporated | RF/mm-Wave Peak Detector with High-Dynamic Range Calibration |
US20170139036A1 (en) | 2015-11-12 | 2017-05-18 | Texas Instruments Incorporated | Buffer sample size control for variable chirp radar |
US20170141453A1 (en) | 2015-11-17 | 2017-05-18 | Vega Grieshaber Kg | Antenna device and method for transmitting and/or receiving a signal |
US20170170947A1 (en) | 2015-12-09 | 2017-06-15 | Samsung Electronics Co. Ltd. | Method for operating switch and electronic device supporting the same |
US20170176574A1 (en) | 2015-12-18 | 2017-06-22 | Texas Instruments Incorporated | Circuits and methods for determining chirp signal linearity and phase noise of a fmcw radar |
US20170192847A1 (en) | 2015-12-31 | 2017-07-06 | Texas Instruments Incorporated | Protecting Data Memory in a Signal Processing System |
US20170201019A1 (en) | 2016-01-13 | 2017-07-13 | Infineon Technologies Ag | System and Method for Measuring a Plurality of RF Signal Paths |
US20170212597A1 (en) | 2016-01-27 | 2017-07-27 | Wipro Limited | Method and System for Recommending One or More Gestures to Users Interacting With Computing Device |
US20170364160A1 (en) | 2016-06-17 | 2017-12-21 | Texas Instruments Incorporated | Hidden markov model-based gesture recognition with fmcw radar |
US20180046255A1 (en) | 2016-08-09 | 2018-02-15 | Google Inc. | Radar-based gestural interface |
US20180101239A1 (en) | 2016-10-09 | 2018-04-12 | Alibaba Group Holding Limited | Three-dimensional graphical user interface for informational input in virtual reality environment |
US20180157330A1 (en) * | 2016-12-05 | 2018-06-07 | Google Inc. | Concurrent Detection of Absolute Distance and Relative Movement for Sensing Action Gestures |
US9935065B1 (en) | 2016-12-21 | 2018-04-03 | Infineon Technologies Ag | Radio frequency device packages and methods of formation thereof |
US10055660B1 (en) * | 2017-09-19 | 2018-08-21 | King Fahd University Of Petroleum And Minerals | Arabic handwriting recognition utilizing bag of features representation |
US20210033693A1 (en) * | 2018-03-19 | 2021-02-04 | King Abdullah University Of Science And Technology | Ultrasound based air-writing system and method |
US20190311227A1 (en) * | 2018-04-06 | 2019-10-10 | Dropbox, Inc. | Generating searchable text for documents portrayed in a repository of digital images utilizing orientation and text prediction neural networks |
US20200026361A1 (en) * | 2018-07-19 | 2020-01-23 | Infineon Technologies Ag | Gesture Detection System and Method Using A Radar Sensors |
US10386481B1 (en) * | 2018-07-19 | 2019-08-20 | King Abdullah University Of Science And Technology | Angle-of-arrival-based gesture recognition system and method |
US20200082196A1 (en) * | 2018-09-10 | 2020-03-12 | Sony Corporation | License plate number recognition based on three dimensional beam search |
US20200150771A1 (en) * | 2018-11-13 | 2020-05-14 | Google Llc | Radar-Image Shaper for Radar-Based Applications |
US20200201443A1 (en) * | 2018-12-19 | 2020-06-25 | Arizona Board Of Regents On Behalf Of Arizona State University | Three-dimensional in-the-air finger motion based user login framework for gesture interface |
US10877568B2 (en) * | 2018-12-19 | 2020-12-29 | Arizona Board Of Regents On Behalf Of Arizona State University | Three-dimensional in-the-air finger motion based user login framework for gesture interface |
US10739864B2 (en) * | 2018-12-31 | 2020-08-11 | International Business Machines Corporation | Air writing to speech system using gesture and wrist angle orientation for synthesized speech modulation |
US20200234030A1 (en) * | 2019-01-22 | 2020-07-23 | Infineon Technologies Ag | User Authentication Using mm-Wave Sensor for Automotive Radar Systems |
US20200250413A1 (en) * | 2019-02-06 | 2020-08-06 | Arizona Board Of Regents On Behalf Of Arizona State University | Systems and methods for a multifactor user identification and authentication framework for in-air-handwriting with hand geometry and deep hashing |
US20200293613A1 (en) * | 2019-03-12 | 2020-09-17 | Wipro Limited | Method and system for identifying and rendering hand written content onto digital display interface |
Non-Patent Citations (49)
Title |
---|
"BT24MTR11 Using BGT24MTR11 in Low Power Applications 24 GHz Rader," Application Note AN341, Revision: Rev 1.0, Infineon Technologies AG, Munich, Germany, Dec. 2, 2013, 25 pages. |
A CNN Based Framework for Unistroke Numeral Recognition in Air-Writing (Year: 2018). * |
Agarwal, C. et al., "Segmentation and Recognition of Text Written in 3D using Leap Motion Interface", 2015 3rd IAPR Asian Conferecne on Pattern Recognition, Nov. 1, 2015, 5 pages. |
Amma, C. et al., "Airwriting: Hands-free Mobile Text Input by Spotting and Continuous Recognition of 3d-Space Handwriting with Inertial Sensors", 2012 16th International Symposium on Wearable Computers, Jun. 18-22, 2012, 8 pages. |
Arsalan, Muhammad et al., "Character Recognition in Air-Writing Based on Network of Radars for Human-Machine Interface", IEEE Sensors Journal, vol. 19, No. 19, Oct. 1, 2019, 10 pages. |
Chen, Xiaolong et al., "Detection and Extraction of Marine Target with Micromotion via Short-Time Fractional Fourier Transform in Sparse Domain," IEEE International Conference on Signal Processing, Communications and Computing, ICSPCC, Aug. 5-8, 2016, 5 pages. |
Chuang, Cheng-Ta et al., "Applying the Kalman Filter to the Infrared-Based Touchless Positioning System with Dynamic Adjustment of Measurement Noise Features", International Microsystems, Packaging Assembly and Circuits Technology Conference, Impact, IEEE Catalog No. CFP1459B-Art, ISBN 978-1-4799-7727-7, Oct. 22, 2014, 4 pages. |
Chuanhua, Du, "FMCW Radar Range-Doppler Processing and Beam Formation Technology," Chinese Doctoral Dissertations & Master's Theses Full Text Database (Masters)—Information Science and Technology Series, China National Knowledge Infrastructure, ISSN 1674-0246, CN 11-9144/G, Dec. 16, 2004-Mar. 2015, 14 pages. |
Deacon, Peter et al., "Frequency Modulated Continuous Wave (FMCW) Radar," Design Team 6 Technical Lecture, Nov. 9, 2011, 27 pages. |
Dham, Vivek "Programming Chirp Parameters in TI Radar Devices," Application Report SWRA553, Texas Instruments, May 2017, 15 pages. |
Diederichs, Kailtyn et al., "Wireless Biometric Individual Identification Utilizing Millimeter Waves", IEEE Sensors Letters, vol. 1, No. 1, IEEE Sensors Council 3500104, Feb. 2017, 4 pages. |
Dooring Alert Systems, "Riders Matter," http:\\dooringalertsystems.com, printed Oct. 4, 2017, 16 pages. |
Filippelli, Mario et al., "Respiratory dynamics during laughter," J Appl Physiol, (90), 1441-1446, Apr. 2001, http://jap.physiology.org/content/jap/90/4/1441.full.pdf. |
Fingertip Detection and Tracking for Recognition of Air-Writing in Videos (Year: 2018). * |
Finger-Writing-In-The-Air System Using Kinect Sensor (Year: 2021). * |
Fox, Ben, "The Simple Technique That Could Save Cyclists' Lives," https://www.outsideonline.com/2115116/simple-technique-could-save-cyclists-lives, Sep. 19, 2016, 6 pages. |
Gu, Changzhan et al., "Assessment of Human Respiration Patterns via Noncontact Sensing Using Doppler Multi-Radar System", Sensors Mar. 2015, 15(3), 6383-6398, doi: 10.3390/s150306383, 17 pages. |
Guercan, Yalin "Super-resolution Algorithms for Joint Range-Azimuth-Doppler Estimation in Automotive Radars," Technische Universitet Delft, TUDelft University of Technology Challenge the Future, Jan. 25, 2017, 72 pages. |
Hazra, Souvik et al., "Robust Genre Recognition Using Millimeter-Wave Radar System", IEEE Sensors Letters, vol. 2, No. 4, Dec. 2018, 4 pages. |
Ikram, M.Z. et al., "High-Accuracy Distance Measurement Using Millimeter-Wave Radar", Texas Instruments Incorporated, Nov. 29, 2017, 5 pages. |
Inac, Ozgur et al., "A Phased Array RFIC with Built-In Self-Test Capabilities," IEEE Transactions on Microwave Theory and Techniques, vol. 60, No. 1, Jan. 2012, 10 pages. |
Killedar, Abdulraheem "XWR1xxx Power Management Optimizations—Low Cost LC Filter Solution," Application Report SWRA577, Texas Instruments, Oct. 2017, 19 pages. |
Kizhakkel, V., "Pulsed Radar Target Recognition Based on Micro-Doppler Signatures Using Wavelet Analysis", A Thesis, Graduate Program in Electrical and Computer Engineering, Ohio State University, Jan. 2013-May 2013, 118 pages. |
Kuehnke, Lutz, "Phased Array Calibration Procedures Based on Measured Element Patterns," 2001 Eleventh International Conference on Antennas and Propagation, IEEE Conf., Publ. No. 480, Apr. 17-20, 2001, 4 pages. |
Leem, Seong Kyu et al., "Detecting Mid-Air Gestures for Digit Writing with Radio Sensors and a CNN", IEEE Transactions on Instrumentation and Measurement, vol. 69, No. 4, Apr. 2020, 16 pages. |
Lim, Soo-Chul et al., "Expansion of Smartwatch Touch Interface from Touchscreen to Around Device Interface Using Infrared Line Image Sensors," Sensors 2015, ISSN 1424-8220, vol. 15, 16642-16653, doi:10.3390/s150716642, www.mdpi.com/journal/sensors, Jul. 15, 2009, 12 pages. |
Lin, Jau-Jr et al., "Design of an FMCW radar baseband signal processing system for automotive application," SpringerPlus a SpringerOpen Journal, (2016) 5:42, http://creativecommons.org/licenses/by/4.0/, DOI 10.1186/s40064-015-1583-5; Jan. 2016, 16 pages. |
MARTIN ATZMUELLER, ALVIN CHIN, FREDERIK JANSSEN, IMMANUEL SCHWEIZER, CHRISTOPH TRATTNER: "ICIAP: International Conference on Image Analysis and Processing, 17th International Conference, Naples, Italy, September 9-13, 2013. Proceedings", vol. 10705 Chap.13, 13 January 2018, SPRINGER, Berlin, Heidelberg, ISBN: 978-3-642-17318-9, article YANA BUNTUENG; ONOYE TAKAO: "Fusion Networks for Air-Writing Recognition", pages: 142 - 152, XP047460334, 032548, DOI: 10.1007/978-3-319-73600-6_13 |
Microwave Journal Frequency Matters, "Single-Chip 24 GHz Radar Front End," Infineon Technologies AG, www.microwavejournal.com/articles/print/21553-single-chip-24-ghz-radar-front-end, Feb. 13, 2014, 2 pages. |
Moazen, D. et al., "AirDraw: Leveraging Smart Watch Motion Sensors for Mobile Human Computer Interactions", 2016 31th IEEE Annual Consumer Communications & Networking Conference (CCNC), Jan. 9-12, 2016, 5 pages. |
Molchanov, P. et al., "Short-Range FMCW Monopulse Radar for Hand-Gesture Sensing", May 10-15, 2015, 6 pages. |
Molchanov, Pavlo et al., "Online Detection and Classification of Dynamic Hand Gestures with Recurrent 3D Convolutional Neural Networks", IEEE Conference on Computer Vision and Pattern Recognition, Jun. 27, 2016, 10 pages. |
Norrdine, A., "An Algebraic Solution to the Multilateration Problem", 2012 International Conference on Indoor Positioning and Indoor Navigation, Nov. 13-15, 2012, 5 pages. |
Qadir, Shahida G., et al., "Focused ISAR Imaging of Rotating Target in Far-Field Compact Range Anechoic Chamber," 14th International Conference on Aerospace Sciences & Aviation Technology, ASAT-14-241-IP, May 24-26, 2011, 7 pages. |
Richards, Mark A., "Fundamentals of Radar Signal Processing," McGraw Hill Electronic Engineering, ISBN: 0-07-144474-2, Jun. 2005, 93 pages. |
Roy, P. et al., "A CNN Based Framework for Unistroke Numeral Recognition in Air-Writing", Computer Vision and Pattern Recognition Unit, Indian Statistical Institute, Feb. 23, 2019, 1 page. |
Schroff, Florian et al., "FaceNet: A Unified Embedding for Face Recognition and Clustering," CVF, CVPR2015, IEEE Computer Society Conference on Computer Vision and Pattern Recognition; Mar. 12, 2015, pp. 815-823. |
Simon, W., et al., "Highly Integrated Ka-Band TX Frontend Module Including 8x8 Antenna Array," IMST GmbH, Germany, Asia Pacific Microwave Conference, Dec. 7-10, 2009, 63 pages. |
Suleymanov, Suleyman, "Design and Implementation of an FMCW Radar Signal Processing Module for Automotive Applications," Master Thesis, University of Twente, Aug. 31, 2016, 61 pages. |
Thayananthan, T. et al., "Intelligent target recognition using micro-Doppler radar signatures," Defence R&D Canada, Radar Sensor Technology III, Proc. of SPIE, vol. 7308, 730817, Dec. 9, 2009, 11 pages. |
Thayaparan, T. et al., "Micro-Doppler Radar Signatures for Intelligent Target Recognition," Defence Research and Development Canada, Technical Memorandum, DRDC Ottawa TM 2004-170, Sep. 2004, 73 pages. |
Wang, Saiwen et al., "Interacting with Soli: Exploring Fine-Grained Dynamic Gesture Recognition in the Radio-Frequency Spectrum", ACM, Tokyo, Japan, Oct. 16-19, 2016, 10 pages. |
Wilder, Carol N., et al., "Respiratory patterns in infant cry," Canada Journal of Speech, Human Communication Winter, 1974-75, http://cjslpa.ca/files/1974_HumComm_Vol_01/No_03_2-60/Wilder_Baken_HumComm_1974.pdf, pp. 18-34. |
Xin, Qin et al., "Signal Processing for Digital Beamforming FMCW Sar," Hindawi Publishing Corporation, Mathematical Problems in Engineering, vol. 2014, Article ID 859890, http://dx.doi.org/10.1155/2014/859890, 11 pages. |
Yana, Buntueng et al., "Fusion Networks for Air-Writing Recognition", International Conference on Pervasive Computing, Springer, XP047460334, ISBN: 978-3-642-17318-9, Jan. 13, 2018, 11 pages. |
Zhang, J. et al., "Deformable Deep Convolutional Generative Adversarial Network in Microwave Based Hand Gestrue Recognition System", College of Information Science & Electronic Engineering, Mar. 4, 2019, 6 pages. |
Zhang, J. et al., "Doppler-Radar Based Hand Festrue Recognition System Using Convolutional Neural Networks", College of Information Science & Electronic Engineering, arXiv:1711.02254v3 [cs.CV] Nov. 22, 2017, 8 pages. |
Zhang, X. et al., "A New Writing Experience: Finger Writing in the Air Using a Kinect Sensor", Multimedia at Work, University of Missouri, Jul. 2013, 9 pages. |
Zhou, Z. et al., "Dynamic Gesture Recognition with a Terahertz Radar Based on Range Profile Sequences and Doppler Signatures", School of Electronic Engineering, University of Electronic Science and Technology of China, Sensors, Dec. 21, 2017, 15 pages. |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20220083075A1 (en) * | 2020-09-15 | 2022-03-17 | Infineon Technologies Ag | Robot Guiding System and Method |
US12242277B2 (en) * | 2020-09-15 | 2025-03-04 | Infineon Technologies Ag | Robot guiding system and method |
US20230003867A1 (en) * | 2021-06-30 | 2023-01-05 | Apple Inc. | Electronic Devices with Low Signal-to-Noise Ratio Range Measurement Capabilities |
US12044769B2 (en) * | 2021-06-30 | 2024-07-23 | Apple Inc. | Electronic devices with low signal-to-noise ratio range measurement capabilities |
Also Published As
Publication number | Publication date |
---|---|
CN111722706A (en) | 2020-09-29 |
EP3712809A1 (en) | 2020-09-23 |
CN111722706B (en) | 2025-01-17 |
US11686815B2 (en) | 2023-06-27 |
US20210365711A1 (en) | 2021-11-25 |
US20200302210A1 (en) | 2020-09-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11686815B2 (en) | Character recognition in air-writing based on network of radars | |
Arsalan et al. | Character recognition in air-writing based on network of radars for human-machine interface | |
Wang et al. | A gesture air-writing tracking method that uses 24 GHz SIMO radar SoC | |
Vo et al. | Multi-sensor joint detection and tracking with the Bernoulli filter | |
US10481696B2 (en) | Radar based user interface | |
US11069365B2 (en) | Detection and reduction of wind noise in computing environments | |
CN110941331A (en) | Gesture Recognition Using 3D mmWave Radar | |
KR102254331B1 (en) | Non-contact type mid-air gesture recognization apparatus and method | |
US20120274498A1 (en) | Personal electronic device providing enhanced user environmental awareness | |
JP2017156219A (en) | Tracking device, tracking method, and program | |
CN114972416A (en) | Radar-based object tracking using neural networks | |
US12189021B2 (en) | Radar-based target tracker | |
Kwon et al. | Detection scheme for a partially occluded pedestrian based on occluded depth in lidar–radar sensor fusion | |
US12307821B2 (en) | Radar-based gesture classification using a variational auto-encoder neural network | |
Arsalan et al. | Air-writing with sparse network of radars using spatio-temporal learning | |
Amin et al. | Radar classifications of consecutive and contiguous human gross‐motor activities | |
Santhalingam et al. | Expressive asl recognition using millimeter-wave wireless signals | |
US11644560B2 (en) | Tracking radar targets represented by multiple reflection points | |
Ninos et al. | Multi-user macro gesture recognition using mmwave technology | |
Yao et al. | Exploring radar data representations in autonomous driving: A comprehensive review | |
Park et al. | Bidirectional LSTM-based overhead target classification for automotive radar systems | |
Lin et al. | Airwrite: An aerial handwriting trajectory tracking and recognition system with mmwave | |
CN117197628A (en) | Millimeter wave radar false target judging method and device based on machine learning | |
Qiu et al. | A Survey of gesture recognition using frequency modulated continuous wave radar | |
US12183015B2 (en) | Interacting multi-model tracking algorithm using rest state model |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: INFINEON TECHNOLOGIES AG, GERMANY Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SANTRA, AVIK;ARSALAN, MUHAMMAD;REEL/FRAME:048661/0104 Effective date: 20190321 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1551); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 4 |