[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

Hand Gesture Echo Based on Millimeter Wave Radar; JUST; China

Citation Author(s):
Biao
Jin
Submitted by:
Biao Jin
Last updated:
Sat, 03/12/2022 - 09:47
DOI:
10.21227/sc9v-4k68
Data Format:
License:
0
0 ratings - Please login to submit your rating.

Abstract 

The millimeter-wave radar has the ability to sense the subtle movement of hand. However, the traditional hand gesture recognition methods are not robust in the scenario with dynamic interference. To address this issue, a robust hand gesture recognition method is proposed based on the self-attention time-series neural networks. Firstly, the original radar echo is constructed in terms of frame, sequence and channel at the input terminal of network. In order to extract the feature from each frame sequence independently, a one-dimensional time-series neural network is built, and the time-distributed layer is used as the wrapper. Then the self-attention mechanism is employed to assign the adequate weights to the sequence of frames entered in parallel, to obtain the inter-frame correlation and to suppress the random interference. Finally, the Global AvgPooling layer is used to reduce the number of channels, and the fully connected layer outputs the label of the gesture. The experimental results show that the proposed method can achieve a high recognition rate in the presence of 25% random dynamic interference.

Instructions: 

***\**\**  Atten-TsNN Model \**\**\***

 

This is the running program of the Atten-TsNN model. A description of the file functions as follows:

 

* main.py

 

  Load data and labels from the DataSets folder. 

 

  Preprocessing is done by calling 'FileHandle.py'. 

 

  Initialize the Atten-TsNN model by calling 'Model_torch.py'

 

  Training and validation by calling 'Trainer.py'

 

* FileHandle.py

 

  Implement data labeling, tensor transformation, dimensional transformation and so on

 

* Model_torch.py

 

  Initialize model parameters

 

* Trainer.py

 

  Training, validation and testing

 

 

 

The following is the folder description:

 

* DataSets

 

  Data set, divided into 5 categories by folder, class.txt as the label. Because the data set is too large, only a single sample is given here. For instance, 'adc1_1_Raw_0.bin' is data without interference, 'Interference1_1.bin' is data with interference.

 

* hook

 

  If you want to visualize,please set hook_flag = True,and the Input and output information of the Attention layer will be stored in the hook folder in numpy format.