- Environment: Check
requirements.txt
file which was generated usingpip list --format=freeze > requirements.txt
command for the environment requirement. These files are slightly filtered manually, so there may be redundant packages. - Dataset: Download dataset (training and testing) from this link. Password is conditional_biometrics.
Ensure that datasets are located in
data
directory. Configuredatasets_config.py
file to point to this data directory by changing main path. - Pre-trained models: (Optional) The pre-trained MobileFaceNet model for fine-tuning or testing can be downloaded from this link.
- Change hyperparameters accordingly in
params.py
file. The set values used are the default, but it is possible to alternatively change them when running the python file. - Run
python training/main.py
. The training should start immediately. - Testing will be performed automatically after training is done, but it is possible to perform testing on an already trained model (see next section).
- Based on the (pre-)trained models in the
models(/pretrained)
directory, load the correct model and the architecture (innetwork
directory) usingload_model.py
file. Change the file accordingly in case of different layer names, etc. - Evaluation:
- Identification / Cumulative Matching Characteristic (CMC) curve: Run
cmc_eval_identification.py
. Based on the generated.pt
files indata
directory, runplot_cmc_roc_sota.ipynb
to generate CMC graph. - Verification / Receiver Operating Characteristic (ROC) curve: Run
roc_eval_verification.py
. Based on the generated.pt
files indata
directory, runplot_cmc_roc_sota.ipynb
to generate ROC graph.
- Identification / Cumulative Matching Characteristic (CMC) curve: Run
- Visualization:
- Gradient-weighted Class Activation Mapping (Grad-CAM): Run
grad_cam.py
, based on the selected images that are stored in a directory. The images will be generated in thegraphs
directory. - t-distributed stochastic neighbor embedding (t-SNE) : Run the Jupyter notebook accordingly. Based on the included text file in
data/visualization/tsne/img_lists
, 10 toy identities are selected to plot the t-SNE points, which will be generated in thegraphs
directory.
- Gradient-weighted Class Activation Mapping (Grad-CAM): Run
Method | Rank-1 IR (%) (Periocular Gallery) |
Rank-1 IR (%) (Face Gallery) |
EER (%) (Periocular-Face) |
---|---|---|---|
77.26 | 68.22 | 9.80 | |
64.72 | 64.01 | 13.14 | |
90.77 | 88.93 | 6.50 | |
89.34 | 85.35 | 9.41 | |
92.47 | 90.71 | 6.31 |
├── configs: Dataset path configuration file and hyperparameters. │ ├── datasets_config.py - Contains directory path for dataset files. Change 'main' in 'main_path' dictionary to point to dataset, e.g.,/home/ael_net/data
(without slash). │ └── params.py - Adjust hyperparameters and arguments in this file for training. ├── data: Dataloader functions and preprocessing. │ ├── [INSERT DATASET HERE.] │ └── data_loader.py - Generate training and testing PyTorch dataloader. Adjust the augmentations etc. in this file. Batch size of data is also determined here, based on the values set inparams.py
. ├── eval: Evaluation metrics (identification and verification). Also contains CMC and ROC evaluations. │ ├── cmc_eval_identification.py - Evaluates Rank-1 Identification Rate (IR) and generates Cumulative Matching Characteristic (CMC) curve, which are saved as.pt
files indata
directory. Use these.pt
files to generate CMC curves. │ ├── plot_cmc_roc_sota.ipynb - Notebook to plot CMC and ROC curves side-by-side, based on generated.pt
files fromcmc_eval_identification.py
androc_eval_verification.py
. Graph is generated ingraphs
directory. │ ├── plot_tSNE.ipynb - Notebook to plot t-SNE images based on the 10 identities of periocular-face toy examples. Example of text file (which correlates to the image paths) are indata/visualization/tsne/img_lists
. │ └── roc_eval_verification.py - Evaluates Verification Equal Error Rate (EER) and generates Receiver Operating Characteristic (ROC) curve, which are saved as.pt
files indata
directory. Use these.pt
files to generate ROC curves. ├── graphs: Directory where graphs are generated. │ └── CMC and ROC curve file is generated in this directory. Some evaluation images are also generated in this directory. ├── logs: Directory where logs are generated. │ └── Logs will be generated in this directory. Each log folder will contain backups of training files with network files and hyperparameters used. ├── models: Directory to store pre-trained models. Trained models are also generated in this directory. │ ├── [INSERT PRE-TRAINED MODELS HERE.] │ ├── Trained models will also be stored in this directory. │ └── The base MobileFaceNet for fine-tuning the AELNet can be downloaded in this link. ├── network: Contains loss functions and network related files. │ ├── ael_net.py - Architecture file for AELNet. │ ├── load_model.py - Loads pre-trained weights based on a given model. │ └── logits.py - Contains some loss functions that are used. └── training: Main files for training AELNet. ├── ain.py - Main file to run for training. Settings and hyperparameters are based on the files inconfigs
directory. └── train.py - Training file that is called frommain.py
. Gets batch of dataloader and contains criterion for loss back-propagation.
@ARTICLE{ael_net,
title = {Attention-aware ensemble learning for face-periocular cross-modality matching},
journal = {Applied Soft Computing},
volume = {175},
pages = {113044},
year = {2025},
issn = {1568-4946},
doi = {https://doi.org/10.1016/j.asoc.2025.113044},
author = {Tiong-Sik Ng and Andrew Beng Jin Teoh}
}