Stars
Source code for the ICML 2022 paper: "Orchestra: Unsupervised Federated Learning via Globally Consistent Clustering"
FedBE: Making Bayesian Model Ensemble Applicable to Federated Learning
Pre-training without Natural Images (ACCV 2020 Best Paper Honorable Mention Award)
Elegant PyTorch implementation of paper Model-Agnostic Meta-Learning (MAML)
Pretrained TorchVision models on CIFAR10 dataset (with weights)
A small React app to monitor ARK funds daily transactions
[NeurIPS 2020] Balanced Meta-Softmax for Long-Tailed Visual Recognition
[CVPR 2020] Rethinking Class-Balanced Methods for Long-Tailed Visual Recognition from a Domain Adaptation Perspective
higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual training steps.
NeurIPS'19: Meta-Weight-Net: Learning an Explicit Mapping For Sample Weighting (Pytorch implementation for noisy labels).
Salvaging Federated Learning by Local Adaptation
Implementation of Transformer model (originally from Attention is All You Need) applied to Time Series.
Python library of various financial technical indicators
Code for Non-convex Learning via Replica Exchange Stochastic Gradient MCMC, ICML 2020.
Personalized Federated Learning with Moreau Envelopes (pFedMe) using Pytorch (NeurIPS 2020)
Gathers machine learning and deep learning models for Stock forecasting including trading bots and simulations
InstaHide: Instance-hiding Schemes for Private Distributed Learning
Bioinformatics'2020: BioBERT: a pre-trained biomedical language representation model for biomedical text mining
Image data augmentation on-the-fly by add new class on transforms in PyTorch and torchvision.
BlueBERT, pre-trained on PubMed abstracts and clinical notes (MIMIC-III).
Continual Learning with Hypernetworks. A continual learning approach that has the flexibility to learn a dedicated set of parameters, fine-tuned for every task, that doesn't require an increase in …