Stars
This is a note on matrix derivatives and described my own experience in detail. Hope you'll like it.
A curated list of pretrained sentence and word embedding models
Use tensorflow.contrib.slim to training a simple CNN classification model for multi task
Multi-Task Deep Neural Networks for Natural Language Understanding
ALBERT: A Lite BERT for Self-supervised Learning of Language Representations
Swift Core ML 3 implementations of GPT-2, DistilGPT-2, BERT, and DistilBERT for Question answering. Other Transformers coming soon!
TextAttack 🐙 is a Python framework for adversarial attacks, data augmentation, and model training in NLP https://textattack.readthedocs.io/en/master/
Visualizer for neural network, deep learning and machine learning models
The example project of inferencing Pose Estimation using Core ML
[ACL 2021] LM-BFF: Better Few-shot Fine-tuning of Language Models https://arxiv.org/abs/2012.15723
Quantized Neural Networks (QNNs) on PYNQ
Official code of our work, PolicyQA: A Reading Comprehension Dataset for Privacy Policies [Findings of EMNLP 2020].
An open-source Python framework for hybrid quantum-classical machine learning.
A Python framework for sequence labeling evaluation(named-entity recognition, pos tagging, etc...)
PrivacyQA, a resource to support question-answering over privacy policies.
Pretrained language model and its related optimization techniques developed by Huawei Noah's Ark Lab.
Code and dataset of EMNLP 2020 paper "Infusing Disease Knowledge into BERT for Health Question Answering, Medical Inference and Disease Name Recognition"
UnifiedQA: Crossing Format Boundaries With a Single QA System
Code for "Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks"
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
A model library for exploring state-of-the-art deep learning topologies and techniques for optimizing Natural Language Processing neural networks
Block Sparse movement pruning
Ongoing research training transformer models at scale
MobileBERT: a Compact Task-Agnostic BERT for Resource-Limited Devices