-
Tsinghua University
- Hangzhou, China
-
08:34
(UTC +08:00) - https://ericli0419.github.io/
- https://scholar.google.com/citations?user=_Me9AmsAAAAJ&hl=en
Starred repositories
Accelerate Molecular Biology Research with Machine Learning
Unified Efficient Fine-Tuning of 100+ LLMs & VLMs (ACL 2024)
Development repository for the Triton language and compiler
Implementation of Rotary Embeddings, from the Roformer paper, in Pytorch
Accelerated, Python-only, single-cell integration benchmarking metrics
Orthrus is a mature RNA model for RNA property prediction. It uses a mamba encoder backbone, a variant of state-space models specifically designed for long-sequence data, such as RNA.
BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)
Benchmarking DNA Language Models on Biologically Meaningful Tasks
Python wrapper -- and more -- for BEDTools (bioinformatics tools for "genome arithmetic")
Explore a comprehensive collection of basic theories, applications, papers, and best practices about Large Language Models (LLMs) in genomes.
GENA-LM is a transformer masked language model trained on human DNA sequence.
Tutorial on large language models for genomics
🧬 Nucleotide Transformer: Building and Evaluating Robust Foundation Models for Human Genomics
从无名小卒到大模型(LLM)大英雄~ 欢迎关注后续!!!
End-to-end analysis of spatial multi-omics data
GENERator: A Long-Context Generative Genomic Foundation Model
Understanding Deep Learning - Simon J.D. Prince
Jupyter notebooks for the Natural Language Processing with Transformers book
we want to create a repo to illustrate usage of transformers in chinese
PyTorch Tutorial for Deep Learning Researchers
Arc Virtual Cell Atlas
Course to get into Large Language Models (LLMs) with roadmaps and Colab notebooks.