- Netherlands
- https://stephanheijl.com/
Stars
Implementation of ST-Moe, the latest incarnation of MoE after years of research at Brain, in Pytorch
An AI-powered research assistant that performs iterative, deep research on any topic by combining search engines, web scraping, and large language models. The goal of this repo is to provide the si…
⚡ TabPFN: Foundation Model for Tabular Data ⚡
LayerNorm(SmallInit(Embedding)) in a Transformer to improve convergence
Collection of my work and learnings throughout my Master's in Applied Data Science from 2023 to 2025
A curated list of awesome things related to Applied Data Science
GEITje 7B: een groot open Nederlands taalmodel
Official repository for the ProteinGym benchmarks
Reimplementation of RaSP, a deep neural network for rapid protein stability prediction, in PyTorch.
Making Protein Design accessible to all via Google Colab!
The official PyTorch implementation of the paper "Human Motion Diffusion Model"
Official repository for the paper "Tranception: Protein Fitness Prediction with Autoregressive Transformers and Inference-time Retrieval"
FauxPilot - an open-source alternative to GitHub Copilot server
Implementation of Tranception, an attention network, paired with retrieval, that is SOTA for protein fitness prediction
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction of AlphaFold 2
Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
Making Protein folding accessible to all!
Implementation of SE3-Transformers for Equivariant Self-Attention, in Pytorch. This specific repository is geared towards integration with eventual Alphafold2 replication.
Get protein embeddings from protein sequences
Open source code for AlphaFold 2.
NFNets and Adaptive Gradient Clipping for SGD implemented in PyTorch. Find explanation at tourdeml.github.io/blog/
Dataframes powered by a multithreaded, vectorized query engine, written in Rust
ProtTrans is providing state of the art pretrained language models for proteins. ProtTrans was trained on thousands of GPUs from Summit and hundreds of Google TPUs using Transformers Models.
Listing of papers about machine learning for proteins.