Starred repositories
Code for studying the super weight in LLM
Notes from the Latent Space paper club. Follow along or start your own!
supporting pytorch FSDP for optimizers
A list of tools and clients available for the Bluesky platform
The Programmable Cypher-based Neuro-Symbolic AGI that lets you program its behavior using Graph-based Prompt Programming: for people who want AI to behave as expected
Meta Lingua: a lean, efficient, and easy-to-hack codebase to research LLMs.
Entropy Based Sampling and Parallel CoT Decoding
Official code for "RB-Modulation: Training-Free Personalization of Diffusion Models using Stochastic Optimal Control"
Codes for "Chameleon: Plug-and-Play Compositional Reasoning with Large Language Models".
An extremely fast Python package and project manager, written in Rust.
The AI Scientist: Towards Fully Automated Open-Ended Scientific Discovery 🧑🔬
Official inference repo for FLUX.1 models
A collection of AWESOME things about mixture-of-experts
[ICML2024 (Oral)] Official PyTorch implementation of DoRA: Weight-Decomposed Low-Rank Adaptation
Toolkit to segment text into sentences or other semantic units in a robust, efficient and adaptable way.
Implementation of Diffusion Transformer (DiT) in JAX
Video+code lecture on building nanoGPT from scratch
You want to contribute to an open-source project? You don't know how to do it? Here is how to.
Lumina-T2X is a unified framework for Text to Any Modality Generation
Pax is a Jax-based machine learning framework for training large scale models. Pax allows for advanced and fully configurable experimentation and parallelization, and has demonstrated industry lead…
Orbax provides common checkpointing and persistence utilities for JAX users