Stars
React Native On-Device Machine Learning w/ Google ML Kit
A set of beautifully-designed, accessible components and a code distribution platform. Works with your favorite frameworks. Open Source. Open Code.
Starter App to Build Your Own App to Query Doc Collections with Large Language Models (LLMs) using LlamaIndex, Langchain, OpenAI and more (MIT Licensed)
An Excel-like grid component for React with custom cell editors, performant scroll & resizable columns
Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We also show you how to solve end to end problems using Llama mode…
⚡LLM Zoo is a project that provides data, models, and evaluation benchmark for large language models.⚡
骆驼(Luotuo): Open Sourced Chinese Language Models. Developed by 陈启源 @ 华中师范大学 & 李鲁鲁 @ 商汤科技 & 冷子昂 @ 商汤科技
⚛️ Fast 3kB React alternative with the same modern API. Components & Virtual DOM.
Adapted from https://note.com/kohya_ss/n/nbf7ce8d80f29 for easier cloning
🤗 Diffusers: State-of-the-art diffusion models for image, video, and audio generation in PyTorch and FLAX.
Caption-Anything is a versatile tool combining image segmentation, visual captioning, and ChatGPT, generating tailored captions with diverse controls for user preferences. https://huggingface.co/sp…
Official codebase for I-JEPA, the Image-based Joint-Embedding Predictive Architecture. First outlined in the CVPR paper, "Self-supervised learning from images with a joint-embedding predictive arch…
A GPT-4 AI Tutor Prompt for customizable personalized learning experiences.
AirLLM 70B inference with single 4GB GPU
Manipulate audio with a simple and easy high level interface
Image to prompt with BLIP and CLIP
(CVPR 2023) Pytorch implementation of “T2M-GPT: Generating Human Motion from Textual Descriptions with Discrete Representations”
A universal Stable-Diffusion toolbox
Nightly release of ControlNet 1.1
Interact with your documents using the power of GPT, 100% privately, no data leaks
This repo contains the data preparation, tokenization, training and inference code for BLOOMChat. BLOOMChat is a 176 billion parameter multilingual chat model based on BLOOM.
LLM training code for Databricks foundation models
ChatGLM-6B: An Open Bilingual Dialogue Language Model | 开源双语对话语言模型
StableLM: Stability AI Language Models