Lumina-T2X is a unified framework for Text to Any Modality Generation
-
Updated
Aug 6, 2024 - Python
Lumina-T2X is a unified framework for Text to Any Modality Generation
[NeurIPS 2024🔥] DreamClear: High-Capacity Real-World Image Restoration with Privacy-Safe Dataset Curation
OpenMusic: SOTA Text-to-music (TTM) Generation
Implementation of F5-TTS in MLX
Taming FLUX for Image Inversion & Editing; OpenSora for Video Inversion & Editing! (Official implementation for Taming Rectified Flow for Inversion and Editing.)
🔥🔥🔥Official Codebase of "DiT-3D: Exploring Plain Diffusion Transformers for 3D Shape Generation"
[ICCV 2023] Efficient Diffusion Training via Min-SNR Weighting Strategy
Adaptive Caching for Faster Video Generation with Diffusion Transformers
The official implementation of "CAME: Confidence-guided Adaptive Memory Optimization"
ArXiv paper Progressive Autoregressive Video Diffusion Models: https://arxiv.org/abs/2410.08151
Implementation of Diffusion Transformer Model in Pytorch
Implementation of F5-TTS in Swift using MLX
FORA introduces simple yet effective caching mechanism in Diffusion Transformer Architecture for faster inference sampling.
This repo implements Diffusion Transformers(DiT) in PyTorch and provides training and inference code on CelebHQ dataset
Implementation of Latent Diffusion Transformer Model in Tensorflow / Keras
[NeurIPS2024 (Spotlight)] "Unified Gradient-Based Machine Unlearning with Remain Geometry Enhancement" by Zhehao Huang, Xinwen Cheng, JingHao Zheng, Haoran Wang, Zhengbao He, Tao Li, Xiaolin Huang
A repo of a modified version of Diffusion Transformer
This repo implements Video generation model using Latent Diffusion Transformers(Latte) in PyTorch and provides training and inference code on Moving mnist dataset and UCF101 dataset
Pytorch and JAX Implementation of Scalable Diffusion Models with Transformers | Diffusion Transformers in Pytorch and JAX
A diffusion transformer implementation in Flax
Add a description, image, and links to the diffusion-transformer topic page so that developers can more easily learn about it.
To associate your repository with the diffusion-transformer topic, visit your repo's landing page and select "manage topics."