8000 GitHub - nkiyohara/kalman-vae: Implementation of "A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning"
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Implementation of "A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning"

License

Notifications You must be signed in to change notification settings

nkiyohara/kalman-vae

Repository files navigation

Kalman Variational Autoencoder (K-VAE)

This repository contains the PyTorch implementation of the Kalman Variational Autoencoder (K-VAE), based on the paper "A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning" (arXiv:1710.05741). It is a framework for unsupervised learning using a disentangled recognition and nonlinear dynamics model.

Getting Started

Follow these steps to set up the environment, install dependencies, and run the training and evaluation scripts.

Prerequisites

Clone the Repository

Clone the repository with submodules:

git clone --recursive https://github.com/nkgvl/kalman-vae.git
cd kalman-vae

Set Up Environment and Install Dependencies

Create a new Conda environment and install the required packages:

conda create --name kvae-env python=3.11
conda activate kvae-env

# Install dependencies from conda-forge
conda install -c conda-forge opencv pygame pymunk

# Install other specific dependencies
conda install matplotlib~=3.8.0 numpy~=1.26.0 pandas~=2.1.1 Pillow~=10.0.1 tqdm~=4.65.0 wandb~=0.15.12

# For PyTorch installation, refer to the official website to select the appropriate version and CUDA support
# Visit https://pytorch.org for instructions

Install the K-VAE Package

Install the K-VAE package using pip:

pip install .

Modify examples/run_training.sh and run the training script:

cd examples
bash run_training.sh

Evaluation

After training, modify examples/run_evaluation.sh and run the evaluation script to assess performance:

cd examples
bash run_evaluation.sh --checkpoint_dir [YOUR_CHECKPOINT_DIR] --epoch [EPOCH_NUMBER]

Evaluation videos and performance tables will be saved in the videos/ and tables/ directories under the specified checkpoint directory. For an example of the output, see the evaluation video here:

idx_2_mask_length_30.mp4

Usage

After completing the setup, you can use the K-VAE model for your research and experiments. Feel free to modify the training and evaluation scripts to explore different configurations.

Acknowledgments

License

This project is licensed under the MIT License - see the LICENSE file for details.

About

Implementation of "A Disentangled Recognition and Nonlinear Dynamics Model for Unsupervised Learning"

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

0