8000 GitHub - xizaoqu/WorldMem: [ArXiv 2025] WORLDMEM: Long-term Consistent World Simulation with Memory
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

[ArXiv 2025] WORLDMEM: Long-term Consistent World Simulation with Memory

License

Notifications You must be signed in to change notification settings
8000

xizaoqu/WorldMem

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


WORLDMEM Icon

WorldMem: Long-term Consistent World Simulation
with Memory

Zeqi Xiao1 Yushi Lan1 Yifan Zhou1 Wenqi Ouyang1 Shuai Yang2 Yanhong Zeng3 Xingang Pan1
1S-Lab, Nanyang Technological University,
2Wangxuan Institute of Computer Technology, Peking University,
3Shanghai AI Laboratry

demo.1.1.mp4

Installation

conda create python=3.10 -n worldmem
conda activate worldmem
pip install -r requirements.txt

Quick start

python app.py

TODO

  • Release inference models and weights;
  • Release training pipeline on Minecraft;
  • Release training data on Minecraft;

🔗 Citation

If you find our work helpful, please cite:

@misc{xiao2025worldmemlongtermconsistentworld,
      title={WORLDMEM: Long-term Consistent World Simulation with Memory}, 
      author={Zeqi Xiao and Yushi Lan and Yifan Zhou and Wenqi Ouyang and Shuai Yang and Yanhong Zeng and Xingang Pan},
      year={2025},
      eprint={2504.12369},
      archivePrefix={arXiv},
      primaryClass={cs.CV},
      url={https://arxiv.org/abs/2504.12369}, 
}

👏 Acknowledgements

  • Diffusion Forcing: Diffusion Forcing provides flexible training and inference strategies for our methods.
  • Minedojo: We collect our Minecraft dataset from Minedojo.
  • Open-oasis: Our model architecture is based on Open-oasis. We also use pretrained VAE and DiT weight from it.

About

[ArXiv 2025] WORLDMEM: Long-term Consistent World Simulation with Memory

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

0