10000 GitHub - Smith42/astroPT: Transformer based foundation model for galaxy images (and general astronomy)
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

Smith42/astroPT

Repository files navigation

astroPT_shoggoth

ICML arXiv arXiv License: AGPL-v3 All Contributors

AstroPT: a Large Observation (foundation) Model for astronomy πŸ”­

Welcome to our simple repository for training astronomical large observation models. This repository began its life as Andrej Karpathy's nanoGPT, and has been altered so that it is usable for astronomical observation data. Within train.py you will find a ~300-line boilerplate training loop and within model.py you will find a ~300-line GPT model definition with an MLP tokeniser and a regressive loss.

Check out the UniverseTBD Discord for updates: discord.gg/MNEVegvfJq

Read the docs here: astropt.readthedocs.io

How does AstroPT work?

AstroPT is an autoregressive transformer under the hood.

Similarly to language models that predict the next word in a sentence, AstroPT processes sequences of astronomical data chunks to predict what comes next.

The intuition here is that this next-token-prediction task requires the model to internalise some understanding of the physical processes underlying the training data.

This is just like how a text GPT needs to have some knowledge of geography to guess a country's capital given a description of that country, or some knowledge of coding to write compilable Fortran.

Below we can see this principle applied to a galaxy image, where we split the image into chunks and pass them into an AstroPT model:

galaxy_imΒ Β Β Β astroPT_archΒ Β 

Of course we can apply this next-token-prediction task across many modalities due to its flexibility.

Check out our work on Euclid data for an example, where we chain galaxy image tokens and spectral energy distribution data and pass them into a single, unified AstroPT model.

I just want to run it! πŸ—£οΈ

Okay I hear you! First you need to install the model:

Install

You can install via pip from PyPI:

pip install astropt

Or if you install locally via a git clone, you can uv install via:

uv sync

Load a pre-trained model

To load and run a pre-trained AstroPT model from HuggingFace you can use the load_astropt function:

from astropt.model_utils import load_astropt

model, model_args = load_astropt(
    repo_id="smith42/astropt_sparse",
    path="astropt/p16k10",
    weights_filename="ckpt.pt",
)
model = model.to("cuda")

where repo_id is the HuggingFace repository ID, and path is the path within the repository that contains the AstroPT model checkpoint.

Pre-trained models

Below are some pre-trained models you can load with the code snippet above. Please make sure that you are using the correct version of AstroPT to load these!

Survey Modalities AstroPT version Model weights Dataset Paper
DESI Legacy Survey JPG galaxy imagery v1.0.0 AstroPT Galaxies Dataset arXiv:2405.14930
Euclid FITS VIS, NISP galaxy imagery and SED data v1.0.2 AstroPT-Euclid Euclid Training Dataset arXiv:2503.15312

Scripts for pre-training and processing data

Check out scripts for a collection of all the scripts we have used to get the results in these papers, and scripts/train.py for an example boilerplate script for training your own AstroPT. config contains example user configurations.

Contributors

Ryan Roberts
Ryan Roberts

πŸ’» πŸ€” πŸ–‹
Mike Smith
Mike Smith

πŸ’» πŸ€” πŸ–‹ πŸ”£
mhuertascompany
mhuertascompany

πŸ€” πŸ–‹
Malgorzata Siudek
Malgorzata Siudek

πŸ€” πŸ–‹ πŸ’» πŸ”£
gimarso
gimarso

πŸ€” πŸ’»
Add your contributions

About

Transformer based foundation model for galaxy images (and general astronomy)

Topics

Resources

License

Code of conduct

Stars

Watchers

Forks

Packages

No packages published

Contributors 4

  •  
  •  
  •  
  •  
0