8000 GitHub - revbucket/openai_batch: Tooling for OpenAI batch inference
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

revbucket/openai_batch

Repository files navigation

openai_batch

Some internal tooling to do OpenAI inference. Will ultimately put output data in the merge_dir with responses in the openai_response field.

Usage:

  • In the experiments/ directory, make a config json
  • Then either run with the sandbox command to do the full thing. Or do a flow of upload, check, merge

Example

Use the interactive jupyter notebook to build a config file. Then run either:

--command sandbox \
--config experiments/example/config.json \
--status-file experiments/example/status.json \
--experiment exp_name_goes_here \
--wait \
--interval 10

Or if you want to do this in several steps:

--command upload \
--config experiments/example/config.json<
5C70
/span> \
--status-file experiments/example/status.json \
--experiment-description exp_name_goes_here

then

--command check \
--config experiments/example/config.json \
--status-file experiments/example/status.json \
--wait \
--interval 10

and finally

--command merge \
--config experiments/example/config.json \
--status-file experiments/example/status.json

Some TODOs/Improvements

Here are some features that might be nice to have in the future:

  • S3 support
  • More sophisticated tracking than a json (maybe really important if lots of files)
  • Estimated cost before submitting jobs (or after running things)

About

Tooling for OpenAI batch inference

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published
0