-
Notifications
You must be signed in to change notification settings - Fork 513
[FR]: Support openai responses api in track_openai #1960
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @wezham-f10, I have a good news for you, our Python SDK already supports the responses API. I updated our documentation to reflect that #1961. Did you already tried monitoring OpenAI responses calls with Opik? Did you face an error? |
Awesome thank you. I didn't get any errors. However I wasn't seeing any trace capture without manually adding |
@wezham-f10 Do you mean that you added Our OpenAI integration should supports both sync and async calls, can you share a bit of code? Is your function async as well? |
Yes correct. This is the setup Client setup def run():
openapi_client = track_openai(
openai.AsyncOpenAI(api_key=get_secret_string("openapi-api-key")),
project_name="project",
)
limit = asyncio.Semaphore(600)
query_coroutines = [
predict(
openapi_client
...
]
result: list[Result] = asyncio.run(asyncio.gather(*query_coroutines)) Call to openAI @track(project_name="project", name="project")
async def predict(
openapi_client: openai.AsyncOpenAI,
prompt: str,
search_query: SearchQuery,
limit: asyncio.Semaphore,
):
logger.info("Current limit", limit=limit._value)
async with limit:
if limit.locked():
logger.info("Sleeping for 1 minute")
await asyncio.sleep(60)
response = await openapi_client.responses.parse(
model=DEFAULT_MODEL,
instructions=prompt,
input=search_query.model_dump_json(),
text_format=Recommendation,
reasoning={"effort": "medium"},
)
return response.output_parsed |
@wezham-f10 at the moment we don't have integration with |
okay great, good to know. Do you have a PR you can share that implements it for |
@wezham-f10 sure, https://github.com/comet-ml/opik/pull/1722/files#diff-8c8a7c4d7bbdabdc8690756946beed447473bfd110da349d4ccdf367750e809e There are more than openai changes in this PR, but the main piece of work is to implement a customized decorator (inside |
Hi @wezham-f10, we are gonna implement support for |
Awesome thanks very much |
Thanks for getting this done |
Proposal summary
As noted here
OpenAI has released the responses API which now includes methods
responses.parse
andresponses.create
. The track_openai method should be extended to support these new methods.Motivation
No response
The text was updated successfully, but these errors were encountered: