-
Notifications
You must be signed in to change notification settings - Fork 91
Add Tool Use Streaming Support and c10-tooluse-streaming Example #58
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Tool Use Streaming Support and c10-tooluse-streaming Example #58
Conversation
Implement tooluse-streaming functionality allowing real-time capture and handling of tool calls in streaming responses. This enables applications to: - Stream tool calls as they arrive from LLMs - Detect tool calls during streaming using ChatStreamEvent::ToolCallChunk - Capture all tool calls with the new ChatOptions.capture_tool_calls flag - Access tool calls in stream end events via captured_tool_calls Adds new c10-tooluse-streaming.rs example demonstrating how to use these features with both streamed tool calls and subsequent tool response handling. Updates streamers across openai deepseek gemini adapters to support the new tool call streaming functionality.
@ClanceyLu Thanks. This is a big one. There was one that did that a while back, but it broke other things. So, I will check this one. Now, I don't understand the |
@ClanceyLu Ok, on a first scan, this looks very well made and follows everything. Just curious, why is
I did that for a reason, since some or most users might not care about it. But for tool_calls, what would be a use case where someone wouldn't care about it? If I do not capture it, then the response is incomplete and not coherent. |
You’re absolutely right — if tool_calls aren’t captured, the response would indeed be incomplete unless the user handles ChatStreamEvent::ToolCallChunk manually. That’s actually why I made capture_tool_calls an Option — to keep things flexible for advanced users who may already be aggregating tool calls from the stream themselves. Setting this option to true just gives them a ready-to-use captured_tool_calls at the end of the stream, without taking away control from users who want to process chunks incrementally. Happy to revisit this if a default true behavior makes more sense going forward. |
Ah, I see — thanks for pointing that out. I noticed there were new commits in main, so I did a rebase to keep the branch up to date. That .gemini model names change came in from there. |
@ClanceyLu Oh, you are right about the Yes, when doing the stream call, users get the tool calls as part of the stream. If they want the convenience of having it at the end as well, they use This is perfect!!! Great work! |
@ClanceyLu So, this looks very good. I will merge it. Now the question is whether it should be a new version bump to It does (rightfully) change some of the public API, so it probably qualifies for a version bump. What do you think? |
I think I am going to do a version bump. |
This PR introduces streaming support for tool use and provides a working example under examples/c10-tooluse-streaming.