-
Notifications
You must be signed in to change notification settings - Fork 1.2k
Add streaming support for zero shot inference #3878
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Very cool!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@arnavgarg1 This is great -- I just made some minor comments (please let me know whether or not they will help). Thank you!
@alexsherstinsky I think I had a poor interface defined! Can you take a look now and see if it makes sense and your comments are addressed? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM -- so nice!
This PR introduces a new boolean flag called
streaming
to the LudwigModel.generate() API which allows users to see streaming output when performing zero-shot inference on single or multiple samples.Demo
Screen.Recording.2024-01-11.at.7.43.05.AM.mov
Config to reproduce the demo video