8000 finished custom url call by Jasonqi146 Β· Pull Request #142 Β· sotopia-lab/sotopia Β· GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

finished custom url call #142

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Jul 23, 2024
Merged

finished custom url call #142

merged 2 commits into from
Jul 23, 2024

Conversation

Jasonqi146
Copy link
Member
@Jasonqi146 Jasonqi146 commented Jul 18, 2024

Closes #141

πŸ“‘ Description

Added to obtain_chain fuction of the custom API inference choice. API model name and base url is directly passed in by agent_model param.

βœ… Checks

  • My pull request adheres to the code style of this project
  • My code requires changes to the documentation
  • I have updated the documentation as required
  • All the tests have passed
  • Branch name follows type/descript (e.g. feature/add-llm-agents)
  • Ready for code review

β„Ή Additional Information

Copy link
codecov bot commented Jul 18, 2024

Codecov Report

Attention: Patch coverage is 36.36364% with 7 lines in your changes missing coverage. Please review.

Project coverage is 60.78%. Comparing base (1bdd155) to head (b5985cc).
Report is 2 commits behind head on main.

@@            Coverage Diff             @@
##             main     #142      +/-   ##
==========================================
- Coverage   60.92%   60.78%   -0.14%     
==========================================
  Files          55       55              
  Lines        2782     2790       +8     
==========================================
+ Hits         1695     1696       +1     
- Misses       1087     1094       +7     
Files Coverage Ξ”
sotopia/generation_utils/generate.py 49.62% <36.36%> (-1.16%) ⬇️

@Jasonqi146 Jasonqi146 requested a review from ProKil July 18, 2024 05:00
@ProKil
Copy link
Member
ProKil commented Jul 18, 2024

Hi @Jasonqi146, thanks for your PR.

  1. Could you help change all of "..." in model_name lines to model_name.startswith("...")?
  2. Is there anyway to mock a custom model openai api server and add a testcase?

@Jasonqi146
Copy link
Member Author

Hi @Jasonqi146, thanks for your PR.

  1. Could you help change all of "..." in model_name lines to model_name.startswith("...")?
  2. Is there anyway to mock a custom model openai api server and add a testcase?

For 2, do we need to deploy an actual Open AI on localhost?

@ProKil
Copy link
Member
ProKil commented Jul 22, 2024

OK I think I can improve the testing of this later, so maybe let's just do 1 and we can merge this.

For 2, I want to set up a small lm in the background as something we can mock and test. I think llama cpp is pretty easy to setup for this. <-- not in this PR.

@ProKil ProKil merged commit 276456d into main Jul 23, 2024
8 checks passed
@ProKil ProKil deleted the feature/custom_api branch July 23, 2024 00:29
@Kong-Aobo
Copy link

The issue addressed in that PR is different from ours. However, we have found a solution on our own, and the problem has been resolved. Thank you for your patient response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[FEAT]: Support for Custom Model Inference
3 participants
0