10000 [Bug]: `AgentOptimizer` error when using `LLMConfig` · Issue #1780 · ag2ai/ag2 · GitHub
[go: up one dir, main page]
More Web Proxy on the site http://driver.im/
Skip to content

[Bug]: AgentOptimizer error when using LLMConfig #1780

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
giorgossideris opened this issue May 5, 2025 · 3 comments
Open

[Bug]: AgentOptimizer error when using LLMConfig #1780

giorgossideris opened this issue May 5, 2025 · 3 comments
Assignees
Labels
bug Something isn't working help from community wanted

Comments

@giorgossideris
Copy link
Collaborator
giorgossideris commented May 5, 2025

Describe the bug

When an LLMConfig object is passed as argument of the AgentOptimizer constructor, it is not accessed. This is because the constructor handles LLMConfig as a dict:

if self.llm_config in [{}, {"config_list": []}, {"config_list": [{"model": ""}]}]:
            raise ValueError(
                "When using OpenAI or Azure OpenAI endpoints, specify a non-empty 'model' either in 'llm_config' or in each config of 'config_list'."
            )
self.llm_config["config_list"] = filter_config(llm_config["config_list"], {"model": [self.optimizer_model]})
self._client = OpenAIWrapper(**self.llm_config)

This issue is related to #1774, and to address both I think that ConversableAgent's _validate_llm_config method should be classmethod of LLMConfig in order to generalize it for use cases outside agents.

Steps to reproduce

from autogen import LLMConfig
from autogen.agentchat.contrib.agent_optimizer import AgentOptimizer

llm_config = LLMConfig.from_json(path="OAI_CONFIG_LIST")
optimizer = AgentOptimizer(llm_config=llm_config, max_actions_per_step=3, optimizer_model="gpt-4o")

Raises:

OpenAIError: The api_key client option must be set either by passing api_key to the client or by setting the OPENAI_API_KEY environment variable

Model Used

No response

Expected Behavior

No response

Screenshots and logs

No response

Additional Information

No response

@giorgossideris giorgossideris added the bug Something isn't working label May 5, 2025
@harishmohanraj
Copy link
Collaborator

Hi @giorgossideris: Thanks for flagging this issue and sharing your suggestion.

@marklysze and @davorrunje, would appreciate your thoughts on the proposed approach.
@giorgossideris, if the feedback looks good, would you be open to submitting a PR with the fix?

@harishmohanraj
Copy link
Collaborator

cc @kumaranvpl

@qingyun-wu
Copy link
Contributor

@skzhang1 Please take a look!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working help from community wanted
Projects
None yet
Development

No branches or pull requests

4 participants
0