-
Notifications
You must be signed in to change notification settings - Fork 20
feat: add discovery call to the llm gateway service #1152
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
6b69c29 to
d6e3496
Compare
d6e3496 to
6aa63ee
Compare
| """Initialize the LLM discovery service.""" | ||
| super().__init__(config=config, execution_context=execution_context) | ||
|
|
||
| def get_available_models(self, agenthub_config: str) -> list[dict[str, Any]]: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@radu-mocanu maybe we should make AgentHub service a first-level service? reference #1151
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, good idea
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
#1152 (comment)
this is actually a good idea given we have multiple use cases for ahub.
@ionut-mihalache-uipath let s wait a bit with this PR if possible. I'll expose AgentHub as a platform service and we can rebase on top of it
|
Summary
Changes