-
Notifications
You must be signed in to change notification settings - Fork 6.8k
LM Studio OSS Support #2312
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LM Studio OSS Support #2312
Conversation
|
All contributors have signed the CLA ✍️ ✅ |
|
I have read the CLA Document and I hereby sign the CLA |
|
please merge asap, thanks! |
c2d54c4 to
e133f11
Compare
|
@codex review |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Codex Review: Here are some suggestions.
About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".
e133f11 to
fa8c2bf
Compare
|
I have read the CLA Document and I hereby sign the CLA |
|
Hey @easong-openai @aibrahim-oai we recently added |
|
pls update it |
1f21203 to
8214707
Compare
|
Thanks for the contribution, and apologies for the slow response. We had a large number of PRs submitted, and we're just now digging out from our backlog. If you're still interested in pursuing this PR, could you please resolve the merge conflicts? |
* Implement LM Studio as a OSS Provider * Initial implementation of LM Studio as a OSS provider * Cleanup lib.rs * Save oss-provider in the config * Code cleanup * Auto-select running provider * Tests for LMS Client * Linter fixes * Fallback to lms explicit path if lms is not bootstrapped * Fix ordering and touch up test w/ validation * Make sure profiles work with oss-provider * Separate out oss and oss provider flags * Add CTRL+C support in oss selection and change oss-provider to local-provider in the cli arg only * Refactor client code * Update README and config * Better comments in README * Graceful handling for cancellation from oss selection screen and warmup logic * Make the warmup logic non-blocking * Fix Cargo.lock * Update config.md and linter fixes
8214707 to
d70c3d9
Compare
Run formatter
d70c3d9 to
b6c8718
Compare
Hello, no issues. I just fixed the conflicts, should be conflict free and good to review! |
|
Thanks @etraut-openai. Confirming we are still interested in pursuing the PR |
etraut-openai
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall, this looks good to me. I left a few suggestions.
In addition to reviewing the code, I also did some manual testing and confirmed that it works as advertised with both LM Studio and Ollama config setts.
Making an isolated commit to move rest of the shared code to common to avoid circular dependencies and make it easy to revert if not the correct place
Overview
Adds LM Studio OSS support. Closes #1883
Changes
This PR enhances the behavior of
--ossflag to support LM Studio as a provider. Additionally, it introduces a new flag--local-providerwhich can take inlmstudioorollamaas values if the user wants to explicitly choose which one to use.If no provider is specified
codex --osswill auto-select the provider based on whichever is running.Additional enhancements
The default can be set using
oss-providerin config like:For non-interactive users, they will need to either provide the provider as an arg or have it in their
config.tomlNotes
For best performance, set the default context length for gpt-oss to the maximum your machine can support