Skip to content

Conversation

@rugvedS07
Copy link
Contributor

Overview

Adds LM Studio OSS support. Closes #1883

Changes

This PR enhances the behavior of --oss flag to support LM Studio as a provider. Additionally, it introduces a new flag--local-provider which can take in lmstudio or ollama as values if the user wants to explicitly choose which one to use.

If no provider is specified codex --oss will auto-select the provider based on whichever is running.

Additional enhancements

The default can be set using oss-provider in config like:

oss_provider = "lmstudio"

For non-interactive users, they will need to either provide the provider as an arg or have it in their config.toml

Notes

For best performance, set the default context length for gpt-oss to the maximum your machine can support

@github-actions
Copy link

github-actions bot commented Aug 14, 2025

All contributors have signed the CLA ✍️ ✅
Posted by the CLA Assistant Lite bot.

@rugvedS07
Copy link
Contributor Author

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Aug 14, 2025
@rugvedS07 rugvedS07 changed the title Implement LM Studio as a OSS Provider LM Studio OSS Support Aug 14, 2025
@DocMAX
Copy link

DocMAX commented Aug 29, 2025

please merge asap, thanks!

@rugvedS07 rugvedS07 force-pushed the feat/lmstudio branch 2 times, most recently from c2d54c4 to e133f11 Compare September 16, 2025 18:51
@dkundel-openai
Copy link
Contributor

@codex review

Copy link
Contributor

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Codex Review: Here are some suggestions.

About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you open a pull request for review, mark a draft as ready, or comment "@codex review". If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex fix this CI failure" or "@codex address that feedback".

@mattjcly
Copy link
Contributor

mattjcly commented Oct 6, 2025

I have read the CLA Document and I hereby sign the CLA

github-actions bot added a commit that referenced this pull request Oct 6, 2025
@yagil
Copy link

yagil commented Oct 13, 2025

Hey @easong-openai @aibrahim-oai we recently added /v1/responses compatibility support (see tweet) and updated this PR to use it. Would be really awesome and appreciated for it to be reviewed by y'all! 🙏 Cheers (cc @dkundel-openai)

@Bauvater
Copy link

pls update it

@rugvedS07 rugvedS07 force-pushed the feat/lmstudio branch 3 times, most recently from 1f21203 to 8214707 Compare October 21, 2025 19:03
@etraut-openai
Copy link
Collaborator

Thanks for the contribution, and apologies for the slow response. We had a large number of PRs submitted, and we're just now digging out from our backlog.

If you're still interested in pursuing this PR, could you please resolve the merge conflicts?

@etraut-openai etraut-openai added the needs-response Additional information is requested label Nov 4, 2025
rugvedS07 and others added 4 commits November 4, 2025 09:14
* Implement LM Studio as a OSS Provider
* Initial implementation of LM Studio as a OSS provider

* Cleanup lib.rs

* Save oss-provider in the config

* Code cleanup

* Auto-select running provider

* Tests for LMS Client

* Linter fixes

* Fallback to lms explicit path if lms is not bootstrapped

* Fix ordering and touch up test w/ validation

* Make sure profiles work with oss-provider

* Separate out oss and oss provider flags

* Add CTRL+C support in oss selection and change oss-provider to
local-provider in the cli arg only

* Refactor client code

* Update README and config

* Better comments in README

* Graceful handling for cancellation from oss selection screen and
warmup logic

* Make the warmup logic non-blocking

* Fix Cargo.lock

* Update config.md and linter fixes
Run formatter
@rugvedS07
Copy link
Contributor Author

Thanks for the contribution, and apologies for the slow response. We had a large number of PRs submitted, and we're just now digging out from our backlog.

If you're still interested in pursuing this PR, could you please resolve the merge conflicts?

Hello, no issues. I just fixed the conflicts, should be conflict free and good to review!

@yagil
Copy link

yagil commented Nov 4, 2025

Thanks @etraut-openai. Confirming we are still interested in pursuing the PR

@etraut-openai etraut-openai removed the needs-response Additional information is requested label Nov 5, 2025
youta7 added a commit to youta7/ta-codex that referenced this pull request Nov 10, 2025
youta7 added a commit to youta7/ta-codex that referenced this pull request Nov 10, 2025
Copy link
Collaborator

@etraut-openai etraut-openai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall, this looks good to me. I left a few suggestions.

In addition to reviewing the code, I also did some manual testing and confirmed that it works as advertised with both LM Studio and Ollama config setts.

Making an isolated commit to move rest of the shared code to common to
avoid circular dependencies and make it easy to revert if not the
correct place
@etraut-openai etraut-openai merged commit 837bc98 into openai:main Nov 17, 2025
25 checks passed
@github-actions github-actions bot locked and limited conversation to collaborators Nov 17, 2025
@rugvedS07 rugvedS07 deleted the feat/lmstudio branch November 17, 2025 20:51
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

LM Studio OSS

7 participants