Skip to content

Conversation

@aibrahim-oai
Copy link
Collaborator

Summary

  • update documentation, example configs, and automation defaults to reference gpt-5.1 / gpt-5.1-codex
  • bump the CLI and core configuration defaults, model presets, and error messaging to the new models while keeping the model-family/tool coverage for legacy slugs
  • refresh tests, fixtures, and TUI snapshots so they expect the upgraded defaults

Testing

  • cargo test -p codex-core config::tests::test_precedence_fixture_with_gpt5_profile

Codex Task


#[cfg(target_os = "windows")]
pub const OPENAI_DEFAULT_MODEL: &str = "gpt-5";
pub const OPENAI_DEFAULT_MODEL: &str = "gpt-5.1";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

let's double check with @dylan-hurd-oai

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

:shipit:

} else if slug.starts_with("gpt-3.5") {
model_family!(slug, "gpt-3.5", needs_special_apply_patch_instructions: true)
} else if slug.starts_with("test-gpt-5-codex") {
} else if slug.starts_with("test-gpt-5") {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

??

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to include test-gpt-5.1

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it has codex prompt

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

oh yeah I wanted to support gpt-5.1-codex

just for tests

fn model_mapping_builds_tokenizer() -> Result<(), TokenizerError> {
// Choose a long-standing model alias that maps to cl100k_base.
let tok = Tokenizer::for_model("gpt-5")?;
let tok = Tokenizer::for_model("gpt-5.1")?;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

does this actually work or do we get a fallback?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fallback

Copy link
Collaborator

@dylan-hurd-oai dylan-hurd-oai left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Should we introduce const variables for default gpt and gpt-x-codex?

@pakrym-oai
Copy link
Collaborator

Is there a bad merge with this PR? Look like a lot of unrelated changes.

@aibrahim-oai aibrahim-oai merged commit ddcc60a into main Nov 18, 2025
25 checks passed
@aibrahim-oai aibrahim-oai deleted the codex/update-models-to-gpt-5.1-and-gpt-5.1-codex branch November 18, 2025 01:40
@github-actions github-actions bot locked and limited conversation to collaborators Nov 18, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants