fix: respect contextLength from YAML model config#11807
Merged
Conversation
Three fixes for contextLength not being respected: 1. Add contextLength to YAML model schema at the model level (not just inside defaultCompletionOptions), matching JSON config behavior 2. Read contextLength from both model-level and defaultCompletionOptions, with model-level taking precedence 3. Fix GUI selector fallback using DEFAULT_MAX_TOKENS (4096) instead of DEFAULT_CONTEXT_LENGTH (32768) Closes #4638 Closes #4304
Contributor
Test Coverage ReviewThis PR fixes how Suggested Test Addition
it("should prioritize model.contextLength over defaultCompletionOptions.contextLength", async () => {
const model: ModelConfig = {
name: "test-model",
provider: "openai",
model: "gpt-4",
contextLength: 128000, // Should take precedence
defaultCompletionOptions: {
contextLength: 64000, // Should be ignored
},
};
const result = await llmsFromModelConfig({
model,
uniqueId: "test-id",
llmLogger: mockLLMLogger,
config: mockConfig,
});
expect(result).toHaveLength(1);
expect(result[0].contextLength).toBe(128000);
});
it("should fall back to defaultCompletionOptions.contextLength when model.contextLength is not set", async () => {
const model: ModelConfig = {
name: "test-model",
provider: "openai",
model: "gpt-4",
defaultCompletionOptions: {
contextLength: 64000,
},
};
const result = await llmsFromModelConfig({
model,
uniqueId: "test-id",
llmLogger: mockLLMLogger,
config: mockConfig,
});
expect(result).toHaveLength(1);
expect(result[0].contextLength).toBe(64000);
});This would ensure the prioritization logic ( No Tests Needed For
|
Replace find callbacks with includes() to offset complexity from contextLength nullish coalescing.
… YAML YAML examples use defaultCompletionOptions.contextLength, JSON examples use model-level contextLength. Also fixes ollama guide using the wrong key name (completionOptions -> defaultCompletionOptions).
3 tasks
Extract the repetitive env-to-options mapping from modelConfigToBaseLLM into a separate applyEnvOptions function, bringing complexity well under the limit of 36.
Patrick-Erichsen
approved these changes
Mar 25, 2026
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
Fixes
contextLengthnot being respected when set in YAML config, addressing a long-standing issue reported by multiple users.contextLengthto YAML model schema at the model level — previously it was only recognized insidedefaultCompletionOptions, and zod silently stripped it if placed at the model level (where JSON config supports it)contextLengthfrom both locations — model-level takes precedence overdefaultCompletionOptions.contextLengthselectSelectedChatModelContextLengthwas falling back toDEFAULT_MAX_TOKENS(4096) instead ofDEFAULT_CONTEXT_LENGTH(32768)Both of these YAML configs now work:
Closes #4638
Closes #4304
Test plan
contextLengthat model level in YAML config — verify it's respected (no truncation errors)contextLengthinsidedefaultCompletionOptions— verify it still workscontextLengthat both levels — verify model-level takes precedencemaxTokensis too close tocontextLengthat either levelSummary by cubic
Respect
contextLengthfrom YAML configs. Model-levelcontextLengthis supported and takes precedence; GUI fallback, validation, and docs now match.Bug Fixes
contextLengthto the YAML model schema at the model level in@continuedev/config-yaml, matching JSON config.contextLengthfrom the model level ordefaultCompletionOptions, with model-level taking precedence.DEFAULT_CONTEXT_LENGTHinstead ofDEFAULT_MAX_TOKENS.contextLengthandmaxTokens, warning when they’re too close.defaultCompletionOptions.contextLength; JSON examples show model-levelcontextLength. Fixed Ollama guide key name todefaultCompletionOptions.Refactors
findwithincludesfor capability checks inmodelConfigToBaseLLM.applyEnvOptionsto reduce complexity.Written for commit c254c9f. Summary will update on new commits.