Skip to content

Conversation

@ServeurpersoCom
Copy link
Collaborator

Make sure to read the contributing guidelines before submitting a PR

server/webui: add CLI support for configuring WebUI defaults

Summary

Implements issue #17940: adds '--webui-config-file' CLI argument and 'LLAMA_WEBUI_CONFIG' environment variable to configure WebUI default settings from the server side.

Problem

WebUI settings are hardcoded in the frontend with no server-side configuration. This causes problems:

  • Server operators can't set sensible defaults for their deployment
  • Users must reconfigure settings every time localStorage is cleared
  • Multi-user servers can't enforce defaults (like disabling auto-file-paste)

Example: Issue #17940 requests ability to set 'pasteLongTextToFileLen: 0' by default, but currently every user must manually configure this in every new browser session.

The TODO that guided this implementation

While exploring the codebase, I found this intentional placeholder in 'server-models.cpp' (lines 6-16):

// TODO: add support for this on web UI
{"role", "router"},
// this is a dummy response to make sure webui doesn't break
{"default_generation_settings", {
    {"params", json{}},  // <-- Empty JSON waiting for implementation
    {"n_ctx", 0},
}},

This empty JSON was left by the original developers as a clear signal for where webui settings should go. This PR completes that TODO.

Backend Changes (C++)

What was added

1. CLI argument ('common/arg.cpp')

  • New flag: '--webui-config-file '
  • Loads JSON file with WebUI default settings

2. Environment variable ('server.cpp')

  • New env var: 'LLAMA_WEBUI_CONFIG'
  • Accepts JSON string, overrides file config
  • Example: 'LLAMA_WEBUI_CONFIG='{"pasteLongTextToFileLen":0}''

3. JSON parsing logic ('server.cpp')

  • Parses config at startup, before any router/child fork
  • Simple override: env var OR file (env var takes precedence)
  • Error handling: warn on missing file, error on invalid JSON

4. Expose in /props ('server-models.cpp' + 'server-context.cpp')

  • Added 'webui_settings' field to '/props' response
  • Works in both router mode and child mode
  • Fills in the TODO placeholder mentioned above

How it works

Server startup
  |
  v
Parse LLAMA_WEBUI_CONFIG env var (if set) OR config.json (if provided)
  |
  v
Store result in params.webui_config
  |
  v
Return in /props as webui_settings

The C++ backend is completely agnostic: it doesn't know what settings exist. It just:

  • Reads JSON
  • Passes through to frontend

This means adding new WebUI settings never requires C++ changes.

Frontend Changes (TypeScript)

What was added

1. API type ('types/api.d.ts')

  • Added optional 'webui_settings' field to props response
  • Type: 'Record<string, string | number | boolean>'

2. Syncable parameters ('services/parameter-sync.ts')

  • Added 14 WebUI settings to 'SYNCABLE_PARAMETERS':
    • pasteLongTextToFileLen
    • pdfAsImage
    • showThoughtInProgress
    • showToolCalls
    • disableReasoningFormat
    • keepStatsVisible
    • showMessageStats
    • askForTitleConfirmation
    • disableAutoScroll
    • renderUserContentAsMarkdown
    • autoMicOnEmpty
    • pyInterpreterEnabled
    • enableContinueGeneration
    • showSystemMessage
  • Updated 'extractServerDefaults()' to merge webui_settings

3. Store integration ('stores/server.svelte.ts' + 'stores/settings.svelte.ts')

  • Added getter for 'webuiSettings'
  • Plumbed through to settings store

4. Tests ('parameter-sync.spec.ts')

  • Test for webui_settings extraction
  • Validates user-specific settings are filtered out
  • Uses 'pasteLongTextToFileLen: 0' example from issue

Settings intentionally excluded

These are user-specific and should NOT be server-configurable:

  • apiKey (personal credentials)
  • systemMessage (personal prompt)
  • theme (UI preference)
  • mcpServers (personal MCP config)
  • custom (user's custom JSON)

Priority Order

Server operators set defaults, but users keep control:

1. WebUI hardcoded defaults (settings-config.ts)
2. Server config: LLAMA_WEBUI_CONFIG env var OR --webui-config-file
3. User's localStorage (always wins)

This means:

  • Server can provide sensible defaults
  • Users can still override anything
  • No breaking changes

Usage Examples

Config file

# webui-defaults.json
{
  "pasteLongTextToFileLen": 0,
  "pdfAsImage": true,
  "showThoughtInProgress": false
}

./llama-server --webui-config-file webui-defaults.json

Environment variable

LLAMA_WEBUI_CONFIG='{"pasteLongTextToFileLen":0}' ./llama-server

Both (env var overrides file)

LLAMA_WEBUI_CONFIG='{"pdfAsImage":true}' \
  ./llama-server --webui-config-file webui-defaults.json

Check it works

curl http://localhost:8080/props | jq .webui_settings

Testing

Backend

# Create test config
echo '{"pasteLongTextToFileLen":0}' > test.json

# Test file loading
./llama-server --webui-config-file test.json

# Test env var
LLAMA_WEBUI_CONFIG='{"pdfAsImage":true}' ./llama-server

# Check /props
curl http://localhost:8080/props | jq .webui_settings

Frontend

cd tools/server/webui
npm test -- parameter-sync.spec.ts

Backward Compatibility

No breaking changes:

  • 'webui_settings' is optional in /props
  • Old servers work fine (webui_settings: null or missing)
  • WebUI falls back to hardcoded defaults
  • No recompilation needed for existing deployments

What this enables

Now server operators can:

  • Set deployment-wide defaults without forking
  • Disable problematic features (like auto-file-paste)
  • Provide better defaults for their use case
  • Support multiple configurations via env vars

And adding new configurable settings only requires TypeScript changes: no C++ recompilation needed.

Fixes #17940

- Add --webui-config-file and LLAMA_WEBUI_CONFIG to configure
- WebUI defaults from server side. Exposes webui_settings in
- /props for both router and child modes

Completes TODO placeholder in server-models.cpp
- Add webui_settings support in parameter sync service
- Adds WebUI params to SYNCABLE_PARAMETERS

User-specific settings (apiKey, theme, etc.) excluded
@ServeurpersoCom
Copy link
Collaborator Author

ServeurpersoCom commented Dec 14, 2025

Testing...

cd /var/www/ia/models
/root/llama.cpp.pascal/build/bin/llama-server \
 -ngl 999 -ctk q8_0 -ctv q8_0 -fa on --mlock -np 4 -kvu --port 8082 \
 --models-max 1 --models-preset backend.ini --webui-config-file frontend.json \
 > llama-server.log 2>&1 &

frontend.json

{
 "showSystemMessage": true,
 "showThoughtInProgress": true,
 "showToolCalls": true,
 "disableReasoningFormat": false,
 "keepStatsVisible": true,
 "showMessageStats": true,
 "askForTitleConfirmation": false,
 "pasteLongTextToFileLen": 0,
 "pdfAsImage": false,
 "disableAutoScroll": false,
 "renderUserContentAsMarkdown": false,
 "autoMicOnEmpty": false,
 "samplers": "top_k;typ_p;top_p;min_p;temperature",
 "temperature": 0.8,
 "dynatemp_range": 0.0,
 "dynatemp_exponent": 1.0,
 "top_k": 40,
 "top_p": 0.95,
 "min_p": 0.05,
 "xtc_probability": 0.0,
 "xtc_threshold": 0.1,
 "typ_p": 1.0,
 "repeat_last_n": 64,
 "repeat_penalty": 1.0,
 "presence_penalty": 0.0,
 "frequency_penalty": 0.0,
 "dry_multiplier": 0.0,
 "dry_base": 1.75,
 "dry_allowed_length": 2,
 "dry_penalty_last_n": -1,
 "max_tokens": -1,
 "pyInterpreterEnabled": false,
 "enableContinueGeneration": true
}
Sans titre

@ServeurpersoCom ServeurpersoCom marked this pull request as ready for review December 14, 2025 15:39
@ServeurpersoCom
Copy link
Collaborator Author

It work

Copy link
Collaborator

@allozaur allozaur left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looking good on my end

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

Feature Request: server/webui: allow configuring server props (default webui settings) from CLI args

3 participants