We built this partly in response to what happened to LiteLLM this week . A supply chain attack that turned a trusted Python library into a machine-wide backdoor is a good reminder that the Python packaging ecosystem has structural risks that other languages don't.
liter-llm () provides the same provider coverage (142 LLMs) with a Rust core. No interpreter-startup side effects, .pth files, or telemetry. API keys are stored as SecretString (zeroed on drop, redacted in debug output), so the credential harvesting that made the LiteLLM incident so damaging is structurally not possible here. For anyone asking ‘who built this’- we're the team behind , a polyglot open-source Rust document intelligence framework available for 91+ doc formats.
Bindings exist for 11 languages, including Python, but also ones that had nothing like this before: Rust, Ruby, Java, Go, PHP, Elixir, C#, C, TypeScript/Node.js, and WASM. Every binding exposes the full API surface: chat, streaming, embeddings, images, speech, transcription, tool calling, and more. Caching is powered by OpenDAL (40+ backends), cost calculation uses an embedded pricing registry derived from the same source as LiteLLM, and streaming supports both SSE and AWS EventStream binary framing.
liter-llm is a client library, not a proxy- it does not include a proxy server, admin dashboard, or team management. If that's what you need, LiteLLM's proxy is still the reference there.
And of course, full credit and thank you to LiteLLM for the provider configurations we derived from their work.
GitHub:
Part of the Kreuzberg org
Join our Discord and let us know what you think: