Skip to content

A minimal LLM Gateway based on OpenResty with model routing and multi-provider support | 基于 OpenResty 的极简 LLM 网关

Notifications You must be signed in to change notification settings

onewesong/one-api-nginx

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

one-api-nginx(OpenResty LLM Gateway)

基于 OpenResty 实现的最小可运行 LLM Gateway:

  • Authorization: Bearer <key> 鉴权(与上游 key 解耦)
  • 根据请求体 model 自动路由到支持该模型的上游 provider
  • 白名单透传接口
  • GET /v1/models 聚合返回可用模型列表

架构图

                +----------------------+
                |      Client/User     |
                |  OpenAI-compatible   |
                +----------+-----------+
                           |
                           | HTTP: /v1/* (model in JSON)
                           v
                 +---------+----------+
                 |   OpenResty GW     |
                 |  (Nginx + Lua)     |
                 |--------------------|
                 |  access_by_lua:    |
                 |   - whitelist      |
                 |   - parse model    |
                 |   - route provider |
                 |   - inject upstream|
                 +----+----------+----+
                      |          |
                      |          | Redis (control-plane)
                      |          v
                      |   +------+------+
                      |   |    Redis    |
                      |   | gw:key:*    |
                      |   | gw:model:*  |
                      |   | gw:provider:|
                      |   +-------------+
                      |
                      | proxy_pass (transparent)
                      v
        +-------------------+   +-------------------+   +-------------------+
        |  Provider A       |   |  Provider B       |   |  Provider C       |
        | (OpenAI/...)      |   | (Anthropic/...)   |   | (Self-host/...)   |
        +-------------------+   +-------------------+   +-------------------+

启动

docker compose up -d
export OPENAI_REAL_KEY="你的上游 key"
./scripts/redis_seed.sh

网关监听:http://localhost:8080

快速验证

curl -sS http://localhost:8080/v1/models | jq
curl -sS http://localhost:8080/health | jq

透传示例(需要真实上游 key)

curl -sS http://localhost:8080/v1/chat/completions \
  -H "Authorization: Bearer demo" \
  -H "Content-Type: application/json" \
  -d '{"model":"gpt-4o-mini","messages":[{"role":"user","content":"hi"}]}'

设计文档

docs/design/openresty-llm-gateway.md

运维 CLI(gwctl)

仓库内提供一个简单的运维工具,用于维护 Redis 中的 key/provider/model 映射:

./ops/gwctl help
./ops/gwctl redis ping

./ops/gwctl provider add openai --base-url https://api.openai.com --auth-type bearer --auth-value "$OPENAI_REAL_KEY"
./ops/gwctl provider list
./ops/gwctl provider models openai
./ops/gwctl provider edit openai --auth-value "$OPENAI_REAL_KEY"
./ops/gwctl provider del openai --purge
./ops/gwctl key add demo --providers openai --status active --policy rr --model-allow "*"
./ops/gwctl model bind gpt-4o-mini --provider openai

调试日志

GATEWAY_DEBUG=1 开启路由调试日志(输出到 docker compose logs gateway):

docker compose up -d
docker compose exec -T gateway sh -lc 'echo $GATEWAY_DEBUG'
docker compose logs -f gateway

About

A minimal LLM Gateway based on OpenResty with model routing and multi-provider support | 基于 OpenResty 的极简 LLM 网关

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published