vLLM
vLLM is a high-throughput and memory-efficient inference and serving engine for large language models (LLMs).
Contribute
Become a financial contributor.
Financial Contributions
Top financial contributors
Individuals
$100,000 USD since Jun 2024
$100,000 USD since Aug 2024
$10,000 USD since May 2024
$7,500 USD since Dec 2024
$700 USD since Mar 2025
$151 USD since Jan 2026
$150 USD since May 2025
$111 USD since Jun 2025
$100 USD since May 2024
$100 USD since Jun 2024
$100 USD since Jun 2024
$100 USD since Apr 2025
$100 USD since Oct 2025
$30 USD since Feb 2025
$25 USD since Jul 2024
Organizations
$100,000 USD since Jun 2024
$20,000 USD since Oct 2025
$2,400.08 USD since Jul 2024
$300 USD since May 2025
$200 USD since Aug 2024
vLLM is all of us
Our contributors 32
Thank you for supporting vLLM.
Simon Mo
Woosuk Kwon
Zhuohan Li
Sequoia
$100,000 USD
Guest
$100,000 USD
SKYWORK AI
$100,000 USD
The House Fund
$20,000 USD
Dropbox Inc.
$10,000 USD
Kindroid
$7,500 USD
GitHub Sponsors
$2,400 USD
Aman Bhargava
$700 USD
Budget
Transparent and open finances.
Credit from Ekagra Ranjan to vLLM •
Debit from vLLM to Kaichao You •
$218,762.36 USD
$307,024.75 USD
$88,262.39 USD
$23,158.95 USD