LocalRAG! icon
AI Document Chat — Ask Your PDFs, EPUBs & More

LocalRAG! is AI for your documents. Ask questions about your PDFs, Office docs, EPUBs, and images. On-device AI (local LLM) works without internet, or use Claude for higher accuracy.

Available on iPhone, iPad, and Android

1-week free trial included

LocalRAG! app screenshot

What's New

v2.12026-03-26

Inline Citations, PDF Highlights & Cross-Language Search

AI answers now show [1][2] source references — tap to jump to the exact page with yellow highlight. On-device embedding model (E5 Multilingual) enables semantic search across languages. Upgraded to BM25 search for more accurate retrieval. Plus: OCR editor, layout-aware PDF table extraction, and search pipeline transparency.

CitationsPDF HighlightCross-LanguageBM25
v2.02026-03-22

On-Device AI — Zero Cloud, Full Privacy

Built-in local LLM (Qwen 3.5 4B) runs entirely on your device. No internet, no API calls, no data sent anywhere. Available on iOS and Android (beta). Download the model once (~3 GB) and chat with your documents completely offline.

Local LLMOfflinePrivacy

See What You Can Do

Import your documents, ask questions in plain language, and get AI-powered answers — all processed on your device.

LocalRAG! AI chat screen with document Q&A

Ask Your Documents Anything

"What are the termination clauses?" "Summarize chapter 3." Just type a question in plain language and LocalRAG! searches across all your documents to find the most relevant passages. No need to open files one by one — AI reads everything for you.

LocalRAG! document list showing multiple file formats

15 Formats, One App

PDF, EPUB, Word, Excel, PowerPoint, images, scanned docs — 15 file formats in a single collection, searchable all at once. Whiteboard photos and scanned PDFs are automatically processed with OCR to extract text for AI-powered Q&A.

LocalRAG! answer with source citations

Answers You Can Verify

Every AI response includes page-level source citations like "Service-Agreement.pdf (p.12)" — so you always know exactly which document and page the answer came from. Never blindly trust AI — verify with the original source.

LocalRAG! AI mode selection screen showing on-device option

Offline AI — Zero Cloud

The built-in local LLM (Qwen3 4B) runs entirely on your device. No internet connection needed, zero data sent to any server. Feel safe asking AI about confidential documents and personal files. Need higher accuracy? Switch to Claude API anytime.

Any file. One conversation.

PDF, Word, EPUB — no matter the format, just add them to a collection. AI searches across all your documents at once.

All your project docs in one place

All your project docs in one place

Meeting PDFs, Word minutes, text memos — formats don't matter. Add them to a collection and ask "What was decided about that issue?" No more opening files one by one.

Bought books you haven't read? No problem

Bought books you haven't read? No problem

Import your unread PDFs and EPUBs together. Without reading a page, ask "What's the key point of this book?" or "Explain the method in Chapter 3." AI reads every page for you.

Search across multiple technical standards

Search across multiple technical standards

ISO PDFs, RFC text files, internal Word policies — load them all into one collection. Ask "What are the authentication requirements?" and LocalRAG finds relevant sections across every document.

Just add everything. Ask AI later.

Just add everything. Ask AI later.

Insurance policies, user manuals, contracts, recipes. Add anything to a collection. You may forget what's in there, but LocalRAG never does. When you need it, just ask — it tells you which document has the answer.

See it in action

Watch how LocalRAG! turns your documents into conversations.

On-Device AI vs. Claude API

Two ways to chat with your documents. Choose what fits your needs.

On-Device AIClaude API
Internet requiredNo — fully offlineYes
Data sent to serverNone — zero cloudQuestion + relevant excerpts only
Answer qualityGood (4B model)Excellent (Opus / Sonnet / Haiku)
Response speed~10 sec to a few min~3 sec
Model download~3 GB (one-time)Not required
PlatformiOS & Android (beta)iOS & Android

Both modes process documents (text extraction, indexing, search) entirely on your device. The on-device AI takes this further by generating answers locally too — nothing ever leaves your device.

Built for Every Professional

Students

Students

"I imported 12 textbook PDFs and asked 'Explain the difference between TCP and UDP.' Got a clear answer with page numbers — no more flipping through 3,000 pages."

Researchers

Researchers

"I loaded 30 papers into one collection and asked 'Which studies use transformer-based models for medical imaging?' Found 8 relevant papers in seconds."

Legal Professionals

Legal Professionals

"I reviewed a 120-page NDA by asking 'Are there any non-compete clauses?' It found two — on pages 34 and 78 — that I would have missed in a manual review."

Business

Business

"Our team's SOPs, compliance docs, and meeting notes are all in one collection. New hires just ask 'What's the approval process for vendor contracts?' and get the answer instantly."

Everything you need

Powerful document AI that respects your privacy.

📄

15 Formats Supported

PDF, EPUB, DOCX, XLSX, PPTX, TXT, MD, CSV, RTF, HTML, JPG, PNG, HEIC, WebP — plus scanned and copy-protected PDFs.

💬

Natural Language Chat

Ask questions about your documents in plain language. Get answers with source citations.

📍

Inline Citations & PDF Jump

AI answers show [1][2] source references you can tap. Jump directly to the cited page with yellow highlight on the exact passage.

🔒

Privacy First

All document processing happens on your device. Your files never leave your phone or tablet.

🔍

BM25 + Cross-Language Search

Okapi BM25 search with on-device embedding model (E5 Multilingual) for semantic search across languages. Ask in Japanese, find answers in English documents.

🖼

Image OCR + Editor

Import photos of whiteboards, receipts, or scanned documents. On-device OCR extracts text — and you can edit the extracted text before indexing.

📂

Collections (Folders)

Organize documents in nested folders up to 3 levels. Parent scope includes all child documents for focused Q&A.

📴

Offline AI Mode

Built-in local LLM runs entirely on your device — no internet, no API calls, no data sent anywhere. True zero-cloud document AI.

🤖

Model Selection

Use the on-device AI for fully offline operation, or choose Claude Sonnet 4.5 / Haiku 4.5 / Opus 4 for higher accuracy.

How it works

Three simple steps to chat with your documents.

1

Import

Add documents from your file manager or other apps. 15 formats supported.

2

Index

LocalRAG! extracts text and builds a search index — entirely on your device.

3

Ask

Ask anything about your documents. Get AI-powered answers with citations.

Supported formats

All the document types you work with, in one app.

PDFEPUBDOCXXLSXPPTXTXTMDCSVRTFHTMLJPG/PNGHEIC/WebPScanned PDF

Frequently Asked Questions

Is LocalRAG! truly local? What data is sent to the cloud?

Document processing — text extraction, chunking, and search indexing — happens entirely on your device. Your files are never uploaded anywhere. When you use the on-device AI (local LLM), everything runs completely offline — zero data leaves your device. When you use Claude, only your question and the top 10 most relevant text snippets (~500 characters each) are sent to the Claude API. The full document is never transmitted.

How is this different from ChatPDF, PDF.ai, or other cloud-based tools?

Cloud-based services require you to upload your entire document to their servers. LocalRAG! keeps your files on your device and only sends small, relevant excerpts when you ask a question. Plus, with the built-in on-device AI, you can go fully offline — zero data leaves your device. This makes it suitable for sensitive documents like contracts, medical records, or confidential research.

Does Anthropic use my data for AI training?

No. Anthropic's API usage policy explicitly states that API inputs and outputs are not used for model training. This is different from the free consumer Claude chat product. Your document excerpts are processed and discarded.

Can I use LocalRAG! without a subscription?

Yes. The free plan includes 5 questions per day with Claude Haiku and the on-device AI. You can also bring your own Anthropic API key (BYOK) for unlimited access to all models at no subscription cost — you only pay Anthropic directly for API usage.

How does the offline AI (local LLM) work?

LocalRAG! includes a built-in on-device language model that runs entirely on your iPhone, iPad, or Android device (beta). Once you download the model (~3 GB), you can ask questions about your documents with zero internet connection. Answers are generated on your device — nothing is sent to any server. It's available on all plans.

What document formats are supported?

LocalRAG! supports 15 formats: PDF, EPUB, DOCX, XLSX, PPTX, TXT, Markdown, CSV, RTF, HTML, JPG, PNG, HEIC, and WebP. Scanned and copy-protected PDFs are also supported via on-device OCR. Images are processed using on-device OCR to extract text.

Start chatting with your documents

Start free with 5 questions/day, or subscribe for more.
Free, Standard ($4.99/mo), and Pro ($9.99/mo) plans available. Cancel anytime.
Terms of Use · Privacy Policy