Skip to main content
Enterprise RAG on AWS

Private AI that answers questions about your company's documents.

No third-party training. No leaving AWS.

Isolated multi-tenant by default. Dedicated deployment in your own AWS account for Enterprise. Built for multinationals, regulated industries, and mid-sized organizations that cannot afford for their knowledge to leave the perimeter.

The problem

Critical knowledge is scattered

Methodologies, manuals, and processes live across SharePoint, Drive, Confluence, PDFs, and people's heads. New hires take months to onboard.

ChatGPT is not an option

Customer data, contracts, regulated information (GDPR, HIPAA-adjacent, finance). Uploading to public SaaS is a legal and reputational risk.

No traceability, no compliance

Without verifiable citations and forensic audit, no recommendation passes legal, risk, or internal compliance review.

What we offer

A complete, governed, enterprise-ready RAG platform

We don't sell a chatbot. We sell the full pipeline that turns your private knowledge into audited, evidence-backed answers — inside your own AWS infrastructure.

01

Secure ingestion

PDFs, DOCX, Markdown, CSV, JSON. Parsing, chunking with overlap, metadata extraction, vector embeddings with Amazon Titan. All inside your AWS VPC.

02

Retrieval with Row-Level Security

Semantic search over pgvector in Aurora PostgreSQL with multi-tenant isolation verified by automated tests. Metadata filters for your vertical or tenant.

03

Answers with Claude via Bedrock

Claude Haiku 4.5 / Sonnet 4.6 / Opus 4.6 with automatic cost-aware routing. Guardrails against cross-tenant leaks, prompt injection, and evidence-free responses. Real-time streaming.

04

Verifiable citations

Every answer links to the exact chunk in the source document. Forensic auditability reconstructible months later: who asked, which chunks were shown, which model answered, how much it cost.

05

Enterprise governance

Workspace RBAC, per-tenant policies, append-only audit log, per-query cost tracking, configurable retention, GDPR export, SAML/OIDC SSO, BYO KMS keys.

06

BYO-LLM and data residency

Plug in your own Anthropic, OpenAI, or private model credentials. Deploy in us-east-1 or eu-west-1. Zero vendor lock-in, zero transatlantic data transfer.

Who it's for

Built from day one for organizations that see their data as competitive advantage and legal obligation — not fuel for public SaaS.

Multinationals with regulated data

Banking, insurance, healthcare, energy, corporate legal. Industries under GDPR, HIPAA-adjacent, SOX, PCI-DSS. They need AI over their knowledge without a single record leaving their AWS perimeter.

Large enterprises with sensitive IP

Manufacturing, pharma, engineering, defense, strategy consulting. Patents, formulas, proprietary methods, strict NDAs. ChatGPT Enterprise is not legally viable.

Chains and franchises with proprietary methodology

50+ locations needing operational consistency, standardized training, and answers aligned with HQ methodology at every point. AI distributes the corporate know-how.

Mid-sized organizations scaling up

200–2000 employees with enterprise-equivalent compliance but without the Glean budget ($50k+ entry). The exact gap rags.cc fills.

Public and semi-public sector

Municipalities, regulators, public universities, public hospitals. Non-negotiable data sovereignty, tender budgets, multi-year contracts.

Teams already deep in AWS

Any organization that already chose AWS as its primary cloud and requires its AI tooling to respect its perimeter, IAM roles, KMS keys, and CloudTrail.

Who it's NOT for

  • Individual users chatting with personal notes (use NotebookLM)
  • Startups under 50 people with no regulatory requirements
  • Companies comfortable uploading data to ChatGPT Enterprise or Copilot
  • Teams not on AWS or with cloud-agnostic strategy
  • Cases where LLM creativity matters more than traceability
Deployment models

Two ways to keep your data protected

Pick the model that fits your risk profile, compliance, and budget. Both honor the fundamental rule: your content never trains a third party and inference never leaves AWS.

Starter · Pro · Business

Shared multi-tenant

Operated by rags.cc in AWS us-east-1

The fastest way to get started. Sign up, pay by card, start uploading documents in minutes. We run the infra.

  • Infra running in our dedicated rags.cc AWS account
  • Multi-tenant isolation with Row-Level Security verified by automated tests
  • Inference via Amazon Bedrock with private VPC endpoint (data never touches public internet)
  • Encryption at rest with KMS managed by rags.cc
  • Append-only audit log, exportable on request
  • Anthropic does NOT use your data to train (Bedrock policy)
  • rags.cc NEVER sells, transfers, or trains models with your content
  • Optional BYO-LLM on Business (your own Anthropic / OpenAI credentials)
Who it's for: Mid-sized companies, scaling organizations, and anyone that prioritizes time-to-value over owning the physical perimeter.
Enterprise+

Dedicated deployment in your AWS account

Your account. Your VPC. Your KMS. Your CloudTrail.

We deploy the rags.cc stack literally inside your AWS account via Terraform and a limited cross-account IAM role. Your data never exists outside your perimeter.

  • Infra inside the AWS account you authorize
  • Your data physically lives in your Aurora, your S3, your VPC
  • BYO KMS keys: the cryptographic root belongs to you, rotate whenever
  • Your CloudTrail sees every operation; your security team audits everything
  • rags.cc operates via temporary, auditable assume-role, with no direct data access
  • You can revoke our access at any moment without losing your data
  • SSO SAML/OIDC with your corporate IdP
  • Negotiable SLA, 24/7 support, dedicated CSM
Who it's for: Banking, insurance, healthcare, defense, public sector, regulated multinationals, contracts with mandatory external audit.
Quick comparisonShared multi-tenantDedicated deployment in your AWS account
Inference never touches public internet
Data NOT used to train third parties
Row-Level Security multi-tenantn/a (single-tenant)
KMS encryption at restrags.cc KMSyour KMS (BYO)
CloudTrail visible to youreportsyes, direct
Data in your AWS account
Revoke access in 1 click
Minimum contractmonthlyannual
Time to start< 1 hour2-4 weeks
Price range$99-$1,499 USD/mo$5k-$25k+ USD/mo

Why rags.cc

Private by design

Everything runs in your AWS VPC. Bedrock via private endpoint. Zero egress.

Verifiable citations

Every answer links to the exact chunk. Forensic auditability months later.

True multi-tenant

Row-Level Security in PostgreSQL. Isolation verified by automated tests.

LLM replaceability

Decoupled Model Gateway. Switch from Claude to Llama without rewriting the product.

Simple public pricing

All prices in USD. No surprises. Monthly or annual billing with a discount.

Starter

$99 USD/ mo
  • 1 workspace
  • 500 documents
  • 1,000 queries / mo
  • 10M LLM tokens / mo
  • Claude Haiku 4.5
  • 30-day audit log
Request
★ Popular

Pro

$499 USD/ mo
  • 5 workspaces
  • 5,000 documents
  • 10,000 queries / mo
  • 50M LLM tokens / mo
  • Haiku 4.5 + Sonnet 4.6
  • 90-day audit log
  • 99.5% SLA
Request

Business

$1,499 USD/ mo
  • 25 workspaces
  • 50,000 documents
  • 50,000 queries / mo
  • 200M LLM tokens / mo
  • Haiku + Sonnet + Opus 4.6
  • BYO-LLM
  • 1-year audit log
  • 99.9% SLA
Request

Prices shown in US dollars (USD). Taxes not included. Monthly or annual billing with a discount. Enterprise tier with SSO, dedicated deployment, and 99.95% SLA available under contract.