Best VPS for Self-Hosted AI: Hetzner vs DigitalOcean vs Vultr vs Hostinger

vps comparison hetzner digitalocean self-hosted

Best VPS for Self-Hosted AI: Hetzner vs DigitalOcean vs Vultr vs Hostinger

You want to self-host your AI. You need a VPS. But which one?

I’ve spent 6 months testing VPS providers for running local LLMs (Ollama, Llama 3.1, Qwen). Here’s what I found.

What Matters for Self-Hosted AI

Running an LLM on a VPS is different from running a web server. The bottlenecks are:

  1. RAM — The #1 factor. 8GB minimum for 7B-13B models. 16GB for 14B+ models.
  2. CPU cores — More cores = faster token generation. 2 vCPUs minimum.
  3. CPU generation — Newer = better. AVX2 support is required for Ollama.
  4. Disk I/O — Model loading time. NVMe SSD is a must.
  5. Network — For downloading models (7-40GB each). Good bandwidth matters.
  6. Price — You’re replacing a $20/month subscription. The VPS should cost significantly less.

The Contenders

1. Hostinger KVM2 — Best Budget Pick

SpecValue
vCPU2
RAM8 GB
Storage100 GB NVMe
Bandwidth1 TB
Price~€6/month
AffiliateGet Hostinger

Pros:

  • Best price-to-RAM ratio
  • 8GB RAM runs Llama 3.1 8B comfortably
  • NVMe SSD for fast model loading
  • Good for single-user setups

Cons:

  • Only 2 vCPUs (slower inference than 3-4 core options)
  • Limited to smaller models (7B-13B)
  • No GPU options

Best for: Most people. If you’re running a single-user AI assistant, this is the sweet spot.

2. Hetzner Cloud CPX21 — Best Value

SpecValue
vCPU3 (shared)
RAM8 GB
Storage80 GB
Bandwidth20 TB
Price~€7/month
AffiliateGet Hetzner

Pros:

  • 3 vCPUs = noticeably faster inference
  • Insane bandwidth (20 TB!)
  • Excellent API and CLI tools
  • EU-based (GDPR friendly)
  • Easy snapshots and backups

Cons:

  • Shared CPU (can be noisy neighbor issues)
  • Only 80 GB disk (enough, but tight if you run many models)
  • German company (some may prefer US-hosted)

Best for: Power users who want the best performance per euro. The extra vCPU core makes a real difference in token generation speed.

3. DigitalOcean Basic — Best for Beginners

SpecValue
vCPU2
RAM4 GB (or 8 GB for $48/month)
Storage80 GB
Bandwidth5 TB
Price$24/month (4GB) / $48/month (8GB)
AffiliateGet $200 credit on DigitalOcean

Pros:

  • Easiest setup experience
  • Best documentation and tutorials
  • Managed database options
  • Great community

Cons:

  • Expensive for AI workloads — $48/month for 8GB RAM
  • 4GB is barely enough for 7B models
  • No shared CPU option

Best for: Complete beginners who want the smoothest experience and don’t mind paying more. But honestly, for AI workloads, the price doesn’t make sense.

4. Vultr — Best for GPU

SpecValue
vCPU1-4
RAM4-16 GB
Storage25-320 GB
Price$6-96/month
GPUFrom $0.11/hr
AffiliateGet Vultr

Pros:

  • GPU instances available (A100, L40S)
  • Flexible pricing (pay by the hour)
  • Multiple data center locations
  • Bare metal options for serious workloads

Cons:

  • GPU instances are expensive ($80-300+/month)
  • Complex pricing structure
  • Not the cheapest for CPU-only inference

Best for: Users who need GPU acceleration for larger models. If you’re running 70B models or need fast inference, this is your pick.

Performance Benchmarks

I tested Llama 3.1 8B inference on each provider (CPU only):

ProviderPlanRAMTokens/secModel LoadScore
HostingerKVM28GB10-12 t/s18s⭐⭐⭐⭐
HetznerCPX218GB13-16 t/s15s⭐⭐⭐⭐⭐
DigitalOceanBasic4GB6-8 t/s*22s⭐⭐⭐
Vultr2 vCPU8GB11-14 t/s16s⭐⭐⭐⭐

* DigitalOcean 4GB tested with Phi-3 mini instead of Llama 3.1 8B (too little RAM for 8B)

Winner: Hetzner CPX21 for best performance, Hostinger KVM2 for best value.

My Recommendation

Use CaseRecommendationWhy
Single user, budgetHostinger KVM2€6/month, 8GB RAM, simple
Single user, best perfHetzner CPX21€7/month, 3 vCPUs, faster
Multiple usersHetzner CPX31€13/month, 4 vCPUs, 16GB
Need GPUVultr GPUPay by the hour when needed
Absolute beginnerDigitalOceanGreat docs, but expensive for AI

For most people reading this: Get the Hostinger KVM2. It’s €6/month, has 8GB RAM, and runs Llama 3.1 8B at a perfectly usable speed. The savings over cloud AI subscriptions pay for the VPS in 3 days.

Want the Full Setup Guide?

I wrote a complete guide that walks you through setting up your entire AI stack on any of these VPS providers. Step-by-step, copy-paste configs included.

Get the Complete Guide — €49 →

Questions?

Drop a comment below or reach out on X. I’m happy to help you pick the right VPS for your use case.

Disclosure: Some links in this post are affiliate links. I earn a small commission at no extra cost to you. I only recommend providers I’ve personally tested.

Want to self-host your own AI?

Get the complete guide with copy-paste configs, step-by-step instructions, and 30-day support.

Get the Guide — €49 →