About HardwareHQ

Free tools to help you run AI locally. No signup, no tracking, no BS.

Why This Exists

Every day, thousands of people ask the same question: "Can I run [model] on my [GPU]?"

The answer is usually scattered across Reddit threads, GitHub issues, and Discord servers. We built HardwareHQ to be the single source of truth for AI hardware compatibility.

All our tools are free. We make money through affiliate links when you buy hardware (clearly disclosed), but the tools work exactly the same whether you click those links or not.

What We Offer

280+ AI Models

VRAM requirements for every quantization level. Updated as new models release.

280+ GPUs & Hardware

Consumer and enterprise specs. From RTX 4060 to H100.

Instant Compatibility Checks

Know in seconds if a model fits your VRAM. No guessing.

Daily Updates

Automated pipeline keeps data fresh. New models added within days of release.

Transparency

No Account Required

All tools work without signup. We don't collect personal data or require authentication.

Open Source

The code is public. You can verify exactly what we do with your data (nothing).

View on GitHub

Affiliate Disclosure

Some hardware links are affiliate links. We earn a small commission if you purchase, at no extra cost to you. This helps keep the tools free. Affiliate links are always marked and never influence our recommendations.

Public API

All data is available via REST endpoints. Free for non-commercial use.

GET/api/hardware-database— Hardware specs
GET/api/ai-models— AI model data

Contact

Found a bug? Have a feature request? Want to contribute data?

Built by hardware enthusiasts who got tired of Googling VRAM requirements.