AI Model Hardware Requirements
What hardware do you actually need to run today's most popular AI models locally? We break down VRAM requirements, recommended GPUs, and compatible tools for each model — from lightweight 7B chatbots to 405B research monsters.
Llama 4
8B5 GB – 16 GB VRAM
Llama 4 Scout 8B
General-purpose chat, coding assistance, instruction following
View hardware picks
70B40 GB – 140 GB VRAM
Llama 4 Maverick 70B
Advanced reasoning, long-context analysis, complex coding tasks
View hardware picks
405B230 GB – 810 GB VRAM
Llama 4 Behemoth 405B
Research-grade reasoning, multi-step problem solving (multi-GPU required)
View hardware picks