VRAM
Video RAM — the dedicated memory on a GPU. VRAM is the single most important spec for AI hardware because the entire model (or a quantized version of it) must fit in VRAM for GPU-accelerated inference. 8 GB runs small models (up to ~7B parameters), 16 GB handles mid-range, 24 GB is the sweet spot for serious local AI, and 48 GB+ opens up 70B-parameter models. When in doubt, buy more VRAM.