Batch Size

The number of input samples processed simultaneously during training or inference. Larger batch sizes increase GPU utilization and throughput but require more VRAM. During training, batch size directly affects how much memory you need — doubling the batch size roughly doubles the activation memory. For inference servers handling multiple users, batch size determines how many concurrent requests your GPU can handle.

Related Products

Related Articles

More Terms