Contact Form

Name

Email *

Message *

Cari Blog Ini

Current Llama Memory Usage

Current LLaMA Memory Usage

LLaMA and VRAM

LLaMA-65B and 70B performs optimally when paired with a GPU that has a minimum of 40GB VRAM. WEB More than 48GB VRAM will be needed for 32k context as 16k is the maximum that fits in 2x 4090 2x 24GB see here.

Hans-Ekbrand Comment

Hans-ekbrand commented on Jul 20 2023.


Comments