Description from extension meta
The best VRAM Calculator for Hugging Face models
Image from store
Description from store
It can be pretty hard to figure out what hardware you need to run LLMs and other AI models.
Now, you can easily see the requirements to inference or fine-tune a model. If it doesn't fit, you will even get suggestions on how to still get it working (i.e. quantization or QLoRA).
This tool is an extension to the Hugging Face website. This is still a Beta version so please leave feedback if you can think of improvements.
Statistics
Installs
21
history
Category
Rating
5.0 (5 votes)
Last update / version
2025-05-28 / 0.5
Listing languages
en