Update src/app.py
Browse files- src/app.py +3 -12
src/app.py
CHANGED
|
@@ -37,22 +37,13 @@ def get_results(model_name: str, library: str, precision: list, training: list,
|
|
| 37 |
with gr.Blocks() as demo:
|
| 38 |
with gr.Column():
|
| 39 |
gr.Markdown(
|
| 40 |
-
"""<img src="https://huggingface.co/spaces/andstor/model-memory-usage/resolve/main/
|
| 41 |
|
| 42 |
-
This tool will help you calculate how much
|
| 43 |
-
on a model hosted on the 🤗 Hugging Face Hub. The minimum recommended vRAM needed for a model
|
| 44 |
-
is denoted as the size of the "largest layer", and training of a model is roughly 4x its size (for Adam).
|
| 45 |
-
|
| 46 |
-
These calculations are accurate within a few percent at most, such as `bert-base-cased` being 413.68 MB and the calculator estimating 413.18 MB.
|
| 47 |
-
|
| 48 |
-
When performing inference, expect to add up to an additional 20% to this as found by [EleutherAI](https://blog.eleuther.ai/transformer-math/).
|
| 49 |
-
More tests will be performed in the future to get a more accurate benchmark for each model.
|
| 50 |
-
|
| 51 |
-
Currently this tool supports all models hosted that use `transformers` and `timm`.
|
| 52 |
|
| 53 |
To use this tool pass in the URL or model name of the model you want to calculate the memory usage for,
|
| 54 |
select which framework it originates from ("auto" will try and detect it from the model metadata), and
|
| 55 |
-
what precisions you want to use."""
|
| 56 |
)
|
| 57 |
out_text = gr.Markdown()
|
| 58 |
out = gr.DataFrame(
|
|
|
|
| 37 |
with gr.Blocks() as demo:
|
| 38 |
with gr.Column():
|
| 39 |
gr.Markdown(
|
| 40 |
+
"""<img src="https://huggingface.co/spaces/andstor/model-memory-usage/resolve/main/measure_model_size_deepspeed.svg" style="float: left;" width="250" height="250"><h1>🤗 DeepSpeed Model Memory Calculator</h1>
|
| 41 |
|
| 42 |
+
This tool will help you calculate how much memory is required for the various Zero Redundancy Optimizer (ZeRO), given a on a model hosted on the 🤗 Hugging Face Hub and a hardware setup. The optimizer states assume that Adam is used.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 43 |
|
| 44 |
To use this tool pass in the URL or model name of the model you want to calculate the memory usage for,
|
| 45 |
select which framework it originates from ("auto" will try and detect it from the model metadata), and
|
| 46 |
+
what precisions you want to use. Then select the select the desired ZeRO configuration."""
|
| 47 |
)
|
| 48 |
out_text = gr.Markdown()
|
| 49 |
out = gr.DataFrame(
|