Instructions to use Turkish-NLP/t5-efficient-small-turkish with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Turkish-NLP/t5-efficient-small-turkish with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("Turkish-NLP/t5-efficient-small-turkish") model = AutoModelForSeq2SeqLM.from_pretrained("Turkish-NLP/t5-efficient-small-turkish") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 6e97d40b290d128947ea524a07c6754757345a26bfb4f27068460e46624ea7d1
- Size of remote file:
- 569 MB
- SHA256:
- 13d4ab924e42f59e10778ca4a49aa0d9c3e83f3eb16cbe98903da96b13715aa6
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.