Instructions to use lukeleeai/t5-base_multirc_ with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use lukeleeai/t5-base_multirc_ with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="lukeleeai/t5-base_multirc_")# Load model directly from transformers import AutoTokenizer, AutoModelForSequenceClassification tokenizer = AutoTokenizer.from_pretrained("lukeleeai/t5-base_multirc_") model = AutoModelForSequenceClassification.from_pretrained("lukeleeai/t5-base_multirc_") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- c2f85a299953ede05c2e3b7246ea944d45df18af270f8337e1a4a3abcbd2c0ba
- Size of remote file:
- 4.06 GB
- SHA256:
- 4a1f4e654d8ab7cafd61f20976cde5b3d0deab7ded1d60a5c0df5bbd4aa4ee9b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.