Instructions to use Luyu/condenser with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Luyu/condenser with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="Luyu/condenser")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("Luyu/condenser") model = AutoModelForMaskedLM.from_pretrained("Luyu/condenser") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 8e56485ef0f6a91d4b0d58e16d1b5b4783921744bdf2c499564aa81012ede002
- Size of remote file:
- 438 MB
- SHA256:
- a3ee3da6afa112f68e4ba0800d947f83bbec2e6bd508d55b52e8f6116df61f23
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.