The model is intialized with nomic-ai/modernbert-embed-base-unsupervised.
Fine-tuning details:
- Training steps: 10K
- Training data: Contrastive training on
Tevatron/msmarco-passage-newwithout title - Batch size: 32 * 2
- Training group size: 8
- Downloads last month
- 101
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support