Instructions to use minsik-oh/dpo-model-sample with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use minsik-oh/dpo-model-sample with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("meta-llama/Llama-3.1-8B-Instruct") model = PeftModel.from_pretrained(base_model, "minsik-oh/dpo-model-sample") - Notebooks
- Google Colab
- Kaggle
File size: 133 Bytes
2f4c3be | 1 2 3 4 | version https://git-lfs.github.com/spec/v1
oid sha256:1ef64781aa03180f4f5ce504314f058f5d0227277df86060473d973cf43b033e
size 36523880
|