Lion: Adversarial Distillation of Proprietary Large Language Models (EMNLP 2023)

arXiv link: https://arxiv.org/abs/2305.12870
Github: https://github.com/YJiangcm/Lion

Note: To comply with the LLaMA model license, we release Lion weights as delta weights.

Downloads last month
14
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for YuxinJiang/lion-7b

Quantizations
2 models

Paper for YuxinJiang/lion-7b