-
CroissantLLM: A Truly Bilingual French-English Language Model
Paper • 2402.00786 • Published • 26 -
croissantllm/CroissantLLMChat-v0.1
Text Generation • 1B • Updated • 885 • 52 -
croissantllm/CroissantLLMBase
Text Generation • Updated • 277 • 33 -
croissantllm/croissant_dataset
Viewer • Updated • 16.7B • 1.96k • 7
Collections
Discover the best community collections!
Collections including paper arxiv:2402.00786
-
DocLLM: A layout-aware generative language model for multimodal document understanding
Paper • 2401.00908 • Published • 189 -
Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models
Paper • 2401.04658 • Published • 27 -
Weaver: Foundation Models for Creative Writing
Paper • 2401.17268 • Published • 45 -
Efficient Tool Use with Chain-of-Abstraction Reasoning
Paper • 2401.17464 • Published • 21
-
Attention Is All You Need
Paper • 1706.03762 • Published • 109 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 25 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 9 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 21
-
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
Paper • 2211.05100 • Published • 35 -
FlauBERT: Unsupervised Language Model Pre-training for French
Paper • 1912.05372 • Published -
CroissantLLM: A Truly Bilingual French-English Language Model
Paper • 2402.00786 • Published • 26 -
AION-1: Omnimodal Foundation Model for Astronomical Sciences
Paper • 2510.17960 • Published • 29
-
LLaMA Beyond English: An Empirical Study on Language Capability Transfer
Paper • 2401.01055 • Published • 55 -
YAYI 2: Multilingual Open-Source Large Language Models
Paper • 2312.14862 • Published • 14 -
Order Matters in the Presence of Dataset Imbalance for Multilingual Learning
Paper • 2312.06134 • Published • 3 -
TaCo: Enhancing Cross-Lingual Transfer for Low-Resource Languages in LLMs through Translation-Assisted Chain-of-Thought Processes
Paper • 2311.10797 • Published
-
CroissantLLM: A Truly Bilingual French-English Language Model
Paper • 2402.00786 • Published • 26 -
croissantllm/CroissantLLMChat-v0.1
Text Generation • 1B • Updated • 885 • 52 -
croissantllm/CroissantLLMBase
Text Generation • Updated • 277 • 33 -
croissantllm/croissant_dataset
Viewer • Updated • 16.7B • 1.96k • 7
-
BLOOM: A 176B-Parameter Open-Access Multilingual Language Model
Paper • 2211.05100 • Published • 35 -
FlauBERT: Unsupervised Language Model Pre-training for French
Paper • 1912.05372 • Published -
CroissantLLM: A Truly Bilingual French-English Language Model
Paper • 2402.00786 • Published • 26 -
AION-1: Omnimodal Foundation Model for Astronomical Sciences
Paper • 2510.17960 • Published • 29
-
DocLLM: A layout-aware generative language model for multimodal document understanding
Paper • 2401.00908 • Published • 189 -
Lightning Attention-2: A Free Lunch for Handling Unlimited Sequence Lengths in Large Language Models
Paper • 2401.04658 • Published • 27 -
Weaver: Foundation Models for Creative Writing
Paper • 2401.17268 • Published • 45 -
Efficient Tool Use with Chain-of-Abstraction Reasoning
Paper • 2401.17464 • Published • 21
-
LLaMA Beyond English: An Empirical Study on Language Capability Transfer
Paper • 2401.01055 • Published • 55 -
YAYI 2: Multilingual Open-Source Large Language Models
Paper • 2312.14862 • Published • 14 -
Order Matters in the Presence of Dataset Imbalance for Multilingual Learning
Paper • 2312.06134 • Published • 3 -
TaCo: Enhancing Cross-Lingual Transfer for Low-Resource Languages in LLMs through Translation-Assisted Chain-of-Thought Processes
Paper • 2311.10797 • Published
-
Attention Is All You Need
Paper • 1706.03762 • Published • 109 -
BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding
Paper • 1810.04805 • Published • 25 -
RoBERTa: A Robustly Optimized BERT Pretraining Approach
Paper • 1907.11692 • Published • 9 -
DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter
Paper • 1910.01108 • Published • 21