Update README: Fix TiQuAD's language name to Tigrinya (#3)
Browse files- Update README: Fix TiQuAD's language name to Tigrinya (4688b42d026aba547d972ab0589378f4befa71bd)
Co-authored-by: Fitsum <[email protected]>
README.md
CHANGED
|
@@ -40,7 +40,7 @@ This phase dramatically expands language coverage to **1833 languages**, impleme
|
|
| 40 |
|
| 41 |
- **Temperature Schedule**: τ=0.3 (most uniform sampling)
|
| 42 |
- **Low-resource Focus**: Includes 1723 new languages with minimal data
|
| 43 |
-
- **Rapid Learning**: Demonstrates 68% performance improvement on
|
| 44 |
- **Script Diversity**: Covers virtually all writing systems in FineWeb2
|
| 45 |
|
| 46 |
### Key Innovation: Annealed Language Learning
|
|
@@ -68,7 +68,7 @@ Use the script at [this link](https://github.com/JHU-CLSP/mmBERT/blob/main/data/
|
|
| 68 |
## 🎯 Performance Impact
|
| 69 |
|
| 70 |
The decay phase demonstrates remarkable efficiency in low-resource language learning:
|
| 71 |
-
- **
|
| 72 |
- **Faroese (FoQA)**: 26% improvement (15.4 F1 points)
|
| 73 |
- **SOTA Performance**: Can even outperforms GPT-4o, Gemini 2.5 Pro
|
| 74 |
- **Rapid Acquisition**: Significant gains with only 100B tokens of exposure
|
|
|
|
| 40 |
|
| 41 |
- **Temperature Schedule**: τ=0.3 (most uniform sampling)
|
| 42 |
- **Low-resource Focus**: Includes 1723 new languages with minimal data
|
| 43 |
+
- **Rapid Learning**: Demonstrates 68% performance improvement on Tigrinya and 26% on Faroese
|
| 44 |
- **Script Diversity**: Covers virtually all writing systems in FineWeb2
|
| 45 |
|
| 46 |
### Key Innovation: Annealed Language Learning
|
|
|
|
| 68 |
## 🎯 Performance Impact
|
| 69 |
|
| 70 |
The decay phase demonstrates remarkable efficiency in low-resource language learning:
|
| 71 |
+
- **Tigrinya (TiQuAD)**: 68% improvement (12.1 F1 points) from including the language
|
| 72 |
- **Faroese (FoQA)**: 26% improvement (15.4 F1 points)
|
| 73 |
- **SOTA Performance**: Can even outperforms GPT-4o, Gemini 2.5 Pro
|
| 74 |
- **Rapid Acquisition**: Significant gains with only 100B tokens of exposure
|