gsaltintas commited on
Commit
2e0667f
·
verified ·
1 Parent(s): 06bbe96

Uploading tokenizer_robustness_completion_farsi_arabic_keyboard_for_farsi subset

Browse files
README.md CHANGED
@@ -7,6 +7,136 @@ pretty_name: Tokenization Robustness
7
  tags:
8
  - multilingual
9
  - tokenization
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
10
  ---
11
 
12
  # Dataset Card for Tokenization Robustness
 
7
  tags:
8
  - multilingual
9
  - tokenization
10
+ configs:
11
+ - config_name: tokenizer_robustness_completion_farsi_arabic_keyboard_for_farsi
12
+ data_files:
13
+ - split: test
14
+ path: tokenizer_robustness_completion_farsi_arabic_keyboard_for_farsi/test-*
15
+ dataset_info:
16
+ config_name: tokenizer_robustness_completion_farsi_arabic_keyboard_for_farsi
17
+ features:
18
+ - name: question
19
+ dtype: string
20
+ - name: choices
21
+ list: string
22
+ - name: answer
23
+ dtype: int64
24
+ - name: answer_label
25
+ dtype: string
26
+ - name: split
27
+ dtype: string
28
+ - name: subcategories
29
+ dtype: string
30
+ - name: category
31
+ dtype: string
32
+ - name: lang
33
+ dtype: string
34
+ - name: second_lang
35
+ dtype: string
36
+ - name: notes
37
+ dtype: string
38
+ - name: id
39
+ dtype: string
40
+ - name: set_id
41
+ dtype: string
42
+ - name: variation_id
43
+ dtype: string
44
+ - name: vanilla_cos_sim_to_canonical
45
+ struct:
46
+ - name: CohereLabs/aya-expanse-8b
47
+ dtype: float64
48
+ - name: Qwen/Qwen3-8B
49
+ dtype: float64
50
+ - name: bigscience/bloom
51
+ dtype: float64
52
+ - name: common-pile/comma-v0.1-1t
53
+ dtype: float64
54
+ - name: facebook/xglm-564M
55
+ dtype: float64
56
+ - name: google-bert/bert-base-multilingual-cased
57
+ dtype: float64
58
+ - name: google/byt5-small
59
+ dtype: float64
60
+ - name: google/gemma-2-2b
61
+ dtype: float64
62
+ - name: gpt2
63
+ dtype: float64
64
+ - name: meta-llama/Llama-3.2-1B
65
+ dtype: float64
66
+ - name: microsoft/Phi-3-mini-4k-instruct
67
+ dtype: float64
68
+ - name: mistralai/tekken
69
+ dtype: float64
70
+ - name: tiktoken/gpt-4o
71
+ dtype: float64
72
+ - name: tokenmonster/englishcode-32000-consistent-v1
73
+ dtype: float64
74
+ - name: trimmed_cos_sim_to_canonical
75
+ struct:
76
+ - name: CohereLabs/aya-expanse-8b
77
+ dtype: float64
78
+ - name: Qwen/Qwen3-8B
79
+ dtype: float64
80
+ - name: bigscience/bloom
81
+ dtype: float64
82
+ - name: common-pile/comma-v0.1-1t
83
+ dtype: float64
84
+ - name: facebook/xglm-564M
85
+ dtype: float64
86
+ - name: google-bert/bert-base-multilingual-cased
87
+ dtype: float64
88
+ - name: google/byt5-small
89
+ dtype: float64
90
+ - name: google/gemma-2-2b
91
+ dtype: float64
92
+ - name: gpt2
93
+ dtype: float64
94
+ - name: meta-llama/Llama-3.2-1B
95
+ dtype: float64
96
+ - name: microsoft/Phi-3-mini-4k-instruct
97
+ dtype: float64
98
+ - name: mistralai/tekken
99
+ dtype: float64
100
+ - name: tiktoken/gpt-4o
101
+ dtype: float64
102
+ - name: tokenmonster/englishcode-32000-consistent-v1
103
+ dtype: float64
104
+ - name: token_counts
105
+ struct:
106
+ - name: CohereLabs/aya-expanse-8b
107
+ dtype: int64
108
+ - name: Qwen/Qwen3-8B
109
+ dtype: int64
110
+ - name: bigscience/bloom
111
+ dtype: int64
112
+ - name: common-pile/comma-v0.1-1t
113
+ dtype: int64
114
+ - name: facebook/xglm-564M
115
+ dtype: int64
116
+ - name: google-bert/bert-base-multilingual-cased
117
+ dtype: int64
118
+ - name: google/byt5-small
119
+ dtype: int64
120
+ - name: google/gemma-2-2b
121
+ dtype: int64
122
+ - name: gpt2
123
+ dtype: int64
124
+ - name: meta-llama/Llama-3.2-1B
125
+ dtype: int64
126
+ - name: microsoft/Phi-3-mini-4k-instruct
127
+ dtype: int64
128
+ - name: mistralai/tekken
129
+ dtype: int64
130
+ - name: tiktoken/gpt-4o
131
+ dtype: int64
132
+ - name: tokenmonster/englishcode-32000-consistent-v1
133
+ dtype: int64
134
+ splits:
135
+ - name: test
136
+ num_bytes: 23529
137
+ num_examples: 40
138
+ download_size: 40935
139
+ dataset_size: 23529
140
  ---
141
 
142
  # Dataset Card for Tokenization Robustness
tokenizer_robustness_completion_farsi_arabic_keyboard_for_farsi/test-00000-of-00001.parquet ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8f065b7b4bf43b3c7aea826eb5a85fec012b2dde02e55b45b70e80c585178eb5
3
+ size 40935