tomaarsen's picture
tomaarsen HF Staff
Add new CrossEncoder model
06f5d0b verified
---
language:
- en
tags:
- sentence-transformers
- cross-encoder
- reranker
- generated_from_trainer
- dataset_size:39770704
- loss:MarginMSELoss
base_model: jhu-clsp/ettin-encoder-32m
datasets:
- sentence-transformers/msmarco
pipeline_tag: text-ranking
library_name: sentence-transformers
metrics:
- map
- mrr@10
- ndcg@10
co2_eq_emissions:
emissions: 2412.1447613578944
energy_consumed: 6.534597983371423
source: codecarbon
training_type: fine-tuning
on_cloud: false
cpu_model: AMD EPYC 7R13 Processor
ram_total_size: 1999.9855308532715
hours_used: 1.932
hardware_used: 8 x NVIDIA H100 80GB HBM3
model-index:
- name: CrossEncoder based on jhu-clsp/ettin-encoder-32m
results:
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoMSMARCO R100
type: NanoMSMARCO_R100
metrics:
- type: map
value: 0.638
name: Map
- type: mrr@10
value: 0.6319
name: Mrr@10
- type: ndcg@10
value: 0.6951
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNFCorpus R100
type: NanoNFCorpus_R100
metrics:
- type: map
value: 0.3547
name: Map
- type: mrr@10
value: 0.6102
name: Mrr@10
- type: ndcg@10
value: 0.4138
name: Ndcg@10
- task:
type: cross-encoder-reranking
name: Cross Encoder Reranking
dataset:
name: NanoNQ R100
type: NanoNQ_R100
metrics:
- type: map
value: 0.6748
name: Map
- type: mrr@10
value: 0.6919
name: Mrr@10
- type: ndcg@10
value: 0.7294
name: Ndcg@10
- task:
type: cross-encoder-nano-beir
name: Cross Encoder Nano BEIR
dataset:
name: NanoBEIR R100 mean
type: NanoBEIR_R100_mean
metrics:
- type: map
value: 0.5558
name: Map
- type: mrr@10
value: 0.6446
name: Mrr@10
- type: ndcg@10
value: 0.6128
name: Ndcg@10
---
# CrossEncoder based on jhu-clsp/ettin-encoder-32m
This is a [Cross Encoder](https://www.sbert.net/docs/cross_encoder/usage/usage.html) model finetuned from [jhu-clsp/ettin-encoder-32m](https://huggingface.co/jhu-clsp/ettin-encoder-32m) on the [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) dataset using the [sentence-transformers](https://www.SBERT.net) library. It computes scores for pairs of texts, which can be used for text reranking and semantic search.
## Model Details
### Model Description
- **Model Type:** Cross Encoder
- **Base model:** [jhu-clsp/ettin-encoder-32m](https://huggingface.co/jhu-clsp/ettin-encoder-32m) <!-- at revision 1b8ba06455dd44f80fc9c1ca9e22806157a57379 -->
- **Maximum Sequence Length:** 512 tokens
- **Number of Output Labels:** 1 label
- **Training Dataset:**
- [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco)
- **Language:** en
<!-- - **License:** Unknown -->
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Documentation:** [Cross Encoder Documentation](https://www.sbert.net/docs/cross_encoder/usage/usage.html)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
- **Hugging Face:** [Cross Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=cross-encoder)
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import CrossEncoder
# Download from the 🤗 Hub
model = CrossEncoder("tomaarsen/ms-marco-ettin-32m-reranker")
# Get scores for pairs of texts
pairs = [
['what is honor society', 'In the United States, an honor society is a rank organization that recognizes excellence among peers. Numerous societies recognize various fields and circumstances. The Order of the Arrow, for example, is the national honor society of the Boy Scouts of America.'],
['what happens to blood pressure when you raise your arm', 'Well, you measured blood pressure in that arm would drop because while the pressure that your heart puts out will stay pretty constant, your arm is now higher and requires more energy to reach the end, so the pressure seen there will be lower. The Doc · 7 years ago.'],
['what country is the name astrid from', 'Comment by silkfire. Astrid, Estrid, Ã\x86striðr æstriðr Ã\x81strÃ\xadðr astriðr ástrÃ\xadðr is a given Name Of north. Germanic origin it Comes From Ã\x81sfrÃ\xadðr (Norse Asfriðr), ásfrÃ\xadðr divine (beauty) + from (ass). áss god friðr FrÃ\xadðr beautiful names derived From, astrid include the name astrida which is a somewhat common name for Girls. in the country of latviat comes from Old Norse Ã\x81sfrÃ\xadðr (ásfrÃ\xadðr Divine), beauty from (ass) + áss (god). Friðr frÃ\xadðr beautiful Names derived from astrid Include, the name astrida which is a somewhat common name for girls in The. country of latvia'],
['where the latin people came from', 'Latina is a city in Italy and the Latin people come from southern Italy around Rome.'],
['how long do cocker spaniels', "the cocker spaniel's life span is 12-15 years. a dog that is kept fit and on nutritious dog food has a better chance of living longer than those who aren't. I would assume roughly 10-15 years. My cocker spaniel is 13.5 years old and just starting to lose her health."],
]
scores = model.predict(pairs)
print(scores.shape)
# (5,)
# Or rank different texts based on similarity to a single text
ranks = model.rank(
'what is honor society',
[
'In the United States, an honor society is a rank organization that recognizes excellence among peers. Numerous societies recognize various fields and circumstances. The Order of the Arrow, for example, is the national honor society of the Boy Scouts of America.',
'Well, you measured blood pressure in that arm would drop because while the pressure that your heart puts out will stay pretty constant, your arm is now higher and requires more energy to reach the end, so the pressure seen there will be lower. The Doc · 7 years ago.',
'Comment by silkfire. Astrid, Estrid, Ã\x86striðr æstriðr Ã\x81strÃ\xadðr astriðr ástrÃ\xadðr is a given Name Of north. Germanic origin it Comes From Ã\x81sfrÃ\xadðr (Norse Asfriðr), ásfrÃ\xadðr divine (beauty) + from (ass). áss god friðr FrÃ\xadðr beautiful names derived From, astrid include the name astrida which is a somewhat common name for Girls. in the country of latviat comes from Old Norse Ã\x81sfrÃ\xadðr (ásfrÃ\xadðr Divine), beauty from (ass) + áss (god). Friðr frÃ\xadðr beautiful Names derived from astrid Include, the name astrida which is a somewhat common name for girls in The. country of latvia',
'Latina is a city in Italy and the Latin people come from southern Italy around Rome.',
"the cocker spaniel's life span is 12-15 years. a dog that is kept fit and on nutritious dog food has a better chance of living longer than those who aren't. I would assume roughly 10-15 years. My cocker spaniel is 13.5 years old and just starting to lose her health.",
]
)
# [{'corpus_id': ..., 'score': ...}, {'corpus_id': ..., 'score': ...}, ...]
```
<!--
### Direct Usage (Transformers)
<details><summary>Click to see the direct usage in Transformers</summary>
</details>
-->
<!--
### Downstream Usage (Sentence Transformers)
You can finetune this model on your own dataset.
<details><summary>Click to expand</summary>
</details>
-->
<!--
### Out-of-Scope Use
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->
## Evaluation
### Metrics
#### Cross Encoder Reranking
* Datasets: `NanoMSMARCO_R100`, `NanoNFCorpus_R100` and `NanoNQ_R100`
* Evaluated with [<code>CrossEncoderRerankingEvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderRerankingEvaluator) with these parameters:
```json
{
"at_k": 10,
"always_rerank_positives": true
}
```
| Metric | NanoMSMARCO_R100 | NanoNFCorpus_R100 | NanoNQ_R100 |
|:------------|:---------------------|:---------------------|:---------------------|
| map | 0.6380 (+0.1484) | 0.3547 (+0.0937) | 0.6748 (+0.2552) |
| mrr@10 | 0.6319 (+0.1544) | 0.6102 (+0.1104) | 0.6919 (+0.2652) |
| **ndcg@10** | **0.6951 (+0.1547)** | **0.4138 (+0.0888)** | **0.7294 (+0.2287)** |
#### Cross Encoder Nano BEIR
* Dataset: `NanoBEIR_R100_mean`
* Evaluated with [<code>CrossEncoderNanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/cross_encoder/evaluation.html#sentence_transformers.cross_encoder.evaluation.CrossEncoderNanoBEIREvaluator) with these parameters:
```json
{
"dataset_names": [
"msmarco",
"nfcorpus",
"nq"
],
"rerank_k": 100,
"at_k": 10,
"always_rerank_positives": true
}
```
| Metric | Value |
|:------------|:---------------------|
| map | 0.5558 (+0.1658) |
| mrr@10 | 0.6446 (+0.1766) |
| **ndcg@10** | **0.6128 (+0.1574)** |
<!--
## Bias, Risks and Limitations
*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->
<!--
### Recommendations
*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->
## Training Details
### Training Dataset
#### msmarco
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83)
* Size: 39,770,704 training samples
* Columns: <code>query_id</code>, <code>positive_id</code>, <code>negative_id</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | query_id | positive_id | negative_id | score |
|:--------|:------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------|
| type | string | string | string | float |
| details | <ul><li>min: 11 characters</li><li>mean: 33.98 characters</li><li>max: 124 characters</li></ul> | <ul><li>min: 55 characters</li><li>mean: 348.53 characters</li><li>max: 926 characters</li></ul> | <ul><li>min: 80 characters</li><li>mean: 340.54 characters</li><li>max: 818 characters</li></ul> | <ul><li>min: -3.12</li><li>mean: 13.18</li><li>max: 22.49</li></ul> |
* Samples:
| query_id | positive_id | negative_id | score |
|:----------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------|
| <code>which artist started fauves movement</code> | <code>Les Fauves (Wild Beasts) • Fauvism was the first twentieth-century movement in modern art. Inspired by. the examples of van Gogh, Gauguin, Seurat, and Cézanne. • It grew out of a loosely allied group of French painters with shared interests. Henri Matisse was eventually recognized as the leader of Les Fauves, or The. Wild Beasts”.</code> | <code>Let’s get a reflection onto your glass, for added realism. Architectural 2D to 3D visualizer and graphic artist Jonathan Pagaduan Ignas shows you how in this great little tutorial he has contributed to us here at SketchUpArtists.Here are a few easy steps to a glass reflection effect using Google SketchUp and V-Ray for SketchUp..let’s get started!Step 1 - Import Image File. Step 2 - Place Image in Glass. Step 3 - Position Texture. Step 4 - Sample Texture with Dropper.rchitectural 2D to 3D visualizer and graphic artist Jonathan Pagaduan Ignas shows you how in this great little tutorial he has contributed to us here at SketchUpArtists. Here are a few easy steps to a glass reflection effect using Google SketchUp and V-Ray for SketchUp..let’s get started!</code> | <code>18.45222806930542</code> |
| <code>what foods are high in uric acid list</code> | <code>Avoid high-purine foods. See attached lists. Avoid or limit alcohol. Alcohol increases purine production, leading to higher uric acid levels in your blood and urine. Limit meat to 3 ounces per meal. Limit high-fat foods such as salad dressings, ice cream, fried foods, gravies, and dressings.Fat holds onto uric acid in your kidneys.lcohol increases purine production, leading to higher uric acid levels in your blood and urine. Limit meat to 3 ounces per meal. Limit high-fat foods such as salad dressings, ice cream, fried foods, gravies, and dressings. Fat holds onto uric acid in your kidneys.</code> | <code>You can develop gout if you have too much uric acid in your body. Uric acid is a chemical that everyone has in their blood. It's a waste product that forms from substances called purines. These are found in every cell in your body, and in certain foods, such as seafood and liver.</code> | <code>7.823118130366007</code> |
| <code>what battle was the turning point in the pacific theatre</code> | <code>The campaign lasted for nearly six months and cost thousands of American and Japanese lives. This campaign, from the naval perspective, and indeed from a strategic perspective, was the turning point of World War Two in the Pacific Theater. The experienced and well trained Japanese fleet was still superior to the United States fleet after the Battle of Midway, and was still a threat to the US fleet until it was checked in the actions off Guadalcanal.</code> | <code>It is famous as the setting of the February–March 1945 Battle of Iwo Jima involving the United States and a small number of elements of the British Pacific Fleet versus the Empire of Japan during World War II.</code> | <code>7.8578096230824785</code> |
* Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#marginmseloss) with these parameters:
```json
{
"activation_fn": "torch.nn.modules.linear.Identity"
}
```
### Evaluation Dataset
#### msmarco
* Dataset: [msmarco](https://huggingface.co/datasets/sentence-transformers/msmarco) at [9e329ed](https://huggingface.co/datasets/sentence-transformers/msmarco/tree/9e329ed2e649c9d37b0d91dd6b764ff6fe671d83)
* Size: 10,000 evaluation samples
* Columns: <code>query_id</code>, <code>positive_id</code>, <code>negative_id</code>, and <code>score</code>
* Approximate statistics based on the first 1000 samples:
| | query_id | positive_id | negative_id | score |
|:--------|:------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------|
| type | string | string | string | float |
| details | <ul><li>min: 11 characters</li><li>mean: 34.75 characters</li><li>max: 124 characters</li></ul> | <ul><li>min: 69 characters</li><li>mean: 346.11 characters</li><li>max: 920 characters</li></ul> | <ul><li>min: 79 characters</li><li>mean: 350.68 characters</li><li>max: 1014 characters</li></ul> | <ul><li>min: -2.41</li><li>mean: 13.41</li><li>max: 22.48</li></ul> |
* Samples:
| query_id | positive_id | negative_id | score |
|:--------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------|
| <code>what is honor society</code> | <code>In the United States, an honor society is a rank organization that recognizes excellence among peers. Numerous societies recognize various fields and circumstances. The Order of the Arrow, for example, is the national honor society of the Boy Scouts of America.</code> | <code>Pi Gamma Mu Headlines. Pi Gamma Mu is the oldest and preeminent honor society in the social sciences. Our mission is to encourage and recognize superior scholarship in social science disciplines and to foster cooperation and social service among its members.</code> | <code>7.578772008419037</code> |
| <code>what happens to blood pressure when you raise your arm</code> | <code>Well, you measured blood pressure in that arm would drop because while the pressure that your heart puts out will stay pretty constant, your arm is now higher and requires more energy to reach the end, so the pressure seen there will be lower. The Doc · 7 years ago.</code> | <code>Constriction of blood vessels and increase in heart rate does raise blood pressure, but only temporarily; when the stress reaction goes away, blood pressure returns to its pre-stress level.This is called situational stress, and its effects are generally short-lived and disappear when the stressful event is over.hronic (constant) stress causes our bodies to go into high gear on and off for days or weeks at a time. The links between chronic stress and blood pressure are not clear. Although stress does not clearly cause heart disease, it can play a role in general wellness.</code> | <code>8.200530961155891</code> |
| <code>what country is the name astrid from</code> | <code>Comment by silkfire. Astrid, Estrid, Æstriðr æstriðr Ástríðr astriðr ástríðr is a given Name Of north. Germanic origin it Comes From Ásfríðr (Norse Asfriðr), ásfríðr divine (beauty) + from (ass). áss god friðr Fríðr beautiful names derived From, astrid include the name astrida which is a somewhat common name for Girls. in the country of latviat comes from Old Norse Ásfríðr (ásfríðr Divine), beauty from (ass) + áss (god). Friðr fríðr beautiful Names derived from astrid Include, the name astrida which is a somewhat common name for girls in The. country of latvia</code> | <code>» Follow author. » Share. Astrid Ayrianto was on omeegle one day on the video chat, little did she know, she had almost everything in common with the lead singer of her favorite band Get Scared, Nicholas Matthews.</code> | <code>10.449922064940136</code> |
* Loss: [<code>MarginMSELoss</code>](https://sbert.net/docs/package_reference/cross_encoder/losses.html#marginmseloss) with these parameters:
```json
{
"activation_fn": "torch.nn.modules.linear.Identity"
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: steps
- `per_device_train_batch_size`: 256
- `per_device_eval_batch_size`: 256
- `learning_rate`: 2e-05
- `num_train_epochs`: 1
- `warmup_ratio`: 0.1
- `seed`: 12
- `bf16`: True
- `load_best_model_at_end`: True
#### All Hyperparameters
<details><summary>Click to expand</summary>
- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 256
- `per_device_eval_batch_size`: 256
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 2e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 12
- `data_seed`: None
- `jit_mode_eval`: False
- `bf16`: True
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `parallelism_config`: None
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch_fused
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `project`: huggingface
- `trackio_space_id`: trackio
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `hub_revision`: None
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`:
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: no
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: False
- `use_liger_kernel`: False
- `liger_kernel_config`: None
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: True
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}
</details>
### Training Logs
<details><summary>Click to expand</summary>
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_R100_ndcg@10 | NanoNFCorpus_R100_ndcg@10 | NanoNQ_R100_ndcg@10 | NanoBEIR_R100_mean_ndcg@10 |
|:----------:|:---------:|:-------------:|:---------------:|:------------------------:|:-------------------------:|:--------------------:|:--------------------------:|
| -1 | -1 | - | - | 0.0694 (-0.4711) | 0.2964 (-0.0286) | 0.0281 (-0.4725) | 0.1313 (-0.3241) |
| 0.0001 | 1 | 209.9035 | - | - | - | - | - |
| 0.0020 | 39 | 207.8315 | - | - | - | - | - |
| 0.0040 | 78 | 204.3864 | - | - | - | - | - |
| 0.0060 | 117 | 197.3435 | - | - | - | - | - |
| 0.0080 | 156 | 188.2296 | - | - | - | - | - |
| 0.0100 | 195 | 176.2927 | 169.8326 | 0.1222 (-0.4182) | 0.2790 (-0.0460) | 0.0840 (-0.4166) | 0.1618 (-0.2936) |
| 0.0121 | 234 | 161.1845 | - | - | - | - | - |
| 0.0141 | 273 | 145.5738 | - | - | - | - | - |
| 0.0161 | 312 | 122.6948 | - | - | - | - | - |
| 0.0181 | 351 | 81.3822 | - | - | - | - | - |
| 0.0201 | 390 | 43.1117 | 35.8988 | 0.5569 (+0.0164) | 0.3702 (+0.0452) | 0.4567 (-0.0439) | 0.4613 (+0.0059) |
| 0.0221 | 429 | 32.5851 | - | - | - | - | - |
| 0.0241 | 468 | 27.7726 | - | - | - | - | - |
| 0.0261 | 507 | 24.6193 | - | - | - | - | - |
| 0.0281 | 546 | 22.2331 | - | - | - | - | - |
| 0.0301 | 585 | 20.8158 | 19.3642 | 0.6007 (+0.0603) | 0.3694 (+0.0444) | 0.5467 (+0.0460) | 0.5056 (+0.0502) |
| 0.0321 | 624 | 19.5039 | - | - | - | - | - |
| 0.0341 | 663 | 18.151 | - | - | - | - | - |
| 0.0362 | 702 | 17.1317 | - | - | - | - | - |
| 0.0382 | 741 | 16.6977 | - | - | - | - | - |
| 0.0402 | 780 | 15.6749 | 15.1013 | 0.6167 (+0.0763) | 0.3794 (+0.0544) | 0.5682 (+0.0675) | 0.5214 (+0.0661) |
| 0.0422 | 819 | 15.2219 | - | - | - | - | - |
| 0.0442 | 858 | 14.7046 | - | - | - | - | - |
| 0.0462 | 897 | 14.1086 | - | - | - | - | - |
| 0.0482 | 936 | 13.5811 | - | - | - | - | - |
| 0.0502 | 975 | 13.2202 | 12.5601 | 0.6616 (+0.1212) | 0.3763 (+0.0513) | 0.5839 (+0.0833) | 0.5406 (+0.0852) |
| 0.0522 | 1014 | 12.7417 | - | - | - | - | - |
| 0.0542 | 1053 | 12.4341 | - | - | - | - | - |
| 0.0562 | 1092 | 12.1748 | - | - | - | - | - |
| 0.0582 | 1131 | 11.8355 | - | - | - | - | - |
| 0.0603 | 1170 | 11.3288 | 11.0595 | 0.6582 (+0.1178) | 0.3847 (+0.0596) | 0.6125 (+0.1118) | 0.5518 (+0.0964) |
| 0.0623 | 1209 | 11.2907 | - | - | - | - | - |
| 0.0643 | 1248 | 10.9028 | - | - | - | - | - |
| 0.0663 | 1287 | 10.627 | - | - | - | - | - |
| 0.0683 | 1326 | 10.4322 | - | - | - | - | - |
| 0.0703 | 1365 | 10.0831 | 9.7591 | 0.6565 (+0.1161) | 0.3910 (+0.0659) | 0.6118 (+0.1111) | 0.5531 (+0.0977) |
| 0.0723 | 1404 | 10.0484 | - | - | - | - | - |
| 0.0743 | 1443 | 9.7691 | - | - | - | - | - |
| 0.0763 | 1482 | 9.5597 | - | - | - | - | - |
| 0.0783 | 1521 | 9.47 | - | - | - | - | - |
| 0.0803 | 1560 | 9.2528 | 8.9910 | 0.6667 (+0.1263) | 0.3815 (+0.0564) | 0.6247 (+0.1240) | 0.5576 (+0.1023) |
| 0.0823 | 1599 | 9.1445 | - | - | - | - | - |
| 0.0844 | 1638 | 9.0069 | - | - | - | - | - |
| 0.0864 | 1677 | 8.6523 | - | - | - | - | - |
| 0.0884 | 1716 | 8.5901 | - | - | - | - | - |
| 0.0904 | 1755 | 8.715 | 8.3166 | 0.6402 (+0.0997) | 0.3853 (+0.0603) | 0.6448 (+0.1442) | 0.5568 (+0.1014) |
| 0.0924 | 1794 | 8.457 | - | - | - | - | - |
| 0.0944 | 1833 | 8.3209 | - | - | - | - | - |
| 0.0964 | 1872 | 8.1242 | - | - | - | - | - |
| 0.0984 | 1911 | 8.1453 | - | - | - | - | - |
| 0.1004 | 1950 | 8.0561 | 7.6634 | 0.6806 (+0.1401) | 0.3843 (+0.0592) | 0.6495 (+0.1489) | 0.5715 (+0.1161) |
| 0.1024 | 1989 | 7.855 | - | - | - | - | - |
| 0.1044 | 2028 | 7.7573 | - | - | - | - | - |
| 0.1064 | 2067 | 7.5976 | - | - | - | - | - |
| 0.1085 | 2106 | 7.4529 | - | - | - | - | - |
| 0.1105 | 2145 | 7.372 | 7.0798 | 0.6667 (+0.1263) | 0.3952 (+0.0702) | 0.6386 (+0.1379) | 0.5668 (+0.1115) |
| 0.1125 | 2184 | 7.1673 | - | - | - | - | - |
| 0.1145 | 2223 | 7.2075 | - | - | - | - | - |
| 0.1165 | 2262 | 7.1261 | - | - | - | - | - |
| 0.1185 | 2301 | 7.0716 | - | - | - | - | - |
| 0.1205 | 2340 | 7.0182 | 6.7560 | 0.6557 (+0.1153) | 0.3876 (+0.0626) | 0.6544 (+0.1537) | 0.5659 (+0.1105) |
| 0.1225 | 2379 | 6.861 | - | - | - | - | - |
| 0.1245 | 2418 | 6.8366 | - | - | - | - | - |
| 0.1265 | 2457 | 6.6857 | - | - | - | - | - |
| 0.1285 | 2496 | 6.665 | - | - | - | - | - |
| 0.1305 | 2535 | 6.6564 | 6.2824 | 0.6796 (+0.1392) | 0.3834 (+0.0584) | 0.6637 (+0.1630) | 0.5756 (+0.1202) |
| 0.1326 | 2574 | 6.5436 | - | - | - | - | - |
| 0.1346 | 2613 | 6.4091 | - | - | - | - | - |
| 0.1366 | 2652 | 6.2974 | - | - | - | - | - |
| 0.1386 | 2691 | 6.3251 | - | - | - | - | - |
| 0.1406 | 2730 | 6.2799 | 5.9852 | 0.6719 (+0.1314) | 0.3829 (+0.0579) | 0.6711 (+0.1704) | 0.5753 (+0.1199) |
| 0.1426 | 2769 | 6.2637 | - | - | - | - | - |
| 0.1446 | 2808 | 6.1648 | - | - | - | - | - |
| 0.1466 | 2847 | 6.1382 | - | - | - | - | - |
| 0.1486 | 2886 | 6.1291 | - | - | - | - | - |
| 0.1506 | 2925 | 5.9413 | 5.8402 | 0.6665 (+0.1261) | 0.3950 (+0.0700) | 0.6448 (+0.1441) | 0.5688 (+0.1134) |
| 0.1526 | 2964 | 5.9748 | - | - | - | - | - |
| 0.1546 | 3003 | 5.8973 | - | - | - | - | - |
| 0.1567 | 3042 | 5.8654 | - | - | - | - | - |
| 0.1587 | 3081 | 5.6654 | - | - | - | - | - |
| 0.1607 | 3120 | 5.8549 | 5.6918 | 0.6765 (+0.1361) | 0.3948 (+0.0697) | 0.6639 (+0.1632) | 0.5784 (+0.1230) |
| 0.1627 | 3159 | 5.6934 | - | - | - | - | - |
| 0.1647 | 3198 | 5.7285 | - | - | - | - | - |
| 0.1667 | 3237 | 5.5589 | - | - | - | - | - |
| 0.1687 | 3276 | 5.6317 | - | - | - | - | - |
| 0.1707 | 3315 | 5.5741 | 5.4521 | 0.6545 (+0.1141) | 0.3988 (+0.0738) | 0.6863 (+0.1856) | 0.5799 (+0.1245) |
| 0.1727 | 3354 | 5.4948 | - | - | - | - | - |
| 0.1747 | 3393 | 5.4782 | - | - | - | - | - |
| 0.1767 | 3432 | 5.5595 | - | - | - | - | - |
| 0.1787 | 3471 | 5.417 | - | - | - | - | - |
| 0.1808 | 3510 | 5.4339 | 5.2730 | 0.6224 (+0.0820) | 0.3970 (+0.0720) | 0.6666 (+0.1660) | 0.5620 (+0.1066) |
| 0.1828 | 3549 | 5.3723 | - | - | - | - | - |
| 0.1848 | 3588 | 5.2479 | - | - | - | - | - |
| 0.1868 | 3627 | 5.2665 | - | - | - | - | - |
| 0.1888 | 3666 | 5.2302 | - | - | - | - | - |
| 0.1908 | 3705 | 5.1863 | 5.1419 | 0.6791 (+0.1387) | 0.4037 (+0.0786) | 0.6941 (+0.1934) | 0.5923 (+0.1369) |
| 0.1928 | 3744 | 5.1855 | - | - | - | - | - |
| 0.1948 | 3783 | 5.1529 | - | - | - | - | - |
| 0.1968 | 3822 | 5.2058 | - | - | - | - | - |
| 0.1988 | 3861 | 5.098 | - | - | - | - | - |
| 0.2008 | 3900 | 5.0176 | 4.9869 | 0.6590 (+0.1186) | 0.3849 (+0.0599) | 0.6844 (+0.1837) | 0.5761 (+0.1207) |
| 0.2028 | 3939 | 5.0708 | - | - | - | - | - |
| 0.2049 | 3978 | 5.0215 | - | - | - | - | - |
| 0.2069 | 4017 | 4.974 | - | - | - | - | - |
| 0.2089 | 4056 | 4.9687 | - | - | - | - | - |
| 0.2109 | 4095 | 4.9689 | 4.8506 | 0.6734 (+0.1330) | 0.3995 (+0.0745) | 0.7023 (+0.2016) | 0.5917 (+0.1364) |
| 0.2129 | 4134 | 4.8809 | - | - | - | - | - |
| 0.2149 | 4173 | 4.9176 | - | - | - | - | - |
| 0.2169 | 4212 | 4.7451 | - | - | - | - | - |
| 0.2189 | 4251 | 4.7807 | - | - | - | - | - |
| 0.2209 | 4290 | 4.8157 | 4.7150 | 0.6269 (+0.0865) | 0.3948 (+0.0698) | 0.6938 (+0.1932) | 0.5719 (+0.1165) |
| 0.2229 | 4329 | 4.7986 | - | - | - | - | - |
| 0.2249 | 4368 | 4.7942 | - | - | - | - | - |
| 0.2269 | 4407 | 4.7008 | - | - | - | - | - |
| 0.2290 | 4446 | 4.7572 | - | - | - | - | - |
| 0.2310 | 4485 | 4.7616 | 4.6657 | 0.6577 (+0.1172) | 0.4022 (+0.0772) | 0.7019 (+0.2013) | 0.5873 (+0.1319) |
| 0.2330 | 4524 | 4.7014 | - | - | - | - | - |
| 0.2350 | 4563 | 4.6512 | - | - | - | - | - |
| 0.2370 | 4602 | 4.6997 | - | - | - | - | - |
| 0.2390 | 4641 | 4.5655 | - | - | - | - | - |
| 0.2410 | 4680 | 4.5727 | 4.5367 | 0.6826 (+0.1422) | 0.3937 (+0.0687) | 0.6958 (+0.1951) | 0.5907 (+0.1353) |
| 0.2430 | 4719 | 4.5258 | - | - | - | - | - |
| 0.2450 | 4758 | 4.6012 | - | - | - | - | - |
| 0.2470 | 4797 | 4.5785 | - | - | - | - | - |
| 0.2490 | 4836 | 4.5415 | - | - | - | - | - |
| 0.2510 | 4875 | 4.4921 | 4.4462 | 0.6689 (+0.1285) | 0.3977 (+0.0727) | 0.7103 (+0.2096) | 0.5923 (+0.1369) |
| 0.2531 | 4914 | 4.4911 | - | - | - | - | - |
| 0.2551 | 4953 | 4.4795 | - | - | - | - | - |
| 0.2571 | 4992 | 4.4027 | - | - | - | - | - |
| 0.2591 | 5031 | 4.3652 | - | - | - | - | - |
| 0.2611 | 5070 | 4.3868 | 4.3909 | 0.6535 (+0.1130) | 0.3868 (+0.0617) | 0.6815 (+0.1809) | 0.5739 (+0.1185) |
| 0.2631 | 5109 | 4.4055 | - | - | - | - | - |
| 0.2651 | 5148 | 4.3968 | - | - | - | - | - |
| 0.2671 | 5187 | 4.3333 | - | - | - | - | - |
| 0.2691 | 5226 | 4.3369 | - | - | - | - | - |
| 0.2711 | 5265 | 4.3079 | 4.3355 | 0.6550 (+0.1145) | 0.3895 (+0.0644) | 0.6957 (+0.1950) | 0.5800 (+0.1247) |
| 0.2731 | 5304 | 4.3211 | - | - | - | - | - |
| 0.2751 | 5343 | 4.2841 | - | - | - | - | - |
| 0.2772 | 5382 | 4.2753 | - | - | - | - | - |
| 0.2792 | 5421 | 4.221 | - | - | - | - | - |
| 0.2812 | 5460 | 4.2146 | 4.2056 | 0.6590 (+0.1186) | 0.3796 (+0.0545) | 0.7028 (+0.2021) | 0.5804 (+0.1251) |
| 0.2832 | 5499 | 4.2692 | - | - | - | - | - |
| 0.2852 | 5538 | 4.2236 | - | - | - | - | - |
| 0.2872 | 5577 | 4.1555 | - | - | - | - | - |
| 0.2892 | 5616 | 4.1684 | - | - | - | - | - |
| 0.2912 | 5655 | 4.1731 | 4.1304 | 0.6734 (+0.1329) | 0.3865 (+0.0615) | 0.6950 (+0.1944) | 0.5850 (+0.1296) |
| 0.2932 | 5694 | 4.1562 | - | - | - | - | - |
| 0.2952 | 5733 | 4.1689 | - | - | - | - | - |
| 0.2972 | 5772 | 4.1617 | - | - | - | - | - |
| 0.2992 | 5811 | 4.1256 | - | - | - | - | - |
| 0.3013 | 5850 | 4.0592 | 4.0723 | 0.6694 (+0.1290) | 0.3870 (+0.0619) | 0.7079 (+0.2073) | 0.5881 (+0.1327) |
| 0.3033 | 5889 | 4.0894 | - | - | - | - | - |
| 0.3053 | 5928 | 4.103 | - | - | - | - | - |
| 0.3073 | 5967 | 4.0083 | - | - | - | - | - |
| 0.3093 | 6006 | 4.03 | - | - | - | - | - |
| 0.3113 | 6045 | 3.9931 | 4.0058 | 0.6695 (+0.1290) | 0.3914 (+0.0664) | 0.7024 (+0.2018) | 0.5878 (+0.1324) |
| 0.3133 | 6084 | 4.0186 | - | - | - | - | - |
| 0.3153 | 6123 | 3.9312 | - | - | - | - | - |
| 0.3173 | 6162 | 4.0398 | - | - | - | - | - |
| 0.3193 | 6201 | 3.9672 | - | - | - | - | - |
| 0.3213 | 6240 | 3.9879 | 3.9322 | 0.6696 (+0.1292) | 0.3932 (+0.0681) | 0.7075 (+0.2068) | 0.5901 (+0.1347) |
| 0.3233 | 6279 | 3.879 | - | - | - | - | - |
| 0.3254 | 6318 | 3.9123 | - | - | - | - | - |
| 0.3274 | 6357 | 3.9144 | - | - | - | - | - |
| 0.3294 | 6396 | 3.903 | - | - | - | - | - |
| 0.3314 | 6435 | 3.9447 | 3.9070 | 0.6440 (+0.1036) | 0.3849 (+0.0599) | 0.7099 (+0.2092) | 0.5796 (+0.1242) |
| 0.3334 | 6474 | 3.9082 | - | - | - | - | - |
| 0.3354 | 6513 | 3.8405 | - | - | - | - | - |
| 0.3374 | 6552 | 3.8633 | - | - | - | - | - |
| 0.3394 | 6591 | 3.8301 | - | - | - | - | - |
| 0.3414 | 6630 | 3.8188 | 3.8524 | 0.6611 (+0.1206) | 0.3804 (+0.0554) | 0.6935 (+0.1929) | 0.5783 (+0.1230) |
| 0.3434 | 6669 | 3.8292 | - | - | - | - | - |
| 0.3454 | 6708 | 3.8502 | - | - | - | - | - |
| 0.3474 | 6747 | 3.8649 | - | - | - | - | - |
| 0.3495 | 6786 | 3.7942 | - | - | - | - | - |
| 0.3515 | 6825 | 3.7324 | 3.7958 | 0.6732 (+0.1328) | 0.3816 (+0.0565) | 0.7011 (+0.2005) | 0.5853 (+0.1299) |
| 0.3535 | 6864 | 3.7916 | - | - | - | - | - |
| 0.3555 | 6903 | 3.7299 | - | - | - | - | - |
| 0.3575 | 6942 | 3.8087 | - | - | - | - | - |
| 0.3595 | 6981 | 3.7821 | - | - | - | - | - |
| 0.3615 | 7020 | 3.7395 | 3.7438 | 0.6585 (+0.1181) | 0.3875 (+0.0625) | 0.7004 (+0.1998) | 0.5821 (+0.1268) |
| 0.3635 | 7059 | 3.7345 | - | - | - | - | - |
| 0.3655 | 7098 | 3.7472 | - | - | - | - | - |
| 0.3675 | 7137 | 3.7277 | - | - | - | - | - |
| 0.3695 | 7176 | 3.6535 | - | - | - | - | - |
| 0.3715 | 7215 | 3.6586 | 3.6982 | 0.6616 (+0.1212) | 0.3823 (+0.0573) | 0.7002 (+0.1996) | 0.5814 (+0.1260) |
| 0.3736 | 7254 | 3.6653 | - | - | - | - | - |
| 0.3756 | 7293 | 3.7074 | - | - | - | - | - |
| 0.3776 | 7332 | 3.6542 | - | - | - | - | - |
| 0.3796 | 7371 | 3.5972 | - | - | - | - | - |
| 0.3816 | 7410 | 3.6499 | 3.6283 | 0.6569 (+0.1165) | 0.3796 (+0.0545) | 0.7079 (+0.2073) | 0.5815 (+0.1261) |
| 0.3836 | 7449 | 3.6373 | - | - | - | - | - |
| 0.3856 | 7488 | 3.6253 | - | - | - | - | - |
| 0.3876 | 7527 | 3.6441 | - | - | - | - | - |
| 0.3896 | 7566 | 3.6278 | - | - | - | - | - |
| 0.3916 | 7605 | 3.6291 | 3.6008 | 0.6695 (+0.1291) | 0.3928 (+0.0677) | 0.7187 (+0.2180) | 0.5936 (+0.1383) |
| 0.3936 | 7644 | 3.5957 | - | - | - | - | - |
| 0.3956 | 7683 | 3.6031 | - | - | - | - | - |
| 0.3977 | 7722 | 3.5544 | - | - | - | - | - |
| 0.3997 | 7761 | 3.5823 | - | - | - | - | - |
| 0.4017 | 7800 | 3.5426 | 3.6026 | 0.6732 (+0.1328) | 0.3851 (+0.0600) | 0.7170 (+0.2163) | 0.5918 (+0.1364) |
| 0.4037 | 7839 | 3.5943 | - | - | - | - | - |
| 0.4057 | 7878 | 3.4955 | - | - | - | - | - |
| 0.4077 | 7917 | 3.5072 | - | - | - | - | - |
| 0.4097 | 7956 | 3.5529 | - | - | - | - | - |
| 0.4117 | 7995 | 3.5523 | 3.4759 | 0.6666 (+0.1262) | 0.3826 (+0.0576) | 0.7048 (+0.2041) | 0.5847 (+0.1293) |
| 0.4137 | 8034 | 3.4947 | - | - | - | - | - |
| 0.4157 | 8073 | 3.4753 | - | - | - | - | - |
| 0.4177 | 8112 | 3.431 | - | - | - | - | - |
| 0.4197 | 8151 | 3.4871 | - | - | - | - | - |
| 0.4218 | 8190 | 3.5072 | 3.4841 | 0.6703 (+0.1299) | 0.3880 (+0.0630) | 0.7216 (+0.2210) | 0.5933 (+0.1380) |
| 0.4238 | 8229 | 3.4812 | - | - | - | - | - |
| 0.4258 | 8268 | 3.4408 | - | - | - | - | - |
| 0.4278 | 8307 | 3.4781 | - | - | - | - | - |
| 0.4298 | 8346 | 3.4667 | - | - | - | - | - |
| 0.4318 | 8385 | 3.4618 | 3.4575 | 0.6709 (+0.1305) | 0.3785 (+0.0535) | 0.7062 (+0.2055) | 0.5852 (+0.1298) |
| 0.4338 | 8424 | 3.4711 | - | - | - | - | - |
| 0.4358 | 8463 | 3.4586 | - | - | - | - | - |
| 0.4378 | 8502 | 3.4246 | - | - | - | - | - |
| 0.4398 | 8541 | 3.4322 | - | - | - | - | - |
| 0.4418 | 8580 | 3.3901 | 3.4289 | 0.6657 (+0.1253) | 0.3780 (+0.0530) | 0.7118 (+0.2111) | 0.5852 (+0.1298) |
| 0.4438 | 8619 | 3.3776 | - | - | - | - | - |
| 0.4459 | 8658 | 3.3819 | - | - | - | - | - |
| 0.4479 | 8697 | 3.393 | - | - | - | - | - |
| 0.4499 | 8736 | 3.3559 | - | - | - | - | - |
| 0.4519 | 8775 | 3.4004 | 3.2981 | 0.6748 (+0.1344) | 0.3870 (+0.0620) | 0.7121 (+0.2114) | 0.5913 (+0.1359) |
| 0.4539 | 8814 | 3.426 | - | - | - | - | - |
| 0.4559 | 8853 | 3.3938 | - | - | - | - | - |
| 0.4579 | 8892 | 3.3304 | - | - | - | - | - |
| 0.4599 | 8931 | 3.3455 | - | - | - | - | - |
| 0.4619 | 8970 | 3.3314 | 3.3531 | 0.6700 (+0.1296) | 0.3775 (+0.0524) | 0.7173 (+0.2166) | 0.5882 (+0.1329) |
| 0.4639 | 9009 | 3.3455 | - | - | - | - | - |
| 0.4659 | 9048 | 3.3363 | - | - | - | - | - |
| 0.4679 | 9087 | 3.3492 | - | - | - | - | - |
| 0.4700 | 9126 | 3.3272 | - | - | - | - | - |
| 0.4720 | 9165 | 3.3253 | 3.2839 | 0.6963 (+0.1558) | 0.3831 (+0.0581) | 0.7113 (+0.2107) | 0.5969 (+0.1415) |
| 0.4740 | 9204 | 3.3214 | - | - | - | - | - |
| 0.4760 | 9243 | 3.2383 | - | - | - | - | - |
| 0.4780 | 9282 | 3.3198 | - | - | - | - | - |
| 0.4800 | 9321 | 3.2898 | - | - | - | - | - |
| 0.4820 | 9360 | 3.2641 | 3.2287 | 0.6832 (+0.1428) | 0.3880 (+0.0630) | 0.7206 (+0.2200) | 0.5973 (+0.1419) |
| 0.4840 | 9399 | 3.2974 | - | - | - | - | - |
| 0.4860 | 9438 | 3.3147 | - | - | - | - | - |
| 0.4880 | 9477 | 3.2911 | - | - | - | - | - |
| 0.4900 | 9516 | 3.2745 | - | - | - | - | - |
| 0.4920 | 9555 | 3.2168 | 3.2729 | 0.6730 (+0.1326) | 0.3920 (+0.0669) | 0.7173 (+0.2167) | 0.5941 (+0.1387) |
| 0.4941 | 9594 | 3.2036 | - | - | - | - | - |
| 0.4961 | 9633 | 3.2618 | - | - | - | - | - |
| 0.4981 | 9672 | 3.2786 | - | - | - | - | - |
| 0.5001 | 9711 | 3.2214 | - | - | - | - | - |
| 0.5021 | 9750 | 3.1759 | 3.2106 | 0.6764 (+0.1360) | 0.3852 (+0.0602) | 0.7226 (+0.2220) | 0.5948 (+0.1394) |
| 0.5041 | 9789 | 3.2159 | - | - | - | - | - |
| 0.5061 | 9828 | 3.1856 | - | - | - | - | - |
| 0.5081 | 9867 | 3.209 | - | - | - | - | - |
| 0.5101 | 9906 | 3.2239 | - | - | - | - | - |
| 0.5121 | 9945 | 3.2142 | 3.2159 | 0.6697 (+0.1292) | 0.3913 (+0.0662) | 0.7009 (+0.2003) | 0.5873 (+0.1319) |
| 0.5141 | 9984 | 3.2156 | - | - | - | - | - |
| 0.5161 | 10023 | 3.1568 | - | - | - | - | - |
| 0.5182 | 10062 | 3.1344 | - | - | - | - | - |
| 0.5202 | 10101 | 3.1754 | - | - | - | - | - |
| 0.5222 | 10140 | 3.1414 | 3.2009 | 0.6913 (+0.1509) | 0.3913 (+0.0663) | 0.7180 (+0.2173) | 0.6002 (+0.1448) |
| 0.5242 | 10179 | 3.2359 | - | - | - | - | - |
| 0.5262 | 10218 | 3.149 | - | - | - | - | - |
| 0.5282 | 10257 | 3.1741 | - | - | - | - | - |
| 0.5302 | 10296 | 3.081 | - | - | - | - | - |
| 0.5322 | 10335 | 3.1457 | 3.1939 | 0.6685 (+0.1281) | 0.3858 (+0.0608) | 0.7304 (+0.2297) | 0.5949 (+0.1395) |
| 0.5342 | 10374 | 3.1322 | - | - | - | - | - |
| 0.5362 | 10413 | 3.1307 | - | - | - | - | - |
| 0.5382 | 10452 | 3.1212 | - | - | - | - | - |
| 0.5402 | 10491 | 3.105 | - | - | - | - | - |
| 0.5423 | 10530 | 3.1339 | 3.1324 | 0.6693 (+0.1289) | 0.3942 (+0.0692) | 0.7101 (+0.2094) | 0.5912 (+0.1358) |
| 0.5443 | 10569 | 3.111 | - | - | - | - | - |
| 0.5463 | 10608 | 3.1045 | - | - | - | - | - |
| 0.5483 | 10647 | 3.0923 | - | - | - | - | - |
| 0.5503 | 10686 | 3.075 | - | - | - | - | - |
| 0.5523 | 10725 | 3.062 | 3.1363 | 0.6931 (+0.1527) | 0.4079 (+0.0828) | 0.7233 (+0.2227) | 0.6081 (+0.1527) |
| 0.5543 | 10764 | 3.1167 | - | - | - | - | - |
| 0.5563 | 10803 | 3.0903 | - | - | - | - | - |
| 0.5583 | 10842 | 3.0604 | - | - | - | - | - |
| 0.5603 | 10881 | 3.0724 | - | - | - | - | - |
| 0.5623 | 10920 | 3.1052 | 3.1181 | 0.6856 (+0.1452) | 0.4003 (+0.0753) | 0.7249 (+0.2242) | 0.6036 (+0.1482) |
| 0.5643 | 10959 | 3.0984 | - | - | - | - | - |
| 0.5664 | 10998 | 3.0779 | - | - | - | - | - |
| 0.5684 | 11037 | 2.9911 | - | - | - | - | - |
| 0.5704 | 11076 | 3.0431 | - | - | - | - | - |
| 0.5724 | 11115 | 3.0793 | 3.1106 | 0.6709 (+0.1305) | 0.3998 (+0.0748) | 0.7186 (+0.2180) | 0.5965 (+0.1411) |
| 0.5744 | 11154 | 3.0504 | - | - | - | - | - |
| 0.5764 | 11193 | 3.0118 | - | - | - | - | - |
| 0.5784 | 11232 | 3.0704 | - | - | - | - | - |
| 0.5804 | 11271 | 3.0498 | - | - | - | - | - |
| 0.5824 | 11310 | 3.0393 | 3.0475 | 0.6806 (+0.1401) | 0.4091 (+0.0841) | 0.7122 (+0.2116) | 0.6006 (+0.1453) |
| 0.5844 | 11349 | 2.9919 | - | - | - | - | - |
| 0.5864 | 11388 | 3.0291 | - | - | - | - | - |
| 0.5884 | 11427 | 3.0119 | - | - | - | - | - |
| 0.5905 | 11466 | 3.0201 | - | - | - | - | - |
| 0.5925 | 11505 | 3.0319 | 3.0064 | 0.6852 (+0.1448) | 0.4061 (+0.0811) | 0.7163 (+0.2156) | 0.6025 (+0.1472) |
| 0.5945 | 11544 | 3.0309 | - | - | - | - | - |
| 0.5965 | 11583 | 3.0097 | - | - | - | - | - |
| 0.5985 | 11622 | 2.9759 | - | - | - | - | - |
| 0.6005 | 11661 | 2.9937 | - | - | - | - | - |
| 0.6025 | 11700 | 2.9885 | 3.0262 | 0.6729 (+0.1325) | 0.3985 (+0.0735) | 0.7227 (+0.2221) | 0.5980 (+0.1427) |
| 0.6045 | 11739 | 3.0092 | - | - | - | - | - |
| 0.6065 | 11778 | 2.9569 | - | - | - | - | - |
| 0.6085 | 11817 | 2.9665 | - | - | - | - | - |
| 0.6105 | 11856 | 2.9774 | - | - | - | - | - |
| 0.6125 | 11895 | 2.9915 | 2.9842 | 0.6857 (+0.1452) | 0.4055 (+0.0804) | 0.7379 (+0.2372) | 0.6097 (+0.1543) |
| 0.6146 | 11934 | 2.9555 | - | - | - | - | - |
| 0.6166 | 11973 | 2.9833 | - | - | - | - | - |
| 0.6186 | 12012 | 2.9858 | - | - | - | - | - |
| 0.6206 | 12051 | 2.9743 | - | - | - | - | - |
| 0.6226 | 12090 | 2.9686 | 2.9523 | 0.6821 (+0.1417) | 0.4044 (+0.0794) | 0.7238 (+0.2232) | 0.6035 (+0.1481) |
| 0.6246 | 12129 | 2.9867 | - | - | - | - | - |
| 0.6266 | 12168 | 2.9548 | - | - | - | - | - |
| 0.6286 | 12207 | 2.9557 | - | - | - | - | - |
| 0.6306 | 12246 | 2.9506 | - | - | - | - | - |
| 0.6326 | 12285 | 2.9692 | 2.9323 | 0.6862 (+0.1458) | 0.4063 (+0.0813) | 0.7257 (+0.2251) | 0.6061 (+0.1507) |
| 0.6346 | 12324 | 2.9458 | - | - | - | - | - |
| 0.6366 | 12363 | 2.9131 | - | - | - | - | - |
| 0.6387 | 12402 | 2.9118 | - | - | - | - | - |
| 0.6407 | 12441 | 2.8914 | - | - | - | - | - |
| 0.6427 | 12480 | 2.8845 | 2.9832 | 0.6796 (+0.1392) | 0.4009 (+0.0759) | 0.7250 (+0.2244) | 0.6019 (+0.1465) |
| 0.6447 | 12519 | 2.9312 | - | - | - | - | - |
| 0.6467 | 12558 | 2.9344 | - | - | - | - | - |
| 0.6487 | 12597 | 2.9033 | - | - | - | - | - |
| 0.6507 | 12636 | 2.892 | - | - | - | - | - |
| 0.6527 | 12675 | 2.8944 | 2.9286 | 0.6874 (+0.1469) | 0.4015 (+0.0764) | 0.7247 (+0.2241) | 0.6045 (+0.1491) |
| 0.6547 | 12714 | 2.9269 | - | - | - | - | - |
| 0.6567 | 12753 | 2.8988 | - | - | - | - | - |
| 0.6587 | 12792 | 2.9167 | - | - | - | - | - |
| 0.6607 | 12831 | 2.8703 | - | - | - | - | - |
| 0.6628 | 12870 | 2.8619 | 2.8724 | 0.6829 (+0.1425) | 0.4005 (+0.0754) | 0.7176 (+0.2170) | 0.6003 (+0.1450) |
| 0.6648 | 12909 | 2.868 | - | - | - | - | - |
| 0.6668 | 12948 | 2.8775 | - | - | - | - | - |
| 0.6688 | 12987 | 2.866 | - | - | - | - | - |
| 0.6708 | 13026 | 2.8877 | - | - | - | - | - |
| 0.6728 | 13065 | 2.896 | 2.9117 | 0.6846 (+0.1442) | 0.4023 (+0.0773) | 0.7198 (+0.2192) | 0.6022 (+0.1469) |
| 0.6748 | 13104 | 2.8351 | - | - | - | - | - |
| 0.6768 | 13143 | 2.8679 | - | - | - | - | - |
| 0.6788 | 13182 | 2.9197 | - | - | - | - | - |
| 0.6808 | 13221 | 2.822 | - | - | - | - | - |
| 0.6828 | 13260 | 2.8443 | 2.9010 | 0.6775 (+0.1371) | 0.4039 (+0.0789) | 0.7116 (+0.2109) | 0.5977 (+0.1423) |
| 0.6848 | 13299 | 2.8646 | - | - | - | - | - |
| 0.6869 | 13338 | 2.8645 | - | - | - | - | - |
| 0.6889 | 13377 | 2.8659 | - | - | - | - | - |
| 0.6909 | 13416 | 2.8264 | - | - | - | - | - |
| 0.6929 | 13455 | 2.8222 | 2.8632 | 0.6947 (+0.1543) | 0.4057 (+0.0807) | 0.7314 (+0.2307) | 0.6106 (+0.1552) |
| 0.6949 | 13494 | 2.8565 | - | - | - | - | - |
| 0.6969 | 13533 | 2.8385 | - | - | - | - | - |
| 0.6989 | 13572 | 2.8305 | - | - | - | - | - |
| 0.7009 | 13611 | 2.8368 | - | - | - | - | - |
| 0.7029 | 13650 | 2.8416 | 2.8309 | 0.6784 (+0.1380) | 0.4113 (+0.0863) | 0.7270 (+0.2264) | 0.6056 (+0.1502) |
| 0.7049 | 13689 | 2.8007 | - | - | - | - | - |
| 0.7069 | 13728 | 2.8565 | - | - | - | - | - |
| 0.7089 | 13767 | 2.8893 | - | - | - | - | - |
| 0.7110 | 13806 | 2.844 | - | - | - | - | - |
| 0.7130 | 13845 | 2.8293 | 2.8423 | 0.6693 (+0.1289) | 0.4053 (+0.0803) | 0.7244 (+0.2238) | 0.5997 (+0.1443) |
| 0.7150 | 13884 | 2.8424 | - | - | - | - | - |
| 0.7170 | 13923 | 2.7951 | - | - | - | - | - |
| 0.7190 | 13962 | 2.8004 | - | - | - | - | - |
| 0.7210 | 14001 | 2.7833 | - | - | - | - | - |
| 0.7230 | 14040 | 2.8133 | 2.8021 | 0.6788 (+0.1383) | 0.4013 (+0.0763) | 0.7285 (+0.2278) | 0.6028 (+0.1475) |
| 0.7250 | 14079 | 2.8245 | - | - | - | - | - |
| 0.7270 | 14118 | 2.7995 | - | - | - | - | - |
| 0.7290 | 14157 | 2.7859 | - | - | - | - | - |
| 0.7310 | 14196 | 2.8067 | - | - | - | - | - |
| 0.7330 | 14235 | 2.7606 | 2.8098 | 0.6750 (+0.1346) | 0.4012 (+0.0762) | 0.7381 (+0.2375) | 0.6048 (+0.1494) |
| 0.7351 | 14274 | 2.7662 | - | - | - | - | - |
| 0.7371 | 14313 | 2.8081 | - | - | - | - | - |
| 0.7391 | 14352 | 2.8159 | - | - | - | - | - |
| 0.7411 | 14391 | 2.7604 | - | - | - | - | - |
| 0.7431 | 14430 | 2.7721 | 2.7912 | 0.6759 (+0.1355) | 0.4040 (+0.0790) | 0.7283 (+0.2276) | 0.6027 (+0.1474) |
| 0.7451 | 14469 | 2.7772 | - | - | - | - | - |
| 0.7471 | 14508 | 2.8183 | - | - | - | - | - |
| 0.7491 | 14547 | 2.7821 | - | - | - | - | - |
| 0.7511 | 14586 | 2.7434 | - | - | - | - | - |
| 0.7531 | 14625 | 2.812 | 2.7940 | 0.6673 (+0.1268) | 0.4034 (+0.0783) | 0.7277 (+0.2271) | 0.5995 (+0.1441) |
| 0.7551 | 14664 | 2.7847 | - | - | - | - | - |
| 0.7571 | 14703 | 2.7604 | - | - | - | - | - |
| 0.7592 | 14742 | 2.7271 | - | - | - | - | - |
| 0.7612 | 14781 | 2.7663 | - | - | - | - | - |
| 0.7632 | 14820 | 2.7731 | 2.7489 | 0.6694 (+0.1290) | 0.3963 (+0.0713) | 0.7380 (+0.2373) | 0.6012 (+0.1459) |
| 0.7652 | 14859 | 2.8013 | - | - | - | - | - |
| 0.7672 | 14898 | 2.762 | - | - | - | - | - |
| 0.7692 | 14937 | 2.7646 | - | - | - | - | - |
| 0.7712 | 14976 | 2.762 | - | - | - | - | - |
| 0.7732 | 15015 | 2.77 | 2.7367 | 0.6815 (+0.1411) | 0.4066 (+0.0816) | 0.7262 (+0.2256) | 0.6048 (+0.1494) |
| 0.7752 | 15054 | 2.7827 | - | - | - | - | - |
| 0.7772 | 15093 | 2.7027 | - | - | - | - | - |
| 0.7792 | 15132 | 2.7395 | - | - | - | - | - |
| 0.7812 | 15171 | 2.7425 | - | - | - | - | - |
| 0.7833 | 15210 | 2.7757 | 2.7380 | 0.6720 (+0.1315) | 0.4057 (+0.0807) | 0.7374 (+0.2367) | 0.6050 (+0.1497) |
| 0.7853 | 15249 | 2.7185 | - | - | - | - | - |
| 0.7873 | 15288 | 2.7287 | - | - | - | - | - |
| 0.7893 | 15327 | 2.7311 | - | - | - | - | - |
| 0.7913 | 15366 | 2.7281 | - | - | - | - | - |
| 0.7933 | 15405 | 2.7039 | 2.7142 | 0.6854 (+0.1450) | 0.4034 (+0.0784) | 0.7364 (+0.2358) | 0.6084 (+0.1531) |
| 0.7953 | 15444 | 2.7028 | - | - | - | - | - |
| 0.7973 | 15483 | 2.7182 | - | - | - | - | - |
| 0.7993 | 15522 | 2.688 | - | - | - | - | - |
| 0.8013 | 15561 | 2.7562 | - | - | - | - | - |
| 0.8033 | 15600 | 2.6968 | 2.7413 | 0.6669 (+0.1264) | 0.4118 (+0.0867) | 0.7269 (+0.2263) | 0.6018 (+0.1465) |
| 0.8053 | 15639 | 2.6802 | - | - | - | - | - |
| 0.8074 | 15678 | 2.6936 | - | - | - | - | - |
| 0.8094 | 15717 | 2.7059 | - | - | - | - | - |
| 0.8114 | 15756 | 2.7052 | - | - | - | - | - |
| 0.8134 | 15795 | 2.7178 | 2.6968 | 0.6738 (+0.1334) | 0.4064 (+0.0813) | 0.7283 (+0.2277) | 0.6028 (+0.1475) |
| 0.8154 | 15834 | 2.669 | - | - | - | - | - |
| 0.8174 | 15873 | 2.6881 | - | - | - | - | - |
| 0.8194 | 15912 | 2.6973 | - | - | - | - | - |
| 0.8214 | 15951 | 2.6861 | - | - | - | - | - |
| 0.8234 | 15990 | 2.6695 | 2.7110 | 0.6821 (+0.1417) | 0.4104 (+0.0853) | 0.7304 (+0.2298) | 0.6076 (+0.1523) |
| 0.8254 | 16029 | 2.6974 | - | - | - | - | - |
| 0.8274 | 16068 | 2.7043 | - | - | - | - | - |
| 0.8294 | 16107 | 2.6929 | - | - | - | - | - |
| 0.8315 | 16146 | 2.667 | - | - | - | - | - |
| 0.8335 | 16185 | 2.7035 | 2.6746 | 0.6891 (+0.1487) | 0.4084 (+0.0834) | 0.7262 (+0.2256) | 0.6079 (+0.1526) |
| 0.8355 | 16224 | 2.6472 | - | - | - | - | - |
| 0.8375 | 16263 | 2.6956 | - | - | - | - | - |
| 0.8395 | 16302 | 2.6823 | - | - | - | - | - |
| 0.8415 | 16341 | 2.695 | - | - | - | - | - |
| 0.8435 | 16380 | 2.6439 | 2.6841 | 0.6835 (+0.1430) | 0.4015 (+0.0764) | 0.7288 (+0.2282) | 0.6046 (+0.1492) |
| 0.8455 | 16419 | 2.6767 | - | - | - | - | - |
| 0.8475 | 16458 | 2.6556 | - | - | - | - | - |
| 0.8495 | 16497 | 2.6431 | - | - | - | - | - |
| 0.8515 | 16536 | 2.6594 | - | - | - | - | - |
| 0.8535 | 16575 | 2.6403 | 2.6657 | 0.6709 (+0.1304) | 0.4044 (+0.0794) | 0.7323 (+0.2316) | 0.6025 (+0.1471) |
| 0.8556 | 16614 | 2.6687 | - | - | - | - | - |
| 0.8576 | 16653 | 2.6887 | - | - | - | - | - |
| 0.8596 | 16692 | 2.6761 | - | - | - | - | - |
| 0.8616 | 16731 | 2.6371 | - | - | - | - | - |
| 0.8636 | 16770 | 2.6368 | 2.6791 | 0.6768 (+0.1364) | 0.4068 (+0.0818) | 0.7314 (+0.2308) | 0.6050 (+0.1497) |
| 0.8656 | 16809 | 2.6325 | - | - | - | - | - |
| 0.8676 | 16848 | 2.641 | - | - | - | - | - |
| 0.8696 | 16887 | 2.6614 | - | - | - | - | - |
| 0.8716 | 16926 | 2.7 | - | - | - | - | - |
| 0.8736 | 16965 | 2.678 | 2.6354 | 0.6938 (+0.1533) | 0.4108 (+0.0857) | 0.7274 (+0.2268) | 0.6107 (+0.1553) |
| 0.8756 | 17004 | 2.6619 | - | - | - | - | - |
| 0.8776 | 17043 | 2.681 | - | - | - | - | - |
| 0.8797 | 17082 | 2.6465 | - | - | - | - | - |
| 0.8817 | 17121 | 2.6977 | - | - | - | - | - |
| 0.8837 | 17160 | 2.6476 | 2.6324 | 0.6969 (+0.1565) | 0.4118 (+0.0868) | 0.7293 (+0.2287) | 0.6127 (+0.1573) |
| 0.8857 | 17199 | 2.6405 | - | - | - | - | - |
| 0.8877 | 17238 | 2.6645 | - | - | - | - | - |
| 0.8897 | 17277 | 2.6635 | - | - | - | - | - |
| 0.8917 | 17316 | 2.6038 | - | - | - | - | - |
| **0.8937** | **17355** | **2.6807** | **2.642** | **0.6951 (+0.1547)** | **0.4138 (+0.0888)** | **0.7294 (+0.2287)** | **0.6128 (+0.1574)** |
| 0.8957 | 17394 | 2.6679 | - | - | - | - | - |
| 0.8977 | 17433 | 2.6063 | - | - | - | - | - |
| 0.8997 | 17472 | 2.6215 | - | - | - | - | - |
| 0.9017 | 17511 | 2.6141 | - | - | - | - | - |
| 0.9038 | 17550 | 2.6324 | 2.6330 | 0.6963 (+0.1559) | 0.4117 (+0.0866) | 0.7287 (+0.2280) | 0.6122 (+0.1568) |
| 0.9058 | 17589 | 2.6405 | - | - | - | - | - |
| 0.9078 | 17628 | 2.619 | - | - | - | - | - |
| 0.9098 | 17667 | 2.6153 | - | - | - | - | - |
| 0.9118 | 17706 | 2.6323 | - | - | - | - | - |
| 0.9138 | 17745 | 2.5891 | 2.6280 | 0.6886 (+0.1482) | 0.4110 (+0.0860) | 0.7275 (+0.2268) | 0.6090 (+0.1537) |
| 0.9158 | 17784 | 2.6385 | - | - | - | - | - |
| 0.9178 | 17823 | 2.6138 | - | - | - | - | - |
| 0.9198 | 17862 | 2.6181 | - | - | - | - | - |
| 0.9218 | 17901 | 2.6414 | - | - | - | - | - |
| 0.9238 | 17940 | 2.6362 | 2.6224 | 0.6907 (+0.1502) | 0.4085 (+0.0835) | 0.7360 (+0.2354) | 0.6117 (+0.1564) |
| 0.9258 | 17979 | 2.6156 | - | - | - | - | - |
| 0.9279 | 18018 | 2.597 | - | - | - | - | - |
| 0.9299 | 18057 | 2.6254 | - | - | - | - | - |
| 0.9319 | 18096 | 2.6434 | - | - | - | - | - |
| 0.9339 | 18135 | 2.6474 | 2.6014 | 0.6884 (+0.1479) | 0.4133 (+0.0882) | 0.7248 (+0.2242) | 0.6088 (+0.1534) |
| 0.9359 | 18174 | 2.6214 | - | - | - | - | - |
| 0.9379 | 18213 | 2.6145 | - | - | - | - | - |
| 0.9399 | 18252 | 2.617 | - | - | - | - | - |
| 0.9419 | 18291 | 2.6209 | - | - | - | - | - |
| 0.9439 | 18330 | 2.5976 | 2.6057 | 0.6871 (+0.1467) | 0.4072 (+0.0821) | 0.7290 (+0.2284) | 0.6078 (+0.1524) |
| 0.9459 | 18369 | 2.61 | - | - | - | - | - |
| 0.9479 | 18408 | 2.614 | - | - | - | - | - |
| 0.9499 | 18447 | 2.6187 | - | - | - | - | - |
| 0.9520 | 18486 | 2.6004 | - | - | - | - | - |
| 0.9540 | 18525 | 2.6515 | 2.6052 | 0.6871 (+0.1467) | 0.4091 (+0.0841) | 0.7270 (+0.2263) | 0.6077 (+0.1524) |
| 0.9560 | 18564 | 2.6141 | - | - | - | - | - |
| 0.9580 | 18603 | 2.6016 | - | - | - | - | - |
| 0.9600 | 18642 | 2.5918 | - | - | - | - | - |
| 0.9620 | 18681 | 2.5684 | - | - | - | - | - |
| 0.9640 | 18720 | 2.6087 | 2.5941 | 0.6797 (+0.1393) | 0.4105 (+0.0854) | 0.7284 (+0.2277) | 0.6062 (+0.1508) |
| 0.9660 | 18759 | 2.5961 | - | - | - | - | - |
| 0.9680 | 18798 | 2.6121 | - | - | - | - | - |
| 0.9700 | 18837 | 2.5896 | - | - | - | - | - |
| 0.9720 | 18876 | 2.6101 | - | - | - | - | - |
| 0.9740 | 18915 | 2.6106 | 2.5921 | 0.6856 (+0.1452) | 0.4093 (+0.0843) | 0.7284 (+0.2277) | 0.6078 (+0.1524) |
| 0.9761 | 18954 | 2.6046 | - | - | - | - | - |
| 0.9781 | 18993 | 2.6155 | - | - | - | - | - |
| 0.9801 | 19032 | 2.6166 | - | - | - | - | - |
| 0.9821 | 19071 | 2.5866 | - | - | - | - | - |
| 0.9841 | 19110 | 2.6369 | 2.5943 | 0.6838 (+0.1433) | 0.4117 (+0.0867) | 0.7284 (+0.2277) | 0.6079 (+0.1526) |
| 0.9861 | 19149 | 2.6319 | - | - | - | - | - |
| 0.9881 | 19188 | 2.6035 | - | - | - | - | - |
| 0.9901 | 19227 | 2.565 | - | - | - | - | - |
| 0.9921 | 19266 | 2.6071 | - | - | - | - | - |
| 0.9941 | 19305 | 2.5908 | 2.5866 | 0.6838 (+0.1433) | 0.4116 (+0.0866) | 0.7284 (+0.2277) | 0.6079 (+0.1525) |
| 0.9961 | 19344 | 2.5983 | - | - | - | - | - |
| 0.9981 | 19383 | 2.5995 | - | - | - | - | - |
| -1 | -1 | - | - | 0.6951 (+0.1547) | 0.4138 (+0.0888) | 0.7294 (+0.2287) | 0.6128 (+0.1574) |
* The bold row denotes the saved checkpoint.
</details>
### Environmental Impact
Carbon emissions were measured using [CodeCarbon](https://github.com/mlco2/codecarbon).
- **Energy Consumed**: 6.535 kWh
- **Carbon Emitted**: 2.412 kg of CO2
- **Hours Used**: 1.932 hours
### Training Hardware
- **On Cloud**: No
- **GPU Model**: 8 x NVIDIA H100 80GB HBM3
- **CPU Model**: AMD EPYC 7R13 Processor
- **RAM Size**: 1999.99 GB
### Framework Versions
- Python: 3.10.14
- Sentence Transformers: 5.1.2
- Transformers: 4.57.1
- PyTorch: 2.9.1+cu126
- Accelerate: 1.12.0
- Datasets: 4.4.1
- Tokenizers: 0.22.1
## Citation
### BibTeX
#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
```
#### MarginMSELoss
```bibtex
@misc{hofstätter2021improving,
title={Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation},
author={Sebastian Hofstätter and Sophia Althammer and Michael Schröder and Mete Sertkan and Allan Hanbury},
year={2021},
eprint={2010.02666},
archivePrefix={arXiv},
primaryClass={cs.IR}
}
```
<!--
## Glossary
*Clearly define terms in order to be accessible across audiences.*
-->
<!--
## Model Card Authors
*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->
<!--
## Model Card Contact
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->