Phoenix21 commited on
Commit
0f042bc
·
verified ·
1 Parent(s): ca55eb0

Upload Distance-Aware Chronos model

Browse files
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ tags:
4
+ - time-series
5
+ - forecasting
6
+ - chronos
7
+ - distance-aware
8
+ library_name: transformers
9
+ ---
10
+
11
+ # Distance-Aware Chronos
12
+
13
+ This is a distance-aware enhancement of the Chronos time series forecasting model.
14
+
15
+ ## Model Description
16
+
17
+ This model extends the original [Chronos](https://github.com/amazon-science/chronos-forecasting)
18
+ architecture with distance-aware loss functions and output layers that explicitly consider the
19
+ ordinal nature of quantized time series bins.
20
+
21
+ **Base Model:** amazon/chronos-t5-small
22
+ **Number of Bins:** 4096
23
+ **Training Epoch:** 10
24
+ **Validation Loss:** 2.4703
25
+
26
+ ## Key Features
27
+
28
+ - **Distance-Aware Loss:** Combines ordinal cross-entropy, smooth label loss, and Earth Mover's Distance
29
+ - **Ordinal Output Layer:** Uses Gaussian kernels and sinusoidal position encodings
30
+ - **Improved Bin Predictions:** Better handling of nearby bin relationships
31
+
32
+ ## Installation
33
+ ```bash
34
+ pip install torch transformers chronos
35
+ ```
36
+
37
+ ## Usage
38
+ ```python
39
+ from distance_aware_chronos import DistanceAwareChronos
40
+ import numpy as np
41
+
42
+ # Load model
43
+ model = DistanceAwareChronos.from_pretrained("Phoenix21/distance-aware-chronos-t")
44
+
45
+ # Prepare your time series
46
+ context = np.array([1.0, 2.0, 3.0, 4.0, 5.0]) # Your historical data
47
+
48
+ # Generate forecasts
49
+ predictions = model.predict(context, horizon=24, num_samples=100)
50
+
51
+ print(f"Forecast shape: {predictions.shape}")
52
+ ```
53
+
54
+ ## Training Data
55
+
56
+ Trained on the [Chronos datasets](https://huggingface.co/datasets/autogluon/chronos_datasets)
57
+ from HuggingFace.
58
+
59
+ ## Citation
60
+
61
+ If you use this model, please cite:
62
+ ```bibtex
63
+ @article{chronos2024,
64
+ title={Chronos: Learning the Language of Time Series},
65
+ author={Ansari, Abdul Fatir et al.},
66
+ journal={Transactions on Machine Learning Research},
67
+ year={2024}
68
+ }
69
+ ```
70
+
71
+ ## License
72
+
73
+ Apache 2.0
base_model/config.json ADDED
@@ -0,0 +1,49 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "T5ForConditionalGeneration"
4
+ ],
5
+ "chronos_config": {
6
+ "context_length": 512,
7
+ "eos_token_id": 1,
8
+ "model_type": "seq2seq",
9
+ "n_special_tokens": 2,
10
+ "n_tokens": 4096,
11
+ "num_samples": 20,
12
+ "pad_token_id": 0,
13
+ "prediction_length": 64,
14
+ "temperature": 1.0,
15
+ "tokenizer_class": "MeanScaleUniformBins",
16
+ "tokenizer_kwargs": {
17
+ "high_limit": 15.0,
18
+ "low_limit": -15.0
19
+ },
20
+ "top_k": 50,
21
+ "top_p": 1.0,
22
+ "use_eos_token": true
23
+ },
24
+ "classifier_dropout": 0.0,
25
+ "d_ff": 2048,
26
+ "d_kv": 64,
27
+ "d_model": 512,
28
+ "decoder_start_token_id": 0,
29
+ "dense_act_fn": "relu",
30
+ "dropout_rate": 0.1,
31
+ "dtype": "float32",
32
+ "eos_token_id": 1,
33
+ "feed_forward_proj": "relu",
34
+ "initializer_factor": 0.05,
35
+ "is_encoder_decoder": true,
36
+ "is_gated_act": false,
37
+ "layer_norm_epsilon": 1e-06,
38
+ "model_type": "t5",
39
+ "n_positions": 512,
40
+ "num_decoder_layers": 6,
41
+ "num_heads": 8,
42
+ "num_layers": 6,
43
+ "pad_token_id": 0,
44
+ "relative_attention_max_distance": 128,
45
+ "relative_attention_num_buckets": 32,
46
+ "transformers_version": "4.57.1",
47
+ "use_cache": true,
48
+ "vocab_size": 4096
49
+ }
base_model/generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "decoder_start_token_id": 0,
4
+ "eos_token_id": 1,
5
+ "pad_token_id": 0,
6
+ "transformers_version": "4.57.1"
7
+ }
base_model/model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:099347ca49e954892070992dae2fa31abecf5e54fcfffe876d69d9ddc3602ed2
3
+ size 184632360
config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "distance_aware_chronos",
3
+ "base_model": "amazon/chronos-t5-small",
4
+ "num_bins": 4096,
5
+ "training_epoch": 10,
6
+ "val_loss": 2.470318087331065,
7
+ "timestamp": "2025-11-21T04:18:03.413552"
8
+ }
distance_output.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6cc68ee4f559f59b1aca371fa339dc84269f7649d688f1074328eb6233b321fc
3
+ size 10678922
example_usage.py ADDED
@@ -0,0 +1,28 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Example: Using Distance-Aware Chronos
2
+
3
+ import numpy as np
4
+ from distance_aware_chronos import DistanceAwareChronos
5
+
6
+ # Load the model
7
+ model = DistanceAwareChronos.from_pretrained("YOUR_USERNAME/distance-aware-chronos")
8
+
9
+ # Example 1: Simple forecasting
10
+ context = np.random.randn(100) # Your time series
11
+ forecast = model.predict(context, horizon=24, num_samples=100)
12
+
13
+ print(f"Forecast shape: {forecast.shape}")
14
+ print(f"Mean forecast: {forecast.mean()}")
15
+
16
+ # Example 2: With visualization
17
+ import matplotlib.pyplot as plt
18
+
19
+ plt.figure(figsize=(12, 6))
20
+ plt.plot(range(len(context)), context, label='Historical', color='blue')
21
+ plt.plot(range(len(context), len(context) + len(forecast)),
22
+ forecast, label='Forecast', color='red', linestyle='--')
23
+ plt.xlabel('Time')
24
+ plt.ylabel('Value')
25
+ plt.title('Time Series Forecast')
26
+ plt.legend()
27
+ plt.grid(True, alpha=0.3)
28
+ plt.show()