Update README.md
Browse files
README.md
CHANGED
|
@@ -4,15 +4,70 @@ task_categories:
|
|
| 4 |
- robotics
|
| 5 |
---
|
| 6 |
|
| 7 |
-
|
|
|
|
| 8 |
|
| 9 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 10 |
|
| 11 |
-
The dataset is organized by source datasets, with each source containing one or more arrow files.
|
| 12 |
|
| 13 |
### Features
|
| 14 |
|
| 15 |
-
The dataset contains the following fields:
|
|
|
|
| 16 |
- `dataset_name`: Original source dataset name
|
| 17 |
- `image`: Image of the robot scene (binary)
|
| 18 |
- `task_string`: Description of the task
|
|
@@ -20,4 +75,20 @@ The dataset contains the following fields:
|
|
| 20 |
- `traj_index`: Index of the trajectory in the dataset
|
| 21 |
- `action`: Robot action vector (serialized numpy array)
|
| 22 |
- `trace`: Robot trajectory trace (serialized numpy array)
|
| 23 |
-
- `trace_visibility`: Visibility mask for the trace (serialized numpy array)
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 4 |
- robotics
|
| 5 |
---
|
| 6 |
|
| 7 |
+
<div align="center">
|
| 8 |
+
<h2>Magma: A Foundation Model for Multimodal AI Agents</h2>
|
| 9 |
|
| 10 |
+
[Jianwei Yang](https://jwyang.github.io/)<sup>*</sup><sup>1</sup><sup>†</sup>
|
| 11 |
+
[Reuben Tan](https://cs-people.bu.edu/rxtan/)<sup>1</sup><sup>†</sup>
|
| 12 |
+
[Qianhui Wu](https://qianhuiwu.github.io/)<sup>1</sup><sup>†</sup>
|
| 13 |
+
[Ruijie Zheng](https://ruijiezheng.com/)<sup>2</sup><sup>‡</sup>
|
| 14 |
+
[Baolin Peng](https://scholar.google.com/citations?user=u1CNjgwAAAAJ&hl=en&oi=ao)<sup>1</sup><sup>‡</sup>
|
| 15 |
+
[Yongyuan Liang](https://cheryyunl.github.io)<sup>2</sup><sup>‡</sup>
|
| 16 |
+
|
| 17 |
+
[Yu Gu](http://yu-gu.me/)<sup>1</sup>
|
| 18 |
+
[Mu Cai](https://pages.cs.wisc.edu/~mucai/)<sup>3</sup>
|
| 19 |
+
[Seonghyeon Ye](https://seonghyeonye.github.io/)<sup>4</sup>
|
| 20 |
+
[Joel Jang](https://joeljang.github.io/)<sup>5</sup>
|
| 21 |
+
[Yuquan Deng](https://scholar.google.com/citations?user=LTC0Q6YAAAAJ&hl=en)<sup>5</sup>
|
| 22 |
+
[Lars Liden](https://sites.google.com/site/larsliden)<sup>1</sup>
|
| 23 |
+
[Jianfeng Gao](https://www.microsoft.com/en-us/research/people/jfgao/)<sup>1</sup><sup>▽</sup>
|
| 24 |
+
|
| 25 |
+
<sup>1</sup> Microsoft Research; <sup>2</sup> University of Maryland; <sup>3</sup> University of Wisconsin-Madison
|
| 26 |
+
<sup>4</sup> KAIST; <sup>5</sup> University of Washington
|
| 27 |
+
|
| 28 |
+
<sup>*</sup> Project lead <sup>†</sup> First authors <sup>‡</sup> Second authors <sup>▽</sup> Leadership
|
| 29 |
+
|
| 30 |
+
\[[arXiv Paper](https://www.arxiv.org/pdf/2502.13130)\] \[[Project Page](https://microsoft.github.io/Magma/)\] \[[Hugging Face Paper](https://huggingface.co/papers/2502.13130)\] \[[Github Repo](https://github.com/microsoft/Magma)\] \[[Video](https://www.youtube.com/watch?v=SbfzvUU5yM8)\]
|
| 31 |
+
|
| 32 |
+
</div>
|
| 33 |
+
|
| 34 |
+
## Introduction
|
| 35 |
+
|
| 36 |
+
This dataset contains the robotic manipulation data used in Magma pretraining. For fair comparison, we followed OpenVLA to use the data mix "siglip-224px+mx-oxe-magic-soup".
|
| 37 |
+
|
| 38 |
+
The dataset is organized by following source datasets, with each source containing one or more arrow files:
|
| 39 |
+
|
| 40 |
+
| Folder | Number of Shards |
|
| 41 |
+
|:------------------------------------------------------|-------------------:|
|
| 42 |
+
| austin_buds_dataset_converted_externally_to_rlds | 1 |
|
| 43 |
+
| austin_sailor_dataset_converted_externally_to_rlds | 4 |
|
| 44 |
+
| austin_sirius_dataset_converted_externally_to_rlds | 3 |
|
| 45 |
+
| berkeley_autolab_ur5 | 1 |
|
| 46 |
+
| berkeley_cable_routing | 1 |
|
| 47 |
+
| berkeley_fanuc_manipulation | 1 |
|
| 48 |
+
| bridge_orig | 17 |
|
| 49 |
+
| cmu_stretch | 1 |
|
| 50 |
+
| dlr_edan_shared_control_converted_externally_to_rlds | 1 |
|
| 51 |
+
| fractal20220817_data | 21 |
|
| 52 |
+
| furniture_bench_dataset_converted_externally_to_rlds | 4 |
|
| 53 |
+
| iamlab_cmu_pickup_insert_converted_externally_to_rlds | 2 |
|
| 54 |
+
| jaco_play | 1 |
|
| 55 |
+
| kuka | 21 |
|
| 56 |
+
| language_table | 8 |
|
| 57 |
+
| nyu_franka_play_dataset_converted_externally_to_rlds | 1 |
|
| 58 |
+
| roboturk | 3 |
|
| 59 |
+
| stanford_hydra_dataset_converted_externally_to_rlds | 4 |
|
| 60 |
+
| taco_play | 3 |
|
| 61 |
+
| toto | 3 |
|
| 62 |
+
| ucsd_kitchen_dataset_converted_externally_to_rlds | 1 |
|
| 63 |
+
| utaustin_mutex | 4 |
|
| 64 |
+
| viola | 1 |
|
| 65 |
|
|
|
|
| 66 |
|
| 67 |
### Features
|
| 68 |
|
| 69 |
+
In addition to the default features, we extracted the visual traces of future 16 frames for each frame. The dataset contains the following fields:
|
| 70 |
+
|
| 71 |
- `dataset_name`: Original source dataset name
|
| 72 |
- `image`: Image of the robot scene (binary)
|
| 73 |
- `task_string`: Description of the task
|
|
|
|
| 75 |
- `traj_index`: Index of the trajectory in the dataset
|
| 76 |
- `action`: Robot action vector (serialized numpy array)
|
| 77 |
- `trace`: Robot trajectory trace (serialized numpy array)
|
| 78 |
+
- `trace_visibility`: Visibility mask for the trace (serialized numpy array)
|
| 79 |
+
|
| 80 |
+
## Dataset Loading
|
| 81 |
+
|
| 82 |
+
We can load the full data using:
|
| 83 |
+
|
| 84 |
+
```py
|
| 85 |
+
from datasets import load_dataset
|
| 86 |
+
dataset = load_dataset("MagmaAI/Magma-OXE-ToM", streaming=True, split="train")
|
| 87 |
+
```
|
| 88 |
+
|
| 89 |
+
or specify a dataset by:
|
| 90 |
+
|
| 91 |
+
```py
|
| 92 |
+
from datasets import load_dataset
|
| 93 |
+
dataset = load_dataset("MagmaAI/Magma-OXE-ToM", data_dir="austin_buds_dataset_converted_externally_to_rlds", streaming=True, split="train")
|
| 94 |
+
```
|