Dataset Preview
Full Screen
The full dataset viewer is not available (click to read why). Only showing a preview of the rows.
The dataset generation failed
Error code:   DatasetGenerationError
Exception:    TypeError
Message:      Couldn't cast array of type
struct<base_pos: list<element: double>, base_quat: list<element: double>, parent: string, type: string>
to
{'base_pos': Sequence(feature=Value(dtype='float64', id=None), length=-1, id=None), 'base_quat': Sequence(feature=Value(dtype='float64', id=None), length=-1, id=None), 'type': Value(dtype='string', id=None)}
Traceback:    Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1492, in compute_config_parquet_and_info_response
                  fill_builder_info(builder, hf_endpoint=hf_endpoint, hf_token=hf_token, validate=validate)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 683, in fill_builder_info
                  ) = retry_validate_get_features_num_examples_size_and_compression_ratio(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 602, in retry_validate_get_features_num_examples_size_and_compression_ratio
                  validate(pf)
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 640, in validate
                  raise TooBigRowGroupsError(
              worker.job_runners.config.parquet_and_info.TooBigRowGroupsError: Parquet file has too big row groups. First row group has 1894850886 which exceeds the limit of 300000000
              
              During handling of the above exception, another exception occurred:
              
              Traceback (most recent call last):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1995, in _prepare_split_single
                  for _, table in generator:
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 797, in wrapped
                  for item in generator(*args, **kwargs):
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 97, in _generate_tables
                  yield f"{file_idx}_{batch_idx}", self._cast_table(pa_table)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/packaged_modules/parquet/parquet.py", line 75, in _cast_table
                  pa_table = table_cast(pa_table, self.info.features.arrow_schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2302, in table_cast
                  return cast_table_to_schema(table, schema)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in cast_table_to_schema
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2261, in <listcomp>
                  arrays = [cast_array_to_feature(table[name], feature) for name, feature in features.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in wrapper
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1802, in <listcomp>
                  return pa.chunked_array([func(chunk, *args, **kwargs) for chunk in array.chunks])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2020, in cast_array_to_feature
                  arrays = [_c(array.field(name), subfeature) for name, subfeature in feature.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2020, in <listcomp>
                  arrays = [_c(array.field(name), subfeature) for name, subfeature in feature.items()]
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2025, in cast_array_to_feature
                  casted_array_values = _c(array.values, feature[0])
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 1804, in wrapper
                  return func(array, *args, **kwargs)
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/table.py", line 2122, in cast_array_to_feature
                  raise TypeError(f"Couldn't cast array of type\n{_short_str(array.type)}\nto\n{_short_str(feature)}")
              TypeError: Couldn't cast array of type
              struct<base_pos: list<element: double>, base_quat: list<element: double>, parent: string, type: string>
              to
              {'base_pos': Sequence(feature=Value(dtype='float64', id=None), length=-1, id=None), 'base_quat': Sequence(feature=Value(dtype='float64', id=None), length=-1, id=None), 'type': Value(dtype='string', id=None)}
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1505, in compute_config_parquet_and_info_response
                  parquet_operations, partial, estimated_dataset_info = stream_convert_to_parquet(
                File "/src/services/worker/src/worker/job_runners/config/parquet_and_info.py", line 1099, in stream_convert_to_parquet
                  builder._prepare_split(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 1882, in _prepare_split
                  for job_id, done, content in self._prepare_split_single(
                File "/src/services/worker/.venv/lib/python3.9/site-packages/datasets/builder.py", line 2038, in _prepare_split_single
                  raise DatasetGenerationError("An error occurred while generating the dataset") from e
              datasets.exceptions.DatasetGenerationError: An error occurred while generating the dataset

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

obj_file
dict
robot_file
dict
metadata
dict
plan
list
scene
dict
sequence
dict
trajectory
dict
scene_file
string
obstacles_file
string
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":50369,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a0_","tasks":[{"algorithm":"","end":179.0,"name":"handover","object_index":0,"start":122.(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":2,"primitive":"pickpick1","robots":["a2_","a3_"]},{"object":2,"primitive":"pickp(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":16546,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a0_","tasks":[{"algorithm":"rrt","end":265.0,"name":"pick","object_index":1,"start":184.0(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":0,"primitive":"pickpick1","robots":["a3_","a2_"]},{"object":2,"primitive":"hando(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":9157,"folder":"out/run_id_202405271352/success/envId_614/rand(...TRUNCATED)
[{"robot":"a3_","tasks":[{"algorithm":"","end":218.0,"name":"handover","object_index":2,"start":165.(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":0,"primitive":"pick","robots":["a0_"]},{"object":2,"primitive":"handover","robot(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":31439,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a3_","tasks":[{"algorithm":"","end":73.0,"name":"handover","object_index":2,"start":28.0}(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":2,"primitive":"handover","robots":["a0_","a3_"]},{"object":1,"primitive":"pickpi(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":1442,"folder":"out/run_id_202405271352/success/envId_614/rand(...TRUNCATED)
[{"robot":"a0_","tasks":[{"algorithm":"rrt","end":147.0,"name":"pick","object_index":1,"start":73.0}(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":2,"primitive":"pick","robots":["a3_"]},{"object":0,"primitive":"pick","robots":[(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":52652,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a1_","tasks":[{"algorithm":"rrt","end":91.0,"name":"pick","object_index":2,"start":58.0},(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":1,"primitive":"handover","robots":["a3_","a0_"]},{"object":0,"primitive":"pick",(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":53760,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a0_","tasks":[{"algorithm":"","end":365.0,"name":"handover","object_index":1,"start":321.(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":2,"primitive":"handover","robots":["a2_","a3_"]},{"object":0,"primitive":"handov(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":32753,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a1_","tasks":[{"algorithm":"rrt","end":238.0,"name":"pick","object_index":2,"start":195.0(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":0,"primitive":"pick","robots":["a2_"]},{"object":1,"primitive":"handover","robot(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":2538,"folder":"out/run_id_202405271352/success/envId_614/rand(...TRUNCATED)
[{"robot":"a0_","tasks":[{"algorithm":"rrt","end":133.0,"name":"pick","object_index":0,"start":108.0(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":0,"primitive":"pickpick1","robots":["a3_","a0_"]},{"object":2,"primitive":"hando(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
{"objects":[{"goal_pos":[-1.005004830126797,-0.4490026515700889,0.08],"goal_quat":[0.967977824185825(...TRUNCATED)
{"robots":[{"base_pos":[-0.5,-0.4,0.0],"base_quat":[1.0,0.0,0.0,0.0],"type":"ur5_vacuum"},{"base_pos(...TRUNCATED)
{"metadata":{"cumulative_compute_time":29896,"folder":"out/run_id_202405271352/success/envId_614/ran(...TRUNCATED)
[{"robot":"a2_","tasks":[{"algorithm":"","end":285.0,"name":"handover","object_index":0,"start":244.(...TRUNCATED)
{"Objects":{"obj1":{"goal":{"abs_pos":[-1.005004830126797,-0.3990026515700889,0.6299999999999999],"a(...TRUNCATED)
{"tasks":[{"object":2,"primitive":"handover","robots":["a0_","a3_"]},{"object":1,"primitive":"pickpi(...TRUNCATED)
{"objs":[{"name":"obj1","steps":[{"pos":[0.013139616294503442,0.2339829839707323,0.6299999999999999](...TRUNCATED)
"World \t{ X:<[0, 0, 0, 1, 0, 0, 0]> } \n\ntable_base (World) {\n Q:[0 0(...TRUNCATED)
End of preview.

Dataset Card for TAPAS πŸ₯˜πŸ€πŸ΄πŸ§€

Dataset Summary

TAPAS is a simulated dataset for Task Assignment and Planning for Multi Agent Systems.

The dataset consists of task and motion plans for multiple robots, in different scenarios,

acting asynchronously in the same workspace and modifying the same environment.

Downloads last month
72