API Documentation
Sensor-agnostic tactile and vision embeddings delivered via REST API. Fetch training-ready datasets in three lines of code.
What it does
The haptal API gives you programmatic access to pre-processed robotics datasets — compressed 450x from raw sensor data into dense embeddings that work across sensor types (GelSight, DIGIT, BioTac, custom). No collection rigs, no preprocessing, no sensor-specific code.
Why use it
- Skip infrastructure — A tactile data rig costs $50K+ and months. One API call replaces it.
- Sensor-agnostic — Embeddings normalize across hardware. Your model generalizes without retraining per sensor.
- Multiple formats — NumPy, PyTorch, TensorFlow, HDF5. Stream in real-time or batch download.
- Version pinning — Lock dataset versions for reproducible experiments across your team.
Install
pip install haptal-sdk
Authenticate
from haptal import HaptalClient
client = HaptalClient(api_key="your_api_key_here")
print(client.ping().ok) # True
Fetch a dataset
data = client.datasets.fetch(
dataset_id="humanoid-manipulation-v2",
split="train",
format="numpy"
)
print(data.observations.shape) # (50000, 7, 128)
print(data.labels.unique()) # ['grasp', 'lift', 'place', ...]
Stream into training loop
stream = client.streams.connect(
dataset_id="humanoid-manipulation-v2",
batch_size=64,
shuffle=True
)
for batch in stream:
loss = model.train_step(batch.observations, batch.actions, batch.rewards)