Robotics data platform
Manipulation data, compressed and ready to train.
haptal.ai turns raw tactile + vision logs into sensor-agnostic embeddings so teams can train policies without building data rigs.
Pipeline
End-to-end tactile data — from custom hardware to training-ready embeddings.
Custom Tactile Data Collection
We work with you to map your target manipulation tasks — grasping, assembly, insertion — and identify the exact tactile modalities your robot needs: force, pressure, slip detection, and texture recognition.
Customized Manufacturing Rig
We design and build a data collection rig tailored to your robot platform — integrating visuo-tactile sensors with synchronized multi-modal capture and sub-degree pose tracking accuracy.
Raw Data Collection
We deploy the rig to capture synchronized streams — cameras, tactile sensors, and joint positions — collecting thousands of demonstration trajectories across your target tasks.
Optimize
All sensor streams are time-aligned, quality-checked, and augmented. Bad recordings are filtered, sparse tactile data is synthetically expanded, and signals are normalized across sensor types.
Embedded Code
Raw sensor files are compressed into compact, dense numerical vectors that preserve task-relevant information — delivered as training-ready datasets hundreds of times smaller than the originals.
Team
Built at Berkeley
Daniel Gabriel Dapula
Co-Founder & Head of Hardware
Patent-worthy tactile sensing research in Prof. Lining Yao's capstone project. UC Berkeley Mechanical Engineering
Aarav Bedi
CEO & Co-Founder
Hardware experience at Rigetti Computing, SkyDeck & Berkeley National Lab. UC Berkeley Mechanical Engineering.
Arif Razack
Co-Founder & Head of ML
Production ML experience deploying models at scale. Building end-to-end data pipelines and embedding architecture. Data Science at UC Berkeley