We capture how humans physically interact with the real world, so robots can learn to do the same.
Abu Dhabi, UAEWhen a person picks up a mug, opens a door, or mops a floor, a camera can record every frame of that movement. What it cannot record is how hard they gripped, how their weight shifted, or how much force passed through their wrist. That invisible layer, the measured forces of contact between a human body and the physical world, is exactly what robotics companies need. Startups are racing to collect video of human tasks. No one is collecting the forces.
We deploy lightweight sensor kits (pressure-sensing gloves, smart insoles, and body-worn cameras) through partnerships with facility management and service companies, collecting synchronized video and force data during everyday tasks in real homes, offices, and commercial spaces. The result is a dataset no one else provides: real-world human interaction data with measured contact forces, captured at scale, outside the lab.
Licensed human interaction datasets with real-world force measurements. The training data that robots need and simulation cannot provide
AI-powered injury risk screening from video, trained on the same real-world force data. No lab visit required
A 989-muscle neuromusculoskeletal model transforms raw sensor data into physics-complete annotations: joint torques, muscle activations, force distributions. The most detailed academic whole-body musculoskeletal model. Cannot be replicated from video.
Pressure-sensing gloves and smart insoles capture measured contact forces, not estimates from video. No startup collects this. Without force data, robots can learn where to reach but not how hard to grip.
We deploy through facility management companies already operating in thousands of real environments. No per-site negotiation needed. No competitor uses this channel. Every other startup pays gig workers or rents access.
Co-Founder & Chief Scientist
Co-Founder & CEO