Builds a general robot brain across embodiments, with a larger funding base and broader platform posture.
Works on foundation models for real-world robot control, with more public investor heat and research visibility.
Targets robot foundation models and physical automation, likely competing for talent, compute, and early robot customers.
No hard moat yet; the credible path is proprietary demonstration data plus tactile capture hardware that improves with each robot task recorded.
Using robot evaluation loops to compare model architectures, measure manipulation failures, and feed real-world rollout data back into training.
Building models that can transfer manipulation skills across different robot embodiments instead of being locked to one arm or gripper.
Training manipulation foundation models from human demonstrations with tactile and force feedback so robots can learn contact-heavy tasks across objects.
The company trains robot manipulation models on multimodal human demos, with force and tactile signals as the possible edge over video-only robot learning.