Ships US-made plug-and-play robotics hardware with Python SDKs for AI-powered robot development.
Using real-time ML path planning, edge neural network inference on Nvidia Jetson, and automated hardware-ML integration testing.

|
Robotics Hardware
|
YC W26

Last Updated:
March 19, 2026

Builds a US-based plug-and-play robotics hardware ecosystem,actuators, motors, FOC control boards, Nvidia Jetson companion boards, and Python SDKs,enabling robotics startups to rapidly develop AI-powered robots like quadrupeds, humanoids, and robotic arms without sourcing fragmented overseas components.
HLabs has publicly announced expanding its modular hardware lineup including FOC motor control boards, Nvidia Jetson companion boards, wireless modules, and US-manufactured actuators. They are shipping preorders and actively building out Python libraries and developer documentation to lower the barrier for ML-powered robotics prototyping. Their website signals a commitment to scaling domestic manufacturing capacity and broadening support for medium-to-large robot platforms.
Patent filing US12405283B1 (Aug 2024) reveals an AI-driven robotic arm system for lab automation using 3D sensors, LiDAR, ToF cameras, and real-time ML path planning,claiming up to 62x throughput improvement. GitHub activity under HarmonicLabs shows modular control system tooling and real-time feedback loop libraries. Founder Paul Hetherington's prior company Mystic AI (YC W21) focused on low-latency ML model deployment, suggesting deep infrastructure for edge inference optimization. Hiring signals point to upcoming engineering and manufacturing roles as production scales. Conference and community engagement hints at partnerships with robotics labs and early enterprise customers in lab automation.
<p>AI-Powered Robotic Arm for Autonomous Lab Automation</p>
A robot arm that watches, learns, and moves through a lab like a seasoned technician—except it never gets tired and works 62 times faster.
HLabs' patent (US12405283B1) describes an AI-driven robotic arm system that integrates 3D cameras, LiDAR, and time-of-flight sensors to build a real-time spatial model of its environment. Machine learning models continuously process this sensor data to plan optimal movement paths, avoid collisions, and adapt to changing conditions on the fly. The system uses a closed feedback loop where each action's outcome is fed back into the ML model, enabling continuous self-improvement. This architecture allows the arm to autonomously execute multi-step lab protocols—such as cell isolation, purification, and sample transfer—that traditionally require skilled human technicians. The patent claims up to 62x throughput improvement, which would be transformative for pharmaceutical, biotech, and academic research labs facing labor shortages and reproducibility challenges. The system's edge inference capability (likely leveraging HLabs' own Nvidia Jetson boards) ensures low-latency decision-making critical for precise physical manipulation.
It's like replacing a meticulous lab technician with an octopus that has perfect memory, laser eyes, and never needs a coffee break.
<p>Edge AI Inference for Real-Time Quadruped and Humanoid Robot Control</p>
A robot brain that thinks fast enough to catch itself before it trips—no Wi-Fi required.
HLabs' Nvidia Jetson companion board is purpose-built to run neural network inference directly on the robot, eliminating the latency and reliability risks of cloud-based ML processing. For locomotion-intensive robots like quadrupeds and humanoids, real-time control is non-negotiable—a 50ms delay can mean the difference between a graceful step and a catastrophic fall. By providing a plug-and-play edge AI board with pre-integrated Python libraries, HLabs enables robotics developers to deploy reinforcement learning locomotion policies, visual SLAM, and obstacle avoidance models directly at the edge. The board abstracts away the complexity of GPU driver configuration, power management, and communication protocols, so ML engineers can focus on model architecture rather than embedded systems plumbing. This is particularly valuable for startups iterating rapidly on sim-to-real transfer, where models trained in simulation need to be deployed and tested on physical hardware with minimal friction. The companion board's tight integration with HLabs' FOC motor controllers creates a seamless pipeline from neural network output to precise motor torque commands.
It's like giving a robot its own on-board Formula 1 pit crew that makes split-second decisions without ever radioing back to headquarters.
<p>Plug-and-Play ML Development Kit for Rapid Robotics Prototyping</p>
A robotics starter kit so well-integrated that going from "I trained a model" to "my robot is walking" takes hours, not months.
One of the biggest bottlenecks in robotics R&D is the gap between training an ML model in simulation and deploying it on physical hardware. Traditionally, this requires weeks of custom electronics integration, driver development, communication protocol debugging, and power system design before a single inference can run on a real robot. HLabs collapses this entire pipeline into a modular, plug-and-play ecosystem. Their hardware stack—actuators, motors, FOC boards, Jetson companion boards—is designed from the ground up to work together seamlessly, with standardized connectors and communication protocols. Their Python libraries provide high-level APIs that let ML engineers send torque commands, read sensor data, and deploy trained models with just a few lines of code. This dramatically accelerates the sim-to-real transfer loop, enabling robotics teams to test more hypotheses, iterate faster, and reach product-market fit sooner. For early-stage robotics startups operating on limited runway, this time compression is existential—it can mean the difference between shipping a product and running out of funding. The open-source GitHub tooling (HarmonicLabs) further lowers the barrier by providing reference implementations and community-contributed integrations.
It's like IKEA furniture for robots—everything fits together with clear instructions, except instead of an Allen wrench you use Python, and instead of a bookshelf you get a walking machine.
Paul Hetherington is a second-time YC founder who previously built Mystic AI (low-latency ML deployment), giving him rare dual expertise in both robotics hardware engineering and production-grade ML infrastructure,allowing HLabs to design hardware that is natively optimized for AI workloads from the silicon up, not bolted on as an afterthought.