How Is

Seeing Systems

Using AI?

Modular strike drones with AI swarm control for single-operator autonomous missions.

Using agentic multi-drone autonomy for swarm coordination, combat-adaptive path planning, and on-edge target classification.

Company Overview

Builds modular strike and reconnaissance drones (Bandit) paired with an AI-driven swarm control platform (Aerie) that enables single operators to coordinate autonomous drone swarms for military missions.

Product Roadmap & Public Announcements

Seeing Systems has publicly showcased Bandit, a modular FPV drone for training and one-way strike missions, and Aerie, an agentic swarm control system. They have confirmed field testing with UK Royal Marine Commandos, four NATO forces, and active prototype deployment in Ukraine. Their public messaging emphasizes radical hardware modularity, >2x lifecycle cost reduction vs. incumbents, and forward-deployed engineering for rapid iteration cycles driven by real operator feedback.

Signals & Private Analysis

Non-traditional signals suggest a strong push toward deeper agentic autonomy,moving beyond swarm coordination toward fully autonomous mission planning and dynamic threat response. The founders' backgrounds (Jane Street quant rigor + EOD hardware for military) hint at sensor fusion and edge ML inference on-drone. Absence of public job postings and funding announcements beyond YC suggests either stealth fundraising or bootstrapped growth via early defense contracts. GitHub and conference silence is consistent with classified or export-controlled work. Likely pursuing UK MOD and NATO procurement pathways, with Ukraine serving as a live proving ground to accelerate product-market fit in ways peacetime testing cannot.

Seeing Systems

Machine Learning Use Cases

Agentic multi-drone autonomy
For
Operational Efficiency
Operations

<p>Agentic Swarm Coordination: AI-driven multi-drone autonomous mission execution enabling a single operator to control an entire drone swarm in contested environments.</p>

Layman's Explanation

Instead of needing a pilot for every drone, one person tells the AI what the mission is and the swarm figures out how to do it together.

Use Case Details

Seeing Systems' Aerie platform uses agentic AI to transform drone swarm operations from a many-operators-to-many-drones model into a single-operator-to-many-drones paradigm. The system ingests mission objectives, environmental data, and real-time sensor feeds to autonomously allocate tasks across the swarm—assigning reconnaissance roles, strike priorities, and fallback behaviors without requiring granular human input for each drone. The AI agents negotiate roles among themselves, dynamically re-plan when drones are lost or threats emerge, and surface only critical decision points to the human operator. This dramatically reduces cognitive load, training requirements, and the personnel footprint needed for complex multi-drone operations, which is essential for resource-constrained military units operating in denied or degraded communications environments.

Analogy

It's like being a restaurant manager who just says "busy Friday night, 200 covers" and the entire kitchen staff self-organizes who's on grill, who's on desserts, and who covers when someone burns their hand.

Combat-adaptive path planning
For
Risk Reduction
Engineering

<p>Battlefield Adaptive Autonomy: Real-time ML-driven flight path optimization and threat avoidance using live combat data from Ukraine deployments.</p>

Layman's Explanation

The drone teaches itself to dodge electronic jamming and enemy fire by learning from what happened to drones before it on the same battlefield.

Use Case Details

Seeing Systems' forward-deployed engineering model in Ukraine creates a uniquely powerful feedback loop: drones encounter real electronic warfare (EW) jamming, kinetic threats, and environmental challenges, and that operational data is fed back into ML models that continuously improve autonomous flight behaviors. The system likely uses imitation learning from skilled FPV pilots combined with reinforcement learning in simulation to develop robust navigation policies that can adapt to GPS denial, RF jamming, and visual obscurants. Each deployment generates training data that makes the next generation of autonomy more resilient. This is a decisive advantage over competitors relying solely on simulation or controlled test environments—Seeing Systems' models are trained on the ground truth of modern electronic and kinetic warfare, producing drones that can autonomously navigate to targets even when communications are degraded or severed.

Analogy

It's like a delivery driver who doesn't just use Google Maps but actually remembers every pothole, speed trap, and road closure from thousands of previous drivers' dashcam footage—and reroutes before you even hit the problem.

On-edge target classification
For
Product Differentiation
Product

<p>Intelligent Target Recognition and Strike Optimization: ML-powered onboard computer vision for autonomous target identification, classification, and optimal strike angle computation on the Bandit drone platform.</p>

Layman's Explanation

The drone's onboard AI recognizes what it's looking at—vehicle, structure, decoy—and calculates the best angle of attack all by itself, even if it loses contact with the operator.

Use Case Details

The Bandit drone platform integrates onboard computer vision models that perform real-time target detection, classification, and prioritization without requiring continuous operator input. Using lightweight convolutional neural networks optimized for edge deployment, the system can distinguish between military vehicles, structures, personnel, and decoys under varying lighting, weather, and occlusion conditions. Once a target is classified, the AI computes an optimal terminal approach vector—factoring in target geometry, armor profile, warhead type, and wind conditions—to maximize strike effectiveness. This capability is critical for one-way effector missions where communication links may be severed during the terminal phase. The modular hardware architecture of Bandit allows rapid swapping of vision modules and ML accelerators as new, more capable chips become available, ensuring the platform stays ahead of adversary countermeasures like visual camouflage and IR decoys.

Analogy

It's like a smart bowling ball that not only rolls itself down the lane but picks which pin to hit and adjusts its spin mid-roll for a guaranteed strike—even after you've already let go.

Key Technical Team Members

  • Matthew Le Maitre, Co-Founder & Software Lead
  • Alex Le Maitre, Co-Founder & Hardware Lead

The Le Maitre brothers combine elite quantitative software engineering (Jane Street, Cambridge, autonomous robotics) with hands-on military-grade hardware experience (EOD systems, drone manufacturing), enabling them to iterate on both the AI brain and the physical platform simultaneously,with live battlefield feedback from Ukraine and NATO forces that no competitor can easily replicate.

Seeing Systems

Funding History

  • 2025-2026 | Founded by Matthew and Alex Le Maitre. 2025-2026 | Accepted into Y Combinator W26 batch (~$500K standard investment). 2025-2026 | Prototypes shipped to Ukraine for field testing. 2025-2026 | Engagements with UK Royal Marine Commandos and four NATO forces. No additional public funding rounds disclosed.

Seeing Systems

Competitors

  • Hardware Competitors: Shield AI, Anduril (Ghost), Skydio (X2/X10), Turkish Baykar (TB2/Bayraktar), Ukrainian Brave1 ecosystem startups. Swarm Software: Shield AI (Hivemind), Anduril (Lattice), Palantir (AIP for defense). Low-Cost FPV/Strike Drones: Various Ukrainian manufacturers, DJI-based modified platforms, Auterion (open-source autopilot). AI-Native Defense Startups: Helsing, Elbit Systems (AI division), various YC/stealth defense AI companies.
More

Companies
Get Every New ML Use Cases Directly to Your Inbox
Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.