AI Usage at a Glance
Aug 1, 2024
ProductivityPractice documented: Foundation deployed the Phantom MK1 humanoid robot, which uses an onboard AI system to autonomously perform repetitive physical tasks — such as moving cases and racking car parts — in manufacturing and logistics facilities. The robot uses camera-based perception and a proprietary physics-action model to navigate environments and execute tasks without continuous human intervention.
Practice DocumentedView practice →Aug 1, 2024
OtherPractice documented: Foundation announced a planned AI system called 'fleet coherence' that would allow multiple Phantom robots to share real-time task and environment data, coordinate their actions across a shared world graph, and dynamically allocate work to avoid duplication. As of early 2025, this system is in early development and robots largely operate independently.
Practice DocumentedView practice →Aug 22, 2024
ProductivityNew evidence: Founder of failed fintech Synapse says he's raised $11M for new robotics startup
Evidence AddedView practice →Mar 1, 2025
ProductivityNew evidence: Foundation funding, news & analysis
Evidence AddedView practice →Mar 1, 2025
Data AnalysisPractice documented: Foundation uses data generated during Phantom robot deployments to retrain its proprietary AI action models, according to CEO Sankaet Pathak in a March 2025 interview. When a human operator remotely intervenes to correct a robot's behavior, that intervention is labeled and fed back into the model. Sacra reports that this feedback loop is built into Foundation's business model, with larger fleets producing more training data and improving model accuracy over time.
Practice DocumentedView practice →Mar 9, 2025
OtherNew evidence: Sankaet Pathak, CEO of Foundation, on why humanoids win in robotics
Evidence AddedView practice →Mar 9, 2025
Data AnalysisNew evidence: Sankaet Pathak, CEO of Foundation, on why humanoids win in robotics
Evidence AddedView practice →Sep 1, 2025
ProductivityNew evidence: Inside Foundation Robotics: The Untold Story Behind Their Success
Evidence AddedView practice →Nov 1, 2025
ProductivityNew evidence: Foundation Emerges With 'Phantom' Humanoid, Betting on Novel Actuators and Hybrid AI
Evidence AddedView practice →Mar 9, 2026
OtherPractice documented: Foundation tested two Phantom MK-1 humanoid robots in Ukraine starting February 2026 in a frontline-reconnaissance support role, per co-founder Mike LeBlanc's statements to TIME. The robots use a camera-first visual perception system and LLM-based task-to-motion software that converts operator commands into movement sequences, and they operate with a human in the loop for any lethal decisions.
Practice DocumentedView practice →Mar 13, 2026
OtherNew evidence: Company Testing Humanoid Robot Soldiers on Frontlines of Ukraine
Evidence AddedView practice →Mar 13, 2026
OtherNew evidence: Two humanoid soldier robots delivered to Ukraine by US company – Time
Evidence AddedView practice →Foundation tested two Phantom MK-1 humanoid robots in Ukraine starting February 2026 in a frontline-reconnaissance support role, per co-founder Mike LeBlanc's statements to TIME. The robots use a camera-first visual perception system and LLM-based task-to-motion software that converts operator commands into movement sequences, and they operate with a human in the loop for any lethal decisions.
Foundation sent two Phantom MK-1 units to Ukraine in February 2026 for a frontline-reconnaissance support pilot, with the company stating intent to prepare Phantoms for further Pentagon deployment. The robot uses visual sensors to interpret its surroundings; an LLM-based task-to-motion layer converts high-level operator commands into movement sequences. Under current U.S. Department of Defense protocols, automated systems may engage only with explicit human authorization, and Foundation states it applies that same principle to Phantom. The specific Ukrainian operator, location, and mission outcomes are not disclosed in any reviewed source.
Foundation tested two Phantom MK-1 humanoid robots in Ukraine starting February 2026 in a frontline-reconnaissance support role, per co-founder Mike LeBlanc's statements to TIME. The robots use a camera-first visual perception system and LLM-based task-to-motion software that converts operator commands into movement sequences, and they operate with a human in the loop for any lethal decisions.
Foundation announced a planned AI system called 'fleet coherence' that would allow multiple Phantom robots to share real-time task and environment data, coordinate their actions across a shared world graph, and dynamically allocate work to avoid duplication. As of early 2025, this system is in early development and robots largely operate independently.
Have evidence about Foundation's AI practices? Submit a report.
Submit a report →AI Trace is free and nonprofit. Support our work
According to Sacra, Foundation's business model is structured so that each robot deployment generates training data that feeds back into its proprietary AI models. When a human operator intervenes via teleoperation to correct a robot's behavior, that intervention is labeled and used as training data to improve the model, as described by CEO Sankaet Pathak in a March 2025 interview. This approach means that AI model performance is directly tied to the scale of the robot fleet; a larger deployed fleet produces more diverse operational data, enabling more robust model generalization over time.
Foundation announced a planned AI system called 'fleet coherence' that would allow multiple Phantom robots to share real-time task and environment data, coordinate their actions across a shared world graph, and dynamically allocate work to avoid duplication. As of early 2025, this system is in early development and robots largely operate independently.
Foundation's master plan and a March 2025 Sacra interview with CEO Sankaet Pathak describe fleet coherence as a long-term goal, not a current deployment. Pathak noted that robots in 2025 deployments work largely independently and do not coordinate with each other. The envisioned system, analogous to GPU cluster coordination per Pathak, would give each robot in a fleet real-time awareness of what every other robot is doing, enabling dynamic task reallocation as work progresses via a shared world graph. Sacra's editorial analysis describes the intended software architecture as coordinating actions across a fleet using this shared world graph to prevent task duplication and optimize resource allocation.