Humanoid robots are being trained, in part, by gig workers recording their own body movements at home — including a Nigerian medical student who films himself after hospital shifts with a phone strapped to his forehead.
The arrangement, reported by MIT Technology Review, offers a window into an under-examined supply chain underpinning the humanoid robotics boom. Just as large language models required vast quantities of human-labelled text, physical AI systems require vast quantities of human motion — and the industry has found a familiar solution: the gig economy.
A Phone on His Forehead, Data for a Robot
Zeus, a medical student living in a hilltop city in central Nigeria, returns from long hospital shifts, sets up a ring light, and records himself performing specific physical tasks with an iPhone strapped to his head. His outstretched hands mimic a sleepwalker's posture — a deliberate data-capture technique that gives robot-training algorithms a first-person view of human reach, grip, and spatial navigation.
His labour is part of a broader cottage industry that robotics companies are quietly assembling. Workers across multiple countries perform structured physical actions — opening drawers, folding items, navigating domestic spaces — on camera, uploading footage that engineers then use to teach humanoid robots how to move through the physical world.
The people teaching robots to walk, reach, and grasp are often themselves invisible — paid by the clip, not the hour, with no stake in the systems they are building.
The model mirrors the early years of content moderation and AI data labelling, where workers in the Global South performed cognitively demanding tasks for platform wages with little job security or recognition. The difference here is that the data being collected is not text or images — it is the human body itself.
Why Humanoid Robots Need Human Movement Data
Humanoid robots present a data problem that cameras and sensors alone cannot easily solve. Unlike industrial arms operating in controlled factory environments, humanoid machines are being designed to function in unpredictable domestic and commercial spaces — kitchens, hospitals, warehouses — alongside people. Training them requires exposure to the full, messy variability of human physical behaviour.
Simulation can generate some of this data, but researchers have found that models trained exclusively in virtual environments often fail to transfer reliably to the real world — a persistent challenge the field calls the sim-to-real gap. Human demonstration data, captured in real homes and real lighting conditions by real people of different body types and movement styles, helps close that gap.
The result is demand for what the industry calls teleoperation data and egocentric video — footage shot from head-mounted or hand-held cameras that approximates a robot's own perspective. Workers like Zeus are, in effect, embodying the robot's future point of view.
The Economics of Motion
The pay structure for this work is typically task-based. Workers are compensated per clip or per session, with rates varying by task complexity and platform. The arrangement offers flexibility — a significant draw for workers like Zeus, who fits recording sessions around medical school — but it also means income is unpredictable and labour protections are minimal.
This model raises questions that researchers studying platform labour have long flagged in other contexts. Workers contribute directly to the creation of systems that could, over time, displace jobs in the very sectors — healthcare, logistics, domestic service — where humanoids are being targeted for deployment. The workers building the dataset rarely share in the downstream commercial value.
The concentration of this workforce in lower-income countries reflects broader patterns in AI data labour. A 2023 study of approximately 1,000 data workers across Kenya, India, and Venezuela, published by researchers at Oxford Internet Institute, found that a majority earned below their country's median wage despite performing tasks central to AI product development.
What the Robotics Industry Says
Companies building humanoid robots — including Figure AI, Physical Intelligence, 1X Technologies, and Boston Dynamics — have described the data challenge as a central bottleneck in the field. Several have invested in proprietary data-collection pipelines, including purpose-built teleoperation rigs and in-house demonstration labs.
But the economics of scale push toward distributed collection. Building a robot capable of generalising across thousands of household scenarios requires exposure to thousands of household scenarios — a volume that in-house teams struggle to generate at competitive speed. Gig-based collection allows companies to scale data acquisition rapidly, in parallel, across diverse environments.
According to MIT Technology Review's reporting, the specific platforms and pay rates involved in this emerging market vary, and the sector lacks standardised disclosure about where training data originates or how contributors are compensated.
What This Means
The humans teaching humanoid robots to move through the world occupy the same structural position as every previous wave of AI data labourers — essential, undercompensated, and largely uncredited — and the industry's trajectory suggests demand for their work will grow substantially before any of the robots they trained arrive to compete with them.
