Gig workers across the globe are being recruited to train humanoid robots from their homes, donning motion-capture gear after shifts in hospitals, warehouses, and offices to generate the physical movement data that robotics companies need, according to MIT Technology Review.
The report, published on 1 April 2026, profiles workers such as Zeus, a medical student in Nigeria, who returns from long hospital shifts and straps on equipment to record his movements — data that is then used to teach humanoid robots how to navigate and interact with the physical world. The arrangement mirrors the broader gig economy model that already powers much of AI's data-labelling pipeline, now extended into the realm of embodied robotics.
The Hidden Labour Behind Humanoid AI
Humanoid robots require enormous quantities of high-quality motion data to learn tasks that humans perform instinctively — walking, picking up objects, opening doors. Collecting that data in controlled lab environments is expensive and slow. Outsourcing the work to a distributed, on-demand workforce reduces costs and accelerates training timelines for companies moving to commercialize humanoid platforms.
The model closely resembles the crowdsourced data annotation economy that underpins large language models, where platforms like Scale AI and Remotasks have built global workforces — often in lower-income countries — to label images, transcribe audio, and rank AI outputs. The difference with humanoid training is physical: workers are not just clicking on screens but performing and recording bodily movements, raising distinct questions about ergonomic risk and biometric data ownership.
The arrangement mirrors the broader gig economy model that already powers much of AI's data-labelling pipeline, now extended into the realm of embodied robotics.
What Workers Are Actually Doing
Based on the MIT Technology Review account, workers like Zeus perform structured physical tasks while wearing motion-capture or sensor equipment, generating datasets that robotics firms feed into training pipelines for humanoid systems. The work is flexible — completed at home, around other commitments — but the compensation structure, working conditions, and contractual terms governing the use of biometric movement data are not detailed in the available reporting.
This lack of transparency is significant. Research on gig data-labelling workforces has repeatedly documented a gap between the value companies extract from this labour and the pay and protections workers receive. A 2023 study of approximately 500 data workers across Sub-Saharan Africa and Southeast Asia, published in the journal Big Data & Society, found that fewer than 30% had access to grievance mechanisms or knew how their data contributions were used downstream.
The Benchmark Problem Running in Parallel
The MIT Technology Review edition also flags a separate but related issue gaining traction in the AI research community: the inadequacy of existing benchmarks for evaluating AI systems. As humanoid robots and advanced AI models become more capable, the tools used to measure that capability are struggling to keep pace.
Current benchmarks often test narrow, well-defined tasks that models can game through pattern recognition rather than genuine generalization. Researchers argue that better evaluation frameworks are urgently needed — not only for language models but for embodied AI systems that must operate safely and reliably in unpredictable physical environments. The stakes are higher when the system in question has a physical body.
Labour Rights in an Embodied AI Economy
The humanoid training story arrives at a sensitive moment for the AI labour debate. Advocacy groups and some legislators have begun pressing for clearer disclosure requirements around AI training supply chains — who performs the work, under what conditions, and with what rights over the data they produce. The European Union's AI Act, which entered phased enforcement in 2025, includes provisions on data governance but does not comprehensively address the labour conditions of training data workers.
For workers in countries without strong digital labour protections, the risks are compounded. Motion and biometric data are among the most sensitive categories of personal information, and workers who generate such data for robotics firms may have limited recourse if that data is later repurposed, sold, or used in ways they did not anticipate.
The robotics industry, for its part, is moving fast. Companies including Figure AI, 1X Technologies, and Agility Robotics are competing to deploy humanoid systems in logistics and manufacturing within the next few years, and the demand for high-quality training data is expected to increase.
What This Means
As humanoid robotics scales toward commercial deployment, the gig workers generating its foundational training data deserve the same scrutiny — on pay, data rights, and working conditions — that has slowly been applied to the rest of the AI labour supply chain.
