Thousands of people are selling their identities to train AI – but at what cost? | AI (artificial intelligence)

by Chief Editor

The Rise of the ‘Gig AI Trainer’: How Everyday People Are Powering Artificial Intelligence—and the Risks They Face

One morning last year, Jacobus Louw set out on his daily neighborhood walk to feed the seagulls. Except this time, he recorded several videos of his feet and the view as he walked. The video earned him $14, about 10 times South Africa’s minimum wage, or half a week’s worth of groceries for the 27-year-old from Cape Town.

The task? “Urban Navigation” on Kled AI, an app that pays contributors for uploading data – videos and photos – to train artificial intelligence models. In a couple of weeks, Louw made $50 by uploading pictures and videos of his everyday life.

Thousands of miles away in Ranchi, India, Sahil Tigga, a 22-year-old student, regularly earns money by letting Silencio, which crowdsources audio data for AI training, access his phone’s microphone to capture ambient city noise, such as inside a restaurant or traffic. He as well uploads recordings of his voice, traveling to capture unique settings. He earns over $100 a month, enough to cover his food expenses.

And in Chicago, Ramelio Hill, an 18-year-old welding apprentice, made a couple hundred dollars by selling his private phone chats with friends and family to Neon Mobile, a conversational AI training platform that pays $0.50 per minute. Hill reasoned that tech companies already capture so much of his private data, so he might as well get a cut of the profit.

These “gig AI trainers” – uploading scenes, photos, videos, and audio – are at the forefront of a new global data gold rush. As Silicon Valley’s hunger for high-quality, human-grade data outpaces what can be scraped from the open internet, data marketplaces are emerging to bridge the gap. From Cape Town to Chicago, people are micro-licensing their biometric identities to train the next generation of AI.

The Data Drought Fueling the Demand

AI’s language models, such as ChatGPT and Gemini, demand vast troves of learning material to improve, but they’re facing a data drought. The most used training sources, such as C4, RefinedWeb and Dolma, are restricting generative AI companies from training models with their data. Researchers estimate AI companies will run out of fresh, high-quality text to train on as soon as 2026. Some labs are resorting to feeding back the synthetic data their AI generates, but this can lead to models producing errors and ultimately collapsing.

This is where apps like Kled AI and Silencio step in. On these data marketplaces, millions are monetizing their identities to feed and train AI. Beyond Kled AI, Silencio and Neon Mobile, options for AI trainers include Luel AI, backed by Y-Combinator, which sources multilingual conversations for about $0.15 a minute. ElevenLabs allows you to digitally clone your voice for a base fee of $0.02 a minute.

Gig AI training is a new and growing category of function, according to Bouke Klein Teeselink, an economics professor at King’s College London.

The humans fueling the machines, particularly those in developing countries, often need the money and have few other options. For many, it’s a pragmatic response to economic disparity. In countries with high unemployment and devalued currencies, earning US currency is often more stable. Some struggle to secure entry-level jobs and turn to AI training out of necessity. Even in wealthier nations, the rising cost of living has made selling one’s data a logical financial pivot.

The Trade-offs: Privacy, Exploitation, and a Precarious Future

Yet, the pitfalls of gig AI training can be invisible. On some marketplaces, data trainers grant irrevocable, royalty-free licenses, allowing companies to create derivative works. A 20-minute voice recording today could power an AI customer service bot for years, with the trainer never seeing another cent.

Due to a lack of transparency, a user’s face could end up in a facial recognition database or a predatory advertisement, with virtually no legal recourse.

Louw, the AI trainer in Cape Town, is aware of the privacy trade-offs. Though the income is erratic, he’s willing to accept these conditions to earn money. He struggled with a nervous disorder for years and couldn’t secure a job, but money earned on AI marketplaces allowed him to save for a $500 spa training course to turn into a masseur.

“As a South African, being paid in USD is more worth it than people think,” Louw said.

Mark Graham, a professor of internet geography at the University of Oxford and author of Feeding the Machine, acknowledges that the money can be meaningful in the short term, but warns that “structurally this work is precarious, non-progressive and effectively a dead end”.

AI marketplaces rely on a “race to the bottom in wages”, added Graham, and a “temporary demand for human data”. Once this demand shifts, “workers are left with no protections, no transferable skills, and no safety net”.

The only winner, Graham said, are “the platforms in the global north [that] capture all the enduring value”.

Carte Blanche Permissions and Seller’s Remorse

Hill, the Chicago-based AI trainer, had conflicting feelings about selling his private phone calls to Neon Mobile. He earned $200 for about 11 hours of calls, but said the app frequently went offline and failed to release overdue payments. “Neon was always shady to me, but I kept using it to get some extra, easy money for bills,” Hill said.

He’s now reconsidering how easy that money was. In September 2025, TechCrunch discovered a security flaw in Neon Mobile that allowed anyone to access users’ phone numbers, call recordings, and transcripts. Hill said Neon Mobile never informed him, and he’s worried how his voice may be misused.

Jennifer King, a data privacy researcher at the Stanford Institute for Human-Centered Artificial Intelligence, finds concerning that AI marketplaces are unclear about how and where users’ data will be deployed. Without negotiating or knowing their rights, “consumers run a risk of their data being repurposed in ways that they don’t like or didn’t understand or anticipate, and they’ll have little recourse if so”.

When AI trainers share their data on Neon Mobile and Kled AI, they’re granting a carte blanche license (worldwide, exclusive, irrevocable, transferable and royalty-free) to sell, use, publicly display and store their likeness – and even create derivative works.

Kled AI’s founder, Avi Patel, said his company’s data agreements limit use to AI training and research purposes. “The entire business depends on user trust. If contributors believe their data could be misused, the platform stops working.” He said his company vets businesses before selling datasets, to avoid working with those with “questionable intent”, such as pornography, and “government bodies” that they believe could use the data in ways that conflict with that trust.

Neon Mobile did not respond to a request for comment.

According to Enrico Bonadio, a law professor at City St George’s, University of London, the terms of these agreements permit the platforms, as well as its clients, to do “almost anything with that material, forever, with no further payment and no realistic way for the contributor to withdraw consent or meaningfully renegotiate”.

More troubling risks include trainers’ data being used for deepfakes and impersonation. Even though data marketplaces claim to strip the data of any identification, biometric patterns are hard to anonymize.

Adam Coy, an actor from New York, sold his likeness in 2024 for $1,000 to Captions (now Mirage), with an agreement ensuring his identity wouldn’t be used for political means or to sell alcohol, tobacco or pornography, and that the license would expire in a year.

Not long after, friends forwarded him videos featuring his face and voice garnering millions of views. In one Instagram reel, his AI replica claims to be a “vagina doctor” and promotes unproven medical supplements.

“It felt embarrassing to explain it to people,” Coy said. “My feeling [while deciding to sell my likeness] was that most models were going to be scraping the internet for data and likeness [anyway], so may as well be paid for it.”

Coy hasn’t signed up for any AI data gigs since and would only consider it with major compensation.

FAQ: Gig AI Training

What is gig AI training? It’s a new form of work where individuals are paid to provide data – photos, videos, audio, conversations – to train artificial intelligence models.

How much can you earn? Earnings vary widely, from a few dollars for a short task to over $100 a month for regular contributions.

What are the risks? Privacy concerns, potential for data misuse (deepfakes, identity theft), and precarious work conditions with limited protections.

Is my data safe? Data marketplaces claim to anonymize data, but biometric information is difficult to fully anonymize, and there’s always a risk of misuse.

Where can I discover gig AI training opportunities? Kled AI, Silencio, Neon Mobile, Luel AI, and ElevenLabs are some of the platforms offering these opportunities.

Did you know? AI companies are projected to run out of high-quality text data to train on as soon as 2026, increasing the demand for human-generated data.

Pro Tip: Carefully review the terms and conditions of any data marketplace before sharing your data. Understand what rights you are granting and how your data will be used.

What are your thoughts on the rise of gig AI training? Share your opinions in the comments below!

You may also like

Leave a Comment