AI can uplift humanity — but not if it starves the very people who feed it

AI can uplift humanity — but not if it starves the very people who feed it

written by Liina Raud
edited by ChatGPT
image by Chat GPT


A few months ago, I was looking around the job market to see what kind of projects are out there for independent photographers. I spent tens of thousands of dollars on my higher education to become a fully academically educated creative professional who knows what she’s doing — someone who understands the craft, knows the tools firsthand, and can deliver the highest level of quality. I also spent time I snapped away from being present with my kids and taking care of the household. Instead, I did homework, wrote essays, read about fine art, wrote analyses on visual culture — you name it. Don’t get me wrong, I loved it. I was highly motivated and super successful in my studies.

Only to find that, a few years later, if I don’t learn how AI training works — how licensing deals with platforms like Reddit function, where the training data comes from, what the agreements and loopholes are — my creative process will be exploited. Used not for the common good but for someone else’s money.

From now on, I will use ChatGPT not to write my articles or do my design work, but to help me think about and organize my thoughts. That’s what I’m doing here.

But back to the job posting.

$650 to photograph headless runners, for 8 days, over 40 hours on the Lakefront, for an AI company building a sports dataset for training. Anyone who knows photography knows how difficult these situations are. You need full understanding of sports photography to capture these photos sharp, not over- or underexposed — and that is a skill in itself. That’s what you go to school for.

And they offered minimum pay: $16.25 per hour.

Now, doing the math, I realized this was insane exploitation of creative labor. If I take the gig, I make my $650 — but that takes away a gig from thousands of photographers who would otherwise be hired to shoot running events. Meanwhile, the data is sold at a profit margin that will last generations. The images become part of a dataset used to train models that could replace not only me, but entire teams of photographers.

However, look at this:
“OpenAI’s H1 2025: $4.3B in income, $13.5B in loss.”
Source: https://www.reuters.com/technology/openais-first-half-revenue-rises-16-about-43-billion-information-reports-2025-09-30/

So in today’s world, it makes sense to be in the data world. Let OpenAI take the loss while data-collection companies enjoy the revenue.

This is literally the problem — it’s not even about training AI. It’s about providing data for enormous profit while giving artists crumbs. It shows no respect toward creatives. And yes, we creatives haven’t historically been “about the money,” but we do have dignity.

The legal challenges creators face — complex licensing deals, lack of transparency, style mimicry, disappearing attribution — are overwhelming for most of us. We just want to make art, not war.

If this exploitation at least made us collectively smarter, more educated, or more connected, it would still hurt — but it would have purpose. The truth is harsher: the value generated goes to a tiny cluster of companies while the creators receive crumbs.

Counter-Argument (and the part that actually makes sense)

But then again — let me challenge myself here, because there is a real counterpoint. If I want an image to illustrate this article, what are my options? Pull a photo from someone’s website without permission and risk getting sued? That’s not ethical. That’s not legal. And honestly, it’s stressful. Or buy a photo from a photo bank? I don't want to pay more than I'm already paying. But if I ask ChatGPT to make me an image of a headless runner — boom, I get something unique that harms no photographer and comes with zero copyright problems. It feels almost liberating. It almost convinces me that maybe this whole AI data collection thing isn’t completely bad after all. And honestly, it isn’t. The technology itself isn’t the villain here. The problem is how the data is collected, how the companies profit from it, and how the original creators are compensated — or not compensated. The tool can be ethical. The process can be fair. We just need to make sure the people whose work builds these models aren’t left behind or underpaid while everyone else gets rich.


Conclusion

We need fair pay — not symbolic paychecks that pretend to honor creative labor but really just keep an exploitative system running. We need ethical data practices that don’t rip creativity out of the hands of the people who made it and funnel it straight into private profit centers. And we need AI that actually benefits the public, not AI whose foundation is quietly built on underpaid artists who are already struggling to stay afloat. And maybe we need an option to buy someones creation through ChatGPT.

What we absolutely do not need is a system built on the exploitation of desperate creatives who are doing everything right — studying, investing, practicing, raising families, pushing culture forward — only to find themselves feeding machines that will eventually replace them. We don’t need more value concentrated into a tiny cluster of companies while the people who provide the raw material of culture are left with nothing. We don’t need innovation that eats its own creators alive.

And let me be clear: artists aren’t anti-technology. We aren’t afraid of progress. We aren’t resisting the future. We just don’t want to be erased by it. If the system keeps taking without giving back — without respect, without fairness — then eventually we’ll stop giving. We’ll stop creating. And that’s not drama — that’s a survival instinct.

AI can uplift humanity — but not if it starves the very people who feed it.

Back to blog

Leave a comment