How AI Systems Use Your Work and Forget Your Name
As AI systems use creative work without consent, art and AI critic Panu Syrjämäki asks: What’s left of authorship in a world built on erasure?

As AI systems grow more powerful, they do so by learning from us—our words, our work, our lives. But the deeper question is this: can an AI system built on invisible labor ever produce anything truly new? Or is it just a future built from forgetting?
A project worker spends two years building something no one asked for but now everyone needs. He maps community gaps, tests real solutions, rewrites how things function. His reports spark funding. His insights move policy. His manager collects the credit—she didn’t build it, she just sent calendar invites.
A teacher writes original materials tailored for students others gave up on. Her lessons travel from hand to hand for years—until one day, they reappear inside an app powered by AI. No permission. No payment. Her years have been mined and sold back to her.
A nurse writes plain-language guides to help families navigate care. They’re clear, empathetic, trustworthy. Months later, the same tone shows up in a chatbot. Her words, disassembled and reconstructed. Branded as "innovation." She disappears from the narrative.
This isn’t fiction. It’s how extraction works now.
And for artists, it’s been happening longer.
Art, writing, photography, voice—years of creation scraped into training datasets without consent. Fed into systems that flatten the nuance, erase the labor, and generate “content” that sells. It looks like ours. But it comes from no one.
This is how generative AI is trained: by feeding massive amounts of data—books, artworks, essays, videos—into models that analyze patterns and mimic form. But these datasets aren’t curated with fairness. They’re scraped from the internet, archives, libraries, and private collections—often without permission. Often without the knowledge of the creator.
And the machine doesn’t differentiate. It doesn’t know what’s sacred or stolen, what’s shared or misunderstood. It just absorbs—indiscriminately. It doesn't cite its sources. It doesn’t respect the labor. It only refines its mimicry.
“The machine doesn’t recognize origin, only output,” says Panu Syrjämäki. “It learns what we give it—blindly. And if what we give it is stolen, it becomes very good at stealing.”
At Harvard SEAS, Syrjämäki studied machine vision with leading AI theorists. His work focuses on the limitations of algorithmic perception, and how bias and omission become systemic when datasets are poorly—or dishonestly—constructed.
“These models don’t just learn brushstrokes or grammar,” he notes. “They absorb the emotional texture of someone’s work and repackage it without context or consent. What you get isn’t creation—it’s mimicry masquerading as intelligence.”
That mimicry is spreading fast—across the art world, the publishing industry, film, music, education, even health. And the deeper ethical question isn't just who owns the output. It's who owns the input.
This month, the U.S. Copyright Office took a strong position: if no human made it, no one can own it. In AI Copyright, Part II, as reported by ART Walkway, the Office wrote:
“If authors cannot make a living from their craft, they are likely to produce fewer works. Society would be poorer if the sparks of human creativity become fewer or dimmer.”
That’s not policy boilerplate. That’s a warning.
What we’re watching is the industrial-scale laundering of creative labor—made efficient by automation, and legitimized by markets that care more about scale than soul. It’s not that machines are doing something immoral. It’s that the systems training them have chosen not to ask.
“When recognition vanishes, so does responsibility,” Syrjämäki adds. “And what’s worse—so does the will to keep creating.”
The creative act—whether painting or coding or community-building—demands energy, care, vulnerability. And in a society where those efforts can be harvested and resold by systems that answer to no one, something fundamental breaks.
This is not a fight against technology. It is a fight for accountability. For consent. For visibility.
Because if the future is built on uncredited ghosts, we will not be living in a world of innovation.
We will be living in a world of erasure.
And erasure is never neutral.
ART Walkway News