AI Plagerism Debate
Isn't AI simply stealing from writers and artists without permission?
Aside 1.2
A reader asked whether AI is simply stealing from writers without permission. It’s a fair question. Authors and artists aren’t imagining the risks: George R.R. Martin, John Grisham, and others sued OpenAI in 2023; Anthropic recently settled a lawsuit over pirated training data; and Microsoft now faces similar claims. A global study predicts musicians and video creators could lose nearly a quarter of their income by 2028, while freelancers, translators, and illustrators are already feeling the pinch. The fear of being copied or replaced is real.
But the answer isn’t so clear-cut. Large Language Models don’t search for and copy stories word-for-word. They learn patterns from vast amounts of text, a bit like how a student learns to write by reading many books. Sometimes that includes publicly available works whose use is now being challenged in court. And when an AI writes “in the style of” an author, it isn’t pulling passages directly from them—it’s imitating tone and rhythm. That still raises tough questions about fairness and compensation, which the legal system hasn’t fully settled.
For me, the point of this newsletter isn’t to impersonate famous writers or find shortcuts to success. It’s to explore how AI can be a partner in creativity. Yes, it can generate text quickly, but the real skill lies in working with it—deciding when to lean on the machine, when to diverge, and how to blend its output with your own voice. These systems are not going away, and the ability to use them thoughtfully may become a defining skill in every industry.

