Skip to main content Skip to local navigation

AI and copyright law: Osgoode expert describes what’s at stake

Carys Craig, a professor at York University and director of IP Osgoode, doesn’t mince words when it comes to the collision of artificial intelligence (AI) and copyright law.

“Copyright is likely to function not as a safeguard for our culture, but as a tool to advance corporate power,” she says, cutting to the heart of a debate that’s reshaping how society values creativity in an age increasingly defined by algorithms.

Carys Craig
Carys Craig

These technologies – AI systems capable of generating text, images and even music by analyzing massive datasets – are trained on vast troves of material, often scraped from the internet without explicit permission from the original creators.

This, argues Craig, a faculty member at Osgoode Hall Law School specializing in intellectual property, has exposed a dangerous fault line in copyright law. It risks privatizing shared cultural knowledge while sidelining independent artists and creators. Tech giants and publishing houses are already cashing in, striking lucrative licensing deals that leave smaller players – grassroots innovators, authors and musicians – locked out. The result? A shrinking cultural commons and an amplification of algorithmic bias, where only the largest, most corporate-friendly datasets shape the future of AI.

Craig’s critique goes beyond the mechanics of copyright enforcement. She challenges the very foundation of the system: the idea that copyright law actually protects creators.

“Copyright casts the author’s work as an alienable commodity,” she explains, noting that the current framework disproportionately benefits intermediaries – the platforms, publishers and corporations that control distribution.

Strengthening copyright, she argues, won’t fix this inherent imbalance. Instead, she calls for a more radical approach: robust public funding for the arts, stronger labour protections for creators and tax policies that support cultural production, not just corporate profit.

One of Craig’s most provocative proposals? Leaving AI-generated works firmly in the public domain. In an era where machines can churn out endless content with the click of a button, she sees this proposal as essential to preserving the value of human creativity.

“Protecting AI-generated outputs with exclusive rights would create unnecessary incentives to mass-produce AI content,” she warns. For Craig, copyright should be reserved for human authorship – a fundamentally social and dialogic act that no algorithm can truly replicate.

This position puts her at odds with some international jurisdictions, including the U.K. and South Africa, that have already created legal provisions for AI-generated works. But in Canada, where copyright law still emphasizes human authorship, Craig sees an opportunity to draw a clear line. She envisions a future where courts and policymakers prioritize transparency, perhaps even requiring a system for registering and authenticating human-authored works to distinguish them from AI-made imitations.

Craig also defends the use of copyrighted material for training AI under the principle of fair dealing – a cornerstone of Canadian copyright law designed to balance creators’ rights with the public interest. While she agrees that AI-generated outputs that closely mimic existing works should be considered copyright infringement, she cautions against giving copyright holders control over how their work is used as data. To do so, she argues, would be to transform copyright into a tool for stifling innovation, rather than fostering creativity.

The stakes are undeniably high. As AI becomes increasingly embedded in our creative lives, courts and lawmakers face fundamental questions about authorship, originality and the very definition of art.

Craig’s research asks society to confront a critical choice: prioritize corporate profits, or fight for a future where human creativity continues to flourish? As she puts it, “We need a copyright system that serves people – not corporations.”

Editor's Picks Features Research & Innovation

Tags: