By John Nyman
Throughout her doctoral studies, Amanda Turnbull has grappled with the legal consequences of “machines doing things with words.” Her timely dissertation, Law, Language, and Authority: The Algorithmic Turn, completed in August 2024, offers a measured yet unflinching reflection on how artificial intelligence is transforming society and the law. Speaking over Zoom from her home in New Zealand, where she is now a lecturer in Cyberlaw at the University of Waikato’s Te Piringa Faculty of Law Turnbull shared some insights from her research.
“With AI, there’s an algorithm at the end of the hammer.”
At the heart of Turnbull’s thesis is her contention that AI is “more than just a tool.” When we think of a tool, Turnbull suggests, we usually think of something like a hammer. There’s always a person at the end of the hammer, and they’re responsible for what the hammer does. In the context of algorithmic systems, commentators have proposed different alternatives for who that responsible party might be, including the programmer, the end user, and the company that owns the technology. But these approaches obscure the true novelty—and danger—of AI. With AI, Turnbull explained, “there’s an algorithm at the end of the hammer.”
Turnbull’s focus on algorithmically generated language reflects her thesis’s remarkable origins at the University of Ottawa’s Department of English. Although her original supervisor, the late Professor Ian Kerr (Canada Research Chair in Ethics, Law & Technology), soon recognized that it belonged in a faculty of law, Turnbull’s dissertation maintains its indebtedness to mid-century philosopher of language, JL Austin—who, Turnbull was surprised to learn, was a close friend of the legal theorist HLA Hart. Austin’s “speech act theory” emphasized what words do in addition to what they mean. Adapting this framework to contemporary technology, Turnbull is less interested in what a generative AI like ChatGPT says than the difference it makes that a non-human actor says it.
The first of three “pillars” of Turnbull’s dissertation thus explores the consequences of AI’s participation in writing literary works. To be clear, “there is no such thing as an AI author” according to Turnbull—but that doesn’t mean AI should have no legally cognizable role at all. Drawing on her early career as a classical flautist, Turnbull recognized that generative AI’s imitative reproduction of human-authored texts in its training data isn’t so different from the work of human artists. In her words, “there’s an amount of imitation that necessarily occurs when you’re being creative.”
Unexpectedly, Turnbull found inspiration in the “spectrum of authoring” developed by Saint Bonaventure in the 13th century, long before the modern notion of authorship was developed. Generative AI, she asserts, resembles Bonaventure’s “commentator,” a mid-point between an author and a mere scribe, who clarifies and expands on pre-existing texts. By referring to generative AI as a commentator or “expositor,” lawmakers can reserve copyright for human authors without turning a blind eye to the authority embodied in algorithmically generated language.
That authority is at the centre of the second and third “pillars” of Turnbull’s research, which examine the legal implications of algorithmic contracting. As coined by Lauren Henry Sholz in 2017, an algorithmic contract is a contract in which the main terms and conditions are drafted not by human actors, but by computer systems.
Key for Turnbull is how the systems behind algorithmic contracts exercise “derivative” authority without legal intent. For this reason, algorithmic contracting is “in no way” similar to earlier technologies such as click-wrap agreements, standard form contracts, or the archetypal pen and paper. In other words, there is no “functional equivalence” between algorithmic contracting and other platforms. Courts should therefore reconsider both the notion of technological neutrality and the application of intent-based contract doctrines, including the doctrine of unconscionability recently revived by the Supreme Court of Canada in Uber v Heller.
In the third “pillar” of her thesis, Turnbull discusses how unconstrained algorithmic contracting creates the conditions for technology-facilitated sexual violence. She focused on Uber and the instances of sexual violence involving drivers and passengers documented in its 2019 safety report. Sadly, Turnbull described this chapter as “the easiest to write,” since it quickly “became obvious that this is a new way of exerting harm.” Yet the solutions to these problems are far from straightforward. In Uber’s case, the issue permeates the firm’s corporate culture and overall attitude toward innovation, she contends, which has failed to truly consider “the whole web of entanglements” impacting algorithmic language.
Ultimately, dealing fairly with AI will require “extraordinary ways of thinking” on the part of courts and regulators. But Turnbull is confident the law can adapt. The entire law of contracts and copyright, for example, can be seen as areas that have constantly adapted to new technologies. By approaching the algorithmic turn with both bravery and nuance, courts can learn to recognize AI as something that’s more than a tool, but no substitute for genuine human authority and intent.
Going forward, Turnbull is keen to use her dissertation, which was supervised by IP Osgoode Director Carys Craig, as a basis for further explorations of technology-facilitated gender-based violence such as platform violence and “onlife” harm—a term coined by Mireille Hildebrandt to describe the intersection between experiences online and in ‘real life.’ At the same time, Turnbull is interested in how algorithms have played a positive role in certain legal contexts. Although, as she says, “we’re hot to jump on technology and focus on the negatives,” in a forthcoming article on the 1999 Canada-US Pacific Salmon Agreement, she and co-author Donald McRae will explore how, “in this case, the algorithm solved the dispute.” Turnbull also plans to publish her dissertation as a book and to return to another book she began writing even before beginning her PhD—a fictional novel that is, aptly, about Austin, Hart, and the father of computer science, Alan Turing…
John Nyman is a student at Osgoode Hall Law School (JD ’26) and an IP Osgoode JD Research Fellow