There’s no shortage of commentary on the brokenness of the On-Campus Interview (OCI) process. It can be stressful, confusing, and many students – for whatever reason – come out of the process without having secured a job. Having gone through the process myself, I can confirm that even lawyers and law firm recruiters will tell you that there has to be a better way to hire. That said, can artificial intelligence (AI) potentially improve this process?
In recent years a lot has been made about the impact that AI can have on different areas, including job recruitment. Hiring can be expensive. Recent estimates have put the cost of hiring a new employee at roughly $250,000 USD [1]. With the ever-increasing data companies are accumulating on their hiring processes and employees, AI is being looked at as a tool that could lower the costs of hiring and improve hiring efficiency [2]. This all sounds great, right? Unfortunately, AI has the potential to be just as biased during the hiring process as humans are. AI hiring tools are trained by humans, typically using data sets from the real world. Training AI using data that reflects your current workforce can expose the fact that hiring managers have given preference to certain types of candidates over others [3]. One example is the AI recruiting tool that Amazon developed in 2014 to hire engineers. As soon as the AI went online, it started discriminating against women [4]. The data that the AI was being trained on was based on existing engineers in the Amazon workforce [5]. Due to hiring biases, this workforce was disproportionately comprised of males [6]. As a result, the AI recruiting tool was favouring male candidates over female candidates based on gender alone.
As this example demonstrates, the issue of selection bias is particularly impactful when AI recruitment tools are deployed in typically white male dominated fields, like law. This brings me back to the OCI process. I mentioned that the process is broken, one of the major reasons why is because members of minority groups often face discrimination during the selection process [7]. In theory, AI tools should be able to mitigate this issue. However, the Amazon example shows us that the AI system you’re implementing will only be as good as the data you use to train it.
Large corporations like Amazon have the resources and market power to gain access to high quality data and have still managed to run into data-related issues with their AI. What chance do smaller players in the AI industry have for filtering out low-quality data when they train their AI, if corporate giants aren’t always successful at it? If Canada wants to foster innovation and combat AI bias, we need to promote quality data. One way that this goal could be accomplished is by introducing a Text and Data Mining (TDM) exception into the Copyright Act. As Prof. Pina D’Agostino noted in her submission to the Standing Committee on Industry, Science and Technology, a TDM exception would allow data to be mined for AI training purposes, and provide Canadian stakeholders with the legal clarity they need to access quality data that can be used to develop their AI [8].
The application of AI to law student recruitment highlights the potential for AI to affect our lives in ways we may not anticipate. Law students, as soon to be lawyers, are in a unique position to impact how AI is regulated moving forward. The future direction of AI will affect all of us, not just students interested in IP. It’s important to think about how AI interacts with the law, and how the law may need to adapt in order to effectively regulate AI. For example, is a TDM exception the right approach or would another approach be better?
At the risk of sounding too preachy, I think AI is something all law students should be informed about. Even if you’re not a law student, AI will have an increasingly direct impact on your everyday life moving forward. The more you know about it, the better equipped you’ll be to adapt and respond to it. In recent years IP Osgoode has been at the forefront of educating the legal community on the challenges posed by AI through their Bracing for Impact Conferences [9]. I encourage you to read over the conference proceedings as it’s a great way to learn about AI from some of the foremost experts on AI in Canada and abroad.
Written by Lucas Colantoni, Osgoode JD Candidate, enrolled in Professors D’Agostino and Vaver 2019/2020 IP & Technology Law Intensive Program at Osgoode Hall Law School. As part of the course requirements, students were asked to write a blog on a topic of their choice.
[1] Rudina Seseri, “How AI Is Changing The Game For Recruiting” Forbes (29 January 2018), online: <https://www.forbes.com/sites/valleyvoices/2018/01/29/how-ai-is-changing-the-game-for-recruiting/#78abd1d71aa2>.
[2] Ibid.
[3] Aaron Holmes, “AI could be the key to ending discrimination in hiring, but experts warn it can be just as biased as humans” Business Insider (8 October 2018), online: <https://www.businessinsider.com/ai-hiring-tools-biased-as-humans-experts-warn-2019-10>.
[4] Jonathan Penney, Address (Why is Data so Important to the Development of AI?, presentation delivered at Osgoode Hall Law School, York University, Toronto, 21 March 2019), online:<aichallenge.osgoode.yorku.ca/files/2019/05/Panel-1-Why-is-Data-Transcript.pdf>.
[5] Ibid.
[6] Ibid.
[7] Jane Gerster, “Reality check: Does name-blind hiring help improve diversity>” Global News (17 July 2018), online: <https://globalnews.ca/news/4329893/name-blind-hiring-diversity>.
[8] Pina d’Agostino, “Submission to the Standing Committee on Industry, Science and Technology for the Statutory Review of the Copyright Act” (IP Osgoode, 2018), online: <https://www.ourcommons.ca/Content/Committee/421/INDU/Brief/BR10269431/br-external/DagostinoGiuseppina-e.pdf >
[9] Osgoode Hall Law School, “Bracing for Impact: The Artificial Intelligence Challenge” (2018 and 2019), online: Osgoode Hall Law School <aichallenge.osgoode.yorku.ca>.