Emily Chow is an IPilogue Writer and a 1L JD Candidate at Osgoode Hall Law School.
Over the past few years, there has been a rise of copyright strikes purporting to moderate infringing content on YouTube. Although this may protect copyright holders such as musicians and artists, its potential for misuse can frustrate efforts to hold people and institutions accountable, as is the case here.
On the 29th of June in 2021, James Burch stood outside the Alameda Courthouse in Oakland California in solidarity and support for the family of Steven Taylor. Steven Taylor, a 33-year-old Black man, was fatally shot while having a mental health crisis at a Walmart in San Leandro. As Burch and other supporters of the Justice 4 Steven Taylor campaign outside listened intently to the pre-trial hearing, officers approached and asked them to move a banner. Burch and a fellow supporter started recording the interaction, exercising their First Amendment right to record or film the police.
The 2:55 minute video can be found here. At 0:31, the police officer quickly begins to play Taylor Swift’s Blank Space from his cellphone speakers. Beyond the irony of trying to escape accountability and fill space with copyrighted music, the officer deliberately chooses to obscure the recording in the hopes of getting the video copyright stricken by YouTube’s AI algorithm. At 0:53, the officer says: “[the protestors] can record all [they] want, [but] I just know it can’t be posted to YouTube.”
This incident has not been the only reported instance of such behaviour by the police. However, these unsettling attempts at eschewing accountability point to larger issues regarding freedom of speech, Anti-Black racism, policing, and obscure algorithms in America and beyond.
The right to record the police in the US has been reaffirmed in Glik v Cunniffe and ACLU v Alvarez, where the courts determined that recording a public officer’s actions resonates with principles enshrined in the First Amendment: the right to receive information and ideas. The right to record was instrumental in securing a guilty verdict in Derek Chauvin’s trial for his murder of George Floyd.
While the US courts have evidently prioritized the public’s rights to film officers, video and live streaming services like YouTube, TikTok, and Instagram have been much more obscure regarding how their algorithms target and remove videos. A preliminary search into YouTube’s copyright and fair use policies is unhelpful in determining how videos are assessed on a “case-to-case basis” and instead redirects you to Google’s fair use policies. The Electronic Frontier Foundation (EFF) is a non-profit organization whose mandate is to protect user privacy and defend digital civil liberties. They have published a guide on YouTube’s removal processes and Content ID algorithm, which is more user-friendly. However, it is important to note that automated processes like Content ID are alleged to disproportionately target independent marginalized BIPOC, queer and disabled creators; different equity-seeking groups have filed lawsuits against YouTube, including a group of Black creators and LGBTQ+ bloggers. TikTok has come under fire for flagging “vulnerable” creators and preventing their videos from reaching audiences. This practice is dubbed “shadow banning.” Moderators identified creators through short clips, and most singled out creators with tags in their bios like “#disabled” or with pride flags.
Burch posted the video on July 1st, 2021. It has not yet been taken down. However, questions remain as to the apparent lack of oversight for large US corporations and what this means for activists and creators. Just as the internet has expanded opportunities for mass global viewership and collective action, so too have the dimensions of existing power structures and inequality.