15,000 Eyes in New York City

Photo Credits: Lianhao Qu (Unsplash)

Tiffany WangTiffany Wang is an IPilogue Writer, IP Innovation Clinic Fellow, and a 2L JD Candidate at Osgoode Hall Law School.

 

“Always eyes watching you and the voice enveloping you. Asleep or awake, indoors or out of doors, in the bath or bed—no escape. Nothing was your own except the few cubic centimeters in your skull.”

Nineteen Eighty-Four by George Orwell

Sprawling throughout New York City, more than 15,000 cameras observe and record faces and movements. These seemingly omnipresent lenses conduct and generate facial-recognition data for the New York Police Department (“NYPD”), capturing footage to identify individuals for criminal enforcement efforts. 

Surveillance by security cameras is not unique to the Big Apple. Chinese police forces deploy facial algorithms in real time, notifying security personnel whenever a person of interest or target appears in sight. New York’s situation appears similar. Amnesty International and volunteer researchers mapped cameras throughout three of the city’s five boroughs: Manhattan, Brooklyn, and the Bronx. In total, 15,280 cameras operate to surveil the population of New York. 

Matt Mahmoudi, the team’s lead Artificial Intelligence researcher, says that a person is “never anonymous” and that faces “can be tracked by facial-recognition technology using imagery from thousands of camera points across New York.” Not only do public cameras tower over the city, but private cameras installed by businesses and homeowners also record information that the NYPD may access with permission.

The Big Apple’s most surveilled neighborhood is East New York, Brooklyn, where 577 cameras capture action in an area less than two square miles. Over ninety percent of East New York’s residents are racial minorities. 

Idemia, a French company specializing in facial recognition software, serves the United States federal and state police forces. In 2019, the National Institute of Standards and Technology revealed that Idemia’s algorithms are prone to confuse racial minorities’ faces. Similarly, the Department of Homeland Security found that darker skin tones reduce the accuracy of computer vision algorithms.

Government monitoring and generation of facial data raises concerns of racial bias. If commercial algorithms continue to demonstrate significant errors in identifying individuals with varying skin tones, these concerns will quickly escalate to technological racism. Tensions may ultimately result in the NYPD abolishing the use of facial recognition technology, following the footsteps of San Francisco. 

Could over 15,000 eyes create a dystopia? After all, clocks are still striking thirteen.