US police have gained access to a database that contains 30 billion photos scraped from Facebook, and according to the CEO of the company developing the software, authorities have already accessed it more than 1 million times.
The company developing the software is Clearview AI, and what it created is a facial recognition database that is designed to track and identify individuals. The company attempts to sell its newly developed technology by saying it advances public safety, reduces crime, fraud, child abuse and exploitation, and exonerates people from crimes they were wrongfully accused of, which makes communities safer and commerce increasingly secure.
Despite these claims, Clearview AI has come under scrutiny by digital rights agencies that point to its database being obtained through Facebook and instances of authorities wrongfully arresting individuals that were identified through Clearview AI. Notably, Clearview AI CEO Hoan Ton-That admitted in a BBC interview that the company obtained its 30 billion Facebook photos without users' knowledge, which enabled the company to expand rapidly.
Furthermore, Facebook, or Meta, rallied against Clearview AI, sending the company a cease and desist letter in 2020 for violating their user's privacy rights, while demanding that they stop accessing any data, photos, or videos from its services. A Meta spokesperson told Insider that it had also banned the company's founder from all of its services.
Digital privacy advocates such as Caitlin Seeley George, the director of campaigns and operations for Fight for the Future, a non-profit digital rights advocacy group, told Insider via an email that "Clearview is a total affront to peoples' rights, full stop, and police should not be able to use this tool". Adding that authorities often use this facial recognition tool without getting approval from their department and that there are no laws in place to stop them from doing so.
Notably, CNN reported that just last year, Clearview AI claimed its client list stretches some 3,100 US agencies that, include the FBI, Department of Homeland Security, and the Miami Police Department that didn't hide their use of the software, saying that its used all the time to detect various crimes from shoplifting to murder.
Since Clearview AI's built on 30 billion photos of people, everyone is at risk, even people that don't believe they have anything to hide from authorities. Matthew Guariglia, a senior policy analyst for the international non-profit digital rights group Electronic Frontier Fund explained to Insider that Clearview AI's extensive database creates a "perpetual police line-up" and that "you don't know what you have to hide".
"Governments come and go and things that weren't illegal become illegal. And suddenly, you could end up being somebody who could be retroactively arrested and prosecuted for something that wasn't illegal when you did it," said Guariglia
"I think the primary example that we're seeing now is abortion. In that people who received abortions in a state where it was legal at the time, suddenly have to live in fear of some kind of retroactive prosecution - and suddenly what you didn't think you had to hide you actually do have to hide," he continued
In other news, an AI that's database contains 30 billion photographs of people obtained from Facebook has been handed over to US authorities that are using it for facial recognition and person identification purposes.