How does Myrtle Beach SC police use facial recognition software?
The Myrtle Beach Police Department used facial recognition software that collects and stores mass amounts of photographs and other information posted online in the course of police work for more than a year, records obtained by The Sun News reveal.
The Myrtle Beach policeâs use of the software, created by the tech company Clearview AI, means that officers potentially had access to the photos residents and visitors have posted and appeared in on the internet but never intended to share with law enforcement. Though officers used the program on a trial basis and never paid for it, that trial period gave Myrtle Beach police access to Clearview AIâs search engine and investigative tools, which collects photos posted to Instagram, Facebook, Twitter and other parts of the internet and stores them in a massive database with other information. When police officers have a photo of a person and plug it into the software, Clearview AI pulls up possible matches from its database, allowing officers to identify people they might not have otherwise been able to.
Records obtained by The Sun News via a Freedom of Information Act request, in addition to other reporting, reveal Myrtle Beach officers searched Clearview AIâs database for people they suspected of crimes dozens of times In those cases, police officers were able to take a photo of a person they believed committed a crime, submit that photo to the software, and use information pulled from the database.
Cpl. Thomas Vest, a spokesperson for the police department, said no arrests were made after officers used the software.
Emails obtained by The Sun News show the department used Clearview AIâs program from February 2020 through April 2021. Vest confirmed the department used the program during that time period.
On the whole, Vest said, Myrtle Beach police sought to use the Clearview AI software as part of their police work and did so, and ultimately decided not to buy the software from the company.
âOur intelligence unit and officers who worked with them were authorized to have access to the program as a trial,â Vest wrote in an email in response to questions from The Sun News. âAfter the trial we chose not to purchase the program, and that was the end of our involvement. In our experience, investigative tactics utilizing technology we currently have has been successful at identifying and holding persons who commit crimes in our city accountable.â
Still, said advocates and others, the police departmentâs use of the technology raises questions about what constitutes privacy online, and whether law enforcement agents should have ready access to photos and other information people post on personal websites or social media accounts.
âWe have certain rights in this country where we can go about our daily lives without surveillanceâ¦what this in effect does is just completely undermine that,â said Frank Knaack, the executive director of the American Civil Liberties Union in South Carolina. âWe donât know how theyâre using it. There are no state checks on it, there are no court checks on law enforcement being able to use this technology. Officers could spy on people they want to keep an eye on.â
How Clearview AI works, and how Myrtle Beach police used it
In its marketing materials, Clearview AI claims that its software can be used as a powerful tool to track down suspected criminals when all a police officer may have is a photo of the person. In one example touted by the company, police officers in Las Vegas were alerted that a man was sharing child pornography over the internet, and only had a photo of the suspect to go off of. Officers, the company says, were able to plug that photo into Clearview AIâs search engine, which collects mass amounts of photographs posted online, and got a match in return: The man appeared in the background of someone elseâs Instagram post. From there, officers were able to track the man down, and arrest him.
Clearview AI has also boasted that its technology was used by federal authorities to track down rioters who stormed the U.S. Capitol on Jan. 6. In those cases, officers could take screenshots of video recordings from inside the building, plug those images into Clearview AIâs search engine, and get results that could help identify the person.
It appears Myrtle Beach police used Clearview AI in a similar way.
The emails obtained by The Sun News show that Cpl. Chris Tyndall, a detective in the Myrtle Beach policeâs intelligence unit, created an account with Clearview AI in February 2020 and began using the software then. About a month later, in March, Tyndall invited Lt. Chris Smith, an investigations supervisor, to create an account himself, which Smith appears to have done. Tyndall was the primary user of the Clearview AI software, Vest said.
Between February 29, 2020 and April 3, the emails show that a user within the Myrtle Beach Police Department logged into the Clearview AI program 13 times on various dates. The majority of the logins occurred in the first half of 2020, between February 29 and May 11, though there were two additional logins in November 2020 and April 2021.
Though the logins to the software are piecemeal, the emails also suggest officers used the program regularly and into 2021. In October 2020, for example, police detective Bryan Stillwell emailed photos of people to Tyndall and asked him to run the photos through Clearview AIâs software. Stillwell made a similar request in January 2021, the emails show.
Data obtained by Buzzfeed News, which first reported that hundreds of local police departments across the country had used Clearview AI, shows that Myrtle Beach police used the software between 101 and 500 times throughout its trial. In messages sent to users, Clearview AI encourages law enforcement officers to conduct as many searches as they can with the software.
âDonât stop at one search. See if you can reach 100 searches. Itâs a numbers game,â a company message sent to Smith when he set up his account said. âOur database is always expanding and you never know when a photo will turn up a lead.â
Buzzfeed Newsâ data shows that 30 other South Carolina law enforcement agencies used trials of Clearview AIâs software, including the Georgetown County Sheriffâs Department, the Charleston County Sheriffâs Department and the Beaufort police. The state attorney generalâs office and SLED also used the technology. The Charleston County Sheriffâs Department and the Spartanburg police used the Clearview AI software most frequently, with the agencies tallying between 501-1,000 and 1001-5,000 searches respectively.
âThe program was used during investigations on persons involved or believed to be involved in crimes,â Vest said. âIt was used alongside standard investigative methods, and the results were not used to make arrests.â
By early January, police officials began considering whether or not to purchase a subscription to the software, though the department ultimately opted not to do so, Vest said.
During the agencyâs trial period with Clearview, Vest said, the chief of police was informed that officers were using the technology, but Mayor Brenda Bethune and members of city council were not. Bethune said itâs her understanding that the software was only used on a trial basis and was never purchased by the city.
âHad we continued the use of the program and intended to purchase the program, we would share the results of the trial and justification for purchase with the City Leadership Council and the Mayor,â Vest said. âWe chose not to purchase the program, and our involvement with the company ended there.â
Ethical concerns about facial recognition software
Generally, artificial intelligence software works by collecting a massive amount of data into a centralized database, and then deploying code to detect patterns in that data and draw conclusions based on those patterns. Speech-to-text software, like the kind used in Google Translate, is a good example of how an artificial intelligence program can take a mass amount of data (multiple users all pronouncing the same words slightly differently) and conclude which translations best match what a person is saying. Those âconclusionsâ reached by the software program are then added back into the central database, allowing the program to âlearnâ as it keeps running over and over.
Artificial intelligence software can also be used to analyze photographs and find patterns in the images. Clearview AIâs facial recognition software falls under that category.
Feng Luo, a professor at Clemson University and the founding director of Clemsonâs AI Research Institute for Science and Engineering, explained that facial recognition software works by having a computer program recognize patterns in the images, or faces, itâs reading. Those patterns could include everything from how a personâs eyes and nose are shaped, to their facial hair, to the bone structure of their face, based on shadows, creases and indentations visible in a photograph.
âThe general idea is to do a classification tool, you want to classify the images into different categories,â Luo said. âWe can easily separate the featuresâ¦(but) itâs a tough task. You need a large training dataset.â
Clearview AI, according to the company, gathers up photos and other information publicly-available on the web, from public social media accounts to mugshot websites, and then deploys a facial recognition program to categorize those images and match them to photos police officers submit. That means that if police have a somewhat-blurry photo of a person they suspect of committing a crime, perhaps taken from a security camera, they can plug that photo into Clearview AIâs software and receive possible matches in return.
But that type of software, critics have said, is ripe for biases and misuses. Because there are so many minute differences between peopleâs faces, and because the quality, angle and lighting of a photo can vary, facial recognition software can easily mistake one person for another. And in the criminal justice system, that can mean that people of color are more frequently targeted as matches by that software, especially if the software is drawing from mugshot photos, which more frequently feature Black and brown individuals.
âIf you donât have enough data, your data is biased,â Luo explained. âIf you only have a picture from one group, the tools canât apply to the other groups.â
Hoan Ton-That, the CEO of Clearview AI, said in an email that his companyâs software only gathers publicly-available data and that independent studies have found âno racial biasâ in the software.
âClearview AIâs software only searches publicly available information that is available to any person with a computer and internet connection and does not search any private data,â he wrote. âAs a person of mixed race, creating a non-biased technology is important to me.â
Critics of facial recognition software say that even if police use the technology with good intentions, its use essentially allows a government agency to surveil the public based on photographs that appear online.
âThe issues with facial recognition are many. Itâs a very powerful surveillance tool that allows for the mass surveillance of people which can be done without their knowledge,â said Jeramie Scott, senior counsel for the Electronic Privacy Information Center in Washington, D.C., an organization that supports digital privacy and engages in litigation against government agencies and other groups to prevent the spread and use of technologies like facial recognition software.
Facial recognition software, Scott said, âessentially creates a digital ID that government and law enforcement can control.â Even if a police department didnât pay for a subscription to Clearview AI, Scott said, the agency may still have records from the program about people they searched for in the database.
âPresumably if they used it to identify someone they would have the hits that came back,â Scott said. âI assume they probably keep that information in the investigative file.â
In response to questions about individual privacy, Vest said Myrtle Beach police took those concerns seriously and didnât use Clearview to surveil the public, only to search for people suspected of crimes.
âWe take the safety of our community seriously and understand that individual privacy is part of that commitment,â Vest said. âWe understand there are privacy concerns with emerging technology, and we do not take those concerns lightly. We hold our officers to a high standard of accountability and integrity and expect our community to do so as well.â
Ton-That, said the company didnât intend for law enforcement agencies to use its software to surveil the public.
âClearview AIâs database contains only publicly available information, not any private information, and is used for the after-the-crime investigations, not for real-time surveillance,â Ton-That wrote in an email. âClearview AIâs software only searches publicly available information that is available to any person with a computer and internet connection and does not search any private data.â
Could Clearview AI be banned?
While police agencies around the country regularly seek to use new technologies to aid their police work, those tools can cross privacy boundaries, Knaack, of the South Carolina ACLU said. To illustrate that point, Knaack pointed to fingerprints: When a person goes out in public, they leave their fingerprints all over the place, but that doesnât mean police officers can follow behind a person and gather up their fingerprints and keep track of every place theyâve been. Clearview AI collecting a mass amount of photos posted online does essentially that, Knaack said.
âThis is the ability of law enforcement to take what we do in public and put it in a database and see everything weâre doing in public,â he said.
The ACLU nationally is currently involved in a lawsuit against Clearview AI in Illinois, alleging that the companyâs technology is violating the publicâs privacy rights.
But even though police using facial recognition software might raise some peopleâs concerns, the state legislature may not be willing to ban the use of the technology outright, said state Rep. William Bailey, R-Little River. Though some lawmakers may want to protect peopleâs privacy, he said, others may be wary of shunting law enforcement.
âI can honestly say that I see both sides of it,â Bailey, a former police officer and public safety director in North Myrtle Beach, said. âI can see it from the government using (your photos) for things you didnât intend you to use (them for), but I can also see that you posted a photo and itâs out there.â
Short of state-wide bans on the use of facial recognition software, city and county councils may be more willing to bar that technology, Scott, of EPIC, said. In South Carolina, state Rep. Leonidas Stavrinakis, D-Charleston, introduced a bill earlier this year that would bar police officers from using biometric surveillance technology in conjunction with their body-worn cameras, though that legislation didnât advance past a legislative committee this session.
Whether or not any cities or counties in South Carolina will bar the use of facial recognition software is yet to be seen. Still, Bailey said, the public knowing police have access to a database that contains all the photos theyâve posted publicly online could cause some âblowback.â
â(Police) have been going on peopleâs Facebook pages to figure out who they are and who theyâre running with. I think the rub is doing it with the database,â Bailey said. âI think thatâs where youâre going to have a huge blowback.â