by Beige Luciano-Adams via The Epoch Times (emphasis ours),
LOS ANGELES—As police across California crack down on illegal street racing, takeovers, and sideshows, technology companies are marketing new surveillance tools to meet the demand—prompting questions about the implications for privacy rights and Fourth Amendment protections.
In the Bay Area and Los Angeles, where incidents have become increasingly brazen and violent in recent years, often drawing hundreds of attendees and overwhelming police, agencies already rely on planes, drones, and automatic license plate reader (ALPR) cameras as they aim to reduce the risk to first responders.
And they’ve begun to see results.
On Oct. 25 in the Bay Area, the California Highway Patrol (CHP) reported the seizure of 16 vehicles that had been involved in two separate takeovers a month prior. Officers couldn’t reach the center of the sideshow before it moved to another location, but they collected video evidence from cameras placed around the Bay Bridge. That led investigators to a list of vehicles, allowing them to request seizures orders from a judge.
Armed with these technologies, CHP officers sent to Oakland to crack down on illegal sideshows and rising violent and retail crime have seized more than 2,000 stolen vehicles since February.
And a controversial surveillance system used by police to detect gunshots and fireworks is now being remarketed as a tool to listen for the sounds of illegal street racing, takeovers and sideshows—like screeching tires—according to an Oct. 23 announcement from Flock Safety, an Atlanta-based company that leases surveillance systems to thousands of law enforcement agencies across the United States.
Audio detection offers an additional angle that can be integrated with existing camera networks and analytics, which Flock said in its announcement will provide a “deeper layer of insight, enabling [police] to track repeat offenders and analyze patterns linked to sideshows.”
When the cameras mounted at intersections are used in conjunction with audio detectors, the analytics system generates a report that lists vehicles, ranked by frequency, near confirmed shootings, fireworks, sideshows or takeovers, according to the company.
The selling point is that the AI-powered system identifies patterns nearly instantly that would typically take hours or days for humans.
The newly reconfigured technology raises old questions about the balance between privacy and public safety, which civil rights groups have already been litigating—in the courts and in the public sphere—for years.
For critics, the deployment of such technologies is part of a long march, a stealth encroachment on constitutional rights that has accelerated in the years since 9/11.
“Some of these are mass surveillance technologies that shouldn’t be permitted to operate in a democratic society,” Jay Stanley, a senior policy analyst with the American Civil Liberties Union, told The Epoch Times. “We don’t watch everybody all the time, just in case somebody does something wrong somewhere.”
Technologies like Flock’s cameras and audio detection devices, mounted at public intersections throughout the country in an increasingly dense network, raise questions about the “boundary between what can be done in today’s technology and what should be done,” Stanley said.
According to a February 2020 report by the state auditor, nearly all of California’s law enforcement agencies already use surveillance cameras that automatically read and report license plate data along with other details of the vehicle, time, and location.
These typically use infrared cameras to read license numbers and feed them into databases, but some cameras, like Flock’s, can capture more than license plates—things like car color and make, as well as small identifying details.
According to Flock’s website, police departments in New York, California, Illinois, Texas, and Louisiana are among those already using the company’s Raven system for gunshot detection, which the company claims is 90 percent accurate in identifying gunshots.
Accuracy Claims
Various reports have called such claims into question—including a May annual review by the City of San Jose, which initially found around half of alerts were confirmed to be gunshots, with around a third being false positives. After some adjustments to the system, the confirmed number went up to nearly 80 percent.
Critics argue the tendency of acoustic gunshot detection toward false positives can put people at risk, for example by sending police to a location expecting gunfire where there are innocent people. Such technologies can also record human voices, which law enforcement agencies have used in court.
“As is so often the case with police surveillance technologies, a device initially deployed for one purpose (here, to detect gunshots) has been expanded to another purpose (to spy on conversations with sensitive microphones),” said the Electronic Frontier Foundation, a nonprofit focused on the intersection of civil rights and digital technology.
Some cities have canceled contracts with Flock or similar providers after analysis revealed disappointing results.
A 2021 investigation of Flock competitor ShotSpotter found the acoustic gunshot detection system generated more than 40,000 dead-end deployments in Chicago in less than two years, with the vast majority of alerts turning up no evidence of gunfire or related crime.
The Champaign Police Department in Illinois last year opted not to renew its contract with Flock after results fell short of marketing claims. Data obtained by local journalists showed 59 out of 64 alerts were “unfounded,” with 21 of those likely caused by fireworks.
“To date, the system has not yet lived up to performance expectations, including misidentifying some sounds—such as fireworks or a vehicle backfire—as possible gunfire,” a police official told CU Citizen Access.
Flock did not offer an estimate of accuracy in its announcement of the Raven systems repurposed to listen for vehicular chaos, nor did it respond to an inquiry about how many communities use Raven to detect the sounds of street takeovers. But other media have reported at least two Bay Area law enforcement agencies are already using it.
A Growing Network
Cameras that read license plates and microphones that listen for gunshots have been around for decades, but in recent years, California municipalities have expanded their surveillance networks—and rapidly developing AI-powered technology is adding an unprecedented accelerant.
On Oct. 22, the San Diego Sheriff’s Department announced plans to install 60 additional cameras in unincorporated areas, adding to five cities that have already used them with “significant investigative success,” including solving homicides, kidnappings, vehicle theft, burglaries, and assaults.
Nodding to privacy and data security concerns, the Department said it has implemented “strict protocols,” including adherence to Senate Bill 34, state legislation from 2015 that regulates how data is used, stored, and shared, and requires regular audits to ensure compliance. San Diego keeps ALPR data for a maximum of one year unless it is being used in ongoing investigations.
Earlier this year, San Francisco installed 400 ALPR cameras, and Oakland, in partnership with the California Highway Patrol, installed 480 Flock cameras that read license plates and other identifying details.
“When we’re talking about car break-ins and car theft ... when we’re talking about sideshows and some of the other issues that have happened in our city, automatic license plate readers can play an invaluable role in helping us to track some of the perpetrators of these crimes and hold them accountable,” San Francisco Mayor London Breed said at the time.
In some California cities, police can now also access private security camera networks if neighbors grant them permission.
For example, Sacramento currently has 809 cameras registered in a program that allows people to register their cameras with the police department, which lets investigators know where the camera is and request video evidence in case of an incident. Businesses and residents can also choose to “integrate” their cameras, giving the police department direct, live access to the feed.
And “real-time crime centers” in major cities across the United States already combine these modalities. Last month, the Los Angeles Sheriff’s Department opened its first center in Agoura Hills, and LAPD plans to open multiple in the coming months.
These centers can tap into license plate readers and existing cameras at intersections, as well as footage from private cameras if businesses or residents allow it.
Citing low staffing levels and rising crime—including 50 car burglaries across the course of a single weekend in one L.A. City Council District—an LAPD report to the Board of Police Commissioners cited “an acute need to explore new measures, like the use of technology, to mitigate these impacts and improve the department’s response to crime.”
Privacy Regulations
In an April memo, Gov. Gavin Newsom’s office said the “crime-fighting cameras” installed at Oakland intersections would protect privacy by limiting data storage to 28 days and not disclosing footage to third parties beyond other law enforcement agencies, while complying with recent bulletins from the California Attorney General’s office outlining state law that governs data collection, storage and use, including SB 34.
Police can use ALPRs to match license plates with those on a “hot list” of known offenders. But even if they don’t match, the data is still stored in a database, prompting questions about how it is protected and used.
The ACLU raised this issue in a 2013 report titled “You Are Being Tracked,” noting that the readers “would pose few civil liberties risks if they only checked plates against hot lists and these hot lists were implemented soundly.” But the networked systems store the compiled data, not just license plates of vehicles that generate hits.
The “enormous databases” of motorists’ location information that are created as a result, and often pooled among regional systems, are often retained permanently and shared with little to no restriction, the report argued.
The 2020 state auditor report found that while most California law enforcement agencies use the technology, “few have appropriate usage and privacy policies in place.”
The report looked at four agencies—the Fresno and Los Angeles police departments, and the Sacramento and Marin County sheriff’s offices. All of them accumulated a large number of images in their ALPR systems, but most of those did not relate to criminal investigations.
For example, 99.9 percent of the 320 million images Los Angeles stored at the time were for vehicles that were not on a hot list when the photo was taken.
And according to a Sacramento grand jury investigation, a vast ALPR system deployed by the county’s sheriffs department and city’s police departments couldn’t distinguish between cars used for criminal activities and those operated legally.
“And we subsequently learned that both the Sheriff’s Office and Sacramento Police Department have been lax in following state law regarding how ALPR data is shared with other law enforcement entities,” the report said.
In fact, the investigation found that those departments regularly shared license plate data out of state, which is prohibited by SB 34.
In an emailed statement, the California attorney general’s office told The Epoch Times such technological tools “are helpful in deterring and investigating crime, serving both to prevent wrongdoing and ensure accountability for those who violate the law,” but that they must be used with “the utmost respect for ethical and legal standards.”
The attorney general’s office said that recently it has been working with local agencies “to ensure that they are using ALPR systems for their intended use.”
4th Amendment Concerns
A federal lawsuit filed Oct. 21 against the use of Flock’s surveillance network in Norfolk, Virginia, alleges the city is violating Fourth Amendment rights by tracking “the whole of a person’s public movements,” thus amounting to a search.
The City of Norfolk gathers information about “everyone who drives past any of its 172 cameras to facilitate investigating crimes,“ and in doing so, ”violates the long-standing societal expectation that people’s movements and associations over an extended period are their business alone,” the complaint states.
With all of this done without a warrant, the complaint continues, “This is exactly the type of ‘too permeating police surveillance’ the Fourth Amendment was adopted to prevent.”
Flock released a statement to media countering that Fourth Amendment case law shows license plate readers don’t constitute a warrantless search because they photograph cars in public, where there is no reasonable expectation of privacy, and case precedent in numerous states has upheld the use of evidence from ALPRs as constitutional without requiring a warrant.
Jay Stanley, the ACLU policy analyst, noted courts are still in the relatively early stages of grappling with these technologies.
“But courts have also made a number of rulings that sweeping surveillance technology is not consistent with the Fourth Amendment. ... I think that automatic license plate readers raise a lot of the same concerns that the Supreme Court addressed in some of the big privacy cases in recent years,” he said.
Among those are United States v. Jones, in which the government tracked someone’s vehicle with a GPS tracker without a warrant for 28 days, subsequently securing a conviction with the resulting data; the court held that such constituted a search under the Fourth Amendment. Previously a lower court had ruled the data was admissible because the suspect had no reasonable expectation of privacy when his car was on public streets.
And in Carpenter v. United States, the court held that acquisition of a suspect’s cell-site records—historical location data from cell phone providers, obtained without a warrant—constituted a Fourth Amendment search.
“When you have enough license plate readers out there, it becomes tantamount to being tracked with a GPS. And so it raises the same issues that the court has already ruled on,” Stanley said.
He suggested that communities need time to digest these technologies and their potential consequences before adopting them at such speed and scale.
“Communities need to decide whether they want to allow the police departments that serve them to have the new powers these technologies convey and whether they’re even effective at reducing crime and ultimately making communities a better place—which is the whole point of law enforcement and government,” he said.