Since Flock Safety began partnering with law enforcement, a growing number of officers have been found abusing the surveillance system. In one instance, a Kansas police chief used Flock cameras 164 times while tracking an ex. In another case, a sheriff in Texas lied about using Flock to “track a missing person,” but was later found to be investigating a possible abortion. In Georgia, a police chief was arrested for using Flock to stalk and harass citizens. In Virginia, a man sued the city of Norfolk over purported privacy violations and discovered that Flock cameras had been used to track him 526 times, around four times per day.
“Flock has built a dangerous platform in which abuse of surveillance data is almost certain,” Wyden wrote. “The company has adopted a see-no-evil approach of not proactively auditing the searches done by its law enforcement customers because, as the company’s Chief Communications Officer told the press, ‘It is not Flock’s job to police the police.’”
“I would limit the allowed uses for ALPR,” Marlow told me. “While some uses, like for toll collection and Amber Alerts, with the right guardrails in place, are not particularly problematic, some ALPRs are used to target communities of color and low-income communities for fine/fee enforcement and for minor crime enforcement, which can exacerbate existing policing inequities.”
This type of harmful ALPR targeting is typically used to both oppress minorities and bring in a greater number of fees for local law organizations — problems that existed long before AI recognition camera, but have been exacerbated by the technology.
https://www.cnet.com/home/security/when-flock-comes-to-town-why-cities-are-axing-the-controversial-surveillance-technology