We Hacked Flock Safety Cameras in under 30 Seconds. 🫥
By Benn Jordan
Summary
## Key takeaways - **Hack Flock Cameras in Seconds**: By pressing a button sequence on the back of Flock Safety cameras, researchers created a wireless access point to gain shell access, allowing full remote control, data exfiltration, or malware installation in under 30 seconds, as demonstrated to journalists using a simple tool. [04:34], [05:08] - **No 2FA for Police Access**: Flock Safety does not require two-factor authentication for some police departments accessing sensitive location data of virtually everyone, making it vulnerable to basic attacks like Wi-Fi cloning that bypass push notifications, prompting a US Senator to request an FTC investigation on national security grounds. [08:29], [09:00] - **Cameras Capture and Store People**: Contrary to Flock Safety's claims of only capturing vehicles, the cameras' radar triggers photos of people, hands, or objects, storing unencrypted images in separate folders, including some older than 7 days and even factory test shots, contradicting their encryption and deletion policies. [11:44], [12:19] - **Google Dork Exposes Patrol Data**: A simple Google search phrase revealed a misconfigured Flock Safety demo site with a live API key accessing ArcGIS layers, exposing police departments' details, live patrol car locations, officers' personal info, and 6,000 hot list alerts tracking suspects' movements for months without reason. [13:18], [15:02] - **Exaggerated Crime-Solving Claims**: Flock Safety claims their services solve 10% of US crimes and boosted Oakland's violent crime clearance by 11%, but these cite self-authored papers with misleading data, ignoring national crime drops and actual clearance rates as low as 3%, while studies link ALPRs to increased vehicle thefts in some cities. [22:32], [24:36] - **Compromise: Vendor Security Audits**: To ensure safety, private companies offering surveillance services to government must pay for independent security audits of hardware and software before contracts, receiving annual ratings like health inspections for businesses, preventing unvetted tech from harvesting public data. [37:50], [38:37]
Topics Covered
- Cameras Hackable with Button Sequence
- No 2FA Exposes Police Logins
- Cameras Store Unencrypted People Images
- Google Dorking Exposes Sensitive Maps
- Surveillance Claims Mask Ineffectiveness
Full Transcript
You may remember me from this video where I told you about 40,000 of these things that your tax dollars pay for that are tracking your every move and repurposing data collected about you every time that you drive past them.
Upon further investigation, it turns out that there are over 80,000 of them. And
um we got some and we hacked them. You
can press a button a few times on the back of these cameras and within a few minutes turn them into your own personal spy device or malware host or honeypot that steals people's login credentials or a cryptocurrency minor. Whatever you
want really. Or alternatively, how you can point an antenna at them and decode the video stream using a technique used by the CIA during the Cold War. Or how
another researcher found a Google search phrase that had the capabilities of showing you the real-time location of these cameras and police patrol cars.
This isn't clickbait or an exaggerated claim with no payoff. Just the other day, weeks before this video will be released, US senators and representatives drafted an official letter to open an investigation that
highlights the national security risks associated with our findings. And in
this video, I'm going to show you exactly how they work and even demonstrate them to journalists.
>> No [ __ ] way.
>> And finally, we're going to take a deep data dive into the efficacy, misinformation, and straightup lies surrounding some private surveillance startups. And we're going to use that
startups. And we're going to use that momentum to push for protocols and legislation that actually makes you safer. Wow, that's a lot for a YouTube
safer. Wow, that's a lot for a YouTube video.
>> I can never act. Been trying to stack up racks on racks. Boys in the about wearing all black stain on track and that's on me. Out here getting this be if you work for the devil. Better
retreat.
The meat and potatoes of this video will be mostly in parody with John Gainex white paper and that's linked in the description below. Many of these
description below. Many of these vulnerabilities were recently published with the National Vulnerability Database or are in the process of publication.
And to prevent the average viewer from getting lost or falling asleep, I'm going to keep many of the formalities and extensive details to a minimum. But
if you find yourself wanting more details at any time in this video, just check the description for a whole bunch of links. Welcome to the world of
of links. Welcome to the world of responsible disclosure. And while I'm up
responsible disclosure. And while I'm up here, let me tell you that to the best of my knowledge, any of the cameras or hardware seen in this video were acquired legally. I have not shared or
acquired legally. I have not shared or redistributed any of the data on them, and as long as they are in my possession, they will not be placed into the wild. At no point in time have I
the wild. At no point in time have I knowingly accessed or interfered with any server or service related to or owned by Flock Safety. There is a chance that all the devices that you see in this video, or all of the devices that we have acquired from multiple different
sources, are unique and do not have the same hardware or software that the devices in the wild have. I have no idea how or why that would be the case, but it is technically a possibility. And
finally, at the time of me recording this, there are 47 security issues covered with the vast majority listed in the white paper. In this video, I'm going to be showing you six of them.
Over the last summer, when doing research for my first video on this topic, I started poking around to see if anyone had done an independent audit of flock safety or related services. This
naturally led me to the dark web where I have access to some semi-private communities dedicated to hacking and open- source intelligence. Some of these communities have wellorganized marketplaces for breached data,
credentials, exploits, and all that stuff. And that is where I found this.
stuff. And that is where I found this.
Please excuse the poor English translation, but these were law enforcement flock safety accounts for sale with escrow protection by a reputable vendor. And a few days later,
reputable vendor. And a few days later, the listings were removed in a way that suggested that someone had bought them.
Not long after that, I got in touch with a professional security researcher regarding other things on this list. And
what do you know? He found something very similar on the dark web.
>> So, in the cyber industry, there are things called access brokers. And some
people specialize in government agencies or maybe local law enforcement.
>> I started digging more to find out where these accounts could have come from.
Were they bought or stolen off of a police officer or a flock employee? Or
just maybe Flock Safety had some security vulnerabilities.
The most significant and troublesome and mind-boggling vulnerability on this list was discovered nearly a year ago, late 2024, that stumbled down a rabbit hole.
Uh, and then probably a few nights later, was messing with the buttons and the dip switch and then was able to figure out how to get uh a shell on it.
Yeah. So, I'm uh John Gaines Gains.
Professionally, I've been in the offensive security field for over a decade. Obtaining shell on a device
decade. Obtaining shell on a device means that you can remote control it, exfiltrate data, and escalate privileges, which is exactly what John did. As detailed on John's blog and
did. As detailed on John's blog and formerly published paper, John had found a user named Cager on social media who was trying to recreate some of the disclosures, he reported that by merely pressing the button on a device in a
particular sequence on a flock safety camera, a wireless access point is created. Hey partner, don't do any of
created. Hey partner, don't do any of this stuff unless you can legally acquire a Flock safety camera. If you do this to one of the 80,000 all over the
US of A, you'll be put in the clink.
First thing you're going to want to do is go ahead and press the button to turn on that police camera. Then press the button on the back a number of times I can't disclose in this video. There she
is, the Flock wireless access point. Go
on and connect. Send a command to enable ADB. Connect. And now you can connect to
ADB. Connect. And now you can connect to the Flock safety device and access its data or install whatever the hell you want on it. Have fun.
>> This along with John's previous discoveries are encapsulated in an easy to use tool that he made so even a novice user could obtain full control of the cameras. We let George Chey from the
the cameras. We let George Chey from the Guardian do the honors.
>> The longest part actually is waiting for the hotspot to turn to turn on. Um but
realistically in about 5 seconds. And in
fact with the compute box uh you don't need to hit the buttons uh because the USBC ports are exposed. So you can just plug in a rubber ducky and then walk away. A rubber ducky, sometimes referred
away. A rubber ducky, sometimes referred to as a bad USB, is a USB drive that a computer or device detects as a USB keyboard and then execute scripts called payloads. One can make a device like
payloads. One can make a device like this for as little as $5. This quite
literally raises the limit of how one could use flock safety devices to their imagination. You can clone or decompile
imagination. You can clone or decompile the apps. You could send the video
the apps. You could send the video stream data to a remote server. You
could use it as a botnet client for malware. You could have it capture Wi-Fi
malware. You could have it capture Wi-Fi handshake credentials and do middleman or honeypot attacks or replace or modify captured footage or images. And if that is the case, this could bring into
question the integrity of the data being used as admissible evidence in court like in general unless of course a prosecutor could prove that a security breach wasn't detected. And uh about that
>> the apps that are installed that are uh custom of the vendor uh all have debug enabled which on these types of devices on Android devices means that you can pause them in runtime and modify the
memory right um which gives you system uh injection system can write properties um and in this case there's one that you can modify a cleanup script um that is ran as root you can consider either a
wireless RC or a gated wireless rce that goes from no access to to root which is the worst This means that malicious code can be installed and executed outside of the operating system. So like when you
first turn on a computer and see the BIOS screen where the system does its little selfch check, it could exist right there acting as a superior to Windows or iOS or whatever it is that you're booting into.
Multiffactor authentication or two-factor authentication or two-step verification is part of our daily lives.
We use it when we log into everything from Gmail to Tik Tok to our banks and nearly everything in between. But not
all 2FA is equal, and different types have their own strengths and weaknesses.
For example, some 2FAS simply pop up on your phone or your desktop and ask you if you've just logged in from a certain device or region in which you can approve or decline the new device. This
is an excellent security protocol if you're sitting at home in Kansas and see that somebody from Bangladesh just used your password. But if I'm sitting in
your password. But if I'm sitting in your driveway in Kansas, or especially if I'm using your Wi-Fi using a device like this, I can clone your Wi-Fi signal and then send a deauthorization or deauth signal to your devices's MAC
address. Either you will notice this and
address. Either you will notice this and try to reconnect, or your computer or phone will automatically reconnect. And
now you'd be accessing your own network through my device. I could use a script to clone the login page of whatever credentials I'm trying to get and then feed it to you, capturing your session and depending on the service, your
username and password. Then, as
expected, you would get a two-step verification asking you if you just logged in, and you would say yes, granting me access. This means that when it comes to credentials like this, even with 2FA or MFA, a police surveillance
company's security can only be as good as the least security-minded person with access to that system. If you're
wondering how low this security bar can go with clients, you'll be disappointed to know that Flock Safety doesn't require two-factor authentication with some police departments. Yes, you heard that right. The security process you go
that right. The security process you go through when you log into Disney Plus is just too much to ask some police departments to do when accessing confidential information and the location of in some cases virtually everyone. When I first found this out, I
everyone. When I first found this out, I simply couldn't believe it. And neither
could US Senator Widen's team, which is why it's among the issues leading to a request for the FTC to open an investigation into the company on the grounds of national security.
Fortunately, there's a super easy solution to this, a USB or NFC authenticator. It cost as little as $10
authenticator. It cost as little as $10 and you just plug it in or wave it in front of your device for the second layer of authentication. And if this is too much hassle for an able-bodied police officer or employee to use, then
maybe they shouldn't have access to secure information. It's really
secure information. It's really frustrating to spend this much time talking about a problem when a very simple and common sense solution to that problem has existed since day one.
A concerning amount of hard-coded information is stored inside Flock Safety cameras, and we'll hear all about it in the next vulnerabilities on this list. But within this information is a
list. But within this information is a list of Wi-Fi network names. So, I set up a dummy network with one of these Wi-Fi names. Then, when removing the SIM
Wi-Fi names. Then, when removing the SIM card or when the device couldn't find an LTE signal, some of our Flock safety cameras happily connected to the dummy network and routed upstream traffic through it. Others seemingly prioritized
through it. Others seemingly prioritized my dummy network by default, regardless of if it had a SIM card in it or not. So
I captured the pcap data being transmitted from one of these cameras for a little while and analyzed it with wire sharkark and unblo which is an open- source extraction suite. And sure
enough there were clear text credentials in the data. These exact vulnerabilities were originally disclosed by John in April, another in September and with another pending. This attack requires
another pending. This attack requires knowing the name or credentials of the Wi-Fi networks the camera is looking for. But what concerns me more is that
for. But what concerns me more is that this information wasn't in an encrypted upstream to begin with. Which means that by using a professional-grade SDR or IMSI catcher, which is more or less a
DIY Stingray device, a malicious hacker could just hijack the LTE connection and then do the exact same thing without needing to know these network names or even being physically near the camera.
This could also allow a more modern version of a Tempest attack, which I'll demonstrate in a few minutes. And that's
where a hacker could decode the motion JPEG video stream. I actually tried to accomplish this as I love puzzles, but unfortunately time forced me to choose between decoding the pixel sequence or
finishing this video. And here we are.
When I recreated John's research on these devices, as previously shown, it was as clear to me as it was to John when he first discovered it that they inadequately protected credentials, API keys, passwords, and more. As I
mentioned, the Gaintech blog and the papers have a lot more details on this for the technically minded, but I'm going to use this segment to talk about some of the other troubling things that were found stored in the camera. On Fox
Safety's website, it is stated that they do not capture or record data of people, but only vehicles. They also state that data and footage is encrypted throughout the entire life cycle, and that data is
automatically removed from devices after 7 days. Speaking for myself, when I
7 days. Speaking for myself, when I recreated John's research across multiple devices, I confirmed exactly what he was seeing. If Fox Safety's cameras in the wild are operating like the ones we researched, this would be a
clear contradiction to their statements.
Firstly, when I moved in front of the camera, the radar module triggered the camera module to take a picture of me.
Then the onboard AI looked for a license plate and didn't find one, but it stored the image anyway to a separate folder.
Now, this doesn't seem to target people.
It will also take a picture of my hand if I move in front of the lens or a picture of my desk if I move the device.
But what I observed were the devices intentionally saving the footage, not erasing it. Secondly, throughout the
erasing it. Secondly, throughout the entire process of verifying John's research, I didn't crack or decrypt a single thing, any of the information, footage, or data that you see or hear about in this video was unencrypted at
runtime. And finally, when going through
runtime. And finally, when going through the files and temp folders of the Falcon cameras, we absolutely found images older than 7 days. In fact, John found stored images that were captured when the camera was triggered inside the
factory where the device was made. So,
hypothetically, this suggests that if you had a camera deployed and pointed at your front door, one could access this data and figure out when you entered or exited your house.
For about as long as modern search engines have existed, so is dorking. So,
Googling looks like this, and dorking looks like this. And if you're really well verssed in dorking, you'll start finding things that weren't necessarily intended to be public. I'm a pretty solid dorker and I use it constantly for
researching videos like this one. Josh,
however, is a legitimate expert at dorking.
>> I'm Joshua Michael. I'm a technology obsessive and I founded Next AI, which is a all source intelligence firm focusing in privacy and personal cyber security.
>> This is the exact Google search I used to find an exposed flock safety demo site. reveal how their system traces
site. reveal how their system traces cars, maps patrol vehicles, and can soon build full investigation profiles on people. At first glance, it was a UI
people. At first glance, it was a UI demo site meant to just show off how cool their buttons look. As I looked deeper, it contained 5,000 lines of source code for a search platform, and
buried in the code was a live API key.
And an API stands for application programming interface, which basically allows a computer to contact another computer without having to deal with all the clunky things like user interfaces and buttons. The confidentiality of API
and buttons. The confidentiality of API keys and tokens like this are sometimes more important than things like usernames and passwords because in many cases the token alone grants you the same access but without front-end security measures like capture or
twostep verification. And it had access
twostep verification. And it had access to over 50 private layers that I didn't dare touch because I'm too handsome for prison. I used open- source intelligence
prison. I used open- source intelligence to see what data is stored in RGS. Let
me show you some of the things I found the police departments and flock safety are storing on ArcGIS. this Flux safety map. It's obviously a demo, but it shows
map. It's obviously a demo, but it shows that they'd store registration data. So,
names, emails, how many cameras, and a field to attach files, whatever that may be. Carolton Police Department. This is
be. Carolton Police Department. This is
no good. The exposed API key may have granted us access to track live patrol car locations. Also on the naughty list
car locations. Also on the naughty list is Aurora, Colorado. Or maybe Flock Safety. I don't actually know who owns
Safety. I don't actually know who owns this map. It's Flock Safety. leaking
this map. It's Flock Safety. leaking
another RGI layer with officers names, phone numbers, emails, and even their expected patrol areas. This is the worst one coming out of Dallas, Texas. A map
layer with 6,000 records of hot list alerts containing license plates, the reasons why they're on that list, the exact location detected, the camera that caught them, and the time that they went by. Anyone could Google, find this map,
by. Anyone could Google, find this map, and trace these people's movement patterns for 5 months. Also going to note the reason category has someone in there for just suspect and a bunch of others literally have no reason in our
blank. So let's back it up for a second.
blank. So let's back it up for a second.
If you call 911 and the dispatcher deems it an emergency requiring police, most modern police cars have a GPS module installed that reports back to dispatch.
That way they can efficiently contact the police nearest the event and expedite the response time. Flock Safety
and many of its clients use thirdparty services that makes sense of this constant stream of data and all of that data is handled with an API. Just a few weeks ago, two security researchers, Alexa Feminina and James Zang, wrote a
report discovering that ArcGIS had been compromised by a Chinese state sponsored hacking group called Flex Typhoon. The
report from Infocurity magazine states, "The hackers allegedly targeted a legitimate public-f facing ArcGIS application. This is software that
application. This is software that allows organizations to manage spatial data for disaster recovery, emergency management, and other critical functions. This is just a very recent
functions. This is just a very recent example of what could be compromised with sensitive API information for geospatial platforms. This is probably in real world scenarios the least concerning vulnerability in
this video, but one of the most fascinating ones and very few consumer or security cameras, displays or network manufacturers have the means or know how to test for it. At some level, this device that you're watching this video
on is leaking nonionizing electromagnetic radiation. And if the R
electromagnetic radiation. And if the R word is unsettling, nonionizing means no DNA damage. Phones leak it, monitors
DNA damage. Phones leak it, monitors leak it, microphones leak it, camera modules leak it, most modern electronic devices leak it. But some of these leaking electromagnetic waves are resonating and modulating in parody with
the signal. And if you can isolate the
the signal. And if you can isolate the resonating frequencies, you can with a lot of trial and error decode the signal or in other words spy on the device. The
Tempest attack is something that the CIA and NSA used and experimented with ever since World War II, which is how it got its cool sounding name. Back when we all used CRT televisions and monitors, there
was generally a whole lot more RF leakage. So, it was a much bigger risk
leakage. So, it was a much bigger risk to national security. These days, a practical Tempest attack would involve some time finding and decoding the signal and then placing an RF bug on or near the source, which could transmit
the data remotely. You need a software definfined radio with a lot of bandwidth, RF probes, a directional antenna, and a spectrum analyzer. On the
newer Flock satellite cameras, I noticed that there was an unusual amount of RF leakage coming from the camera module itself, the proprietary coaxial port, and the 8 pin DN port on the back.
Initially, using an RF probing kit, an oscilloscope, a spectrum analyzer, and a hack RF, I was able to isolate a few various ranges of modulated signals.
Then I brought the camera into another room to rule out localized interference and tested these signals as well as common integer quotients of those frequencies. Using an RF probe, I found
frequencies. Using an RF probe, I found an exploitable leak between 592 and 594 MHz. Then using a periodic log antenna
MHz. Then using a periodic log antenna and a 20 decel low-noise amplifier, I was able to point the RF gun at the device as far as 6 ft away and make out what the camera was capturing. In this
case, me. Obviously, the quality and lack of color leaves a lot to be desired, but that's just because the softwaredefined radio I was using to pick this up just didn't have the bandwidth for that kind of quality. If
someone were to use a professional-grade multi-channel SDR board with higher resolution sample rates, the quality of the Tempest attack output can be nearly as good as its source. But due to the
high cost of equipment and the knowledge and time required to execute an attack like this, most consumer device manufacturers and owners do not need to be super worried about this. However, if
a device is being used for something related to national security, government use, or public surveillance, this absolutely needs to be protected against.
This is just my opinion here, but most of these things are entirely preventable and the result of prioritizing growth over painfully obvious industry standard security measures like multiffactor
authentication. Another really obvious
authentication. Another really obvious lowhanging fruit here is using mobile phone operating systems and hardware for a government surveillance camera. The
Falcon, the Sparrow, and likely Flex LPR devices, or the cameras you most commonly see all over the country, were running Android Things 8 or 8.1, which was discontinued in 2021, and that
includes security updates. At this time, there are over 900 published vulnerabilities for this OS. like
someone please explain to me why there are even cameras in the wild recording public activity that aren't even running on supported software. If your phone or your computer or your home security system stopped being supported and you understood how bad this could make your
life, you'd probably be inclined to throw them in the garbage.
The way this is supposed to work is when you discover a vulnerability, you attempt to reach out to the company and you give them a 90-day window to release a patch. No shout out, reward,
a patch. No shout out, reward, compensation, or bounty is required.
However, after those 90 days pass, the discoverer can then post a detailed write-up. Alternatively, some go the bug
write-up. Alternatively, some go the bug bounty route where you report it and get a monetary or reputational reward.
However, as explained by John and Josh, this route has commonly started to include non-disclosure agreements. This
means that even if the company decides not to pay a reward, the discoverer cannot legally disclose the issue publicly. So, they did offer me a bug
publicly. So, they did offer me a bug bounty. It wasn't very specific and it
bounty. It wasn't very specific and it included an NDA pride knowing anything more. I don't think you're going to
more. I don't think you're going to improve cyber security as a whole through not talking about it.
>> Early February, I reached out. Yeah, I
disclosed I I want to say 7 to 12 vulnerabilities maybe for the the license reader and the the gunshot detection. They responded within like a
detection. They responded within like a day and a half and immediately asked for a a video uh chat uh which you know I obliged you know talking about joint PR
statements and um so on and so forth.
Um, what ended up happening was they released a PR statement uh about a month and a half before the 3 months ended. So
pretty fairly quickly uh without telling me, without referencing me or referencing the issues that I that I ended up publishing. They also never gave me any confirmation that anything was fixed. There's something really
was fixed. There's something really unnerving about going on record with legislators or media and talking about national security. I don't fully
national security. I don't fully understand what defines a threat to national security and it seems like the type of topic where you don't want to make any mistakes. So, I just didn't mention it and kept that term out of my
mouth and I just provided our research to those who could. And according to Oregon Senator Widen and Illinois Representative Krishna Mory, well, in their words, Flock has unnecessarily
exposed Americans sensitive personal data to theft by hackers and foreign spies.
Part of my research was trying to figure out just how effective the adoption of lock safety and similar ALPR services have been at reducing crime rates or increasing crime clearance rates. And
this is a deceptively difficult task.
For example, let's do what most people concerningly do these days and ask Google and let AI answer it for us.
Well, hey, there you have it. Let's
maybe look at those actual sources, though. By the way, I want to take a
though. By the way, I want to take a moment to congratulate Flock Safety on their search engine optimization skills.
It's so good that even services that only exist to help companies improve their SEO are like, "Bro, sorry. It's
literally impossible to improve beyond your elite talent." This means that whenever you want to find information about flock safety or ALPRs or police cameras, Flock is going to be the most prominent and influential force in your
initial results. And subsequently, so
initial results. And subsequently, so will the AI assistants answer. More on
that hellscape in a future video. But
hey, look, some studies. Let's check
them out. Okay, so only two of these studies took place after Flock Safety was even incorporated, and they tell us nothing about the efficacy of surveillance or data collection. Flock
Safety's website claims that 10% of all crime in America is solved using their services, which is a pretty impressive thing to brag about. However, the source that they cite this claim with was a
research paper created by two Flock Safety employees that doesn't really outline regional analysis. In other
words, crime has been dropping nationally in America since 2021, even in the last year. The first thing that I personally would want to look for is a crime rate and clearance comparison
between cities that use flock safety services and cities that do not. And
when you do it that way, it's extremely difficult to find any meaningful changes related to surveillance technology in general. But there have been other
general. But there have been other studies not directly related to the surveillance industry. The National
surveillance industry. The National Policing Institute did a multi-sight evaluation and said that license plate readers can improve public safety, but the technologies impact depends on its implementation. It could just be me, but
implementation. It could just be me, but this sounds quite a bit different than what Flock CEO is saying. Over 5,000
cities leveraged Flock to solve north of 14% of all crimes in America. But
Flock's story is an Atlanta story. We
can now rest assured that if a crime happens in South Downtown, it will be solved. It wasn't.
solved. It wasn't.
>> In 2023, the Berkeley Police Accountability Board did some data diving and found that ALPRs in other California communities were sometimes more correlated with increases in vehicle theft and lower crime clearance
rates. In the case of Bakersfield,
rates. In the case of Bakersfield, California, it was only after ALPRs were installed when the city rose to have the highest motor vehicle theft rate in the United States. The board also noticed
United States. The board also noticed that Flock Safety had claimed their services were responsible for a 33% decrease in motor vehicle thefts in Bakavville, California, but were citing data from years before the cameras were
even installed. Which brings us to
even installed. Which brings us to Oakland, California. And in full
Oakland, California. And in full transparency, I've been informally consulting with and sharing some of this research with their city council, who had just delayed a vote on a $2.25 million expansion to its Flock Safety
Network. Flock Safety's public website
Network. Flock Safety's public website claims that their services have helped Oakland's violent crime clearance rate rise by 11%. Not bad. Well, actually
kind of bad because they failed to mention that violent crime decreased by 19% in that period, which is on par with the FBI crime stats for the entire country. They also conveniently failed
country. They also conveniently failed to mention that in 2023, Oakland had a violent crime clearance rate of 3%. And
that was later acknowledged as an error by the police department themselves. And
this was a pretty big news story last year. and the Oakland Police Department
year. and the Oakland Police Department were the ones to acknowledge and confirm the error in the first place. So, how
are so many cities like this failing to call [ __ ] on these types of claims that may result in millions of tax dollars being spent on these services?
This is what I keep running into again and again when researching the efficacy of police surveillance technology. A
private business will site statistics that are often misleading and somehow nobody from these cities or police departments seem to have actually validated the information quoted in the sales pitches.
So, all of this aside, even if we're not considering all of these questionable data sources in marketing, sociology is an incredibly chaotic field of study, one that typically requires decades of
adeptly sourced data to accurately suggest whether a technology or a policy change reduces crime. For decades, we've been trying to figure out if an increase in police can even reduce crime. So at
the very least there should be an independent robust metaanalysis to figure out how effective private surveillance and data sharing is before we take money away from other resources
to pay for it. Right? And without
getting too philosophical here, it is worth considering for a second that catching people committing crimes is very different than total antisocial or criminal events. like a thief or a
criminal events. like a thief or a serial killer or a drug dealer or a drug addict is not going to see a bunch of police cameras and then just tap out from crime and become a plumber. They're
just going to commit crimes in a more obiscated way, which more often than not complicates the process of finding a reformative solution. By the way, what
reformative solution. By the way, what an exhausting amount of sociology studies have figured out over the last 50 years is that high levels of surveillance drastically decreases well-being, morale, and even workplace productivity. If you think about that
productivity. If you think about that for a moment, it shouldn't really surprise anyone. If you have a job where
surprise anyone. If you have a job where you think your superiors are constantly watching you and judging your every move, you'll be concerned with appearing to be productive instead of learning and developing your skills at your natural
speed. Or consider this recent study
speed. Or consider this recent study that strongly suggests that high levels of surveillance causes a steep decline in voluntary visual processing, meaning that it quite literally impairs the
brain's ability to process and recognize human faces. And once again, this is one
human faces. And once again, this is one of those things that you hear and you're like, "What?" But when you think about
like, "What?" But when you think about it, it's not all that surprising. When
you feel like you're in an environment where you are intrinsically not trusted, you're going to be far less likely to make friendly or meaningful social connections with others. And I'm sorry, but that just doesn't sound anything
like a safe environment to me. And
that's on top of an exhaustive amount of studies outlining exactly how and why increased surveillance decreases well-being and mental health. But some
people seemingly only read the studies conducted by companies trying to sell them something.
Meet Mike Johnston, the mayor of Denver.
At first, it seemed like he had the same careful agnosticism about flock safety cameras that many researchers have.
>> Our flock cameras are shut off to every federal agency, everyone outside the state of Colorado, everyone outside the city and county of Denver. Uh, and no one can access them other than Denver Police Department officers. You could
interpret all of the research that I showed you in this video however you like, but the way I interpreted it was that it technically demonstrated and proved that statement to be false. But
guys, don't worry about this.
>> Could a federal law enforcement agency use this database to track someone down on an icehole and arrest them? Uh, no,
cuz the system is not designed to do that. that in just over a year of data
that. that in just over a year of data of usage of Denver's block safety services, queries openly admitting to be used for immigration services amounted
to over 1,800. Mike even warns us of the grave dangers of cutting this data off from external communities.
>> I want to be clear, this is a risk on public safety. If you have someone that
public safety. If you have someone that commits a crime in Lakewood and flees into Denver, they will not be able to find that person in Denver. You know, we had a uh a trans woman who was kidnapped and murdered uh picked up in Denver,
murdered in Lakewood. We solved that crime because Denver and Lakewood could talk to each other across a flock camera database.
>> Except that didn't happen at all. He's
referring to the death of Jax Grafton, which was not a solved murder. Jax's
mother was more than happy to speak her mind about this. I'm shocked and appalled that a public official would use my daughter and and claim that Flock
had anything to do with her f being found.
>> Fortunately, as a result of all this [ __ ] and citizen backlash, the Denver City Council overwhelmingly voted to not renew the flock safety contract.
And in their council letter on the issue, they went as far as calling out flock safety's ethics and credibility.
and I quote, "We do not believe that the city and county of Denver should continue doing business with a company that has demonstrated such disregard for honesty and accountability." Whoa, now
hold on before you celebrate your renewal of faith in humanity. Mayor Mike
Johnson sideststepped city council and signed the flock safety contract anyway, which council members are now calling a backroom deal with a known bad actor.
And all of this [ __ ] happened just in time for police to drive out from Coline Valley, Colorado to knock on a woman's door in Denver with a summon wrongfully accusing her of stealing a package off of someone's porch. Any guesses on what
technology they're citing as evidence?
>> Flock cameras.
>> You know, we have cameras in that town and you can't get a breath of fresh air in or out of that place without us knowing. Correct.
knowing. Correct.
>> To be fair, that seems like a pretty safe prison city. While the officer refused to look at the overwhelming dash cam, torch camera, and Google Maps evidence exonerating the woman, fortunately, the police chief eventually
did. But this makes one ponder, what's
did. But this makes one ponder, what's the result when this happens to a 19-year-old black dude.
Notably, this year, people in communities across the United States have been increasingly concerned about or opposed to the rapid expansion of private surveillance in their communities. A lot of people have been
communities. A lot of people have been noticing more and more of these little black cameras with the solar panels and just assume that they were innocently monitoring traffic flow or maybe giving a dispatcher a better idea of who or
what to send to deal with an accident.
But now they're finding out what they do and what they're capable of doing. And a
lot of people are just like, "Yeah, that." And now every few days I hear
that." And now every few days I hear about another city pushing back and deciding to take down Flock safety cameras. However, in some cases, a city
cameras. However, in some cases, a city formally deciding that they no longer want Flock Safety Services or cameras somehow doesn't result in them going away. After finding out that ICE was
away. After finding out that ICE was using their cameras without their knowledge or consent, the Chicago suburb of Evston decided that they wanted them removed. So, Flock Safety then
removed. So, Flock Safety then reinstalled most of them. And since you can't own a Flock Safety device, only lease it, the city or police department would be handling and potentially damaging private property when removing
them. So now Evston, Illinois is
them. So now Evston, Illinois is spending tax dollars on legal expenses for cease and desist letters and covering the cameras with plastic sheeting to protect residents from being tracked by them. But my previous video about this topic came way too late.
Organizations like Lucy Parson's Labs and Sassy South have been pushing back for years. Another great example, Will
for years. Another great example, Will Freeman, who's a software engineer who started Dlock last year.
>> Around a year ago, I was taking um a road trip from Seattle to Huntsville, Alabama. And I ran into so many of these
Alabama. And I ran into so many of these along the way in these like really small towns. I wanted to do what most cities
towns. I wanted to do what most cities weren't doing and actually tell people what these things are, where they are, and how many there are. Will is more or less simply trying to keep a map and publicly accessible record of deployed
cameras. I think this is something that
cameras. I think this is something that both local governments and flock safety should already be doing, and you'd probably have a hard time finding anyone who disagrees with that, right?
>> I've never gotten a seasoned assist before. Uh but luckily before this even
before. Uh but luckily before this even happened, the EFF reached out via email and uh just said that they were there if we needed anything. So I reached out to them and then they were able to send a
response, actually two responses because their lawyers sent another uh another letter back saying basically like we don't care. Uh we think you're wrong.
don't care. Uh we think you're wrong.
Anyway, >> block cease and desist here was pertaining to trademark, which in my opinion seems like an absurdly frivolous way to try and bully someone to take down a website that simply provides the public with some transparency about
surveillance that their tax dollars pay for. But Garrett Langley, the founder
for. But Garrett Langley, the founder and CEO of Flock Safety, doesn't see it this way.
>> And then unfortunately, there's terroristic organizations like Deflock, whose primary motivation is chaos. They
are closer to Antifa than they are anything else.
>> I mean, where do you even begin with this? Firstly, if you're going to live
this? Firstly, if you're going to live action roleplay The Dark Knight, at least watch the movie up until the point where Lucius Fox and Batman both agree that their mass surveillance system is grossly unethical and intentionally destroy it. Secondly, using buzzwords
destroy it. Secondly, using buzzwords like Antifa doesn't exactly invite rational or goodfaith discourse. You
trolled this guy with legal demands and then publicly accused him of being a terrorist. But the key takeaway of this
terrorist. But the key takeaway of this interview for me, >> but that's why we have a democratically elected process, right? Like I I we're not forcing flock on anyone.
>> Let me make something clear. My little
farm here is not exactly in a dense urban environment. We don't even have
urban environment. We don't even have sidewalks. I literally cannot leave my
sidewalks. I literally cannot leave my neighborhood to go get groceries or ship out a package without passing a flock camera and having my activities logged into a database that is shared to a much larger regional database. I am not
allowed to know who has access to this information. The people sharing my local
information. The people sharing my local information regionally or even nationally most likely do not know exactly who has access to it. I didn't
sign up for flock safety. I never
consented to it. I've never had an opportunity to vote for it. I do not have the option to opt out of it. And
then after researching and taking some pictures of the cameras that are constantly photographing me and seeking more information about them, I coincidentally get cops in my driveway waking my family members up asking weird
questions and freaking my neighbors out.
Now I'm shelving educational video projects to pay attorneys and constantly have to make sure someone is around to take care of my animals because every day I'm not sure if I'm going to be [ __ ] detained for literally not breaking a single law. We're not forcing
flock on anyone. lacks so much perspective and is packed with so much delusion and cognitive dissonance that even Forb's senior editor can't manage to keep a straight face through the sentence. We're not forcing flock on
sentence. We're not forcing flock on anyone.
>> How you doing?
>> May I help you?
>> Here are just a few of the many strange events regularly happening at Ben's home and lab this month that may or may not be associated with this video.
That's somebody being like, "What the are you doing?"
Literally, all the neighbors are freaked out.
>> I think he's just recording video.
>> Would it be legal if I took my out?
Let's just be honest for a second and state the obvious. These cameras aren't exactly little impenetrable fortresses.
They're plastic Android cameras and compute boxes mounted 7 ft off the ground with hose clamps. And in many cases, they can be found in semi-ural areas where one has trouble finding a stop sign that doesn't have bullet holes
in it. And there are a whole lot of
in it. And there are a whole lot of people who absolutely despise these cameras. But we're being patient and
cameras. But we're being patient and we're taking the high road. We're asking
our local, state, and federal governments to not throw our tax dollars at a fast scaling tech startup before adequately researching the risks and rewards and especially before vetting the hardware and software services that
are harvesting our information. It might
seem like my research and videos on this topic are anti- flock. And to some degree, because of delusional [ __ ] like this, they are. But we also shouldn't deny that there's a long list of companies trying everything they can to take Flock Safety's place on the
leaderboard. If somehow we all woke up
leaderboard. If somehow we all woke up tomorrow morning and Flock ceased to exist, another startup would quickly be in their place, promising to help police solve crime in exchange for taxpayer money. But do you want to know what I
money. But do you want to know what I find absolutely outrageous? That over
80,000 surveillance cameras were installed all over the country. And I'm
not aware of a single public audit of the devices, the services, or the technology. And if there was, they would
technology. And if there was, they would have found the exact same lowhanging fruit detailed in this video. And that's
the big difference here that we need to constantly be acknowledging. Your
government supposedly exists to keep order, security, and safety for society.
Flock Safety exists to make money.
They're not a charity. They're a $7.5 billion tech startup reportedly preparing to launch an IPO. And by far their biggest investor is Andre and Horowit. So this is pretty simple stuff.
Horowit. So this is pretty simple stuff.
Should we trust civilian data with a company who is partially controlled and funded by Andre and Horowits? Well,
let's see. Mark Andre was a board member of Facebook during the Cambridge Analytica scandal and was one of the people on the hook for over 8 billion in privacy violation settlement. A6Z's
portfolio company, Coinbase, exposed sensitive information of 69,000 customers. Even the A16Z website itself
customers. Even the A16Z website itself had flaws allowing hackers to view sensitive information about their portfolio companies. Ah, there's so
portfolio companies. Ah, there's so much. What else? Lendup was shut down in
much. What else? Lendup was shut down in 2021 for repeatedly breaking the law and cheating its own customers. Telus
deceived their customers into thinking that they were putting their money into FDIC insured savings accounts. Wise was
involved in funding Hamas and pig butchering crypto schemes. But no, I'm sure this time it'll be fine. I'm so
sure that it'll be fine that I'm not even going to look under the hood.
I have an idea and it's a pretty obvious conclusion to all of this and I think it's an idea that anyone watching this can agree on. Well, except for maybe the people invested in government surveillance. But hear me out. If a
surveillance. But hear me out. If a
private company wants to offer services to the government that are related to national security, public surveillance, or processing data that will be used within the public justice system, they will have to pay an application fee and
provide access to any hardware or software that they intend to put in public. This application fee will hire a
public. This application fee will hire a small team of independent security researchers who are vetted and unaffiliated with the services or products that they're researching. This
team will essentially do exactly what we did in this video, but with adequate resources. If problems are discovered,
resources. If problems are discovered, CBEEs will be published and responsible disclosure will be followed, which would presumably help the company tighten things up without having to pay bug bounties. And if no major problems are
bounties. And if no major problems are discovered, then the company will receive a rating that will be valid for one year until they have to renew their vendor license and get a less intensive inspection. This is not too much to ask.
inspection. This is not too much to ask.
You can't open a hair salon without a license. You can't keep a McDonald's
license. You can't keep a McDonald's open without a health inspection. You
can't legally drive a car past a flock safety camera without taking a routine driver's test and having a vehicle that passes basic safety requirements. This
is an apolitical common sense solution to a really big problem. I'm going to formally propose it to legislators I'm in contact with. And I think it'd be really useful if you, yeah, you wrote, emailed, and called your representative
senators and state attorney generals proposing the same thing. I don't
believe that humans are intrinsically right or left. But if you haven't noticed, finding objectivity in the media right now is like being a grasshopper stuck in the middle of a football field. I'm very familiar with
football field. I'm very familiar with Ground News. I've been a paying customer
Ground News. I've been a paying customer of the service for years. It's an app and a website that collects news articles from around the world and organizes them by political bias, reliability, and potential conflicts of interest such as media ownership. Ground
News themselves are independently owned and funded by subscribers like myself, and they're vetted by three different independent news monitoring organizations. So, here's an example.
organizations. So, here's an example.
Here we have this AI generated drone illustration hovering over protesters and it's warning them that they're being monitored by predator drones orbiting over Los Angeles. Yes, orbiting. I
suppose if people could believe that the earth is flat, then people could believe that LA is a celestial body. Okay, so I head over to ground news to see if this is even a thing. And kablam, it is. Then
you can see these bias filters, which you could use as a sort of political compass or an amusing joy ride to see how cooked we all are. more. I can
easily see who owns the media source and then their factuality score from three different news monitoring organizations.
And then if I'm feeling brave, I can use the blind spot feature to see the news stories that my own personal internet echo chamber isn't showing me. And as
usual, when I have a sponsor on this channel, any profit from that sponsorship will go to UNICEF Ukraine.
Russia is actively targeting Ukraine's infrastructure, and nothing spoils the holidays like knowing that you didn't help children freezing to death. If
you're into this, you could subscribe to get 40% off the Vantage plan by scanning this QR code or using my link ground.news/ben.
ground.news/ben.
In my last video on this, I talked a lot about ethics. I talked a lot about how
about ethics. I talked a lot about how ALPRs and AI cameras could be misused and how they already have been misused.
I outlined how I believe it easily violates Americans Fourth Amendment rights. And here is some feel-good
rights. And here is some feel-good information that you probably don't get to hear much of in 2025. This is a completely nonpartisan issue. One could
use this type of surveillance to track people that ICE intends to capture or deport. Or one could use this type of
deport. Or one could use this type of surveillance to track ICE to sabotage and warn people about raids. It could be used to track a woman leaving her state to get an abortion. It can also be used to track someone driving around during a
lockdown during a pandemic. There is no civilian anywhere who is always 100% aligned with their government throughout their entire life. If mass surveillance sounds good to you today, then it probably wouldn't have sounded good to
you 5 years ago. And I guarantee you that it won't sound good to you at some point in the future. This isn't a right verse left thing or a Republican verse Democrat thing or an empathy verse logic
thing. It's an authoritarian verse
thing. It's an authoritarian verse individual thing. Privacy is a form of
individual thing. Privacy is a form of power that increases your control over your own destiny. And right now, you're at a junction where you're made to be so scared of your neighbors that you might
be willing to give up that power. Or you
can simply say, "No, I refuse to pay for my every movement to be tracked by my government through a for-profit company that hasn't even been adequately vetted to protect my security." But for this to
stop, you need to use your voice and you need to get involved. There are links in the description to show you exactly how to do that.
This video is by far the most timeconuming impactful expensive and collaborative product I've done on this channel or with this nonprofit.
Obviously, it would have not been possible without John Gaines incredible and thorough research or Joshua Michael's determination, ethics, and skill set. If you work in government and
skill set. If you work in government and listen to that part a few minutes ago about setting up a research team to verify technology vendors courting for government contracts, both John and Josh's websites with their contact information can be found in the
description. I also need to thank the
description. I also need to thank the exhaustive level of work from Ed Vogle, Sassy, Lucy Parson's Labs, my legal council, Albert Sers LLP, and those incredibly brave individuals who trusted
me with information that could get them in a lot of trouble for sharing. They
put the safety and common good of society above their own interests. And I
want to acknowledge how meaningful and frankly beautiful that is. I want to thank all the legislators, police officers, and commissioners who, instead of calling me a terrorist, participated in a civil discussion on how to serve
their communities better. But most of all, I want to thank my Patreon members.
It might not seem like it, but this video and the associated research, the factchecking, the legal council costs tens of thousands of dollars. And quite
literally, without the support of my Patreon members, independent research like this would be impossible for me. If
you want to join that community and pitch in for more content like this, as well as a whole lot of dorky and sciency and artsy content and be part of an incredible Discord community and forum full of like-minded folks and a monthly
songwriting challenges you can join for as little as $1. Thanks for watching.
Keep creating. Bye.
Loading video analysis...