If you're an American, you've been probably seeing a whole bunch of these things. And in some places, they're so common that you don't even notice them. They just blend into the background, like their trees or street lights. And you've probably correctly assumed that they're recording traffic. They're also recording and logging license plates and using AI image recognition. But what if I told you that they are in fact not owned by your local police department or your local government, but are licensed to them by a third-party startup, and all of your vehicles whereabouts are being tracked by a third party data broker. What if I also told you that major retail chains are also using them and they're combining your vehicle's whereabouts with your personal information, your shopping habits, and even your instore behavior. And some of them are giving that information to law enforcement. And what if I told you that I just possibly may have come up with a way to break it? [Music] This Colorado family was being driven to a shopping center in a stolen vehicle, and they would have gotten away with it without cutting edge technology alerting police of their crime. Nah, that was actually an OCR error. Well, this New Mexico woman and her 12-year-old sister were driving to the park with a stolen license plate when AI enhanced technology allowed police to That one was another computer vision error. Well, yet another car thieving woman of color caught red-handed. Nope, just a glitch. This situation keeps happening all over the country. Your tax dollars pay startups to rent AI superpowers to law enforcement, and then the following year, your tax dollars pay the lawsuit settlements when it doesn't work. Flock Safety is a startup that was founded in 2017 that specializes in developing and leasing security cameras that have AI capabilities such as license plate recognition and vehicle identification. And these security cameras feed into databases that law enforcement, private companies, and even private citizens can access and utilize. And if you own a car in the United States, you have unquestionably been logged within one of these databases. It works like this. So, you drive past a flock safety camera and it records an image or video. An image segmentation model or something similar looks for the license plate itself or a rectangle with some key identifiers like tail lights or a rear window. Once it is confident enough that it found a license plate, it sends a portion of the image to an OCR model or optical character recognition, which is probably the most widely used type of AI for consumers. For example, it's used in PDFs or any type of automatic data entry, like scanning and cashing checks on your phone with your bank's app. AI like this typically uses a confidence level threshold. So, if the OCR model is above, say, 90% sure that your license plate was read accurately, then the information is saved with a location and time stamp on it. It can also classify the make and model of your vehicle and note if there are any bumper stickers or add-ons or cosmetic damage on the vehicle. And this information as well gets stored to that database. A law enforcement officer can then run your license plate and see every single time that your vehicle has been tagged in the database. So, if you're in a city that has a lot of these cameras, it pretty much has the same effect as secretly sticking a GPS device on your car. Flock Safety is also happy to provide service to businesses and homeowners associations, and those private clients often allow law enforcement to access the data from their cameras. Then there's the hot list. If law enforcement puts you on the hot list, they are notified every single time that flock detects you. So, it's kind of like having a police tail all day, every day, but without the pesky annoyances of requiring a judicial warrant to target and track a citizen who may not even be suspected of any sort of crime. So, now you might be asking, what is there to stop, I don't know, Texas law enforcement tracking women out of state who are suspected of getting illegal abortions? Or how would you prevent a jealous, abusive person who also happens to be a law enforcement officer from using the system to constantly track his ex-girlfriend and her new partner? Well, unfortunately, you wouldn't prevent it because these are just two of the many examples of things that have actually already happened. Flock Safety has exhibited the usual hyperaggressive startup behavior that we've all grown accustomed to near Orlando, Florida, for example. They installed nearly 100 cameras on public roadways without even notifying the county. And this entire industry sales pitches police departments with their cherrypicked success stories. If we didn't live in a surveillance state, this old lady with dementia would have never been found. or if it weren't for all these cameras, this homicide suspect would have never been taken off the streets and would still be murdering people right now. Had this license plate reader error not led to a mother and her young children being held at gunpoint, that family would have never received a $1.9 million settlement from Aurora, Colorado taxpayers. Weirdly, I couldn't find that example in any of their sales decks. From every single agreement that I've looked up and seen, the police department or local government do not own the camera, but pays for their installation. then rents them for an annual subscription fee. And usually this is between $2,000 and $3,000 per camera per year. But apparently there's been a recent price hike that has strained the budgets of some police departments. In all these agreements, Flock Safety's liability policy is extremely tight and protects them against virtually all claims should something go wrong or an equipment or data error mislead an investigation. And one of the most amusing parts of this agreement is this section that is telling the police department to call 911 and not rely on flock safety services in the event of an emergency. Yikes. Civil communities, police departments, private businesses, and individuals are not allowed to sell, lease, lend, give away, auction, or otherwise transfer ownership of any of Flock's IP or hardware. At no given time are you allowed to decompile, disassemble, reverse engineer, modify, alter, tamper with, or even repair any of it. Later in this video, I'll be disassembling and somewhat reverse engineering some ALPR cameras, but not Flock. If I did have access to some of Flock's hardware sitting in a box right over there, my attorney advises me that showing you any of it would jeopardize the existence of this video as a whole. A big reason for this aggressive scaling is because the business model of companies like this is actually quite brilliant. They market themselves as public safety services licensing cutting edge technology, but license plate readers and security cameras have existed for decades. Block Safety and their competitors are data brokers. A city, police department, business, or homeowners association pays to borrow their cameras and then they retain the right to make some of that data accessible behind the same licensing paywall scheme for other clients or customers. Andre Horowitz, who is the VC of choice for data brokers and big data startups like Data Bricks, Fiverr, Scale AI, and Golden just to name a few. They've invested nearly half a billion dollars into Flock Safety, adding another $275 million just a few months ago. These are data broker numbers. For example, ADT, the largest and most recognizable home security company and brand in the world, went public in 1969 and has a market cap of $6.8 billion. And that market cap is eclipsed by Flock Safety's recent F-series valuation of nearly $8 billion. And it needs to be pointed out that Flock Safety and Andre Horowits have spent a combined $92,680,000 in lobbying and the vast majority of that lobbying was in the last year. Maybe you are a law enforcement officer and Flock Safety has helped you solve cases. Or maybe you think that a poorly substantiated claim about a drop in crime is worth the constant monitoring and tracking of hundreds of millions of taxpaying citizens who are not criminals. And those small little glitches that result in young children being held at gunpoint face first on the hot asphalt in a parking lot are just some cracked eggs in the overall omelet that society is trying to make. And if you agree with that and that's your perspective, fair enough. I strongly disagree with you. But this is a conversation that's not even worth having because all of this is about to get a whole lot worse for everyone except for the billionaires extracting wealth from taxpayers, of course. Hey guys, I'm still in the studio. We just finished our Q1 product launch. Fingers are still a little bit shaky. Flock Nova is going to change the game for criminal investigations. I've heard from so many chiefs that they have this system and that system and that system. They know they have the data, but it's taking their analysts hours and hours to build a case. And now with Flock Nova, it's one click to one case solved. Flock Nova is a brand new product that is in beta at the time of me recording this. And it combines computer AED dispatch, criminal records, video surveillance, automated license plate reader data, and here's the kicker, third-party data acquired by Flock Safety. What third-party data, you ask? They haven't exactly been transparent, but 404 media asked the same question and found out that Flock Nova was using data leaked and sold from large-scale hacks and security breaches. And since that story broke, Flock made an announcement claiming to no longer be using illegally stolen personal data. Most of us at face value don't really seem to mind or at least don't think it's a big deal that our data is being farmed from us. Like, is it really that big of a deal that your activity on Pinterest under a mostly anonymized account is being stored and analyzed and sold? Well, it becomes a very big deal when it's combined with other data and stops being so anonymous. And when your health questionnaire information on Better Help was sold to Facebook, where you posted a bunch of photos from bars 10 years ago, and then all of that gets bundled by a data broker with your tracked and logged bad driving habits that your vehicle manufacturer sold. Yeah, that's a thing, too. Guess what? All of this may explain the reason your car insurance costs twice as much as anybody else in your age demographic. Or it may explain why you can't seem to get a good interest rate on a loan. And now, especially in 2025, where something you may have posted a decade ago will get you arrested and deported. My point is, data privacy never really seems like a big deal until it is. Some states have various laws regarding this, but nobody really seems to give a [ __ ] because in the few cases where they do get caught, the fine or settlement is a mere fraction of the income generated from buying and selling your third party data. If a large retailer or fast food chain is not using flock safety, then they're probably using some other type of automated license plate reader. But once you walk in those doors, oh my god, let's just use Walmart's corporate privacy policy as an example. They're logging your personal identifiers, your name, your phone number, your address, your email, your driver's license number, your signature, device identifiers such as your phone or smartwatch's MAC address. They're susing out your age, gender, citizenship, race, marital status, household income, education, unemployment information, family health, number of children, credit card numbers, and other payment information, geoloccation history, photographs, audio recordings, video recordings, and if known, background checks, and criminal convictions. Oh, hang on. And I haven't even gotten to the creepy part yet. Inferences, aka your behavior and preferences from your shopping patterns, intelligence, and aptitude. And they reserve the right to share any and all of this with third parties, including but not limited to, you guessed it. It's been reported that police departments have been running queries and sharing Flock Safety's data with ICE. Some Home Depot and Lowe's locations are also Flock Safety customers, and they have also been known to share Flock Safety's data with law enforcement. It might be safe to assume that some of the immigration rates happening at Home Depot and Lowe's locations may be connected to this honeypot of data. So when an immigrant picks up some construction supplies before work, the store will then send the license plate and vehicle identification data to flock safety. That would trigger an alert on the hot list and just like that, masked men arrive and pick up the Venezuelan day laborer with no criminal record and ship them off to a prison camp in El Salvador. Like why can't y'all just [ __ ] sell tools? So we have these creepy AI cameras everywhere proving every paranoid schizophrenic person right by logging our vehicles every move for police retailers and even gated communities. Then we have retailers building advanced personal and psychological profiles about us which are then being traded to data brokers and combined with our internet shopping, browsing and social media activity. And now law enforcement data brokers are buying it all up. Here's a hard truth that we all need to know and consider. If every single actor in this database brokerage theater were playing by the rules and following every single law and acting ethically, your data is still only as safe as the database with the weakest security policies or that one employee that falls for the fishing email. In 2024 alone, if we are only counting data breaches that were formally reported, that number exceeds over 10,000 incidents affecting 4.2 two billion compromised records. But fear not, our legislators are coming to our rescue by banning Tik Tok. Did I hear that right? And it kind of doesn't even matter at this point. Debating the safeguarding of personal data in 2025 is like standing on the burnt remains of a home and debating what kind of fire extinguisher you're going to put in the kitchen. This next section isn't about big data or retailers. We're going to circle back to the data mining that your tax dollars pay for, and we're going to see just how insecure it is, and we're going to try to come up with some ways to defend ourselves from it. In this chapter, we'll be technically going over how ALPR cameras work and examining a few hardware vulnerabilities, but we kind of don't have to. From what I gathered, Flock safety cameras can be reached by Bluetooth. And without breaking off into a completely separate tangent, security is not Bluetooth strength. To Flock Security's credit, from what I could tell, all the Bluetooth seems to be doing is relaying hardware and status information about the camera and transmission system. The camera system also has a Wi-Fi radio that acts as a router with WPA2 encryption, which a lot of consumer products still use despite it being a 20-year-old technology with a lot of flaws. One of those biggest vulnerabilities is in the handshake. If you were to try to guess the passcode on an iPhone, you have about six tries before the device times out and makes you wait before you can try again. And this is the same or very similar for any smartphone or Gmail or any modern login page. with WPA2 Wi-Fi, there's a pre-shared password key and an attacker can capture the handshake on their own device so they can stick their GPU on it. And if everything's updated and working properly, and there are no other security vulnerabilities at all on the router, a modern gaming GPU can crack an 8digit key in under 3 hours. If your Wi-Fi password is a known word that would be in a password dictionary, I'd give it about 60 seconds. With some other vendors and services, you don't actually have to crack anything because they failed to secure their RTSP video feeds properly. And in just a few hours, with some specialized search engines, I managed to browse my way into accessing live feeds from over two dozen traffic cameras. And what's even more troubling is that in a couple small towns that I audited, if one camera wasn't secured, none of them were, allowing anyone the capability of tracking every single vehicle in the town indefinitely without ever having to see or write a line of code. Another troubling example, Hickvision, the largest IP camera company in the world. They make all types of cameras, many of which have ALPR technology and are marketed towards police departments. In 2022, hackers were able to not only view the feed, but exploit the firmware used in over 80,000 cameras, allowing them to execute code remotely. And then just last year, the Russian military compromised Hickvision cameras in Ukraine to obtain intelligence to plan air strikes on Kiev and subvert Ukraine's air defense systems. In fact, right now, at the time of me recording this video, you can download an exploit for Hickvision cameras on GitHub that allows you to view the feed, retrieve snapshots, and extract user credentials. Then you have Verata, a massive security camera company whose chairman has bragged about locking clients into a predatory subscription model. There is a hardware component and a licensing component. Now, the license, you might have bought a one-year license, but you like literally bolted the hardware to your ceiling. So, like you're not taking it down. many of their clients in health care, prisons, police departments, and so on couldn't afford to leave for a more secure ecosystem. So, when every single one of their 150,000 plus cameras were hacked due to a corporate administrator account having their username and password publicly exposed on the internet, hackers not only had access to the private video feeds, but the networks that they were connected to. And they were also able to access the archives uh of those cameras as well, the clips that had been saved by those customers. If there's one way to get me completely obsessed about something technical, it's to tell me that I'm not allowed to learn about it. And few industries are cagier about technical analysis or pen testing than the security camera industry. One would assume that it's because they do not want the general public figuring out the technical function of a device that people depend on for security. And by forcing customers into tight agreements or cease and assisting anybody who posts a disassembly video, they're making you safer. If you're buying or if you're in the market of buying any type of home, business, or civic security device or solution, you should be looking for a solution with a high level of openness and transparency. Probably about the worst thing that a nerdy YouTube channel can do to malicious hackers is make content publicly reporting on security vulnerabilities because it forces the manufacturers to swiftly correct those vulnerabilities or face losing customers to competitors. I started my journey here by obtaining one of the most popular license plate recognition cameras that are mounted on police cars made by Vigilant Solutions, which is a tough sounding name that Motorola uses when trying to market to law enforcement agencies. It truly did feel like more of Motorola's efforts went into obiscating its functionality and making it absurdly proprietary than the part that, you know, reads license plates. And ironically, it had a micro SD card in it with unencrypted data. Yeah, not great. Most of them have two different camera sensors, one for the daytime and one for the nighttime, and then a bunch of infrared light emitters. So, for example, if a police officer is driving behind a vehicle at night, it'll send out an infrared flash and then take a photo at a very high frame rate. So, it's underexposed everywhere but the infrared reflection from the license plate itself. And keep in mind that our human eyeballs are not able to see infrared light. So, you or the police officer probably doesn't even realize that a flash is taking place. I'll demonstrate this really quick. Here is a longer exposure with no infrared flash. Here is a longer exposure with an infrared flash. Here's a very fast exposure with the infrared flash. Anyway, this channel's attorney has expressed some concern over the complicated legalities of powering the camera on and using it in this video, considering how I obtained it. And no, I did not steal it off of a police car. So, instead, I took it apart and analyzed just about every chip on it and then rebuilt something similar to the functionality of the daytime camera to see if I can get a nice license plate reader on my truck, just like the police have. And then rather than sending all the data to an external computer server, I figured I might as well try to do everything in a sleek device that I put together. I put a Raspberry Pi 5 behind a 7in touchscreen display, then used a USB camera module with the exact same specs as the daytime camera. As mentioned earlier, the first order of business is image segmentation or finding the license plate itself. And this was the most challenging part that required the most tweaking because uh rectangles are everywhere. The OCR or the plate reading part of this project worked just fine. But if this were my police department and I was the chief, I'd demand better. And ultimately, what's the point of doing all of this if we're not going to create a future proof method of using technology to allow citizens to opt out from being logged by these types of devices? So, we need some more modern AI. There are quite a few license plate reader models available in a virtually endless supply of shape recognition models. YOLO or you only look once seemed to perform the most accurate out of the bunch and it was working much better than the open ALPRbased models that some popular public safety data broker companies use. But YOLO is also pretty power hungry. So I had to add a Halo AI board with 26 tops to my little police computer here. And then I decided to switch to a camera that communicated directly with the Pi board with dual channels. And that way I wouldn't have to deal with the overhead of a USB controller. I know this is so dorky, but look, we have an incredibly powerful computer vision system that's much more accurate than anything on the market for law enforcement. And it probably cost me around $250 if you don't count the trial and error. And what's better is I could probably use an accurate computer vision system for future projects. Or maybe it'll just simply let me know when one of my chickens escapes. But in the meantime, we're going to pump it through Recourse Scout, a company similar to Flock Safety that also allows the option of sharing my data to the police. This extremely secure sharing network allows agencies to access data and records from any other participating partner across the Recor ecosystem, including residential members of our Recor Scout home package. Come to think of it, now that I'm saying this out loud, this also presents a pretty big security vulnerability as I could just rob a bank and then supply input data with an RTSP stream tagging me driving my vehicle 400 miles away from the bank. But anyway, all right. So, welcome to Ben's top models, AI models. We have Recor, YOLO V11, plate recognizer, and classic open ALPR for our four license plate and vehicle tracking test subjects. And I even made a custom license plate for this project, which is also what I will make my DJ name. Ben Jordan is taking a more direct approach. He wants to confuse the AIS using something called adversarial noise. Confusing AI models and preventing them from replicating the track. If you've been here before, you may have seen my ongoing project that encodes music files with noise that AI hears, but humans cannot. It effectively makes instrument and genre classifiers completely confused, and that makes the AI hear complete and utter nonsense. The whole point of that project was to develop something that would prevent AI companies from scraping my and others albums and reselling the data as an AI music service. And so far, the algorithm is working very well and improving. And the reason I develop things like this is because the people that we've elected to protect our intellectual property and our privacy rights aren't doing their jobs. So if we want any sort of defense from this, we need to roll up our sleeves and use technology to defend ourselves. Sometimes videos like this are controversial because adversarial noise attacks have the potential of becoming a pretty big expensive problem for the AI industry, especially since the vast majority of that industry is just rapidly feature hunting while building on top of pre-existing AI models that just work. For example, you can't ask an AI model how exactly it recognizes a face because it doesn't have a human brain or eyes, nor is it equipped with a consciousness-based intelligence capable of actually recognizing a person. This blackbox problem isn't really a problem when an AI model manages to generate an image or a song or recognize a license plate. It just works. But when somebody introduces an attack that prevents the model from working, you can't just open it up and fix a few things. A few years ago with something called Nightshade, some tech-savvy artists attempted to poison pill their artwork by introducing invisible noise to the image that confused models trained for image generators. And it worked incredibly well at first. Luckily for generative AI companies, they figured out that by adding some blur to the images, then using another AI algorithm to sharpen those images, they could remove most of the confusing noise. That's not exactly a great solution or a cure for the problem. It's more like a bandage because now training image models takes longer and costs more due to these additions to the data processing workflow. And that particular bandage isn't going to be great for optical character recognition used by license plate readers because the whole point of the technology is accurately outputting text from an image, not generating abstract license plate photos. So, with that being said, here's what we're going to be doing here. I have a license plate database and I make 1,000 identical traffic images with a plate superimposed on a car. If a license plate isn't even detected at all by both models, meaning that it doesn't even see the plate or rectangle itself, that gets moved to class A. If the license plate was detected with both models, but it got the plate information wrong, that moves to class B. Then, if either model detects and reads it correctly, it goes to class C, which is a control group. Then I do this a whole lot of times with a whole lot of different license plates and create a robust data set to train a new model whose job is to generate abstract invisible license plate overlay patterns that cannot be detected by humans but make license plate recognition systems utterly [ __ ] the bed. So I put the processing application up here on GitHub and once you install processing it's pretty easy to use and everything is kind of explained in the code itself. Once you configure everything correctly, you'll have a bunch of images like this where you will have tiny bits of noise on your plate. Some will be more or less conspicuous than others. Now, we need to test those images. And I created a Python proving ground to do that. So, once the libraries utilizing your GPU are installed and you run the script, it will open up a little dialogue where you will select a folder. It will go through every single image in the folder. Then it will create a new folder and fill it up with images which will tell you where it detected the license plate and what it thinks it read and a confidence value of that. We have PU 5000 with the original. We have MOO 5000 with one of the perturbed. We have PDQ50954 do 5000 PO 5D. And then what I'm really interested in on some of these it didn't detect a plate at all. Somehow the noise pattern actually perturbed the image segmentation model detecting a rectangle on a vehicle. And if we could figure out what pattern is breaking the image segmentation, like detecting the license plate itself, then it's possible that you could just have a license plate holder that doesn't offiscate the plate at all. That could make it much less likely to be read by the AI in the license plate readers. Oh, and my little Python app here also outputs all the data into a CSV file just so you could analyze all the data in front of you at once. It was time to test this in the real world against these prototypes. I thought it would be fun to print out the noise on transparent adhesive sheets for my Pooh 5000 license plate. This is just an easy way to test the patterns and definitely not something you would actually want to put on your car and drive on public roadways unless you miss your friends in prison. Hey officer, it's just a little mud from all my off-roading. 12 mph. I'm definitely throwing these cameras a softball. Let's do one at congested parking lot speeds. I mean, that's like essentially holding text right up to the camera. That looks like a normal license plate. Some of these ALPR models are not as visually interesting as others. So, rather than just showing a bash terminal scrolling text, we'll stick with footage of the ones where I could pump live video in and get real-time results. So, with no adversarial noise, just the raw license plate, it detected PO 5000. This is adversarial noise layer A, and it's definitely pretty confused, and it doesn't seem to pass the confidence threshold to even register a plate at all. I tried the band one plate because it's a style that hasn't been used in like 30 or 40 years and it has red text and it's just very different than the one that we were trying to attack initially with the black noise and it definitely does confuse it quite a bit. All right, this is YOLO V8. Booyah. Not all of them passed the test, which was completely expected, but a lot of them did. And I'm reasonably confident that if I just screen printed some of these on a plate and slapped it on my vehicle, I'd be a traffic camera ninja. And again, these are probably not street legal anyway and are just training data for the final model, which will ideally be completely invisible to the human eye and not obstruct the license plate in any perceivable way. When I made the video about Harmony Cloak and when I introduced Poisonify, my AI music defense algorithm, I got inundated by people asking for a user-friendly app or website where they could use it on their own music. I got VCs wanting to invest in it and companies wanting to exclusively own it. And I'm flattered, if not surprised, by that level of enthusiasm. and it's actually really inspiring. But this channel is a nonprofit organization that allows me to do independent research on whatever it is that I feel like researching and then share those ideas for free publicly. And it's my dream job. And that same set of principles applies to this project. In Greek mythology, there's this venomous multi-headed serpent that lives in the swamps. And it goes by the name of the LNA hydra. And anytime that somebody tried to slay this monster by cutting off one of its heads, they would see two heads grow back in its place instantly. And this monster caused many generations to live in fear until one very powerful and brave man was tasked with destroying it. Donald J. Trump or Hercules or John Wick, I'm not really sure. If projects like this have any hope of putting a dent in things technologically or inspiring legislation to further protect people's IP or privacy, then you're going to need a whole lot of people working creatively on solutions and sharing that information with one another. And that way, if AI companies figure out a way around a defensive technology, two more will pop up in its place. And it's worth pointing out that I have absolutely no problem with my license plate being identifiable or readable by law enforcement officers or even the person driving behind me. I don't even have a problem with police using license plate readers mounted on their vehicles. Driving a two-tonon vehicle that is capable of going well over 100 mph on a public roadway or a parking lot is a pretty big responsibility and privilege. Anywhere in the United States, if law enforcement wants to track your location via phone for a prolonged period or even ping you beyond a single cell phone tower, they are required to have a good reason for it. And an elected judge needs to agree that it's a good enough reason by granting them a warrant. This was argued, debated, and decided by the Supreme Court in a landmark ruling from the 2018 case Carpenter vers United States. Timestamp data provides an intimate window into a person's life, revealing not only his particular movements, but through them his familial, political, professional, religious, and sexual associations. These location records, quote, hold for many Americans the privacies of life. So, it would be reasonable to assume that tracking someone's location with AI without a warrant would be a violation of their Fourth Amendment right, right? Well, that has not been ruled on yet. And if they agree that it's a constitutional violation, they'll be leading the Flock Safety unicorn out of its $8 billion stable and shooting it right in the head. And considering the combined $92 million lobbying effort of Flock Safety and their primary investor, maybe it's not so confusing why nobody will straighten this whole constitutional rights thing out. My point that's taking forever to get to is I have no idea if using adversarial noise to confuse AI tracking system is legal or not in your state. For this reason, if you were inspired by this video or this information, I would advise and ask that you use that inspiration to research this type of technology, and I would not advise or ask that you immediately screen print or place a matte sticker on your license plate because you very well could be the one that they make an example out of. I want to talk today about my love affair with this video sponsor, my Patreon members, because without them, I would not be able to tell you that this video would have been a perfect fit for the non-stop sewage pipe stream of emails I get trying to get me to talk about services like delete me or incognish dollars per year to remove your information from a few data brokers. The problem with these data protection services is that there is no complete data protection service because there is no complete list of data brokers for them to contact. And that doesn't even matter because in the US a data broker is only required to stop displaying data on their servers. So when they sell or license that data elsewhere, which is kind of the point of being a broker of data, your data will just be able to be displayed again. In fact, Incogn is owned by Nord Security or NordVPN, whose marketing affiliates have repeatedly sent me sponsorship offers with key points to describe them as hackproof despite having been hacked in a data breach that they didn't disclose to their users for a year and a half after it was discovered. By the way, Reject Convenience made an excellent video essay on this that I recommend and I'll link below. Everywhere in the European Union, your data privacy is treated and recognized as a fundamental human right. Their general data protection regulation requires you to explicitly opt in before your personal data can be logged or collected. And this provides the framework for similar protections in the UK, Japan, Brazil, Argentina, Switzerland, South Korea, Canada, India, even China, and so on. And in the majority of these countries, these protections are extr territorial, meaning that even though Google or Facebook are American companies or services, they'll need to become and remain compliant to those rules if they want to continue doing business in those countries. There is no reasonable need to try and disrupt license plate readers in these places because your health information, your social media activity, your facial expressions and sentiment while standing in line at Walmart aren't being bundled with them and then leased to law enforcement. Since the boom of the internet in the 1990s, the dangers of personal data brokerage became very apparent to just about all of us. And everyone from sociologists to data scientists to concerned taxpayers begged legislators to protect them. And even after stolen data was used to manipulate and subvert democracy in 2016, another decade passed without any meaningful protection from the federal government. So yeah, I guess we'll just [ __ ] DIY it. Hey, video editing Ben here. If you're a law enforcement officer or if you work for a law enforcement agency or if you're associated or aligned with them and if you've lasted up to this point in this video, thank you. I have been rather blunt about my feelings, but I also do not want to make a video that isn't solutionoriented. A big thing that inspired or I guess frustrated me into making this video was coming across the social media video of the Flock CEO talking about Flock Nova being one click to one case solved. Because every single person who has worked in law enforcement that I have met in my personal life has been an incredibly dedicated hard worker who I feel has been undertrained and underpaid. And from my perspective in the industries that I've worked in, hearing a tech CEO make your work look easy for their own personal and financial gain pisses me off. I understand that you're interested in this technology because you want your community to be safer. And I think that that's a noble reason. I'm going to write a brief open letter SLP proposal about how your law enforcement agency could use technology like this in a way that is customizable, upgradable, more secure, and most importantly built on a framework that is scaled for compatibility with admissible trial evidence. And furthermore, if you do have any technical questions about things that you saw in this video, reach out via email and I'm happy to try and answer them or point you in the direction of someone who can. While it may not seem like it, I am actually aligned with your ability to safely reduce crime. It's just that I want the most effective forensic technology to be owned by the justice system, not venture capital firms. This video took quite a few turns and hopefully some of you learned something or found it meaningful. It was also pretty expensive between hardware from government auctions to components to ALPR licenses. But I'll tell you what, this video would have been impossible to make without my Patreon supporters. So, thank you very much to them. And if you want to see more content like this, as well as have access to ultrasonic bird recordings and tons of other sciency and weird field recordings and audio assets and unreleased music, or our Discord server full of music and science dorks, or our monthly songwriting challenge, then please join us. It's as little as $1. Thanks for watching. Keep creating. Bye.