How to Apply the 'Tyrant Test' to Technology
Justin Hendrix / Feb 1, 2026The Tech Policy Press podcast is available via your favorite podcast service.
Just last week, the Department of Homeland Security released its latest AI inventory, revealing more than 200 AI use cases deployed or in development—an almost 40% increase since July. ICE alone has added 24 new AI applications, including tools to process tips, review social media and mobile device data, and deploy facial recognition to confirm identities. At least 23 applications use some form of facial recognition or biometric identification.
As the New York Times reported on Friday, tech companies are enabling a surveillance infrastructure that can target anyone for profit while challenging constitutional principles. And we have few laws that are fit to regulate it.
This is exactly the trap my guest today has been warning us about. In his forthcoming book, Your Data Will Be Used Against You, George Washington University Law School professor Andrew Guthrie Ferguson explores how the rise of sensor-driven technologies, social media monitoring, and artificial intelligence can be weaponized against democratic values and personal freedoms. Smart cars, smart homes, smart watches—these devices track our most private activities, and that data can be accessed by police and prosecutors looking for incriminating clues.
The book is out March 17 from NYU Press and is available for preorder today.

NYU Press, March 2026.
What follows is a lightly edited transcript of the discussion.
Andrew Guthrie Ferguson:
Andrew Guthrie Ferguson. I'm a law professor at the George Washington University Law School. I'm the author of Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance that's coming out this year, in March of 2026.
Justin Hendrix:
Andrew, your 2017 book, The Rise of Big Data Policing, covered some similar issues on some level as the book that you're about to publish. This may seem like too simple a question, a softball in a way, but I want to see how you might answer it just in terms of how you're thinking about the trajectory of things. What has changed in the nine years since?
Andrew Guthrie Ferguson:
So The Rise of Big Data Policing was the first book to really interrogate how police were using new data-driven technologies to change the power and relationships between citizens and police. It studied predictive policing, early forms of video analytics and big data policing. But two things have changed. One is the technology has gotten a whole lot better. And the second thing is that the regulatory framework hasn't really changed at all, which has meant that the technology has increased and enhanced without corresponding limits in law or regulation. And so we've seen police departments, local and federal use new technologies in ways that are largely unregulated. We've seen many of the warnings that were discussed in that first book come to fruition.
And so what I do in this new book ... Part of the self-surveillance world I'm talking about is democratically mediated self-surveillance. We are the ones putting cameras on our streets. We are the ones putting video analytics in our policing systems, and we are the ones paying for with our tax dollars, these new forms of cell surveillance. And I think we need to interrogate whether that is really good for democracy and good for us.
Justin Hendrix:
You also catalog the ways in which we are the ones, of course, outfitting our homes and ourselves with digital devices that are collecting information, collecting evidence. You say you're always one warrant away from having to give all that up. I want to focus in on the home in particular, because I think that's one way into this book is to think about the way you approach this. You talk a bit about what's going on in the home, the types of things we're outfitting our homes with, but also what you call the law of the home.
Andrew Guthrie Ferguson:
Right. So the book talks about the seduction of self-surveillance, this idea that we are putting smart devices in our home because we've been convinced that they are good for us. I'm not here trying to condemn people who have put Alexas in their bedrooms or Ring doorbells, cameras on their front porch. I understand it's part of the seduction. There's a reason why people have been convinced. What I think people haven't focused on is all of that data that is the most private and privileged data that you might imagine is at best a warrant away from discovery from the police should they think you're involved in a crime. And what I mean by that is there is nothing too private. The data from your smart bed, period app, the whispers that you might say in your bedroom that gets picked up by a smart listening device, nothing that police cannot get with a warrant. And much of that, they can actually get without a warrant, even though we try to protect the home.
And like the home is under our law, under our Fourth Amendment, most privileged places, the places where the Supreme Court has constantly tried to at least draw some boundaries around. But what we're doing is putting in devices that connect with third parties outside of our home. And the law is a lot fuzzier when we get to the data that is connected to third party providers that is available to obviously the providers who are paying to store the data, but possibly also law enforcement and government interests.
Justin Hendrix:
I just read a piece in The Guardian today about the extent to which domestic abusers are using data off of these devices. Another piece of that, you talk a little bit about abortion rights and the types of trouble that people perhaps seeking an abortion might face. On this podcast, we recently had Chris Gilliard. He talked about the idea of the Apple Watch as being effectively a ankle monitor. It seems like people, when you talk to them about these things, they're aware of the dangers on some level, but they always make the decision to simply go with the convenience, go with the benefit to themselves and to push aside the concerns about the broader social impacts.
Andrew Guthrie Ferguson:
And that's precisely the dilemma that the book wrestles with. It is trying to explain, interrogate why we are seduced by these claims of self surveillance. And many times there are actually good reasons to have it. That smartwatch on your wrist might tell you whether you're sleeping well or not, or maybe give you some clues about how to improve your life. There are reasons why we like that convenience, but what we haven't done is recognize that we are really opening ourselves up in a really vulnerable way to having all of that very personal data used against us should the government decide to target you for any reason. And while we, as I think it's a society, might be okay with the police using some of that data, even if it's personal and intimate, if it was a particular kind of crime, it could be a child abuse crime or a domestic violence crime inside the home, but as we've been seeing in this with this government currently in control of the federal law enforcement powers, the limits of how we're going to use power and police power are pretty much unrestrained.
And I think that what is interesting is that looking at the data that we are providing to the world, we are really vulnerable to having that data used against you. And you brought up abortion. So many people, when they find themselves pregnant, do lots of things of trying to get information. Maybe they use Google to search for information, maybe they ask their smart devices, maybe they are tracking their own personal medical thing. Maybe they have a period app that's collecting data. All of that information can be recovered by police if there is a predicate "crime". And in communities or states where abortion's getting criminalized, it means all that personal data is at best a warrant away from retrieval. And I'm not sure that's a society we want to live in. I'm not sure that we really trust no matter who's in charge of the government, Republican or Democrat, that we want that kind of power to go into our homes and take the information that we have given for good reasons to be used against us in a court of law.
Justin Hendrix:
You talk about the law of the home. You say in the hierarchy of Fourth Amendment rights, the home stands as our most protected space and is long held a special primacy in US constitutional law. Why isn't the Fourth Amendment serving its purpose? How are the constitutional constraints, the limitations that perhaps folks might want to believe are in place breaking down?
Andrew Guthrie Ferguson:
So the problem with the smart home in the context of the law of the home is that the data in the home doesn't stay in the home. If it just stayed in your home and was all localized there, police would need a warrant. Again, what a crime is, whether that crime is seeking abortion or dissenting against the government or whatever it would be in this day and age might change and the predicate to get a warrant actually isn't that high, but the data would at least require a warrant to get into that privileged space of the home. The complication with smart homes is the data doesn't stay in the home. Part of it's in the home, but part of it's going to the third party that you are paying for.
So I always joke that if the government would say, "Hey, tomorrow we're going to put wiretaps inside your home to listen to what you're saying. We're going to put cameras on your door to see who you're associating with and what you're doing. We're going to track what you buy and what you read. We're going to have a neighborhood of snitches telling people what's going on." I think people would be horrified that this sounds like a true surveillance nightmare, and yet that's exactly what people are paying Amazon for with Alexas in our bedrooms. We have Ring doorbells on our front door. We have the Neighbors app connecting everything that's going on in our neighborhoods. They know what you read on Kindle, they know what you buy at Whole Foods and on Amazon.com. And that's like one company. We consumers are giving up this data to private entities without control about how it's being used. And I think we should live in a world where we can have that consumer relationship with those companies, but also have laws in place where that data can't be used against us in a court of law. And that would be the perfect world where we could actually have the consumer convenience, but not be afraid that this data is going to be weaponized against us.
Justin Hendrix:
So of course, we're having this conversation in the context of a moment in this country where we're seeing perhaps some of the, as you mentioned, the warnings coming true. I feel like someone says that to me every couple of days in conversations around tech and surveillance and policy that warnings made years ago, especially by civil liberties advocates, folks concerned with privacy, almost have become real, especially in places like Minneapolis where we're seeing real-time use of facial recognition. We're seeing the federation of data from data brokers to make predictions about where people will be so that authorities can try to optimize raids in particular parts of the day, just an extraordinary combination of technologies. So I guess if you had to revise your warning ... You've been making the same warning now for some time. What's the warning now? Where are we headed in your view in just in the near term?
Andrew Guthrie Ferguson:
Two points. So first, I was just reminded that I was on a podcast in the first Trump administration talking about how the use of surveillance data against undocumented individuals was going to be the foothold to expand surveillance technologies in a way that we hadn't seen before. Why? Because people who are in an undocumented status in the US are the least powerful. They're completely powerless. The Constitution and other rules, it's a harder ability to push back there. And so those of us who were warning years ago that this was going to happen have in some ways been proved right. But I think what we're going to see now is that this is actually the first stage of the federalization of this surveillance power that begins with ICE, but then expands out. Everything you're seeing happening directed toward protesters in Minneapolis can happen against anyone anywhere in the United States because that technology will be available to law enforcement unless there are laws put in place to check it now.
One of the interesting, complicated, but I think somewhat profound moments and realizations that's happening now is that for the last decade, these technologies have been used, but in small places and usually directed against poor people, communities of color in policing states. Today, we're seeing the aperture of surveillance widen to people who were normally not targeted. More privileged, whiter communities that had largely avoided the surveillance lens that had been directed against certain communities. But I think it's an opening to have a national dialogue about whether we want this for our country, no matter who it's being targeted to. Right now, ICE and the FBI are largely unregulated by Congress about how they can use the technologies of surveillance, be these mobile facial recognition technologies, be these data broker technologies, be these location tracking technologies, and this should be a moment to have a conversation because those same technologies that are going after protests in Minneapolis can also target the MAGA people in four years when the political winds have shifted or anyone. And I think this actually should be a bipartisan movement that this isn't the federal power we want to be directed against citizens that is unbounded, unchecked, and can be targeted against anyone depending on who's in power.
Justin Hendrix:
There are a bunch of different things that you suggest that folks might want to get up to in order to advocate around these issues. Everything from supporting journalism and generally supporting political movements for resistance. I want to ask you a little bit about what you suggest for individuals as far as a conceptual framework about how to think about how they might engage in this. But I also want to make sure I do bring in one element of what you cover here, which I have always been interested in and wondered about when we might see perhaps more. Again ... And I don't think you are in this book encouraging this necessarily, but you do have a section on sabotaging surveillance on the idea of sabotaging data streams, sabotaging the collection of information. What do you think individuals should be doing when they think about how to respond to the phenomenon that you're talking about?
Andrew Guthrie Ferguson:
The book is set up in three sections. So first, to set up the problem in terms of how smart data is changing our lives, our smart things, our smart homes, our smart bodies, and even smart cities. The second part talks about the true problems of that, about power and privacy. But the third is offering solutions. And I have solutions directed toward judges, solutions directed toward legislatures, and then as you mentioned, solutions directed toward individuals. And candidly, individuals are the hardest thing to give concrete, hopeful solutions. Why? Because you and I can't negotiate with Amazon. We can't negotiate with the FBI. As individuals, we are largely powerless against the data-driven technologies directed against us. But the one thing the book is meant to do is to encourage people to get involved and get engaged. If you look around your life right now and you look at the digital devices in your kitchen, in your home, in your car, on your wrist, recognize that that makes you involved in this fight because the data that you are revealing can be used and may be used against you.
And so if you have that stake, you have to then go to where are the places that have the power dynamics? Maybe it's your Congress people who could act. Again, we can recognize that Congress hasn't exactly been all that fruitful, but the book actually proposes not a regulation about data privacy or data protection. It's talking about actually a narrow subset of that, of regulating how the government, the police, can use this data against us in a court of law. It's actually far more limited than the larger problem that has paralyzed Congress. It also talks about encouraging journalists and podcasters and other people who are out there spreading the word. We would not be having this conversation if journalists hadn't been doing their jobs for the last 15 years. Almost everything in my book is from a journalist who did the hard legwork of figuring out what was happening, where the money was flowing, who was benefiting and exposing this. And so supporting journalism, which is a critical reality right now, is important.
And this entire problem is self-directed in the sense of anything that we are not doing is also on us because we have not chosen to tell our representatives to change these laws, and that could change. It's not foreordained who becomes the elected local representative or state representative or federal representative. We can actually push people to say, "We need to have more protection of our data. We should be able to live in a world where we benefit from this consumer wonder, but also not be worried that the government can turn around and use this data against us."
As to the question of how one can do things like sabotage your data, which is a strong word, but what it means is we go about our lives thinking that our digital devices, our smart devices are on our side. We actually think that we are aligned with the companies providing this because normally the reason why you bought something is you think it will help your life, it'll make your life more efficient, better, more insightful about your sleep patterns or whatever it is. I think it's important to recognize that we're not aligned, that you are at the whims of a billionaire owner of a company that also has contracts with the government that's happy to sell you out and your data out should it threaten those bigger numbers. And to live your life, recognizing that the data you give these devices doesn't have to be completely truthfully and honest. It's your data, you can do what you want with it, and that you might want to take some precautions about giving too much information.
One of the smaller but very current realities right now is we're seeing this move to agentic AI where people are uploading all of their personal data to a chatbot to go buy their tickets for their favorite shows, whatever it is. But that idea of giving all your information to something that you think you can trust is probably a mistake because you probably shouldn't trust that this data system will be protected until there are laws that say, "Okay, fine. We're protecting this. The government, the FBI can't go get all of the data you just gave your new agentic AI, and then maybe you can feel more comfortable giving all that information to the AI."
Justin Hendrix:
I realize I'm going backwards from the order that you present these in the book, but I hope at least the listener will hear my logic in it. So let's assume that individuals do respond and begin to advocate for themselves and begin to request better protections from lawmakers. Let's talk about legislatures. You try to square a couple of circles here in the book. You're trying to avoid the trap, as you said, that Congress has fallen into in terms of not being able to get to a comprehensive federal privacy bill over the last few years. What do you recommend that legislators do? How should they be thinking about the problem of privacy and what should they do in the near term?
Andrew Guthrie Ferguson:
Well, I think if you focus on what data can be used against you in a court of law, there are a couple already established limits in place. So for example, we do allow, and we have for the last several decades, allow federal law enforcement to literally put a microphone in your house and listen to what you are saying to your family and perhaps your compatriots. This is a wiretap and there's literally a federal law that's a wiretap act. It is only used pretty much in more serious cases where there's actually a very good reason. There are both procedural limits to get it and procedural limits to limit it and minimize the data of other people might be heard. And yet we've been fairly comfortable with this incredibly invasive ... Just thinking about that. Right now, if you are under a warrant for a wiretap act investigation, someone could be literally listening in to your conversation in your kitchen. That's invasive. But the reason we've been okay with that is the balance of power about how we use it, when we use it, and what are the procedural protections was there.
And so one of the proposals in the book is to have a wiretap act like law. Both strong and weak versions are mentioned or discussed in the book. That would limit the use of certain technologies and certain access to information based on this at least use of judges and a higher than probable cause standard and the rest of it. But it's actually very vanilla. It is built around an existing law that has worked fairly well, just applied to new technologies in a new way. I also have a more radical suggestion of privileges.
So one of the things that we do in law is we sometimes said there's something else more valuable than evidence to use against you. So when an attorney and their client talk together, it is protected by the attorney-client privilege. When two married people are communicating, the communication is privileged. And it's not that what is being said of, I did the murder or I did the murder isn't wonderfully helpful evidence to the prosecution and they love to use it. It's that there's another value that society thinks is more important, namely the marital communications or the attorney-client privilege. And I think we could build out equivalent with data. There's some data that probably should be privileged such that can never be used against us. And that's radical because right now the existing law is if you create it, it can be used against you. But I don't think that that is really the way we go about our lives. I don't think we think that we are constantly giving up data that could be used against us no matter what we do.
And I at least want to have the debate. I want Congress to debate about whether there are places that are too privileged. And they, Congresspeople, are just as vulnerable. If you want to track a Congressperson with all this information, you want to find out what they've been Googling, you want to find out what they've been buying on Amazon, you want to find about who they've been meeting with, it's actually all available and they too should have some protections despite the fact that they're pretty privileged.
Justin Hendrix:
So I just want to underscore for the listener, what you're talking about here is essentially making the problem right sized in a way that it can be solved. So you talk about the idea that lots of complications around thinking about privacy, conceptualizing digital rights that is often overwhelming. These things fall apart, their industry interests, et cetera. We're talking about legislating digital evidence, what literally can be brought into a court.
Andrew Guthrie Ferguson:
Exactly. And the reason is ... Not that I don't want a global federal data privacy law or data protection solutions. We need that, but it's complicated and it's hard. And we've seen for the past decade plus that Congress is not capable of doing that in the way that we'd want them to do it, at least not quick enough. And so in the short term, and as someone who approaches the world as a criminal procedure professor, law professor and criminal law professor, I think that there is a smaller and still hard, but smaller way of moving forward that takes off the table how this data could be used against us in a criminal map.
Justin Hendrix:
Let's talk about judges. Let's talk about courts. Judges, when we're seeing this, for instance, again, the case of Minneapolis, front lines often of government overreach and dealing with abuses of power and of rights. What should judges do? You say they're not technologists. They may not have the time or inclination to rethink foundational theories of constitutional law, but what must they do here?
Andrew Guthrie Ferguson:
Well, fortunately, I have done that thinking for them and it's all in the book. So all you have to do is read the book and then apply it. But my point is that the judges can interpret the law, interpret the Fourth Amendment in ways that are far more protective than they have. And part of the reason we are where we are in a rather unregulated world where the technology speeds ahead and there's nothing sort of pushing back is that judges have been reluctant to sort of interpret the Fourth Amendment in a way to protect individuals. Now, there are some exceptions. The Carpenter case, Chief Justice Roberts said our cell site location data ... Basically the phone location information that you have requires a warrant to get access to it. The geo offense question is coming up before the Supreme Court this term about whether a equivalent or the sensor vault, the data that Google collected on all people with a Google enabled app is available to police without a warrant or separately if they go about using the corporate warrant system that Google created.
But it's a big deal because it means that if the Fourth Amendment doesn't extend to protect educational data in particular places, it means that what you're seeing in Minneapolis or anywhere else can happen without a warrant, without even cause, just because the police are curious about what you're doing. If the Fourth Amendment doesn't apply and there's no federal law protecting against it, wide open. People can just do it to track their ex-girlfriend or see what a judge is doing or to develop evidence against them because they don't like them. And so if there's no constitutional protection of this data, it means it is free game for law enforcement to use and potentially abuse.
Justin Hendrix:
You talk about the Fourth Amendment is Not For Sale Act early in the book. And I want to just talk about that in particular and data brokers and the way that law enforcement and other entities have access to so much. We're seeing that again in Minneapolis. Federation of that data into some of the systems that ICE and CVP are using. One thing I always remember about the Fourth Amendment is Not For Sale Act, and it seemed to be making a little bit of progress as recently as even 2024, I guess when you were writing this book. And the Biden administration came out announcing that they strongly opposed it because it would ... And I'm reading from the letter that the White House sent out of OMB because it would prohibit the intelligence community and law enforcement from obtaining certain commercially available information. Why in particular do you think both parties refuse to do anything about this particular problem?
Andrew Guthrie Ferguson:
I think there are different reasons for the different administrations, but I think both administrations are at fault for not addressing the dangers of surveillance. The Biden administration, in addition to fighting a very reasonable law that says we should close the loophole, that police can simply buy the data that they would otherwise be required to have a warrant to get seems like pretty common sense information, but the Biden administration spent a ton of money, COVID money, infrastructure money, building out real-time crime centers in many cities that otherwise wouldn't afford them. In developing the equivalent of video analytics, like true real surveillance systems in places that would otherwise not have it. And they, I believe, thought that the surveillance systems they were building would not be misused for political ends or for protest ends. But of course, that norm, like many other norms, has disappeared in a administration like the Trump administration that is looking to weaponize these surveillance systems.
The irony, of course, is like Donald Trump's data was used against him. He if he were to read the book, would actually see that many of the concerns he was saying that involved the Biden administration's perceived weaponization about data against their political enemies actually happened to him. And to me, if you could see that, it might be the place where there's a bipartisan recognition that because we cannot be in power, we should set up rule beforehand to prevent the misuse and abuse of this data. Because whatever is happening now against perceived progressive people in Minneapolis can be and likely will be used against other people in future administrations and future times in America. So we should come together to recognize that there needs to be ground rules about how to use this because the data reveals all of us everywhere about everything we're doing. It doesn't have a partisan slant, it will just reveal it. And so I think it's a time to come together to have this debate. And I think more people today should be supportive of the Fourth Amendment is Not For Sale Act because it's definitely addressing a very real and pretty obvious loophole in the existing legal structure.
Justin Hendrix:
You've reminded me of an article that sticks in my mind from NBC News in April of 2025. The headline was Inside the DHS Task Force Scouring Foreign Students' Social Media. And somewhere about halfway through the piece ... The journalist is Julia Ainsley. She writes, "The data analytic tools now being used to scour social media were enhanced during the Biden administration." A former Biden administration DHS official said. And then she has a quote from that official, which is, "We were not targeting political activity or speech. We would only review them if they were inciting violence." And he goes on apparently to say that it's different from what the previous administration intended, what we're seeing now.
Andrew Guthrie Ferguson:
And that's the danger. If you build it will be used. And right now we're building data collection systems everywhere in our lives. Almost anything ... You can't even buy a car now that's not a smart car that is revealing everywhere you go and by inference what you're doing there. And yet we haven't built systems about when that data can be used and for what purposes. And I think we might be able to agree that there are carve outs. Maybe we'd like to privilege First Amendment protected activity. Maybe we'd like to privilege medical and family activity. Maybe we would like to privilege certain areas that we think shouldn't be targeted. Automated license plate readers. Maybe we could agree not to put them outside of churches and mosques. Maybe we could agree to not put them outside gun ranges and places where people might be concerned that their Second Amendment rights were freed. We could come to terms about if this technology is going to exist, when it should be used, how it should be used, and when it can't be used. But right now, we don't have any of those rules. And so they're being used by whoever controls the political powers.
Justin Hendrix:
I want to ask you about the tyrant test. This is how you finish this book. This idea of being able to put in place a question that we ask about the technologies we develop and deploy. How do you describe this in your own words? What's the tyrant test?
Andrew Guthrie Ferguson:
The tyrant test is both a metaphor, but also a practical plan of action to figure out what you would do if the tyrant had your information. It basically says, assume the worst. Assume that the tyrant is reading your Google searches. Even the embarrassing ones. Assume that the tyrant has access to everywhere you drove and your heartbeat as you're driving there because you had a smartwatch on. If that is the norm, what would you do to build out protection? There's not one simple answer. There have to be legislative protections, there's judicial protections, there have to be community protections, there have to be individual protections. But as I say in the book, we have a good model, which is America. America was built on the tyrant test. We recognize that the centralization of power would be a danger. And so we built federal, state, local counterbalances of power. We had juries and grand juries and citizens to respond to the criminal concerns. We had the different states and federal powers back and forth. We have a Bill of Rights that give actual or did give actual rights and actionable protections. And we built a system to protect against the perceived tyrant.
Now, of course, when I wrote the book, I didn't think we'd be as close that reality of authoritarian world as we are now, but the lesson holds because it doesn't ... This is an example in history that we're living through now, but there could be others. And we should be able to recognize that we need to build these structures beforehand, before anyone has a control, so we're not doing it in a partisan way and saying, okay, the book is actually filled with lots of stories that are pro investigation, pro law enforcement, how data actually solved a hard murder or a horrible sexual assault. It shows stories about why we want these conveniences.
The story that gets me the most is I think we can all agree that smart medical devices are a revolution that are so important for health and our future. And having a smart pacemaker that literally monitors your heartbeat and gives that data to the doctor to save your life is a good. That's a good thing for society. But I think we might also be troubled that detectives can go to the doctor's office and get your smart heartbeat and use it against you in a court of law. I think we should be able to have both the creation of a smart pacemaker and a prohibition that data will not show up in a court of law to be used against us. And it has been in a case where they were actually trying to undermine someone's insurance claim, arson claim, by showing his heartbeat didn't correspond with his story. And to me, we should be able to have both.
And what the tyrant test does, it says we need to, before any of this happens, set up these protections, these regulations that have legislative rules in place, that have strengthening in place about how the Fourth Amendment is supposed to apply, have other community checks about who green lights the surveillance on our streets, whether we really need ALPRs in every community and every place, and has rights and remedies for when these things get abused, because of course they're going to get abused because that's a history of all technology and all policing as they always get misused. The question is, what can we do about it? And we need to have all of those systems in place, similar and parallel to the systems we try to put in place in America to limit political power so that that contest of power will protect the tyrant from misusing that power.
Justin Hendrix:
And I like that you do suggest that the United States itself was built on the tyrant test. I think we perhaps have strayed from that. You talk about the spirit and structure of the US Constitution, how it could apply to the digital world, but it's almost like you're suggesting a wise paranoia about the concentration of power. This idea of trust no one. And you point to other scholars, Simone Brown, Ruha Benjamin, Chris Gilliard who we've already mentioned here, others, this problem, of course, the surveillance that's, as you say, always shadowed Black lives from before the Civil War to after the Civil Rights Movement. Talk about trust, this idea here. I think this is partly it and that maybe what we're seeing now is a beginnings of a recognition that no one should trust the concentration of power that we've allowed to accumulate whereas a lot of people, it was a sort of in the back of their mind, it wasn't a day-to-day problem, no one was coming to their doorstep or stopping them in the street, and now that's beginning to happen.
Andrew Guthrie Ferguson:
So first, I'm deeply indebted to the scholars for their work on race and surveillance and bringing the lived experience of black lives under surveillance to the fore. And I think they would acknowledge the surveillance lens has always been directed against certain communities, usually black and brown communities. And that what is changing now is, again, that aperture surveillance is expanding such that people who had trusted companies selling them consumer surveillance or had trusted the government to not misuse of power are now being betrayed by that trust. And it's a false trust. It never should have been trusted. And I think if you were talking to the scholars who write through that lens of racial injustice in America, they would say, no one should have trusted it. Well, just look at who was targeted. But I think what is interesting about this moment in history about how ICE and CV and the Custom Border Patrol are killing otherwise privileged white people on the streets and other people are coming to protest and are being surveilled, is that the recognition that no one probably should trust how police power can be misused is the correct approach and that we all should go back to that distrust mode, which again was part of America.
The Bill of Rights in the Constitution was a distrust of federal power. It was a concern that by creating a US Constitution, we'd be replicating the issues that happened in Britain, which is why we had the War of Independence. And the Bill of Rights was a check on government power. And I think we have largely ... Or I said we. I think certain parts of our community have largely trusted that the police would not misuse their power and only use it to go after the really bad guys. And now they're seeing that they are now considered part of the really bad guys because they're speaking out against what the federal law enforcement entities are doing.
Justin Hendrix:
I want to ask you again just another question about the recent immigration crackdown and what we've seen in Minneapolis and Chicago, in Los Angeles and Charlotte, and many other communities across the country. Are courts taking steps that you think are moving us in the right direction? Are you seeing enough judicial resistance? Are there specifics you might be able to point to that you're watching more closely than perhaps the average listener?
Andrew Guthrie Ferguson:
I think in terms of the use of surveillance of immigration and enforcement, we haven't seen the courts do all that much. Courts are responding to the physical, going to people's homes or stopping people in the streets or searches or arrest detentions in that way, but we haven't seen litigated the surveillance. And the thing that I hope judges see and I hope people see is that the same logic that might allow ICE to have a national database to be able to track anyone everywhere and find people. It's the same power that will be given to any law enforcement, first the feds, but then it'll trickle down to the states so that police will have that same power to go after people who were protesting, people who are shoplifting, people who were violating the law, and whether we really want that centralized power in America.
It feels very un-American. It feels very much like this is a thing that ranchers and farmers would be protesting that the federal government would have these databases on us and this information. And yet we haven't seen that sense because it feels like initially isolated in this well, immigration enforcement. But the point I want to make is everything that's being used there will be used against you tomorrow. There's no law that stops it. Maybe there's a norm that we hadn't done before, but those norms are changing and breaking in record time.
Justin Hendrix:
I guess my final question for you is, what gives you some optimism that will perhaps address these things? You've mentioned individual responses and you're encouraging certain types of individual responses in particular. Clearly we need some recognition and reckoning. This has to become an issue that people care about when they go to the polls as well as when they go to the streets. Are there signs out there, signals, I guess everyone calls them green sprouts, things that you're looking at that perhaps give you some hope that some of this will advance in the next bit.
Andrew Guthrie Ferguson:
So I've been writing this book for several years now and the thing that concerned me is that my warnings about the potential dangers would not be heard because there are corresponding benefits of police using the technology to find murderers or find people the others wouldn't find. And I think that what we're seeing now with the use and misuse of these surveillance technologies by this administration is proof of the argument that there is a real danger here. And that danger hardly sounds like a green shoot. But this danger that now runs against everyone is the thing that can build a movement to change it. Again, all of this can be solved if the people in different legislative bodies wanted to do something about it. Some of this could be solved if judges wanted to do something about it. The justices have an opportunity when they decide the Chatrie geofence case to do something about it.
And I think maybe the clear and present danger of surveillance being targeted against people who are protesting and using their First Amendment rights might be a moment where we could push back as a community bipartisan. That this is not the world we want to live in and we need to have some guardrails about how our data can be used against us.
Justin Hendrix:
Well, perhaps they always say a crisis is a terrible thing to waste. Hopefully this particular crisis won't be wasted particularly on these issues. Andrew Guthrie Ferguson, thank you very much for joining me to talk about this book. Tell the listener, when's it on sale?
Andrew Guthrie Ferguson:
March 17th, comes out and you can pre-order it now and support your local bookstore.
Justin Hendrix:
So that is Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance from NYU Press. Buy it at your local bookstore. I would concur. Thank you very much, sir.
Andrew Guthrie Ferguson:
Thank you.
Authors
