Skip to content

Understanding Digital Dragnets: Surveillance in the Age of Smartphones

Audio of this conversation is available via your favorite podcast service.

In this episode of the Tech Policy Press podcast, we’re going to explore how law enforcement and other government agencies in the United States acquire data drawn from commercial data brokers for investigative purposes, and the questions raised by these practices.

This is an issue that is still at question in the nation’s courts and is under active discussion on Capitol Hill. For instance, this summer the House Judiciary Committee hosted a hearing it titled Digital Dragnets: Examining the Government’s Access to Your Personal Data. At the hearing, experts witnesses testified that government agencies at all levels, including federal agencies such as the Department of Homeland Security (DHS), Central Intelligence Agency (CIA), Internal Revenue Service (IRS), the Department of Defense (DOD), as well as state and local law enforcement are collecting a massive amount of personal data on American citizens, sidestepping constitutional protections against unwarranted search and seizure provided in the Fourth Amendment. The hearing included discussion of the proposed Fourth Amendment is Not For Sale Act, which would restrict government entities from engaging in such practices.

But while the courts and Congress deliberate, government agencies are acquiring this information from software providers, including one such firm that was the subject of a recent investigative report from the Associated Press titled Tech tool offers police ‘mass surveillance on a budget. Today, I’m joined by the two reporters who spent months trying to understand how a little known company in Virginia goes about acquiring commercially available data and selling it to police in departments across the country– global investigative journalist Garance Burke and national investigative reporter Jason Dearen.

What follows is a lightly edited transcript of the discussion.

Justin Hendrix:

I’m very grateful to both of you for joining me today. Before we get started talking about the specific story that I’ve invited you on to talk through today, I’d love to just know a little bit more about your beats and how you go about the work of investigative reporting at the Associated Press. So, Garance, I’ll start with you since I know you are specifically concerned with the intersection of tech and society.

Garance Burke:

Sure. So, I’m leading an AI initiative at AP, looking at the impacts of surveillance and predictive technologies on our communities, working with colleagues around the world and across United States to really try to understand how AI is popping up in all of our lives. So, this is part and parcel of one of my main areas of focus as a journalist right now.

Justin Hendrix:

Excellent. I understand you have done some work with Pulitzer Center and Stanford in this regard as well.

Garance Burke:

So, I conceived of this project after finishing a fellowship at Stanford University. It was a joint fellowship between the John S. Knight Journalism Fellowship and the Stanford Human-Centered AI Institute. So, it just seemed to me really important as an investigative journalist for folks to understand more about AI, how it works, how it doesn’t work, and how it is omnipresent in our various levels of existence these days. So, that’s something that I pitched to AP upon returning and was super glad to have Jason join in on this investigation.

Justin Hendrix:

Jason, your work really cuts across the gamut from White supremacy to the environment, and you’ve got a book under your belt.

Jason Dearen:

Yeah. I’m a generalist, but over my career, I had a number of themes, the environment and public health and criminal justice and science. I was a 2018-2019 Knight Science Journalism fellow at MIT, where I worked on my book, which is called Kill Shot, for anyone interested. And I’ve been keeping my fingers in both of those subject areas– public health during the pandemic, I did a lot of investigations related to that, and then criminal justice most recently as it pertains to White supremacist infiltration of law enforcement agencies. So, those have been the things I’ve been working on before Garance and I got together on this project.

Justin Hendrix:

So this story, which came out just before Labor Day, Tech Tool offers police mass surveillance on a budget, I suppose combines your curiosities greatly, but let’s talk a little bit about how it came together. We’ll get into the details of what is Fog Reveal, what is Fog Data Science, this company behind this product, but tell me a little bit about how you got started down this path. I understand it started with an approach from the Electronic Frontier Foundation, which had come across some records?

Garance Burke:

That’s right. So, a contact over at EFF had reached out to me and said they thought they had some documents that might be of interest. That was back in January. And by February, Jason and I had started pouring through this cache of thousands of pages of records that EFF had obtained via the Freedom of Information Act. Now this was really the starting point for our investigation. We had to categorize the records, make sure we understood what they were.

By in large, they were contracts and emails, presentations from police and law enforcement agencies that had used this particular surveillance tool, Fog Reveal, and included conversations that police were having with the company and amongst themselves about how to use it to surveil people in their communities. But then we also had to start thinking about how to get beyond the records. So, Jason, do you want to talk a little bit about that?

Jason Dearen:

Obviously, we’re reading emails and these records from various law enforcement agencies and we were looking for any evidence of cases, specific cases that Fog was used in that led to an arrest or anything like that. So, that’s where we started. I also started reaching out to defense attorneys to see if they’d heard about this. There had been one interesting legal development when we first started this effort. It was in Fog’s home state of Virginia. There had been a district court ruling about geofence warrants earlier this year that found them unconstitutional because they cast too wide a net. I think it was related to a 2019 bank robbery case there where police had used a geofence and ended up gathering all this data on bystanders. That was found to be unconstitutional.

So, we had that one legal opinion to look at, but obviously, tech outpaces the court and legal systems so much that there wasn’t a lot of other legal framework to look at here. So, I started talking to defense attorneys around the country to see if they’d heard of this or if it had come up in discovery in any cases. We were really striking out, they hadn’t heard of it, but that piqued our interest even more.

Justin Hendrix:

So ultimately, I understand, you were able to identify at least 40 contracts with this company and a couple of dozen agencies out there. Tell us a little bit about what is Fog Reveal and what is this company Fog Data Science, as you say, based on Leesburg, Virginia?

Garance Burke:

So, we were really interested in this relatively obscure company that, as you said, is based in Leesburg, Virginia and has some related corporate entities in several other states and so little was known about it. We were able to figure out that they were very much networked into law enforcement associations, associations bringing together Department of Homeland Security affiliated Fusion Centers. That was really where they focused their marketing for their tool called Fog Reveal. We wanted to go deeper into who were the people behind this company, how the technology worked, and most importantly what its impacts were in our communities.

Justin Hendrix:

So just for the listener’s sake, maybe you can just give me a sense of how the system works. So, I understand it’s web based, so quite literally a police department buys a contract. And then what happens?

Jason Dearen:

So, we really tried in many different ways to get someone to demonstrate this for us. We knew from public records that police agencies were using this tool. They had it. They talked to us about it a little bit here and there, but no one would do that. So, what we were able to glean was that there’s a web interface that the police officer, detective, the license holder can go in and put in the coordinates for a specific crime scene and that will generate something called a Fog ID. What we were able to figure out about that is that the Fog ID was basically the company Fog taking some other information, either a Venntel ID from that company or some other advertising identification number and just putting a new number on top of it. They would call that the Fog ID.

So, then they could get a sense for which devices were at which locations. What we are also able to learn is that they could search back in time and it’s still unclear to us, I think, exactly how far back they can go. One email says they could go back to June of 2017. The marketing material say 180 days. And then after the story ran, one of the executives for Fog said that it could go back three years. They emailed us afterwards and said it could go back three years. So, it can go back in time, either 180 days to three years, somewhere it seems like in that situation. So, that’s how it works.

It sounds pretty simple to use, it seems like. I think where it gets more complicated is when clients, police, want to dig deeper into a specific device ID and start creating a pattern of life and following that around a little bit. You see in the emails that they will go and ask for help from Fog sales reps and executives and they go back and forth. A lot of times Fog is helping provide some analysis for them as well.

Justin Hendrix:

Well, let’s see, maybe I’ll come back round to the DHS in general. Maybe I’ll try to pick up then still in this conversation about how the thing works. So, you mentioned in the piece a couple of different types of data, consumer data that is federated into the system. One aspect or one thing that you mentioned is Starbucks app data. Another is Waze data, the popular map app, of course. So, essentially, this company is federating all sorts of location data, brought to it from Venntel and presumably other sources.

Jason Dearen:

Yeah, that’s right. So, the way we understand it and based on what Fog would say in its marketing materials about where it gets its data is that there are at least hundreds of apps that are gathering data or hoovering up data, sales data, ad ID data that is provided by these apps to Venntel. We believe it was Venntel, and Gravy Analytics was the other company that is a parent of Venntel, that Fog obtains its information from.

But when we went to Starbucks and went to Waze and talked to them about this, the company said they had never heard of Fog and had given no permissions for its business partners to sell the data to Fog. So, there was a disconnect there. Even though the apps are designed in a way that allows this ad ID data to be gathered by data brokers, the specific end users of that data, the company’s claim to have no knowledge of whatsoever.

Garance Burke:

That was a point of interest for us, Justin, because we went to Venntel and asked them to explain how it was that Fog got its data or what the relationship between the companies was. Venntel wouldn’t comment and said the confidential nature of their business relationships prevented it from responding to our questions. Fog wouldn’t comment on it either. So, I think that that’s one thing that we’re interested in continuing to look at is just the ways in which many of these data brokers operate in relative anonymity. We’re very much interested in understanding some of the public policy implications of having companies that are so difficult to trace have such intimate details about people’s lives.

Justin Hendrix:

Of course, anybody that’s looked at the ecology of data brokers in this country knows that that’s an incredibly difficult thing to track and to understand.

Garance Burke:

Yeah, absolutely. I mean it’s the subject of multiple different congressional inquiries right now. The Department of Homeland Security’s watchdog is auditing how offices under its control have used commercial data. So, I think it’s definitely an area that we’ll be interested to continue to watch.

Justin Hendrix:

Also, the subject of an FTC rule making hearing just this week, and as you say, multiple other hearings in Congress. I believe another in the Judiciary Committee in the House a couple of months ago as well. I understand the company’s also promising, at least in some of its materials, that it can do predictive analytics.

Garance Burke:

Right, that’s another area of interest for us. Of course, predictive analytics is a buzzword that’s often used to describe different kinds of high tech policing tools that purport to predict crime spots or even purport who may have criminal predilection, but the company did not answer our specific questions about its predictive analytics, which is again, something that Fog said multiple times in its brochures that this was a capability that the company had, even as recently as last year for members of the National Fusion Center Association. But when we asked them, they provided no details about any uses the tool had for predicting crime and said that they had not invested in predictive applications.

Jason Dearen:

Yeah, one thing I’d add to that too is that it’s hard to know with a company like Fog and how relatively young they are, how much of what they were stating in their marketing materials was boasting or plans for the future, were they going to invest in later some predictive analytics, things like that. Also, even in saying Starbucks and Waze in their marketing materials, those are two big brands.

So, one interpretation of that is that they were using these brand names to explain to their potential customers what kind of data could be included, that they were gathering up and drawing from rather than specifically from these two companies. That was something that we kept in mind as we were reporting. That’s the difficulty, as you mentioned too of reporting in this area, is that the companies don’t want to talk about it. The customers don’t want to talk about it. Oftentimes the victims don’t know they’re victims because there’s no information about this in the court records. So, yeah, it’s definitely a very shadowy area in which we try to glean facts.

Justin Hendrix:

And yet in fairness to the company, you do write that Fog’s Broderick said in an email that the company does not have access to people’s personal information and draws from commercially available data without restrictions to use and from data brokers that legitimately purchase data from apps in accordance with their legal agreements. Is there anything necessarily against the law about what this company is doing when it shares this information with law enforcement?

Garance Burke:

Well, I think one of the interesting things that we found through our reporting is just that this is such an evolving legal landscape. So, as you were seeing, Justin, there are multiple hearings that have been going on before Congress. The FTC is really examining this in great depth and in fact sued a data broker recently over similar issues, but Fog says that what they do is absolutely within the bounds of the law. Of course, the Electronic Frontier Foundation and other privacy advocacy groups say that’s not the case.

Justin Hendrix:

One of the voices that comes forward in the story is from Davin Hall, who you say is a former crime data analysis supervisor in the Greensboro North Carolina Police Department. I grew up not far from Greensboro, so I noticed that one in particular. Tell us his story a little bit and how did you come across him?

Garance Burke:

So, Davin had written a blog post about the concerns that they had raised while working at the Greensboro Police Department about Fog and about other technologies they felt were really intruding on people’s lives, but really then disappeared from public view. So, we tracked down Davin and had a whole series of conversations with them just about the concerns that they had raised over and over again within the Greensboro Police Department. This is someone who was at a relatively high post there as a data analysis supervisor.

Davin raised the concerns to lawyers at the Police Department, they said, as well as to the city council. Those fell on deaf ears. Ultimately, Davin ended up quitting in part just out of concern for the uses that this type of technology was being put to in the community.

Jason Dearen:

With so few public mentions of Fog out there, Davin had also written a letter to the city council raising concern about this issue. It was one of the public documents out there that mentioned Fog that would come up in a search. So, that was something else that tied Davin into this.

Justin Hendrix:

So, I understand from your reporting that this is the service that it’s not totally inexpensive. You report contracts around $7,000, $9,000 for access to the service, but of course, that puts it well within the capability of most, I’m sure, police departments across the country to purchase as an enterprise subscription. In terms of the scale of it, how far the company’s got with its effort so far, do you have a sense of how big this company is? How many departments it’s working with?

Garance Burke:

Of course, we’d love to know more from the company itself about its customers but weren’t able to get a ton of specific details. We do know that agencies as small as rural Rockingham County, North Carolina, which you might know better than we do, Justin, has a license. It’s a rural part of North Carolina. Recently, the Dallas Police Department, one of the top 10 largest in terms of number of sworn officers, signed up for Fog Reveal. So, I think there’s a real span of agencies that use Fog as a part of their investigatory toolkit, but really we’re very interested in hearing from other agencies where this may have cropped up perhaps on a free trial basis that might not be memorialized in those contracts. We also are interested in hearing of other uses perhaps in the private sector where Fog has popped up.

Justin Hendrix:

So, I wanted to ask you just a little bit about the backgrounds of the founders, who I understand are former Department of Homeland Security executives?

Jason Dearen:

Yeah, Robert Liscouski worked in the George W. Bush administration, as did Matthew Broderick, not the actor but the executive. Broderick was involved in Hurricane Katrina response and later resigned due to the slow response of the agency under his command. Liscouski has some background in a cyber command department, which we were not familiar with. I’m still not familiar with what actually they did. But other than those pieces of their background, there wasn’t a lot of information about why they started Fog or how Fog got off the ground and where the genesis for the company came from.

Garance Burke:

But we were interested that Liscouski and another Fog official had previously worked at companies focused on predictive analytics or machine learning, different types of software platforms supporting artificial intelligence. So, Fog seems to be definitely within their wheelhouse, but we were just unable to get a ton of details about exactly how it came to be.

Justin Hendrix:

I want to ask you a little bit about the response to the story. It’s only been a week since it published from the time that we’re speaking today, but you mentioned that you’ve had other information that’s come along since. What has been the response, including from some of those privacy advocates or lawmakers? Have you seen signs that this reporting may have impact already?

Jason Dearen:

The response online has been very strong in terms of people being freaked out about the potential for misuse of this tool, but in terms of response from lawmakers or people who could get involved in creating some legal solution to any privacy concerns that are raised by Fog’s tool, we hadn’t heard anything back yet unless I’m missing something, Garance. I don’t know if you know of any reach out from people. The Queen dying, I think, has everybody’s attention right now so we can forget about these scary tech tools.

Garance Burke:

Yeah, I mean I think that we are able to say it’s been read at very high levels of the US government and we’ve been really gratified to see the response and just the interest from audiences. One of the things that we worked really hard to do, Justin, was to include examples of where Fog has been used. We spoke with an Arkansas prosecutor who said that this was really very useful for cracking a murder case in Arkansas, the murder of Sydney Sutherland, a nurse. It was also useful according to emails we got from the Missouri State Highway Patrol in another high profile murder case involving a snake breeder.

So, I think as people get a better sense as to how this is actually showing up on the ground, we may hear from more local officials who have concerns or perspective that they want to share. I know that the city councilmen in Anaheim, California who we spoke to felt that he never got anything like the details he would’ve sought before hearing that his city had deployed this tool. So, we’re going to stay tuned on this.

Jason Dearen:

We’ll be really interested to see, too, what Waze, Starbucks, and other big brands and companies who have these kinds of apps that know that their data is being used by data brokers or sold or repackaged. I’d be really interested to see based on their response to us if there is any action, cease and desist letters or otherwise. Because publicly and in their statements, what they told us was that there was no permissions given for their data to be used in this manner for this type of surveillance by police. So, that’s another thing we’ll be watching closely.

Garance Burke:

We’re also interested to see what Fog decides to do in terms of its marketing. This is a company that has had a lot of penetration in the law enforcement market, but you never know. They may decide to pursue additional types of customers as well.

Justin Hendrix:

So of course, my listeners are aware that the Fourth Amendment Is Not For Sale Act is in debate at the moment. At least in the last hearing, where I understand it was discussed, there seemed to be bipartisan support in the committee for that particular piece of legislation, which would address some of the questions here and maybe solve on some level some of these issues about what are appropriate use cases that truly do lead to good outcomes in law enforcement and what are these frightening phenomena that give people such cause for concern about law enforcement overreach in this regard.

But right now, I mean this story does seem to fit a trend of stories about law enforcement companies adopting technologies often for relatively little investment that give these law enforcement entities really extraordinary, almost science fiction like surveillance powers. I’m thinking also, of course, of Clearview AI, which is much publicized.

Garance Burke:

Yeah, I mean for us, it was just relatively shocking to hear that local police had that capability to reach backwards, we understand, three years in time and figure out perhaps where each one of us slept every night. I think that this is something that we talked about years ago as in the realm of the possible and indeed it is happening in the present day. Fog says, of course, that all the data’s anonymized so there’s no reason for people to be concerned and have called folks who raise issue with this members of a cult of privacy in the words of one Arkansas prosecutor we spoke with who is tightly tied to the company.

But I think that the interesting thing with these privacy issues is you do see some bipartisan coming together, which is so rare these days, to just really interrogate what are appropriate uses of this level of detailed surveillance and what are not.

Jason Dearen:

I would also add that while predictive analytics is a buzz term like we talked about earlier, finding out where somebody spends their time, where they live, where they work, where they shop, these types of things in practice can be used to predict people’s movements too. We were talking about back in time but forward. So, that was something that I learned through reporting like this. It doesn’t take some AI or some complicated algorithm that know where somebody lives to predict where they’re going to be. So, it may not be a predictive service in computer jargon or terms, but just in practical terms, it is.

The other thing I thought was interesting from this too was geofence warrants oftentimes, which are done in concert with tech companies like Google and Apple, those are starting to come under scrutiny by the court, as I mentioned with Virginia. This tool seems like, at least through the marketing and the way that they talk about it, a way to do the same thing sometimes without a warrant depending on the state, because some states are tougher about that, some departments are tougher about that. So, it’s just another example of how there’s always an end around it seems when one technology, one trend hits a legal snag, there’s always another one waiting in the wings and then another one. So, this is just the latest iteration of that idea, I think.

Justin Hendrix:

So, if you can’t use cell phone tower data, you go for the commercially available data from the data brokers, the ad tech companies, and get at it the same way.

Jason Dearen:

Exactly.

Garance Burke:

Yeah. I think we’ve seen this crop up in a whole variety of settings. I’ve done a lot of reporting in the past about immigration and this is certainly an issue with immigration and custom’s enforcements use of data from data brokers. The ways in which this can also be tied to other records and really paint a very full picture of a person’s life and movements, I think, is definitely something that we’re going to continue to be interested in, how all of these technologies tie together.

Justin Hendrix:

Of course, you bring up as well the possibility of law enforcement, keeping tabs on people seeking reproductive health services in states where it is now illegal. So, a lot to be concerned about here.

What’s next for the two of you on this intersection of technology and law enforcement? Will you continue to report on Fog or is there another story we can look forward to in the near term?

Garance Burke:

Well, we’re definitely interested in hearing from any listeners who may have more to share with us about all of these issues. Our reporting absolutely will continue right in this vein of the intersection of technology and civil rights technology and human rights with a number of stories cooking right now. We’ll be glad to talk with you about those in the future.

Jason Dearen:

Yeah, and I agree there’s so many unanswered questions. We can only get it so much in this first story. So, I think we’re definitely continuing to dig to answer some of those questions. Definitely, we’ll have some stories upcoming in the near future, hopefully.

Justin Hendrix:

Garance, Jason, thank you so much for speaking to me about this story today.

Jason Dearen:

Thanks for having us.

Garance Burke:

Thank you so much for having us.

.