Abuse of data comes as no surprise

Post Author: NM Mashurov

At the end of May, five women sat around a table in a hot side room at the New Museum, to present on a panel about surveillant anxiety. The women were flanked by large posters bearing Jenny Holzer-like proclamations about life under surveillance culture: YOUR SEARCH QUERIES WILL OUTLIVE YOU; ABUSE OF DATA COMES AS NO SURPRISE; PROTECT ME FROM WHAT I FAVE.

The panel, part of the New Museum’s biennial Ideas City festival, was a product of a collaboration between New Inc and Deep Lab, a cyberfeminist research collective formed two years ago, around the time that Edward Snowden and Laura Poitras published the NSA documents, to discuss and develop around issues of privacy, surveillance, code, art, social hacking, and anonymity. Moderated by Kate Crawford, panelists Simone Browne, Jade E. Davis, Biella Coleman, and Karen Levy presented case studies about how the lived experience of surveillance is unevenly distributed, reinforcing existing power structures.

Simone Browne and Jade E. Davis both spoke to how surveillance is a deeply racialized institution, from the panopticon-like construction of slave ships and 18th century lantern laws (which required black, mixed race, and indigenous people to carry lamps with them if they were to go out after dark) to Jade Davis expounding on her piece, “Black Men Being Killed is the new Girls Gone Wild,” which addresses pleasure in the surveillance of others and draws parallels to families recreationally attending public lynchings. In a post-9/11 world where non-white bodies are both hypervisible and coded as dangerous,

Other topics included branding surveillance as care (for example, when marketing surveillance technologies to new mothers or trucking companies), Anonymous’s strategic use of opacity, and performing freedom practices through techniques such as passing or strategic identification.

Other Deep Lab x New Inc collaborations included panels about Anti-Utopias and Strategic Witnessing, drone painting, and a performance by EMA. Beyond this panel, Deep Lab has delivered other talks including the Deep Lab lecture series (available online), collectively published a book, and continue to develop open-source privacy tools including a portable mic jammer, a troll-tracking tool called FoxyDoxxing, and more.

I caught up with media artist and Deep Lab founder Addie Wagenknecht as well as Deep Lab members Simone Browne and Maddy Varner to discuss Deep Lab’s origins, current surveillance-related concerns, strategic visibility, and the work left to be done.

What inspired you to create Deep Lab ? Why was it important to have it be a specifically female organization?

Addie: Deep Lab was formed around the time Snowden and Laura Poitras published the NSA documents. I was also part of a few collaborative groups, which were all 95% male. I was researching Cypherpunks and groups like Radical Software and saw this was both a historical and contemporary trend.

At the same time, I knew many highly qualified women doing work in these fields and I saw a lack of those contributions appearing in the culture. The thing about collaborations, is that they create a wave much stronger with many as they do as one and hence, Deep Lab was born.

What were some organizations that inspired you? VNS Matrix and subRosa immediately comes to mind but I imagine there are others.

Addie: Although they have a different focus, guerrilla girls was always an inspiration for me, I remember discovering them in the early days of the web, I think I was around 13 or 14, and had never experienced or seen this sort of anonoymous female group who was so outspoken and powerful within the art world. Also riot grrrls and Black Panther Party have been central points of inspiration.

This interview is specifically for the Surveillance Week series. What would you say are the most important issues under the current state of surveillance?

Addie: I find right now what is the most scary is the trust people have in corporations that they don’t have in each other. The amount of data and personal information we hand to them everyday which people don’t question scares me. Its not what they are doing with it now, which is already bad, but what they will do with all that information they have cached on us in 10, 20 or 40 years.

Maddy: My current fears are:
– continual breakthroughs in machine learning research (because contextualizing and understanding massive piles of data is maybe even worse than just collecting it and storing it on decaying hard drives)
– the rising popularity in self-surveillance tools like FitBit and DropCam
– and the intense research interest in frictionless payment systems by corporations like Disney.

Simone: UAVs and drones used to deliver our Amazon purchases, or for targeted killings abroad, or just crashing into cathedrals and other places will continue to be important issues, along with “big data” and Quantified Self technologies, but recently it seems, at least in the US, the relationship between surveillance as understood by the Patriot Act or with Snowden’s revelations of warrantless wiretapping is, for many people, becoming understood as existing alongside with the #BlackLivesMatter movement‘s call to “stop killing us,” mass incarceration, and whether or not police should wear bodycams.

Right now what I find the most scary is the trust people have in corporations that they don’t have in each other. The amount of data and personal information we hand to them everyday which people don’t question scares me.

There was a brief moment of excitement when parts of the Patriot Act were expiring, but not that many people seem to be aware that the USA Freedom Act took its place or that it’s basically the same thing. Do you think anything has changed?

Addie: Yes and no. Our awareness as a society has changed, there are more people talking about it then before. Its no longer a niche issue, but we have Wikileaks, Chelsea Manning and Snowden to thank for that.

In a few of the talks on the site, Deep Lab collective members (Jen Lowe, Maddy Varner) have spoken about the culture of volunteering our information for the sake of convenience—how self-exposure via Facebook and other apps is the norm. It’s so endemic that it seems like any concern about privacy is coded as paranoia. Do you see any possibility of subverting that?

Addie: The more and more people that come out and say they care, who are ‘normal’ people, thats a pretty big subversion and f-you to those systems which depend on a criminalization of those people. What I mean by that is as a pretty typically looking woman in Gap jeans, a mother pushing the stroller at the park, and so on—if I come out and am public about my concerns of privacy, it has the possibility to start changing the image which has been so deeply stigmatized by mass media as something only embraced by terrorist and criminals.

Maddy: I think a lot of it is about contextualizing these violations of privacy in a way people can relate to. The reasons I might care about my chatlogs being read are different than Snowden’s, you know? Facebook and all of these social media sites are just communication infrastructure, places for people to talk and share at different levels of intimacy (the tweet vs the subtweet vs the DM). People care when messages get misdirected and mishandled—a nebulous, higher “other” isn’t as concerning as a specific someone getting ahold of specific thoughts and feelings. A lot of people’s best and worst selves are saved on servers in the middle of nowhere.

Simone: We can look to the EU as the General Data Protection Regulation law is adopted there, to say that perhaps concerns about privacy around data are not really paranoia but could be about the Right to be Forgotten.

The relationship between surveillance as understood by the Patriot Act or with Snowden’s revelations of warrantless wiretapping is, for many people, becoming understood as existing alongside with the #BlackLivesMatter movement’s call to “stop killing us”, mass incarceration, and whether or not police should wear bodycams.

Seeing Deep Lab manifest is very inspiring, but you’d think that with mass data collection and predictive policing, having it be such a public project would also be quite dangerous. On the other hand, since rebellion is marketable in America, I feel like in the United States people can get away with performing dissent much more than under other regimes. Where do you see the line there? Is visibility a worthy goal or is visibility a trap?

Maddy: First of all, America totally hates rebellion, look at all the violence towards civilians and arrests that happened during the #blacklivesmatter protests? If anything, it values and allows certain expressions of dissent, and I think discussion within and with the support of large cultural & academic institutions is one of them. For some people within the group, there’s a definite risk because of the work that they’re doing, but like, I’m statistically not going to be the victim of predictive policing. I went to a job fair at my school and the NSA recruiters laughed at some of the work I’ve done. I’m not actually at risk here.

Because a part of Deep Lab is about identity, it cannot exist without visibility. It is, in a lot of ways, about visibility, because that’s how these issues are heard and like Addie said, that’s how we can continue to operate and interrogate issues like surveillance and privacy.

Simone: I would echo Maddy here to say that only certain rebellion, the kinds that could be contained and are therefore not rebellion are marketable. And, I would add, Deep Lab’s concerns are not with market forces and how we can fit into them.

But I do like the “visibility is trap” quote from Discipline and Punish because it can get us to think about the logic of the Panopticon, now that we are over 40 years past Foucault writing those words, to maybe think about other acts of power that do not necessarily rely on the uncertainty of being seen; sometimes subverting surveillance might start with a demand to not just be seen but to be recognized as having the right to have some rights. Sometimes that requires not just performing dissent, but some actual dissent.

Several of the Deep Lab talks (Harlo’s, Maddy’s,) culminate in the participants talking about projects they are working on developing, many of them related to encryption and resistance. What is the status of those? Have they gotten off the ground / are they available?

Addie: We have Deep Lab-ers like Runa Sandvik who has been doing a lot of public security research, Allison Burtch developed a portable mic jammer, Harlo Holmes is a sort of metadata genius who has been doing some really brilliant work around encryption and geolocation. We host a lot of our work on the github repo.

As a pretty typical looking woman in Gap jeans, a mother pushing the stroller at the park, and so on—if I come out and am public about my concerns of privacy, it has the possibility to start changing the image which has been so deeply stigmatized by mass media as something only embraced by terrorist and criminals.

In the “Divorce Your Metadata” zine distributed at New Inc, Laura Poitras talks about the affect of writing about surveillance; specifically feeling physically ill while working on Citizenfour. What have you experienced the affect of this type of work to be?

Simone: I recently saw Oscar Gandy (author of The Panoptic Sort) give a talk on intersectionality at the Surveillance Studies Center and he mentioned how he, as a black man, sometimes would comport himself in stores, so as not to heighten suspicion. As a kid and sometimes even now, I am still mindful of what my body means in spaces that weren’t designed for me (shopping while black, the outdoors). Of course, the idea of having to do this or it becoming an instinctual reaction is ridiculous, for sure, but the black body is always and already criminalized in the US and many other spaces. We live in a time when Mark Zuckerberg can wear a hoodie every day, no problem, yet on the black body the hoodie gets imbued, for many, with certain qualities—danger, for one. All this is to say, I’ve been aware of being under surveillance since I was a child. That’s probably part of what brought me to this topic in an odd way. I did, however, feel physically ill researching plantation surveillance under chattel slavery. But, of course, I was only writing about it.

Ideas City seemed like it went really well, congratulations! What is DeepLab working on now?

Addie: Right now were working on our internal infrastructure and developing some long term research initiatives. deeplab.net