Queen's University

David Lyon

Watching the Watchers

Dr. David Lyon, Professor in the Department of Sociology and Director of the Surveillance Studies Centre, leans in to make his point: “You know, we need to talk about surveillance and privacy. We need to talk about it in our schools, in our organizations, on campus. We need to say, ‘Do you know what happens to personal data here? Do you care?’ People’s life chances and the choices they can make in life are affected by surveillance.”

[David Lyon]

Lyon has been talking with me for a good hour or so, unveiling a world that most of us ignore in our day-to-day lives, but which is present and persistent in nearly everything we do. We live in a surveilled society where personal information is routinely collected, sorted and classified, often without our knowledge.

I reflect on my own habits. I tend to think of my own personal data as something I keep in my wallet, or type at the top of a private tax form. But listening to Dr. Lyon, a much richer landscape of personal information emerges. I begin to realize that every day I scatter little bits of myself around a digital world. As I walk to my office, my cell phone is not only broadcasting my location, but the trail is being logged. I browse the web, send email, and post to a social media site. All these data can be retrieved, sorted and classified in ways about which I am completely ignorant.

Dr. Lyon had this same kind of revelation over two decades ago while writing a book about the myths and realities of an emerging information society. As a professor and scholar in sociology and history, he was intensely curious about how the modern society came about, and its key moments of transformation. Looking at the interplay between cultural beliefs and the technologies that develop in that culture led him to look critically at the so-called “information society.” While writing his first book on the subject (The Information Society: Issues and Illusions, 1988), Lyon saw a developing interplay between organizations and institutions that collect personal data and emerging digital technologies.

“When I was writing the chapter on government collection of personal data, I thought to myself, ‘Oh my goodness, this chapter speaks to me about a whole world of things that I don’t really understand all that well.’ And that really set me on a course that I’ve been on ever since.”

Lyon’s course initially led to the creation of the Surveillance Project at Queen’s with his colleague Dr. Elia Zureik (emeritus). The Project received provisional centre status as the Surveillance Studies Centre in 2009 (official status in 2012). The Centre works across disciplines, with involvement from the School of Business, the Faculty of Law, the School of Computing plus other departments in the Social Sciences and Humanities, such as Film and Media.

The complexity of the field under study can’t be understated, particularly as surveillance is a continually developing target. It is a dynamic system, where society responds to, and drives, the use of new technologies in the field of surveillance.

The main interest of the Centre is to study the trends in surveillance – the context in which new technologies will arise, in addition to the details of this-or-that technology or internet platform. By understanding the trends, the Centre hopes to inform policy and legislation. For the last five years, the group has been working on a major project, called The New Transparency, including a report, Transparent Lives: Surveillance in Canada, which is nearing completion.

“It’s a great network of scholars around the country and around the world,” Lyon beams. “It’s wonderful to be with people who are both totally intellectually engaged with really difficult problems that are also politically contentious, and where there’s a demand that one thinks ethically about these kinds of issues.”

A case in point is the recent abandonment of Bill C-30 where the government sought to change how police and other law enforcement officials could gain access to personal online data from telecom and internet service provider (ISP) companies without a warrant. Lyon shakes his head and remarks, “The degree of unaccountability and lack of oversight in that process was astonishing.”

So in response, at a New Transparency workshop, the implications of the legislation were discussed, and out of that came a public campaign – led by OpenMedia.ca, but informed, among others, by the academic research of Lyon and his colleagues. “I think I’m correct in thinking that one of the reasons that the government pulled right back on Bill C-30 was because of the extent of interest raised on the issue by the campaign.”

Social Sorting

The challenges of surveillance in society go well beyond the traditional tensions of police, state and the privacy of citizens. “We’re all involved in this,” Lyon remarks. “We’re part of an increasingly surveillant context.”

The Internet gives us an amazing amount of information about one another. Every time you check up on a new acquaintance or job applicant through social media, you are conducting surveillance of a sort. We associate ourselves with some people, but not with others, and thus sort ourselves in very powerful ways. But when our self-sorting is exploited through the algorithms and in the databases of huge organizations, the sorting machine thus created affects us in ways we are not aware of.

Social sorting has been around for ages – we size people up, judge their clothing, their neighbourhood, their accents, where they fit into our conception of society. But the sheer power of databases, and the algorithms created to sort people into groups based on those data, puts modern social sorting on a completely different plane.“

The search engine of Facebook is becoming the kind of model of how to do this because, unlike Google, Facebook is relationship based. And so the clustering is being done by us as we choose friends. So, in a sense, we are “friending” those who will betray us to the world.”

How data are sorted and categorized have very real implications as people are socially sorted in profiles and groups. This is done on you without your knowledge, and without any way to change it. A simple example is how companies choose to locate retail services near more affluent communities and avoid poorer ones. A more poignant situation is how one ends up on a no-fly list and one’s powerlessness to do anything about it.

[David Lyon]

Social sorting is often done with the best intentions. When certain books or albums are recommended to you on Amazon, it is often quite useful and welcome. But the consequences can also be as negative as they are unintentional. Krystle Maki is a PhD student doing pioneering work on how the use of sorting by Ontario’s welfare system is leading to discrimination against women on welfare as a group.

Lyon is visibly impressed by the students and colleagues who work in this field: “I don’t think anyone that we know in our Centre or in our larger national and international networks is interested in surveillance in a dispassionate way. I mean, it is intellectually fascinating, but most people who are engaged in it are concerned about issues of privacy or civil liberties or human rights.”

Another PhD student at the Centre, Özgün Topak, is studying the effects of surveillance along the Greek and Turkish borders, and the experiences of migrants dealing with frankly inhumane policy. “Law is far, far behind what’s happening in the field. There are crucial issues to do with just basic social justice that are raised by surveillance today.”

The issues surrounding social sorting involve a move from the acts of the individual to the perceived risks of the profiled group. Moreover, you may not know how you got into a group, or why you are now under suspicion, or how to disassociate yourself from a particular group.“A Kafka understanding of this world is in some ways better than the Orwellian metaphors because it gives that sense of the experience of surveillance. You don’t know why you’re being called. You don’t know how you got on a list. You don’t know what the criteria were that classified you in a particular way. You don’t know how to get out of it. You don’t know what the consequences are for you or your family. It’s very Kafkaesque in that sense.”

And this is perhaps the crux of the work being done at the Surveillance Studies Centre: there is an emerging surveilled context that certainly affects, and perhaps even distorts society, and democracy, and freedoms. And it isn’t a conspiracy, or the result of bad intentions, nor always a negative thing, but it is a context and a dynamic that demands our full attention if we are to maintain the type of society we enjoy today.

The Risk of Risk Assessment

Lyon makes an insightful contrast to the risk mitigation associated with climate change policy. Climate research is based on analyzing and modeling data in order to predict likely outcomes. Those outcomes are risk, and it makes perfect sense to act early on that, before all the data are in, whether it is reducing greenhouse gas emissions, or redesigning infrastructure to deal with new frequencies of extreme weather.

This is something most people accept as a reasonable course of action, and to not act at all in the face of the risks would be negligent. But when the same thinking is applied to the analysis of human data, and the profiling of groups based on potential social risks, then problems arise. Acting early, with incomplete data, means compromising due process and the presumption of innocence, things in the current security-surveillance climate that, Lyon says, we can no longer take for granted in Canada.

Lyon questions the underlying assumptions about how we evaluate risks from the data: “Our risk analysis is based upon very sophisticated analysis of statistics combined with very sophisticated software. But the belief in them and their power and efficacy also have to be considered. The algorithms themselves have to be calibrated in some way.

How are they calibrated? Who makes them in the first place? Who constructs them and how does that effect the way that they are calibrated?”

In climate change, the data surveilled are physical things like water, heat and geography, but in surveillance, the data are about people – about one another.

To assess risk and act on that understanding has very different consequences. Our conversation turns to the Steven Spielberg film, Minority Report (2002). “In some ways, Minority Report was a very important movie because it came out simultaneously with the after effects of 9/11. The notion of ‘pre-crime’ in the movie is so close to pre-emptive surveillance. And that’s precisely where the civil liberties and human rights potential violations become most likely.”

The extraordinary rendition of Maher Arar, whose personal data were mishandled, and whose personal associations had flagged him as a risk, is a case in point.

The New Transparency Project

Lyon points out that in Canada, we are much better off than many other countries in that we have a Privacy Commission, something that is the envy of other states. Even while the laws it works under struggle to keep up with the changing state of surveillance, nonetheless it offers citizens a direct method to address privacy concerns and violations. To its credit, the Commission has won privacy cases against Facebook and Google.

The New Transparency Project hopes to build on these positive trends through rigorous research and assessment of surveillance. The report is nearing completion of its first draft. “It is a gargantuan task,” Lyon remarks. But the Project is also far more than a report. The Centre, conscious of its identity as a publicly supported institution, plans to launch broad public outreach alongside the publication so that citizens can begin to think critically about these issues and help determine appropriate responses.

As Lyon concludes our talk, I recall his mention of his early academic focus on the development of modern society and how it is connected to cultural beliefs, and I realize that through this research he is engaged directly in that dynamic, at this precise moment in history.“I feel that it’s a moral obligation to bring the intellectual content of my work in line with my deepest commitments and the things that I really believe about human beings and about the world, and what is truly important about a life worth living. And I discovered that among my students and colleagues, there are many people who are asking the same questions.”

Profile by Lowell Cochrane
(e)Affect Issue 3, Spring 2013