PATTERN RECOGNITION took place virtually on Thursday, 21 January 2021 in collaboration with Northeastern University in Boston. This colloquium considered the multiple ways in which systems of law and governance interface with their publics.

Democratic values such as those contained in the Constitution, the Bill of Rights, and human rights conventions are increasingly being sidelined, as print media and verbal communication shifts to digitized visual vocabularies and quantitative methods of prediction. In some cases, black letter law and established modes of address—and redress—have been displaced altogether by visual regimes (like surveillance camera networks and facial recognition software) or numeric narratives (including the algorithmic assessments of credit worthiness, medical predispositions, and probability of criminal recidivism).

These parallel universes operate according to their own logic, creating referential and connotative systems in which notions of truth, justice, and fairness are dangerously reconfigured. Such insidious technological and discursive shifts impose new and often impermeable categories of race, gender, and class that reinforce existing lines of segregation, while their decision-making power operates outside the traditional realm of accountability defined by jurisprudential notions of human agency. The aim of the colloquium was to decipher the norms embedded in the syntax and semantics of justice as a system of governance, and to expose the hidden violence operating in these new genres of expression.

PRESENTATIONS:

Pattern Recognition | Gloria Sutton, Associate Professor of Contemporary Art History, Department of Art  + Design, Northeastern University

As a contemporary art historian, my research analyzes the ways that visual art has become fundamentally conditioned by the protocols of software. My current book, Pattern Recognition, examines critical issues of linguistic, computational, and cultural translation to critique the how the rhetoric of interactivity and participation which has problematically come to define contemporary art also regulates and legislate bodies in real time and real space. Diverse and diffuse in their own right, the artists I present include Rafael Lozano Hemmer (b. 1967), Christine Sun Kim (b. 1980), Lynn Hershman Leeson (b. 1941), and Marta Minujín (b. 1943) whose artworks combine sophisticated computerized tracking systems, customized software, and sensors with workaday hardware supplies to generate real-time systems that pivot around ordinary human gestures which expose the disciplinary frameworks that define bodies and control speech and thus identity.

 

The Danger of Normalizing Surveillance | Evan Selinger, Professor of Philosophy, Department of Philosophy at Rochester Institute of Technology  

Technology governance tends to be reactive. Far too often, regulators wait for bad things to happen and then try to reign in harmful activity. Privacy scholars and activists offer many warnings about the dangers of failing to take a more precautionary approach. One of the most disturbing cautions is that normalizing surveillance leads citizens to view harmful practices as beneficial simply because they’re widely conducted and not because people have good reasons to believe they are socially desirable. In my presentation, I’ll explain why empirical results in moral psychology studies suggest the threat is credible. Addressing the problem requires understanding what the consequences are for governance if the dynamic is real and pervasive. I’ll argue “favorably disposed normalization” is a serious problem because it can subtly erode autonomy and ignite slippery slope trajectories that undermine privacy and civil liberties.

 

The Walls have Ears | Rahul BhargavaAssistant Professor, School of Journalism and Department of Art  + Design, Northeastern University

A recurring truth of the future of technology is that it sneaks up on us unawares. Home “smart speakers” are an example – they have become surprisingly ubiquitous without any substantive reflection on their impacts. Most of these object are blobs or obelisks that are meant to blend-in with a modern home; they fade into the background so all that is left is an omnipresent voice that responds when called-for. We know that these devices are listening to us, yet their design gives no strong indication that they can hear us. We have started a design research project to interrogate this – how can we design a listening device that prompts the user to think critically about its use? Our goal is to redesign the listening device to support reflection. I will share background, sketches, and invite input on how to critique these devices in the public sphere.

 

Dataveillance, Informational Mosaics, and the Computational Mind Jennifer GradeckiAssistant Professor, Department of Art + Design, Northeastern University

This talk will discuss how dataveillance technologies have been shaped by the mosaic metaphor and computational metaphor of mind. The mosaic metaphor leads law enforcement and intelligence agencies to collect and process as much information as possible to construct a complete picture, which drives the mass collection of data and produces information overload. The computationalist metaphor conceives of the analysts’ mind as computer-like and automating software as resembling the analysts’ mind. Data analytics companies market ‘smart’ software that can ‘think’ and ‘learn’ as the solution to data deluge. Automating systems are regarded as capable of intuitive and probabilistic reasoning, only more accurately, more efficiently, and with a larger channel capacity than human analysts. These assumptions become difficult to see and critique once they are embedded in software. Public oversight of public-private socio-technical dataveillance systems is exceedingly difficult because of classification systems, intellectual property laws, and the need for technical literacies.

 

Sarah Hodges | Marcel TopLondon-based Belgian new media artist

“Is our fundamental right to freedom of expression being threatened?” With this question in mind, Marcel Top starts to investigate mass surveillance in the United States. Sara Hodges addresses the artist’s concerns over the safety of democracy in surveilled societies. With his project Sara Hodges, Top questions the current use of these technologies, finally exposing the possible threat they represent. Sara Hodges is a non-existing, algorithm-generated, American citizen. To create this online fake persona, Top started by gathering over 50’000 Instagram posts that used the hashtag #iloveamerica (I love America). Starting from these posts, Top was able to generate new non-existing pictures through machine learning. The online presence of Sara Hodges reflects the online presence of other ordinary people, who in the sight of surveillance technologies represent the perfect American citizens.

 

Organizing Chaos and Autopropaganda: A Technocratic Approach to Managing Public Opinion | Derek CurryAssistant Professor, Department of Art + Design, Northeastern University

In his 1928 book, Propaganda, Edward Bernays openly advocates for the manipulation of public opinion by a small group of technocrats. In a process he described as “organizing chaos.” Bernays explains how it is the role of an “invisible government” to use propaganda to distill public discourse to a few acceptable ideas for the public to choose from. A process similar to what Bernays had proposed has recently been implemented on a global scale through the use of clustering and pattern recognition algorithms designed by a small group of technocrats to predict and influence the behavior of users of networked platforms. This presentation will discuss the similarities between Bernays’s ideas and what Eli Pariser has termed “autopropaganda” as well as their problems and technocratic solutionist approach to democracy and human agency.

 

Platforms, Data and Algorithmic Sociality Zoetanya Sujon Programme Director for Communications and Media, London College of Communication 

With platformization and the rise of global big tech platforms such as Facebook and Google, data extraction continues to grow in both scope and scale across new frontiers of personal information. Social media, for example, may have popularized the configuration of extracted data across sites but this has now become common platform practice. In this age, every click, view, or like produces extensive kinds of data, often collated by massive companies for capitalist profit (see also Zuboff 2015, 2019; Couldry and Mejias 2019; Srnicek 2017). While new data frontiers emerge in virtual reality, fitness and calorie trackers, dating apps, filters, tagging, facial recognition and other sectors using emerging social technologies, it is important to ask what this means for social relations. Specifically, what are these emerging forms of platform-based algorithmic sociality? What kinds of social relations are enabled or disabled? This paper draws upon notions of ‘programmed sociality’ (Bucher 2018), algorithmic cultures (Seaver 2017; Kitchin 2014), and algorithmic bias (Banjamin 2019; Bualamwini and Timnit 2018; Eubanks 2018; Nobl.e 2018) to make sense of algorithms in everyday life and their consequences for emerging social relations.

 

Conservative “Victims” and Misinformation | Ari Ezra Waldman, Professor of Law and Computer Science, Northeastern University School of Law and Khoury College

Conservative groups like to call themselves victims. They see a War on Christmas, a culture of political correctness. They see affirmative action as creating white victims of discrimination. They argue that they are victims of religious discrimination because equality laws require them to treat queer people the same as heterosexuals. They argue that the 2020 election was stolen, that they are victims of election fraud, and that they are victims of fraud at the hands of Black voters and Black election officials. Victimization is at the core of the right wing’s rhetorical attack on progressive values. It is also at the core of its legal strategy, which takes doctrines traditionally used to protect minorities to cement their power. Misinformation plays an essential role in that legal argument because it forms the foundation of a culture in which they perceive the current system as illegitimate. But of course, it’s all a lie. This project adds a new dimension to the discussion of misinformation–namely, it’s role in creating patterns of victimization among groups at the top of traditional hierarchies of power and how the professed belief in victimhood is already affecting and changing the law.

 

Attunement and the Ethics of Care | Ilya VidrinPostdoctoral Associate, Department of Theatre, Northeastern University

As a choreographer, I acknowledge that dance is often portrayed as a visual art form. Yet anyone with movement experiences recognizes that there is more at play than what is visible. In this short presentation, I will discuss the significance of the physical, kinesthetic aspects of ethical principles. I will question how we can collectively move past the technological prioritization of the visual. In so doing, I will demonstrate how partnered movement can function as a critical response to render salient and disrupt the invisible violence of embedded norms that prioritize justice over care.

This colloquium was a collaborative venture between London College of Communication, University of the Arts London, the Visible Justice collective, and Northeastern University’s Humanities Center (CSSH) and Center for the Arts (CAMD).