Mind goes where eyes can’t follow: internalizing the logics of capture | Nora Nahid Khan

The pandemic in the United States has allowed for an experiment, at scale, in how a visually obsessed culture orients itself, frantically, towards the unseen the virus, its transmission. Capture of the bodies suspected to be infected, or about to be infected, by this unseen, has intensified. The theater of security and quarantine has made legible the precise kind of psychological space that surveillance depends on. Surveillance depends on an initial buy-in, whether begrudging or unwitting, of millions of users and subjects of technology. But the buy in is entrenched and secured beyond vision. It is continued on through rhetoric, through persuasion, through training in the play of unseeing and seeing, such that each of us using technology internalizes the logics of capture.

Over the past decade, both critical activism and legal and academic advocacy have helped cultivate widespread awareness of the trade-offs we make in using consumer technology. In fact, there is more widely available history, research, and ongoing theorizing on surveillance, on calm design of consumer technology, the history of dark patterns, than ever before. “Users” seem to largely understand how they—we—rescind our civil liberties and privacy for the convenience and ease of elite design. Business school scholars like Shoshanna Zuboff publish international bestsellers on the foundations and strategies of companies pursuing surveillance capitalism, and ploddingly examine each new buttress in their architecture.1 Interface designers and programmers are more wary of their own role. Tech activists and anti-spying coalitions educate teens about the surveillance state. As users, we seem to share an intellectual understanding of being surveilled. Users, citizens, accept corporate and governmental surveillance in exchange for use of a host of platforms, infrastructures, and software tools. Users adjust to a baseline knowledge of how their profile and movement are tracked, their data lifted.

Less evident are the small shifts in public rhetoric, which continue to ensure collective buy-in for surveillance. Through these changes in narrative, which ask that we surveil ourselves and each other, we learn to inhabit the role of the surveilling eye. We sympathize with the surveillant and fail to interrupt our capture of the surveilled. We begin to relate to each other through the act of policing of language, expression, of bodily movement, intention, motivation, and presence. Taking the world as ours to consume, define, hedge, label, watch, and re-watch, in endless loops, we become police.

Forms of colonizing, imperialist seeing continue online. Everyone that falls across one’s screen belongs to one, and every movement is one’s to possess. Even within the reality of contactless pickup and no touch sociality, in which one avoids overt touch or bodily intimacy, the more subtle logics of capture persist.

Latent elements of the surveillance state have been activated and extended rapidly in the current economic, epidemiological, and bio-political crisis. Particularly oppressive surveillance has targeted essential workers, who are particularly vulnerable, living at the intersection of socioeconomic, gender, and racial inequities. In April and May of 2020, police across the country echoed the spirit of punitive seventeenth-century “lantern laws”; they detained and arrested individuals going to work, to second and third jobs, or to home to nap in between, for violating “curfew,” despite their having papers of excuse around an arbitrary, overnight declaration. In June of 2020, police departments scanned digital images of protestors, individuals critical of the police, and then hunted for them on social media. Journalists from the New York Times to popular podcasters continue to report such insidious efforts breathlessly, as though a massive surveillance architecture has not unfolded around us, and in our hands, for over a decade.

Each event is a new head of a growing hydra. A crisis reveals hidden workings of this Leviathan, a many-eyed, glittering apparatus that flashes in full view, before sinking below the surface of the water. But for the theater of capture to be enacted, a groundwork had to be very methodically dug, tiles laid in steps. For the apparatus of surveillance to become easily acknowledged, visible to us, for it to really take root in our enforced distance from one another, it had to become part of our own, active seeing. It is no longer totally hidden in third-party apps, or purely the purview of black-boxed machine learning systems reading our images for life signatures. One is daily, steadily inculcated, through narrative and media consumption, through public health initiatives and tool updates, to adopt surveillance thinking. In this, one becomes surveillant of a bio-political landscape that consists entirely of at-risk bodies. In a form of scrying, one mines the screen’s display of crowds near and far, sussing out, discerning their hidden intentions.

Sovereigns dream of capture and of the mechanics of that capture being hidden. They dream of this hidden capture becoming ritualized, internalized, done out of the sight of those who would protest. I take up as my focus the many ways that surveillance logics are taken on through consuming media about violence, watching spectacular life unfold in real time online. Caught within the playhouse of algorithmic capture, itself fueled by true, active, physical capture, one risks a slow conversion. Seduced, one naturalizes policing, and then loses sight of one’s own tendencies to police.

I am inspired by thinkers like Simone Browne, Jackie Wang, and Safiya Noble, who leverage their criticism at the design of the carceral state and the technologies that support it. In their wake, I take up how foundational methods of algorithmic capture are to society. Earlier forms of coding, of measuring movement, through the census and lantern laws, have transformed into an algorithmic supremacy, which fuels predictive capture, entrenching injustice along inconceivable scales. In the phrase “logics of capture,” I evoke the powerful frame of “grammars of capture,” which Hortense Spillers coined while critiquing the Moynihan Report’s surveillance of Black women, men, and families in her essay “Mama’s Baby, Papa’s Maybe: An American Grammar Book” (1987). Building on Spillers’s work, Tina Campt writes, in enumerating a grammar, or a vocabulary possible for the future, we can create a “politics of prefiguration that involves living in the future now” (2017, 17).2 When it comes to contemporary technology, understanding algorithmic capture can help us move towards grammars of liberation: a prefiguring of less oppressive technological futures in present discourse.

Mapping Psychological Space and Groundwork: Setting the Stage

As the U.S. first grappled at scale with the emergency of the pandemic, a shift of the “burden” of policing, from the police to citizens themselves, took hold. Communities were harnessed, overnight, to police themselves, and those outside, using technology already at hand. Such practices are encouraged by the mediation of images through surveillant digital infrastructure. I hope to map how we become part of the disciplinary eye, take our place in concentric rings of internalized surveillance, turned into lifestyle.

The digital surveillance infrastructure we work within was always ready for self-surveillance through apps, platforms, access to databases. It was ever driven by an ethic of predictive community policing that seeks out and marks potential “risks.” This structure was primed perfectly for both mass policing and oblique social management. Threats to the public good, like a virus, seem easily solved by the “affordances” of technological solutionism.

As a number of writers such as Kim Stanley Robinson have written, the pandemic’s impact has been equally “abstract and internal. It was [and is] a change in the way we were looking at things” before (2020). This internal, abstract change is my primary subject. As the social pandemic began to unravel alongside the viral, public focus shifted to a view from overhead, to consider the designs of social systems and resource distribution. Pundits discussed urban design as it perpetuates inequality, through racist housing laws, redlining, zoning, and anti-houseless public architecture. The widely uneven infrastructures of public support surfaced, and a lack of preparedness left citizens further dependent on devices—their pervasive infrastructures of techno-surveillance—for guidance, strategies, information. Each person, left to themselves, seized on a digital semblance of stability, reliant on its offerings.

A conscientious, cognitive laborer, in enforced solitude, scrolls her feeds. She zooms in on virus simulations, does her own amateur epidemiological modeling. She researches Sweden and herd immunity, flirts with conspiracy, and gets more and more frustrated about her isolation as a state directly linked to the select actions of individual strangers. Who are these people making it harder for her to leave? From her couch, tucked into a blanket, she scans images of poor people on the subway, crammed shoulder to shoulder, and shudders. People in photos start to exist for her within the language and metrics of the virus; are these people sick, are they immune? Are they asymptomatic or are they potentially spreading their disease everywhere?3

Keeping abreast of the “rules” requires near-constant surveilling of media and surveillance of self. A week offline—assuming one had the internet—could mean missing curfew, or not knowing public sentiment on masks had changed in one’s community and state. Here, an invocation of “We” seems tempting, and would work if we existed within a clear bio-political order, in which all people lived and experienced the world in predictable ways. “We” are instructed to wear masks. “We” are told how we all should navigate urban spaces, approach each other, and stay at bay. “We” are told to strive to be contactless, to avoid touch. There is a dissolution of “We” when the biopolitical order is unclear. Simulations, algorithms, predictions, all change depending on the day’s data input and new findings, and so produce rapid-changing rules.

There were alarms sounded on those who flout rules made in this cybernetic stronghold, even as those rules changed daily. This is not an expression of sympathy for those who actively put others at risk, but a note on the natural disorientation caused by rules based on public health “interpreters” of rough, emerging, and competing simulations of a poorly-understood, unseen virus.4 The theater lit up with fear, anger, and critique of those who do not practice social distancing. Police without masks started a referendum on police abuses (Robbins 2020). From the vantage point of the interior, it became difficult to know precisely what was happening outside of news and social media. Every event came mediated. Small news items are amplified, made ubiquitous within the span of an hour.

And outside, citizens began to actively monitor others, on alert for violations of social distance. Posts about combative, unmasked visitors to stores became headline news. “We” quickly enforced a kind of gaze on each other, a kind of gentle surveillance, a relationship of power that immediately posits the viewer as correct. In the emptied streets, the police walked with new reasons to enact violence. News of New York City police’s choices dominated headlines as the city became a pandemic epicenter. Areas that were already over-policed experienced more policing. Targeted profiling against communities of color intensified. Little empathy, more aggression. As NYC police officer Jorge Trujillo described in the early days of the pandemic, “So on top of the unprecedented problem that we’re dealing with, the police can come and escalate and make things worse” (Speri 2020). Many pointed out the inflexibility of policing, how the absence of oversight meant little to no change in abusive tactics of the police. I was struck by how deeply important citizen counter-surveillance as a tactic would become in this setting. Who else would look back at the state and witness it, but a citizen aware of its rhetoric?

In May of 2020, protests erupted in dozens of cities across America in response to the filmed extrajudicial murder of George Floyd by Minneapolis police officer Derek Chauvin. Late in May, Minnesota Public Safety Commissioner John Harrington announced at a protest that arrestees would be “contact traced,” to determine their associations, political affiliation, levels of organization, platforms, to then build “an information network” (NBC News 2020). That contact tracing, oblique location of sickness, would be then used to trace and criminalize “unseen” sentiments, like anti-police activism, anti-fascism, or anti-racism, was not an ambiguous move. Keen technology critics like Adrian Chen swiftly noted how the “war on COVID is normalizing surveillance in a bad way” (2020). The language tracked.

Kentaro Toyama, a surveillance expert at the University of Michigan, notes that “the normal ethics of surveillance might not apply” in a time of crisis. Here, normal ethics refers to a shared understanding grounded in civil liberties, in which individual privacy should be protected, and surveillance is seen as primed for misuse (2020).

How might critical and academic studies move more swiftly with these tides of sociotechnical change? Given the speed of product design and rollout, the expectations of might need to evolve in response. The insights on the technological apparatus of March 2020 are ancient come December of that year. This year has shown rapid and live prototyping of competing forms of technological mediation, which is both mirroring and actively shaping the social. While unorthodox to write about such world historical events in realtime, my analysis of the past five months is rooted in identifying patterns and models central to emerging surveillance studies, criticism of software ideology, and the philosophy of cybernetics. Theorists need to move with the changing tactics of surveillance on all the fronts that it moves, so users can actively question their role and responsibilities in relation. Beyond any one user base, the collective might be able to denaturalize and assess technologies of the state. They might be better positioned to refuse algorithmic curation, as it highlights specific narratives that serve business and the state: defense of proprietary values at all costs, extraction, supremacy.

Theaters of Policing: Spectacular Aims

Surveillance serves the American state, and it also serves civilians whose interest is in maintaining the violent, bankrupting, extractive economic system the American state relies on.5 Their lives, families, and communities are only “safe” if the state is steadied under law and order. Being part of the eye, then, having the ability to capture and evade, is a distinct power within this regime. Being able to see “danger,” name it, call for punishment, is rewarded. For those in the secure central hub of the capitalist state, the spectacle of the weak, the suffering, and the aggrieved fuels an algorithmic economy that cycles, feeds, and amplifies on rage, grief, and other strong emotions.

“Pandemic theater”6 has been not only a great amplifier, but also a wide-scale simulation of the impact of uneven seeing, capture. Never in history has there been access to so much real-time information about the iterative, compounding injustices in the world. And for some, the potential to wall oneself off from those realities is nearly total. In isolation, the protection of the body—the healthy body—becomes a site of contestation. In sovereign systems like or comparable to America’s, with weak or absent social welfare systems, the individual must carry the burden of her self-healing: “It is your individual responsibility to keep your health intact, and so, to map and chart possible threats around you from your terminal.” Every healthy body must protect itself and has the right to do so with any tools available. While the pandemic has intensified long-overdue public discussions about structural inequality, the incompetence with which it has been handled has given free rein for many to prioritize personal safety and sovereignty over the rights of others. Policing of the most marginalized intensifies, out of fear of disease, of death.

The management of sickness and plague is where our commitments to relative senses of community, if they existed before, are tested.Paul B. Preciado describes the pandemic as a “great laborator[y] of social innovation, the occasion for the large-scale reconfiguration of body procedures and technologies of power” along with a “new way of understanding sovereignty.” 7Sovereignty is waged through networks, through the gaming of algorithmic calculus. Some are deemed outside the realm of community, dispensable by definition. Understanding how these definitions are augmented, concentrated, and distributed through algorithmic media is critical, as one continues to experience the pandemic through computational frameworks.

In domestic isolation, videos and streams and feeds pour in. The temptation when caught inside, cut off, is often to turn even further inward. One does not need to be obsessed with social media, or even caught in a hopeless “echo chamber” to be affected directly by the way algorithms shape news media, policy, and individual action. Online, sensational tracking headlines and fake news alike proliferate and spread through armies of bots. The play unfolds. A cast of characters emerges: the unmasked suburban libertarian; the anti-vaxxer; the working, undocumented immigrant in farm fields; the essential worker; the urbane, middle-class, digitally savvy millennial; the hero-nurse and the ones clapping at dusk; the zealous moralizer. Each character type, each position, becomes part of the story of the bio-political state. Surveilling other’s crisis, other’s grief, and the narratives of pain and loss, death, becomes a form of ritual.

For those with limited mobility and offline access, the Internet surely has “broadened” and made accessible a wide world of information and community. However, it is necessary, at this moment, to shy away from elevating such obvious positives over the facts of how “the Internet” has changed in just ten years. Technology’s promise—particularly the Internet’s—has forever been tied up in the rhetoric of information and access as a portal to individual liberation.8 It is better understood, today, how the ideology of the Internet as liberating, as undeniable good, has often obscured the expansion of state control, tracking, and extraction through it.

The design of algorithmic networks makes any notion of total resistance and divestment from networked systems of capture impossible. Even the most circumspect, curated, and informed engagement with current machine learning-driven platforms plays a part in the ranking, sorting, and categorization of content. The algorithm chooses what news to highlight, whose deaths to cover, what critiques to hide, and whose voices to platform in ways that are inaccessible to the everyday user. Further, algorithmic tracking and extraction, as Safiya Noble maps with exhaustive precision in Algorithms of Oppression, is irrevocably tied to a contemporary person’s life outcomes. Algorithmic systems determine one’s loan-worthiness, employability, trustworthiness, whether one gets into school or not. Many crucial aspects of one’s offline life—from the classroom demographics one’s child encounters to the digital redlining of one’s city—are increasingly shaped by algorithmic decisions.

Admittedly, technology can be intentionally subverted, and the state’s chosen tools redesigned. There are many forms of technologically-driven resistance of this state’s aims. One might know a good number of people who are trying to address the pandemic’s mismanagement through technological hacks and recodings. These insurgent technical spaces help create needed intimate local networks and futures of community-owned technology and ethical computation. But while organizers and self-protecting digital workers might well use encrypted e-mail (and have done so for years), their admirable efforts have to take on the far-reaching effects of algorithmic design and do so beyond the technological space—if true divestment is the goal. Even the best-intended technological solutions to social problems can unwittingly enforce tech-culture’s violent methods of capture and ownership. Contemporary technological activism might also examine its use of the tools of surveillant capture and embrace of technocracy’s underlying mental models.

I have argued for capture as a pervasive, seductive cognitive tendency, a practice that is initially honed through media consumption. Capture here means both the uninterrupted consumption of mediated, violent acts represented in media and video and other forms of passive surveillance conducted offline. The isolated might eagerly take this work on out of a sense of urgency, self-protection, and survival; capture becomes a matter of life and death. Capture becomes a lifestyle—an undertaking with civic weight and import. Seeking answers in pandemic, within screens, one finds unlimited freedom to surveil, download, and zoom in on the bodies of others with a clarity that comes from a many-tiered remove. This leans towards an unreflective adoption of an authoritarian bent towards streams of videos and images as well as the represented people and their lives.

The viewer and feed are co-constitutive. The viewer flows through her feed, moving from inhabiting one surveilling eye to the next, scanning, watching, aggressively framing, moving right along with the feed’s flow of violence. This digital movement with algorithmic spectacle compounds and entrenches one’s political position and affirms it. Though a critical reader of digital media might suggest everyone “reflect” and always make space for critical consumption, to mobilize the mind and spirit to thoughtfully read every video and image in the feed does not seem tenable at scale.

Further, we read images and videos of violence, violently, and do so by technological design. Machine learning-driven visual consumption has its own spectacular time. It depends on capture of bodies and objects to function; its time is collapsed, fueled through sorting, bucketing, separation of protestor, vigilante, worker, hero, scourge. Outrageous narratives, “examples” of individual political positions, become naturalized.

Each character, type, position, becomes part of the story. As spectacular time unfolds through media, historical archetypes echo and compound.9 Based on one’s position, one is disposed to either recognize violence as violence or not at all. It took four months—March 2020 to June of 2020—to transition from crisis fueled by fear of viral infection, to a secondary crisis of “necessary” surveillance, fueled by the state’s fear of protests of police violence, of Black Americans as a political force, of “antifa.”

Sociologist and scholar Zeynep Tufekci has written at length about how security measures and police harassment are amplified in any vacuum of information (2020). As was well-covered in New York, police harassed, fined, and arrested black and brown citizens on camera for not socially distancing. Meanwhile, these officers’ cohorts handed out virus “protection” kits of masks and wipes, with care, to sunbathers in Domino and McCarren Parks (Lee 2020). Mimicking extant “stop and frisk” practices meant 35 of 40 arrested people in New York City were black (Southall 2020). Similarly, in many countries, police data for Covid-19 arrests and fines followed predictable, comparable patterns. In Sydney, Australia, for instance, social distancing’s highest reported infringements take place among Indigenous people, despite being .04% of the population, a fact said to reflect systemic bias and over-policing of Indigenous locals.

Algorithmic and digital media economies deputize civilians to be purveyors of violent spectacle. Being a spectator—time on the platform—makes one more vulnerable to disciplinary tracking, targeting, and training. In 2002, Wendy Hui Kyong Chun richly theorized the cyber-flâneur who navigates online space as a kind of “detached observer who remain [s] hidden from the world while at the center,” invested in a fallacious illusion of control (2002, 247). Not much of this specific illusion has changed within today’s algorithmically driven landscape (Figure 1).

Figure 1. Virginia Wagner, Cyber-flâneur (2021). Image courtesy of the artist.

Figure 1. Virginia Wagner, Cyber-flâneur (2021). Image courtesy of the artist.

Today’s cyber-flâneur maps seamlessly atop with the captor, the surveilling state. But algorithmic surveillance further changes elements of her “walk” through the city and her thoughts on it. She wanders, surfs, but is increasingly guided; she looks away from a street and mirrored walls block off its view after she passes. She is trained into a position and is further nudged in the direction of what she already finds good, what affirms the best reflection of herself. She may see the actions of any one individual walking across her feed in an extreme vacuum, in which they are solely accountable for their actions. For her, it is harder to discern the bigger “reasons” for these mediated actions, including education, political and cultural beliefs, resource distribution over time, a lack of programs, restorative justice, mental health care.

In 2020, the new generation of cyber-flâneur—the algorithmic citizen—might still believe herself in a position to judge, assess, analyze, close-read with objectivity. She might fancy herself a critical reader, equipped with the tools of visual analysis to understand the “real story” of any image. What happens to this critical position when her relationship to an image or video within a legible frame of intentional editing is compromised or re-directed by a ghost hand? The reality of how computational systems today actively shape, crop, highlight, direct and “nudge” our encounters with images makes a critical close-reading very difficult. She is shaped by the algorithm’s edits, choices, and suggestions, and consumes its spectacular time.

In introducing Chun’s essay, Nicholas Mirzoeff notes how Chun’s cyber-flâneur is further a kind of “rubbernecker,” who lightly colonizes with her gaze, takes ownership of the subjects in the image. “For all the futuristic talk associated with the internet,” Mirzoeff writes, “the dominant models of internet use rely on nineteenth-century ideas of the colonizing subject, and skate over the implications of the largely independent ways computers actually exchange data” (2002, 166). Today, each spectator is still yet “also a spectacle, given that everyone automatically produces traces,” data generated with each click, each site visited, each uploaded photo Chun (2002, 244). The cyber-flâneur leaves her tracks everywhere online. She is also colonized in the process of viewing. Her relationship to the digital spectacle, even though she might imagine herself as merely gawker, a passive seer, is active. She shapes the model that will learn from her and then direct her future steps. And so through this cycle, her penchant to violent spectatorship is exacerbated and rewarded. In this ecology, any stance of detached distance is a projection, a myth.

The algorithmic spectacle of crisis embeds the tussle of divergent, wildly opposed ideologies into its framing, harnessing more watchers. I read a static news headline, topping an image of a nurse in a mask, standing proudly in an intersection before protestors. I then watch her in a video; she pushes and is pushed by a woman draped in an American flag. As a viewer, I immediately sympathize with the nurse. I identify with this overworked, underpaid woman who looks like me. I cannot find a scrap of critical generosity for the woman in the flag, even as I learn she has gone bankrupt, as her small business closed. In eight seconds, I am making conclusive judgments.

From Seeing with the State → Learning to Look Again My goal was a quick portrait of how users of technology are being encouraged to adopt a logic of seeing with the state, and through its eyes, offline. Algorithmically-guided scanning and capture generates the same in the real world. Increased surveillance represented through streams and feeds accelerates literal capture. The sovereign technical space exacerbates the most authoritarian tendencies that one might harbor: a tendency to surveil, to identify with the state, with police, to identify with the entity with outsized power to see. The inconceivable becomes normalized. Considered, incisive critique of the grounding conditions that allow this distributed authoritarian lens is hard found.

This seeing with the state is the product of a nameable, immediate crisis, like the pandemic. At the same time, it is an expression of an ongoing, slow, training in our collective sympathies with the sacrifice of policing, which we privatize and take on ourselves as duty. Much public response has clamored for a status quo to be restored. A return to normal. Overnight, a new culture of informants emerged, informing on social distance-violators. Apps like Neighbors and Next Door are used to deliver notes to the public, and the police, about crowds and gatherings of suspicious people. These rough networks of ambient capture and potential punishment, which enforce pre-existing power relationships (the healthy versus the sick, the protected and socially-distanced versus the precariously housed, more at-risk) reveal a relationship to discipline that takes place outside the body, but now depends on digital capture. Some remain untouched, always the center of the story, while other bodies are doomed, subject to disease, death—a story of foretold dispensability.

In theorizing the pandemic, Preciado reminds us of Foucault’s useful framework of biopolitics as one in which “the techniques of biopolitical government spread as a network of power that goes beyond the juridical spheres to become a horizontal, tentacular force, traversing the entire territory of lived experience and penetrating each individual body” (2020).10 How might one understand this tentacular force, shaping the state of exception ushered in by Covid-19? The state of exception allows for individual surveillance in isolation to flourish. If one understands technology and the digital as not just mirroring, but actively framing and magnifying existing structural narratives, there is a deep need, as ever, to square the centrality of technology and smart devices as portals to services and work, with the specific ideological worldviews they espouse.

Critiques of vision and its role as the ultimate conduit of racial surveillance predicate that one is able to see the surveilling mechanism, the apparatus. However, just as race exceeds the visible to denote the invisible, so too does racialized surveillance function along unseen byways. Preciado, for instance, describes the “subaltern vertical workers, racialized, and feminized bodies” as they are “condemned” to work outside, overpoliced. In this work they are also, perversely, unseen (2020). The work of moving away from algorithmic surveillance and its unseeing starts here. For one, articulate all the different ways in which feminized and racialized bodies are unseen, to then pursue collective imperatives to narrative, honor, and elevate their experience across difference. Learn to navigate the world beyond the dominant modes of vision, to think along unseen vectors of risk, safety, erasure, and power.

Surveillance involves a lot of unseeing, because seeing involves choices to unsee or not see. And clear surveillance activity can be ignored, totally not seen. The surveilling apparatus may go offline. What the surveilling eye chooses not to see can be both a death sentence and a lost opportunity for justice. Police body cameras are shut off before enacting extrajudicial murder. In June of 2020, police brutalized protestors with tear gas and rubber bullets, riot gear shields, tasers, and batons, before cameras rolling in clear view of the world. The demands of protestors were as much about past and ongoing police violence as about the abuse that happens in prisons, behind closed doors, domestic and institutional. The violence that is seen is the exception; unseen, systemic abuses are now urgently at the fore of collective consciousness. What happens in the absence of footage fills the shared imagination. Activists and community organizers share statistics on missing women, on the missing of Black and immigrant communities. We are asked to imagine all the violence people are subject to, that goes beyond the sight of the world. The illegible violence. Without citizens to track, name, film, and complicate the state seeing apparatus, one should think on what is not filmed and shown to the world.

Activists have helped create widespread awareness of the ways in which counter surveillance, whether by civilians working individually or in groups, can document police abuse, terror, and police aspirants’ vigilante performances. It remains to be seen whether increased counter-surveillance can prevent deaths and harm, or does, instead, the work of broadcasting nationally and internationally the fact of violence, which would otherwise be buried, hidden, or lost.

So far, we have moved at snail’s pace towards practical logical measures that will help manage social spread and facilitate access to care. In lieu of structural changes, we have a swift activation of corporate and state control of medical data, without firm plans or insight on how this data will be folded into extant forms of algorithmic oppression, based on one’s personal metadata. All these active technological interventions are part of the tentacular spread: the ambient style of mental capture, encouraged through algorithmic media, and the normalization of tracing, presented as necessary for public health.

The mismanagement of the crisis made the final push, fully swinging open the door to more technocratic mass control. And yet within these spectacular, swift activations of surveillance infrastructure, it is critical to also recognize how surveillance is slow, how it is introduced in tiny waves, over years. This understanding helps us reframe debates about technological surveillance from one of individual choice and civil liberties alone. Being surveilled or not is not a clear choice, a choice that can be attained through just enough civic engagement or resistance.

As the medical surveillance apparatus activates at scale over the next year, as policing intensifies and—in the future—becomes s more privatized, it is necessary to frame each expansion, each new arm of the surveillance state, as a continuation of historical forms of tracking. Marked, unseen “pre-existing conditions” become reasons to be abandoned by the state. The pandemic, as Elvia Wilk describes, is a form of “slow violence, resulting not only from sudden, invasive “foreign,” nonhuman threats, but also from ongoing, pervasive, systemic power imbalances inside and outside the arbitrary borders we draw around places, people, and concepts.” And within this negotiation of power imbalance, surveillance architectures are the ultimate example of a slow violence which “is hard to identify, hard to describe, and hard to resist” (2020). Understanding how we are surveilled requires we look back at how we have accepted this role over time. The debate over surveillance demands we examine our motives, to interrupt our ownership. De-normalizing the Trace

Two years ago, living in Detroit’s North End, I wrote a small book about the psychological effect of Project Greenlight’s live-feed cameras on the sidewalks and buildings of Detroit as they introduced a new type of paranoia, about the machine eye reading one badly, intervening without one’s knowledge. To wit, this past year, Project Greenlight was, as many speculated, activated for contact tracing to enforce social distancing (“Violating” 2020). Black neighborhoods like North End are hit the hardest, making the area a hot spot for Covid-19 (“Social Distancing” 2020). The logic of surveillance is not ever so surprising; opening the door to surveillance ensures more surveillance. A slow movement is made from the cusp of possibility (a confusion about what such cameras will be used for) to active, hunting, an eye that follows, that notes deviation and punishes. Increased policing, with a layer of specialized cameras, is now justified as essential for keeping the collective safe.

By normalizing surveillance of one another and of communities for health reasons, we have experienced a truly unprecedented turn: a slide into what is acceptable public surveillance, as enforced through consumer technologies. A new chime, a new interface, a new system of tracking.1A day before the protests in Minneapolis took form, Apple joined with Google to introduce a new system update with technical tools to “help combat the virus and save lives.” They assure users that privacy and security are central—with new framework APIs, cryptography and Bluetooth tracing protocols woven into phone updates (Privacy-Preserving, n.d.). These “exposure notifications” would, apparently, be made on millions of phones to “aid in the fight against the pandemic.” They would be made “available to states’ public health agencies, and governments,” to build a host of virus tracing apps—that will help one know whether one has had contact with an infected, positive-testing person (Exposure Notification, n.d.). As of late May, states like North Dakota and South Carolina had already signed up to unroll it statewide (Leswing 2020).

Closing the loop, one sees and sorts the world algorithmically to become part of the sovereign’s dream. Machine vision dominates: the cameras, computers, and screens that see without eyes, enforce within each of us a form of algorithmic seeing. Algorithms, so often described and understood within the field of technological criticism as “not neutral,” meaning, man-made, imbued with human values, human-crafted from dataset up, have their own form of seeing, sorting, “understanding.” We are drawn into their capture without end. The apparatus of surveillance is carried within. Preciado suggests we just need to change “the relationship between our bodies and biovigilant machines of biocontrol,” our phones, the internet, our computers, by “altering” them, or making for a “re-appropriation of bio-political techniques and their pharmacopornographic devices” (2020). He suggests hacking as intervention into what he terms the “de-collectivization and tele-control” of online work. I appreciate this throwback impulse to an old-school activism, but the path to recoding one’s own mind, itself a “bio-vigilant machine of biocontrol,” and an extension of the surveillance state, seems less clear. The tools of phones and computers are but one extension: the operating system of supremacy that goes hand in hand with capture is less easily hackable.

At this moment, calls to redesign or redefine surveillance—in some cases, embracing it as a potential good, or advocating for more “trained” systems for deeper tracking of health, ignores how the current infrastructure of surveillance is working perfectly, just as designed. Surveillance depends on people in power, of any level of power, identifying with the police. And so one might remember the eagerness one’s own community displayed during this crisis, whether that community be privileged or not, supremacist or not, well-intentioned or indifferent or hostile. One might think of how one unwittingly takes on, has long taken on, the promise of policing and surveillance in exchange of civic freedoms. One might hedge against the casually dangerous impulse to embrace tracking and tracing for being inside or outside, and also reserve energy and critique for systems, for governments being wholly unprepared. There isn’t enough space in cultural discourse for assessing one’s own role in continuing state surveillance, in expressing its logics. This kind of examination can help with naming and identifying the desire for techno-authoritarian oversight wherever it takes root.

Crisis, the state of exception, the protests that followed, all held and hold radical potential for a methodical revision of the history of this moment. If one can take the writing of a history as a site of struggle, and especially so within sites of the digital, then the present and future of surveillance must account for the collective, internalized efforts we take to surveil each other and ourselves, algorithmically and in flesh. The coming months and years could see a transformation of the desire to police into a widespread resistance to capture. In refusing to police each other, we can turn back to look at the state, watch it, track and enumerate its many movements, better sense its approach.

Notes

  1. 1. I draw on Zuboff’s (2019) framework and investigation lightly, throughout. Her breakdown of Google as a case study provides keen insight into the methods of obfuscation and opacity that tech- companies use to gather the data and information of its users. But we might also note that Zuboff’s methods keep her from advocating that the economic system that allows Google to thrive should not exist at all. For Zuboff and other scholars, the ethical tragedy of wholesale abuse and theft of citizen data, that she outlines in detail, does not lead them to cast aspersion on capitalism. I take up here how surveillance has proven historically essential to capitalism’s function. At this moment it is critical to frame the Internet’s evolution into the ultimate surveil- lance tool as more than an assemblage of case studies of a few powerful protagonists who shaped platforms.

  2. Grateful to my peer reviewer for bringing me to the first chapter, “Quiet Soundings, The Grammar of Black Futurity,” as a place where Spillers’ essay is powerfully reassessed.

  3. The pandemic seems to be frequently narrated as a suspension of time and a time in which the virus was only seeable when it was historical; it can’t be seen until it registers in the body.

  4. One outcome of the pandemic, mediated in this way, could be that more people will and are inad- vertently training in how models and simulations produce reality, and further, in interpreting and debating their inputs, their assumptions, their lack of clarity and points of necessary revision.

  5. This could be limited to be a socioeconomic designation of those who “benefit” from upholding capitalism. This could seem to imply only those with a great measure of financial or social security within that system. But an aspiration to uphold the state and its economic imperatives is not limited to the wealthy and is a shared ideological goal for many, across many classes.

  6. “Pandemic theater” is a term used often by Dr. Zeynep Tufekci, a sociologist and computer programmer, who has published widely and consistently about the pandemic this year. Tufekci is frequently praised for her astute analysis of pandemic spectacle; she invites the public to become savvier at critically reading coded, veiled messaging from public health and government.

  7. Paul Preciado’s (2020) text was one of a few early and strong theoretical analyses of the politics and technological dimensions of this pandemic within the history of past pandemics.

  8. The history of the Internet’s early days—in which the myth of the Internet as a cyber-cowboy’s new frontier bloomed—is narrated beautifully in Fred Turner’s (2006) From Counterculture to Cyberculture. One might also look back at John Perry Barlow’s (1996) infamous “Declaration of the Independence of Cyberspace,” in which he claimed the Internet was “a world that all may enter without privilege or prejudice accorded by race, economic power, military force, or station of birth ... a world where anyone, anywhere may express his or her beliefs, no matter how singular, without fear of being coerced into silence or conformity.” For a clear, concise history of the Internet’s architecture evolving from open to closed over two decades, please see Benkler (2016).

  9. In Chapter 6, “Spectacular Time,” Guy Debord described spectacular time in part as the time spent consuming images; further, the “social image of the consumption of time is for its part exclusively dominated by leisure time and vacations moments portrayed, like all spectacular commodities, at a distance, and as desirable by definition.” The social image today of collective consumption of time is arguably of the pursuit of images; time is passed consuming spectacular media (1995, 112).

  10. Preciado walks us through Foucault’s conception of biopolitics as foremost a way “to speak of the relationship that power establishes with the social body in modernity. Foucault described the transition from what he calls a sovereign society, in which sovereignty is defined in terms of commanding the ritualization of death, to a ‘disciplinary society,’ which oversees and maximizes the life of populations as a function of national interest.” Preciado asks us to consider how discipline and punishment are enacted in the anesthetized technological theater of quarantine.

  11. As Wendy Chun describes masterfully software updates introduce and obfuscate their content to such a degree, through insidious, opaque dark patterns, that users rarely notice or read—or have the space—to analyze what is being introduced. Over time, this relationship to software, in which users are actively incentivized to scroll past, has made the force of the click and swipe, the need to keep one’s phone moving, a matter of powerful design, working as intended (2016).

notes on contributor

Notes on contributor Nora Nahid Khan is a writer of criticism. She is on the faculty of Rhode Island School of Design, Digital + Media, teaching critical theory, artistic research, writing for artists and designers, and tech- nological criticism. Her most recent book is Seeing, Naming, Knowing (2019), on the impact of machine vision on criticism.

References

References Barlow, John Perry. 1996. “Declaration of the Independence of Cyberspace.” The Electronic Frontier Foundation. https://www.eff.org/cyberspace-independence. Benkler, Yochai. 2016. “Degrees of Freedom, Dimensions of Power.” Daedalus 145 (1): 18–32. Campt, Tina M. 2017. Listening to Images. Durham: Duke University Press. Chen, Adrian. 2020. Twitter, May 30, 2020. Twitter. https://twitter.com/adrianchen/status/ 1266859149612072960. Chun, Wendy Hui Kyong. 2002. “Othering Space.” In The Visual Culture Reader. 2nd ed., edited by Nicholas Mirzoeff, 243–254. New York: Routledge. Chun, Wendy Hui Kyong. 2016. Updating to Remain the Same. Cambridge, MA: MIT Press. Debord, Guy. 1995. The Society of the Spectacle. Brooklyn: Zone Books. “Exposure Notification API launches to support public health agencies.” n.d. Google. https://blog. google/inside-google/company-announcements/apple-google-exposure-notification-api- launches/. Lee, Ron. 2020. “How New Crowd Controls at Some City Parks Worked Out this Weekend.” NY1, May 11. https://www.ny1.com/nyc/all-boroughs/news/2020/05/11/new-social-distance-enforce ment-at-city-parks. Leswing, Kif. 2020. “Three States Will Use Apple-Google Contact Tracing Technology for Virus Tracking Apps.” CNBC, May 20. https://www.cnbc.com/2020/05/20/three-states-commit-to- apple-google-technology-for-virus-tracking-apps.html. NBC News. 2020. Twitter, May 30, 2020. https://twitter.com/NBCNews/status/12667582400182 76352. Mirzoeff, Nicholas. 2002. “Introduction to Part One.” In Visual Culture Reader. 2nd ed., edited by Nicholas Mirzoeff, 161–169. New York: Routledge. Preciado, Paul B. 2020. “Learning from the Virus.” Artforum (May/June). https://www.artforum. com/print/202005/paul-b-preciado-82823. Privacy-Preserving Contact Tracing. n.d. Apple and Google. https://www.apple.com/covid19/ contacttracing. Robbins, Christopher. 2020. “NYPD Makes Arrests for Social Distance Violations as More Officers Call Out Sick.” Gothamist, April 3. https://gothamist.com/news/nypd-makes-arrests-social- distance-violations-more-officers-call-out-sick. Robinson, Kim Stanley. 2020. “The Coronavirus is Rewriting Our Imaginations.” The New Yorker, May 1. https://www.newyorker.com/culture/annals-of-inquiry/the-coronavirus-and-our-future. Speri, Alice. 2020. “NYPD’s Aggressive Policing Risks Spreading the Coronavirus.” The Intercept, April 3. Found at: https://theintercept.com/2020/04/03/nypd-social-distancing- arrests-coronavirus/. Spillers, Hortense. 1987. ““Mama’s Baby, Papa’s Maybe: An American Grammar Book,” in Culture and Countermemory: The “American’ Connection,” Special Issue.” Diacritics 17 (2): 64–81.

384 N. N. Khan “Social Distancing in Black and white neighborhoods in Detroit.” 2020. Brookings Institution, May. https://www.brookings.edu/blog/fixgov/2020/05/19/social-distancing-in-black-and-white- neighborhoods-in-detroit-a-data-driven-look-at-vulnerable-communities/. Southall, Ashley. 2020. “Scrutiny of Social Distance Policing as 35 of 40 Arrested Are Black.” New York Times, May 29. https://www.nytimes.com/2020/05/07/nyregion/nypd-social- distancing-race-coronavirus.html. Toyama, Kentaro. 2020. “Violating Michigan Social Distancing orders? Big Brother may be Watching.” Bridge Michigan, May 5. https://www.bridgemi.com/michigan-government/ violating-michigan-social-distancing-orders-big-brother-may-be-watching. Tufekci, Zeynep. 2020. “Keep the Parks Open.” The Atlantic, April 7. https://www.theatlantic.com/ health/archive/2020/04/closing-parks-ineffective-pandemic-theater/609580/. Turner, Fred. 2006. From Counterculture to Cyberculture. Chicago: University of Chicago Press. “Violating Michigan social distancing orders? Big Brother may be watching.” 2020. Bridge, May 5. https://www.bridgemi.com/michigan-government/violating-michigan-social-distancing-orders- big-brother-may-be-watching. Wilk, Elvia. 2020. “What’s Happening? Or: How to Name a Disaster.” Bookforum. May 26. https:// www.bookforum.com/print/2702/what-s-happening-24019. Zuboff, Shoshanna. 2019. The Age of Surveillance Capitalism. The Fight for a Human Future at the New Frontier of Power. New York: PublicAffairs.

Women & Performance