Algorithmic Pollution: Artists Working with Dataveillance and Societies of Control

Lisa Moren

Professor of Visual Art and Graduate Program Director of Intermedia + Digital Art, MFA Program at the University of Maryland, Baltimore County [UMBC]

The purpose of Algorithmic Pollution: Artists Working with Dataveillance and Societies of Control came from the Washington Projects for the Arts exhibition in Washington DC, entitled “CYBER IN SECURITIES”, August 30-September 27, 2013. The exhibition was curated just prior to the former CIA employee and government contractor, Edward Snowden’s controversial leaking of US security documents to the Guardian newspaper in June of 2013 and exhibited in September 2013. The leaked documents set off worldwide insecurities about data, including growing questions about how much is out there, the nature of government and corporate relationships to meta-data and private data, and how are our lives are shaped by these matters. Such questions are especially unnerving in an era of algorithms and code’s inherent lack of transparency. This essay will look at artists who use types of dataveillance in order to creatively control their own social path.

This essay is also an examination into the effect “dataveillance” has on sorting people demographically [including racially], and its potential effect on human behavior. This will be accomplished through descriptions of several artists’ projects where data, surveillance, or dataveillance [an amalgam of the two] infiltrates human bodies and landscapes where humans interact. Many of these artworks are from the exhibition “CYBER IN SECURITIES.” For instance, the surveillance artist and programmer David Rokeby coined the phrase “algorithmic pollution” to suggest the importance of invisible information as matter that effects public behavior from both a physiological and psychological view. His practice raises the question of how much control programmers are given in a society where dataveillance is the method of control? In order to tackle this question, this essay employs Michel Foucault’s famous theorization of prison inmate control through strategies of discipline, a process that trains bodies and minds to conform in predictable behaviors through enclosed spaces. This theory will be applied to algorithms and data systems of control in order to explain the more nomadic behavior of contemporary societies of control described by Gilles Deleuze. Under this conceptualization, the primary function of controlling behavior is not through mechanizing enclosed spaces, as in a factory, but through controlling open systems and networks primarily provided by the internet and mobile phones or computers.

Above: Ricarda McDonald and Donna Szoke, “and all watched over by machines of loving grace”, Version 2: January 2012; (Version 1.5: Sept 2004), Interactive video installation, two flat screen monitors, computer, custom software, Kinect sensor, hdmi splitter, hdmi cables, 2 32” monitors, dimensions variable.  Below: Lexie Mountain “BALL HARD” Still of Dennis Williams performing at the opening of Cyber InSecurities, Pepco Edison Place Gallery, Washington Project for the Art’s Experimental Media. August 30, 2013. Image courtesy of Tim Nohe.

Above: Ricarda McDonald and Donna Szoke, “and all watched over by machines of loving grace”, Version 2: January 2012; (Version 1.5: Sept 2004), Interactive video installation, two flat screen monitors, computer, custom software, Kinect sensor, hdmi splitter, hdmi cables, 2 32” monitors, dimensions variable.
Below: Lexie Mountain “BALL HARD” Still of Dennis Williams performing at the opening of Cyber InSecurities, Pepco Edison Place Gallery, Washington Project for the Art’s Experimental Media. August 30, 2013. Image courtesy of Tim Nohe.

Foucault’s interpretation of Jeremy Bentham’s panopticon prison architecture is useful when describing the shift from the mechanics that architectures of power provide to the apparatus that software uses in order to control societies. The panopticon describes strict modes of control where controller and controlled, the surveyor and surveyed, the all-knowing and seeing and the one gazed upon require only that the one gazed upon have a hint that they’re being watched at any given time. With such hints, the institutional apparatus meets its objective of disciplined behavior. When the controller has fully intimidated the subject into believing there is no privacy, free will or hope for transgression or resistance, then the inmate being watched will modify their behavior to conform to the will of their oppressor. [1] However, this requires the subject to believe that a power is always watching them, for “Power should be visible and unverifiable.” [2] The objective of the panopticon is total subjectification of the one being controlled, perhaps especially, his or her behavior. George Orwell’s surveillance novel 1984 describes subjectification of the gaze when he says “You had to live — did live, from habit that became instinct — in the assumption that every sound you made was overheard and, except in darkness, every moment scrutinized.” [3] Here, the goal of total subjectification is to modify behavior down to the instinctive habits and unconscious behavior of everyday life. In this spirit, Donna Szoke’s artwork “and all watched over by machines of loving grace” is a humorous interactive video installation constructed by the people viewing the dual monitor display. Two monitors display a pair of extreme close-up eyes that gaze on visitors crossing its Orwellian pair of eyes reminiscent of an old painting uncomfortably tracking passersby.

Deleuze proposes that Foucault’s notion of discipline on all of life’s institutions doesn’t acknowledge the transitions between the institutions of work, healthcare, play, home, and vacation. However, in contrast, Deleuze proposes that the varying travel between these institutions has become a more significant part of contemporary existence. [4] In order to maintain power, it’s necessary for newer societies of control to replace Foucault’s disciplinary societies by controlling spaces once considered free from observation and traceability such as the car or highway. The newer societies of control understand that in addition to a seeing machine, the panopticon enclosures are also part of a categorizing machine, offering cells that are modular and can be sorted and arranged. [5] Therefore, in order to maintain power, it is necessary to create categories that will transform the “irrationality of the general population” [6] into some kind of cells, data cells or enclosed categories. Data collection institutions from search engines to marketing agencies are able to do this by collecting various and types of data that can be ‘diagnosed’ and sorted into categories that act like less permeable cells or categories from phone numbers to email subject headings. Once cells of behavior are labeled, or calcified in the form of data cells, control through code and algorithms may be better implemented.

Deleuze describes the shift in cells from a disciplined prison space to a controlled public space:

“The different internments or spaces of enclosure through which the individual passes are independent variables … the different control mechanism are inseparable variations, forming a system of variable geometry the language of which is numerical [which doesn’t necessary mean binary]. Enclosures are molds, distinct castings, but controls are a modulation, like a self-deforming cast…” [7]

Deleuze describes a parametric society where the focus of subjectification lies not inside the traditional institutions but inside a central power designed to regulate more nomadic activity between institutions. That means what was once an architecture of fixed enclosure is now a system that is open, varied, and modular. Parametrics by nature are easily modulated by authors of the system. In societies of control “…what is important is no longer either a picture or a number, but a code: the code is a password” and that passwords serves “for gaining access to social locations.” [8] For example, key codes managed by an administrative power may grant or delete access without any change to the hard key or institutional access point. In addition to creating or denying access to institutional enclosures, codes are informed by digital bread crumbs, data-trails and data-mining techniques from “cookies” to bots and webcrawlers mining everything from on-line shopping and Google searches.

Anthropomorphic surveillance is a similar tool in forensics that penetrates the surface of our bodily image into our biological beings and translates that penetration into code. CCTVs lenses, communication technologies and social networking surveillance are compatible with data and can be manipulated, filtered, traced and cross-referenced through sensors, biometrics, chemical profiling and DNA surveillance. Societies of control use these methods along with the power of algorithms, in order to maintain the broadest power. The meta-data analysis being done by scholar Lev Manovich (and others), reaches back to collection systems similar to the evolving U.S. government’s “mail cover” program designed to spy on domestic communists. Early iterations of this program in the 1950s asserted that it was legal to read the “to” and “from” area of a postal letter because such information was already publicly displayed on an envelope. [9] This program has evolved to include secret courts, the U.S. Patriot Act, and citizen-targeted surveillance intended for foreigners and war enemies.

DATA IS EVERYWHERE

Sheldon Brown “Video Wind Chimes” Computer Graphic rendition of installation using four video projectors, electronic controls, aluminum, plastic. Yerba Buena Center for the Arts in San Francisco, CA. 1994. Image courtesy of the artist.

Sheldon Brown “Video Wind Chimes” Computer Graphic rendition of installation using four video projectors, electronic controls, aluminum, plastic. Yerba Buena Center for the Arts in San Francisco, CA. 1994. Image courtesy of the artist.

Artists have been using media to explore the concept of information infiltrating our public spaces since the early 1990s. During this period, Sheldon Brown produced “Video Wind Chimes” as part of a public intervention work at Yerba Buena Center for the Arts in San Francisco. The project scans the random electromagnetic fields and projects whatever information the waves are carrying onto the ground. In 1994 projected images were mostly ads from live television, a manifestation of both the wind and its data. [10] Sheldon sought to install “Video Wind Chimes” in a remote part of California’s highest elevated peak, thus pointing out that even though there are places where we believe we perceive our experiences with nature as pure, even perhaps sublime, data is everywhere.

The pervasiveness of surveillance has never been more central to our culture and appears to be growing exponentially. Consider the recent military-designed wide-area surveillance live-feed system originally called “Angel Fire” and used in Fallujah. In 2015-16 the system had been secretly capturing an aerial shot of one-third (32 square miles) of Baltimore City each second surveillance aircraft flies overhead. The program cost Baltimore City Police Department $2 million dollars per year and for nine months the Mayor or City Council wasn’t even aware it was happening. [11] The chief engineer of the project calls it“talls Earth with Tivo capabilities.” [12] However, this deployment of military surveillance used on an entire U.S. city is only 12 years after the first city-wide surveillance system, called “City Watch,” was installed in Liverpool, UK. At the time, City Watch was larger than any public surveillance system in the world. It was so controversial that the City of Liverpool allowed any citizen to request a copy of its footage within 31 days. The point of such transparent surveillance implementation, later accompanied by a commitment to destroy captured footage after a month, was to ease the public imagination of any dark meaning behind the 242 closed-circuit televisions strewn across the city streets.

Jill Magid “Evidence Locker” 2004. Released CCTV footage filmed by City Watch controllers and edited by the artist. Video projection Police Log #2887, Audio CD, 23 min. Commissioned for the Liverpool Biennial International, 04. Image courtesy of the artist.

Jill Magid “Evidence Locker” 2004. Released CCTV footage filmed by City Watch controllers and edited by the artist. Video projection Police Log #2887, Audio CD, 23 min. Commissioned for the Liverpool Biennial International, 04. Image courtesy of the artist.

In 2004, the New York-based artist Jill Magid spent a month in Liverpool creating two works that playfully subverted the CCTV footage. “Evidence Locker” was sourced from 12-and-a-half hours of footage the artist obtained of herself performing as a heroine. She named the Police Department as the film’s crew and director of the work. She wore a red rain coat that could easily be spotted and tediously submitted 31 Access Request Forms each detailing when, where, and how an incident occurred before her request of the video was granted. Magid cleverly wrote the reports as letters to a lover that became a second work “One Cycle of Memory in the City of L.” [13] Magid modulated societies of control herself when she took control of the surveillance system the Police authorities were monitoring, and told her own devised story. However, once the citizens of Liverpool were accustomed to the idea of 24|7 CCTV cameras, public access to the footage was cut-off, prefiguring the rationale of Baltimore’s Police Department to not even inform the Mayor of its new tool. This represents a striking shift in surveillance normalcy in just twelve years.

The Force of Freedom with Dave Young “Telewar”, 2013, Book, 8 ½” x 5 ½”. Image courtesy of Lisa Moren.

The Force of Freedom with Dave Young “Telewar”, 2013, Book, 8 ½” x 5 ½”. Image courtesy of Lisa Moren.

The Dutch artist team, Force of Freedom, followed another secret program, the U.S. Drone program, and self-published their so-called “telewar.” They also created the humorous video “Lexicon of a Drone War” that cleverly uses the alphabet in order to display and label the semi-secret government jargon surrounding drones.

Chris Csikszentmihalyi “hunter hunter” Steel, motors, 68HC11 microcontroller, sensor and servos. 1992. Image courtesy of the artist.

Chris Csikszentmihalyi “hunter hunter” Steel, motors, 68HC11 microcontroller, sensor and servos. 1992. Image courtesy of the artist.

The users of “Angel Fire” have hopefully not seen artist Chris Csikszentmihalyi’s “hunter hunter” project, an imaginary tool of the future that was actually created in 1992 in San Diego. Csikszentmihalyi reacted to that city being the first to implement speed cameras when automatic ticketing was first instituted. “hunter hunter” is a small micro-controlled robot that senses the frequency, or pitch, of a 9mm bullet. When it detects the sound and it’s spatial relationship, the robot wobbles in the direction of the sound and shoots a 9mm bullet back. The project was installed in a Chicago gallery without incident.

Since 1953 scientists have been on a path to sequence both biology and computers into unified data hybrid known today as A,T,G and C in DNA sequencing. [14] Dataveillance infiltrates human bodies when information artist Heather Dewey-Hagborg united this path by merging her own biometric software program with the digitally compatible part of the DNA she collected and extracted in order to create her DNA surveillance work “Stranger Visions” in 2013.

Heather Dewey-Hagborg “Stranger Visions” 2012-2013, 3D prints, wood, photographs, paper, and sample material, Four face masks, each 7”x10”x8”, four sample boxes, each 3 ¾”x 11 7/8” x 9”. Image courtesy of Lisa Moren.

Heather Dewey-Hagborg “Stranger Visions” 2012-2013, 3D prints, wood, photographs, paper, and sample material, Four face masks, each 7”x10”x8”, four sample boxes, each 3 ¾”x 11 7/8” x 9”. Image courtesy of Lisa Moren.

Dewey-Hagborg collected human remains, hair, nails, gum, cigarette butts, etc. that have been discarded in public places. After laboriously extracting the DNA from a DIY Brooklyn based biotech lab (GENSPACE), she merged the significant information about the owner’s race, gender, eye color, tendency for obesity and other traits that she learned from the genetic code with her biometric software. From this amalgamation she was able to produce a 3D model and rendered the output to a 3D printer. The project displays four anonymous masks along with the original detritus that produced the 3D printed masks and information about its original location. Although masks are always about 25 years of age, questions about the accuracy of unknown portraits may be alleviated when Dewey-Hagborg confronted NPR’s Studio 360 host, Kurt Andersen, with a mask created from his DNA, he reacted: “It’s like finding a brother of me and saying ‘I can see some of me in there.’” [15] The artist intends the work to be a call-to-action for the potential uses and abuses of genetic surveillance in public places. The artist suggests that if she can figure how to create a relatively acceptable portrait from gum found on the streets using publicly available DIY methods, then large-scale DNA surveillance by governments and corporations surely must be on the horizon. [16]

The work of Sheldon Brown, Jill Magid, Chris Csikszentmihalyi and Heather Dewey-Hagborg are artists who consider the increasing the infiltration of dataveillance to our bodies in our environment, and/or, the collection of that data. These artists aren’t using art tools, but sanctified data-systems and technologies employed by large governments, corporations and their labs. They’re not only taking authorship of societal control methods that feel out of anyone’s individual control, but in Sheldon and Heather’s case, they’re expressing human capacity to manipulate these methods outside the goals of industry.

Owen Mundy, Birgit Bachler, Walter Langelaar and Tim Schwarz Commodify.us. 2013. Custom software and archival inkjet prints, printed matter. Four prints: 39”x13”, 77”x13”, 71”x13”, 43”x13”. Image courtesy of Lisa Moren.

Owen Mundy, Birgit Bachler, Walter Langelaar and Tim Schwarz Commodify.us. 2013. Custom software and archival inkjet prints, printed matter. Four prints: 39”x13”, 77”x13”, 71”x13”, 43”x13”. Image courtesy of Lisa Moren.

Owen Mundy, Birgit Bachler, Walter Langelaar and Tim Schwarz Commodify.us. (detail) 2013. Custom software and archival inkjet prints, printed matter. Image courtesy of Lisa Moren.

Owen Mundy, Birgit Bachler, Walter Langelaar and Tim Schwarz Commodify.us. (detail) 2013. Custom software and archival inkjet prints, printed matter. Image courtesy of Lisa Moren.

While most meta-data and related algorithms appear anonymous, artists will look for a means to put societies of control in check when personal identity is compromised. The artist collaboration team Owen Mundy, Birgit Bachler, Walter Langelaar and Tim Schwarz openly display data that is often a hidden digital trail of a person’s portrait in their work “Commodify.us.” “Commodify.us” dismisses the legitimacy of ‘mail cover’ activity that’s grossly expanded in the digital age as “the information associated with communication is often more significant than communication itself” [17] This work describes how Facebook users can trace their own Facebook ‘mail cover’ in order to uncover ones’ communication trends. The artists’ highlight the trends of marketers using the legal mechanisms associated with ‘mail cover’ now being used to sell your information to data-collecting marketing companies such as Acxiom Corp. who will aggregate your data with others as a marketing service. Acxiom is the largest international data-marketing corporation that holds scads of personal information on over 500 million people worldwide. [18] The artists went further than pointing out how participants can trace available data, they created a system where participants can learn how to earn money from their own data, demographic and social network habits, cleverly perhaps, before the marketing companies do it.

Preemptive Media (Beatriz da Costa, Brook Singer and Jaime Schulte) Swipe. 2003-2013. Custom software, monitor, tablet, receipt printer and website, dimensions variable. Image courtesy of Lisa Moren.

Preemptive Media (Beatriz da Costa, Brook Singer and Jaime Schulte) Swipe. 2003-2013. Custom software, monitor, tablet, receipt printer and website, dimensions variable. Image courtesy of Lisa Moren.

The artist team Preemptive Media (Beatriz da Costa, Brook Singer and Jaime Schulte) also work with marketing strategies of passwords, codes and personal information in order to subvert their data-mining powers. The artist collective emphasizes that the drive to mine data in the first place is commercial gain. The fact that corporations allow the government to tap into their ingenuity is a regrettable bonus. In Preemptive Media’s project Swipe (2003-2013) the artists code their own webcrawlers that capture a live data trail of exhibition visitors. Attendees are requested to scan their driver’s license at the door, early iterations used the magnetic strip while later installations used laser scans. In a few moments the attendee receives a receipt offering admission to the exhibition, an alcoholic beverage, and a textual portrait with instantly mined data that may include their race, gender, property value, income, body fat, and other seemingly private information in a pubic setting. Aggregate portrait information of exhibition attendees were summarized and updated live outside the gallery, and displayed on an exterior monitor facing the street of the exhibition space for passersby to probe the socio-economic portrait and other demographics of the exhibition culture. The irony that the exhibition space, Pepco Edison Place Gallery, is across from the National Portrait Gallery in Washington DC is not lost on this meta-portrait project either. Inside the gallery, the artists make an offer in the spirit of a free credit score where a kiosk will allow a deeper penetration and revelation of one’s on-line data portrait. The climax of the game is distributed throughout the network apparatus of the artwork inside, outside and the data moving virtually throughout the space. The viewer may be gratified upon one pleasurable encounter of Preemptive Media services, while experiencing horror or personal fear at another revealing node in the work (such as reading their probable income or body fat). Although creating pleasure mimics the skill of their marketing counterparts (Acxiom), fear however, is the subversive moment of the closing narrative arc presented by Swipe, and fear is the antithesis of what Acxiom chooses to deliver.

“Commodify.us” and Preemptive Media assert that data, and therefore the society of control, is potentially everywhere. The data is also personal and profitable. They also analyze who is increasing the value of personal data and how are they exploiting that value. Therefore the marketing technologies progressing such goals will invest in the algorithms that make personal exploitation of data more and more robust. These media activist artist use original code, low-fi and off-the-shelf medias to re-center some of the power they feel has been taken from them by the corporations modulating to much control over their lives.

CAN ALGORITHMS ANTICIPATE THOUGHT?

Can we anticipate thought by reading action? With a goal of cutting down on unnecessary clicking, Google’s search engine has evolved from a search engine that gives simple information “to a knowledge engine, where they can rank content according to intent rather than straight keyword matching.” [19] Using semantic sorting features such as “knowledge graphs” and “google instant,” Google attempts to anticipate users’ intentions through an autocorrect feature. [20] This knowledge engine was at play when the artist Julia Kim Smith Googled herself in 2013 in order to produce her one-minute video loop and exhibited still images entitled “Why?” Before Kim Smith finished typing her thought “Why do asian women” the knowledge engine kicked-in anticipating her possible conclusions: “like black men”, “age well” “wear masks” and so on. In addition to the one-minute loop, Kim Smith mounted prints from Google screen captures that listed the search results. These disturbing results lie partially in the fact that there’s no human intervention in the algorithm at Google as there is when they censor pornographic and violent words. Algorithms are the mathematical rules, perhaps the formulated passcodes that govern any software like a set of instructions, or rules, that are similar to following recipe: set the oven temperature, pour flour, sugar into a bowl, stir, whip milk for 3minutes, all in a precise order, with measurements, timing, lists, etc. An algorithm therefore may be simple or complex; it can employ metadata and word searches as part of a common list of parallel procedures. An algorithm is arguably the main dataveillance tool used by societies of control. According to the algorithms applied to Google Instant, a common denominator of people are being tracked when they believe they are anonymously asking a search engine questions. Therefore a demographic majority is asking these questions about Asian women. Google’s algorithm uses a majority of past questions to justify anticipating users’ intentions as they type. But if the aggregate data reflects bias, especially when its’ algorithms are applied to a minority, the presumed objectivity inherent in the “unmanned” algorithm sanctifies the anticipated result as simply true. We’re left with an unnerving question: just because an algorithm said it, does the action of the masses constitute truth?

Julia Kim Smith “Why?” 2013. 24 archival digital prints, 4”x4” each. Image courtesy of Lisa Moren.

Julia Kim Smith “Why?” 2013. 24 archival digital prints, 4”x4” each. Image courtesy of Lisa Moren.

Julia Kim Smith “Why?” 2013. 24 archival digital prints, 4”x4” each (detail). Image courtesy of Lisa Moren.

Julia Kim Smith “Why?” 2013. 24 archival digital prints, 4”x4” each (detail). Image courtesy of Lisa Moren.

Julia Kim Smith “Why?” 2013. Monitor, one-minute quicktime loop. Image courtesy of Lisa Moren.

Julia Kim Smith “Why?” 2013. Monitor, one-minute quicktime loop. Image courtesy of Lisa Moren.

Algorithms are a benefit when they free up time in order to do tedious tasks, as applied in the applications that will scan your email, calendar, or GPS location and inform you that traffic may make you late for an appointment. [21] Societies of control have allowed the one surveyed to become more and more comfortable having an apparent non-bias approach that an algorithm offers to make decisions on behalf of the “average person.” However Julia Kim Smith’s “Why?” gives an ordinary, yet emotional critique of the growing concerns around algorithms that use the majority opinion, or habits, of the masses as a basis for auto-access to information. “Why?” continued the experiment with Black Men, Gay Men, Christian Women, Muslim Men, etc, where the results more urgently point to the fact that under algorithmic systems of regeneration and self-reference, any minority viewpoint will be disproportionately and even exponentially silenced.

DO ALGORITHMS SORT PEOPLE?

Over time, auto-formulas, including algorithms, eventually generate the same pattern or command. The result is a type of predictable moiré pattern, or a code that lacks differentiation, plummetting into an infinite mirror-like system. This is what happened when, in an absence of human oversight, algorithms allegedly caused the stock market to fall a thousand points in a few minutes creating the ‘flash crash’ of Spring 2010. [22] The lesson here was that algorithms need to be rewritten every couple of weeks, have human intervention, or preferably both. Simply placing a hand or an object in the space between infinity mirrors will break the loop by creating a different pattern, or in this case, allow the stock market to do something other than auto-sell on a massive scale. Moving the mirrors slightly will do the same thing, but if any of these are done in a routine, or predictable pattern, the loop becomes determinant again. These systems require complexity. Like the minority voice, creative input or a resistance movement, complexity will keep a loop from becoming infinite, incestual and ineffective. Similar algorithms that caused the flash crash are employed by government agencies, including the police and FBI who used tactics such as CRUSH, where the primary objective is to anticipate future crimes based on past activities. Because of inherent profiling within these algorithms, the results were disproportionate arrests within minority communities producing what David Lyon calls social sorting, a form of second-degree racism. [23] The African-American community calls this phenomenon systematic racism. But Deleuzes’s society of control may explain why social sorting works so well is precisely because it can be modulated, or interpreted by authorities alone, the codes “like a self deforming cast will continuously change from one moment to the other, or like a sieve whose mesh will transmute from point to point…” [pp4]

When referring to social sorting at airports Zygmunt Bauman states: “The dataset is an instrument of selection, separation and exclusion. It keeps the globals in the sieve and washes out the locals.” [24] The intention to distinguish tourists from drifters at national borders is well known by the Bangladesh born, New York City raised artist, Hasan Elahi who willingly began posting his personal daily activities long before the social media phenomena. However the NSA, DHS, CIA and Executive Office of the President of the United States have all visited the artist’s site over the years. It all began on a return trip from the Netherlands on June 19th, 2002 when the artist was sequestered for questioning at the Detroit Airport by DHS officials presumably by social sorting. “Who were you with… Why were you there?” but then questions become more strange, “Where were you on September 12th 2001… September 10th… August 30th (and so on)… What is in your storage unit in Florida… are there explosives?” Elahi surprised them by pulling out his PDA and giving detailed answers about exactly what he was doing, the content of his meetings, and the people he was with. After six months of multiple, intense interrogations by the FBI, and nine consecutive polygraphs, the FBI confirmed his innocence and suggested that he inform them whenever he’s traveling abroad in order to avoid further detentions (it’s hard to get off a watch list data-base that been widely cross-referenced).

Elahi took this advice seriously and dutifully gave flight information in advance. This expanded to include images, and instead of becoming less detailed, his self-scrutiny became more detailed. It became more detailed until it evolved into Elahi’s signature project “Tracking Transience”, a database he now updates from his phone controlling detailed documentations about his private life, including his 24/7 GPS location with maps, images, what he eats where he sleeps, etc. This public exposé defies the inherent value of Federal authorities where such fragmentary and concrete information is their greatest commodity. It’s a commodity that Elahi cleverly renders useless through his act of making it freely available.

Hasan Elahi. Installation at “Cyber InSecurities” at Pepco Edison Place Gallery, 702 8th Street, NWW, Washington, DC. Right to left “Tracking Transcience” website: www.trackingtransience.net (2003-present) “undisclosed location” 2013, chromogenic print, 60”x72”;“Hawkeye” 2013, chromogenic print, 60”x72.” Image courtesy of Lisa Moren.

Hasan Elahi. Installation at “Cyber InSecurities” at Pepco Edison Place Gallery, 702 8th Street, NWW, Washington, DC. Right to left “Tracking Transcience” website: www.trackingtransience.net (2003-present) “undisclosed location” 2013, chromogenic print, 60”x72”;“Hawkeye” 2013, chromogenic print, 60”x72.” Image courtesy of Lisa Moren.

In addition to the reverse surveillance project “Tracking Transience” (available through a tablet or at: trackingtransience.net), Elahi exhibited “Hawkeye” a 2012 photograph of the AT&T building on 611 Folsom Street in San Francisco. Elahi documented the building containing the now infamous National Security Agency (NSA) secret room that intercepts and copies 10% of the US internet traffic coming into the country from Asia for the NSA’s spying activities on US citizens, one year before Edward Snowden leaked it to the press. The fact that an artist made a project on government spying activities prior to the Snowden revelations, demonstrates the media’s inability to capture facts, or the public’s inability to appreciate them. A documentary of the spy program was available, but the press was only able to stir public imagination on a grand-scale by reframing the facts from a leakers’ persona, rather than the investigations of objective reporting that publically aired in 2007. [25] “Hawkeye” will be exhibited as a digital image next to “Undisclosed Location”, another reverse surveillance piece of Elahi’s own investigation browsing Google Maps. Through curiosity and intensive research, he found that an aerial view that was blocked as an “undisclosed location” by Google was actually former Vice President Dick Cheney’s home. Elahi bluntly reveals the inequities behind social sorting, where one social class is widely photographed and secretly traced, and another class is exempt through the sorting process, not being detained, or not charged if one is a “good boy having fun,” “a good swimmer,” having another safety net, or the prestige and power that comes with being exempt from having your home erased from Google Maps. [26,27]

Algorithms mimic human systems that cast a wide net defining crime, followed by sorting out the desirables and letting them pass through, while leaving behind underclasses and often, minorities. [28] In addition to such second-degree and systematic racism, social sorting is escalated when combined in the data-mining world of algorithmic errors. The consequences of errors and excessive police violence used on citizens are at the core of the artist team, Channel TWo (CH2), in their site-specific project “PolyCopRiotNode_DC.” Inspired by narratives reported from sources such as the Associated Press, “PolyCopRiotNode_DC” specifies incidents where police and government officials have coordinated excessive force on unarmed civilians in the privacy of their homes in Washington DC. Examples of excessive force in the Washington DC area includes multiple pets who were shot and killed in their homes, thousands of dollars of damage to private property, and an unarmed man who became paralyzed because a Police Corporal shot him in the back damaging his spine. To illustrate the geographic view of these reports, the artists created an augmented reality app that uses the built-in GPS and camera on a smartphone or tablet. When holding up a phone or tablet, the “PolyCopRiotNode_DC” will generate an animation of a PolyCop layered over the ordinary camera view. The viewer can follow the augmented reality until they arrive where the “home invasion” allegedly occurred; the closer the viewer is physically to the node (street intersection) the more massive PolyCop becomes.

Channel Two (CH2): Jessica Westbrook and Adam Trowbridge with Jesus Duran, “PolyCopRiotNode_DC” 2013, Custom software and augmented reality app for smartphones and tablets. Street scene outside Pepco Edison Place Gallery, 702 8th Street, NWW, Washington, DC. September 2013. Image courtesy of Lisa Moren.

Channel Two (CH2): Jessica Westbrook and Adam Trowbridge with Jesus Duran, “PolyCopRiotNode_DC” 2013, Custom software and augmented reality app for smartphones and tablets. Street scene outside Pepco Edison Place Gallery, 702 8th Street, NWW, Washington, DC. September 2013. Image courtesy of Lisa Moren.

This project predates “Pokémon Go” and without a glint of that product’s nostalgia. The digital memory in “PolyCopRiotNode” appears to be less of a data collection system than a deployment of actors revolving around data. Although PolyCops may be added to countless known nodes (CH2) merely anticipated the explosion of citizens controlling the media coverage of excessive police violence, including murders, against innocent citizens. The most notorious examples include when in 2014 Ramsey Orta used his cell phone to document the murder of Eric Garner, in Staten Island; within the same month the twelve year old Tamir Rice murder was captured on surveillance in Cleveland OH; In 2016, Diamond Reynolds live-streamed the death of Philando Castille on Facebook in Falcon Heights, Minnesota; and multiple by-standers recorded the death of Alton Sterling in Baton Rouge LA by Police. These extreme cases of social sorting point to a reverse surveillance by ordinary citizens who have access to the technologies typically only used, and modulated by, the societies of control.

The performance and installation artist, Lexie Mountain, focuses the specific aspect of social sorting that inverts the cultural currency of “security.” She points to the irony of the rights and freedoms of certain groups of Americans to be secure while another population, disproportionately populations of color, live in terror and insecurity. She sites Eric Holder’s popular statistic that the prison population has risen 800% since 1980. [29]

Lexie Mountain “BALL HARD” Performance still Dennis Williams, Joseph Mitchell and William Hubbard performing at the opening of Cyber InSecurities, Pepco Edison Place Gallery, Washington Project for the Art’s Experimental Media. August 30, 2013. Image courtesy of Tim Nohe.

Lexie Mountain “BALL HARD” Performance still
Dennis Williams, Joseph Mitchell and William Hubbard performing at the opening of Cyber InSecurities, Pepco Edison Place Gallery, Washington Project for the Art’s Experimental Media. August 30, 2013. Image courtesy of Tim Nohe.

Her project “Ball Hard” at Cyber InSecurities begins with the ironic insecurity often felt when encountering someone sporting a black t-shirt with white block letters that reads “SECURITY.” In her performance “Ball Hard” three African-American security guards, Dennis Williams, Joseph Mitchell and William Hubbard (who are actual security guards from a mall called “Security Mall” in Baltimore) perform at the opening of Cyber InSecurities incognito. At a pre-determined moment, the performers crack their assignment and collapse to the floor, moaning and forming a ball-like display in a pseudo-verbal expression of their personal insecurities. One of the performers doesn’t feel comfortable falling but rather blurts out his insecurities. Then two other performers help the third stand up while gallery attendees look on dismayed. Lexie’s reverse surveillance, where the mostly White crowd at a D.C. opening probably believed these men were actually security guards on duty, flipped the scene and became an alarming spectacle that unsorted the social order of the gallery. In naming the project, Lexie refers to Ravens linebacker Terrel Suggs remark about “Ball So Hard University” and similar phrases by Jay-Z and Kanye West, [30] Mountain says she was initially “thinking of the homonym ‘bawl hard’ and how access to personal security and stability is limited culturally, legally and institutionally to various social groups.” [31] Similar to the emotional arc of Preemptive Media’s “Swipe”, the emotional arc in “Ball Hard” inverts the “security guards” from the invisible guard, to be part of the esteemed artwork adorning the walls and the uncomfortable center of social control.

Using the glitch, rhizomatic and sfumato surveillance aesthetic, Lexie created an additional eight-foot tall glitch portrait of one the security guards. The print was showcased in the exterior window (next to the demographic of the exhibition attendees). The low-fi, data-enhanced artifact with CCTV bits and matter in the image — a phenomenon gone viral after 9/11 and popularized in reality tv — makes the error, the raw, and what some perceive as the real electronics visible. It’s the society of control’s image of itself in all its flaws and honest language. But like the glitch image, the sfumato surveillance footage is a jittery fragment of the original event, the audience is left to imagine that the physical presence between glitch and image holds a mistake that’s not likely to be what it appears.

Lexie Mountain “BALL HARD” permanent c-print of William Hubbard, 44” x 86”. 2011. Image courtesy of the artist.

Lexie Mountain “BALL HARD” permanent c-print of William Hubbard, 44” x 86”. 2011. Image courtesy of the artist.

The glitch lives as an interesting manifestation of a flaw, the digital document that the image is actually concrete object of electronic waves, bits and light projections. It can be romanticized in stylistic low-fi or politicized. But for Lexie and (CH2), while the glitch is technically an error, it points to a spatial question, a fleeting moment where technology technically fails, but may bring the viewer closer to what Henri Bergson calls the “mobility of words.” Here the words are the technology, where Bergson is distinguishing between how the intellect separates our linguistic thoughts from the instinctual concrete objects in our attention. The goal of the “mobility of words” is to string together multiple concrete things in order to build a world of ideas. In 1907 Bergson’s world of ideas referenced the virtual objects of geometry that further conceptualized his symbolic theory and “extends further and further the knowledge of the external properties of solids.” [32] Like the “mobility of words,” virtual solids become not their concrete original, but a reflection of our idea, our fleeting moment defining its “thingness,” solidifying it and melding it with other “thingness’s.” While this emerging semiotic process appears liberating in both concept and content, Bergson also suggests that if symbolic language were to be used exclusively “it could only lead… to some other mode of analyzing of the living being, and so to a new discontinuity—although less removed, perhaps, from the real continuity of life.” [33] His “continuity of life” is the organic, unfixed complexity of nature that’s unknowable in its entirety, but is missing in fleeting descriptions of language that inevitably codifies objects to an over-simplistic word or concept. McLuhan describes Bergsons notion of the “mobility of words” and “virtual solids” firmly as a technology when he says “Language does for intelligence what the wheel does for the feet and the body. It enables them to move from thing to thing with greater ease and speed and ever less involvement. Language extends and amplifies man but it also divides his faculties.”

Data cells within societies of control may also be considered a technological platform that’s able to move from ‘thing to thing’ with greater ease and speed but ever less involvement. For instance, fragments of ones’ private life may give a fleeting fact without documenting the more complex “continuities of life” including purpose, intentions and other more challenging determining factors. While the latter factors are less evidenced in most surveillance systems, they may ultimately play a significant role in naming a persons’ innocence or guilt. If we believe that an algorithmic recipe is akin to Bergson’s “mobility of words” then we may consider his description of what may happen when a system exclusively uses those rules as its sole advancing mechanism: “it could only lead, on deeper study, to some other mode of analyzing of the living being, and so to a new discontinuity—although less removed, perhaps, from the real continuity of life. The truth is that this continuity cannot be thought by the intellect while it follows its natural movement…” [34] In other words, the intellect eventually needs another mechanism other than itself. He continues to critique the use of language to solve linguistic problems: “We are amazed at the stupidity and especially at the persistence of errors. We may easily find their origin in the natural obstinacy with which we treat the living like the lifeless and think all reality, however fluid, under the form of the sharply defined solid.” [35] Similarly the lifelessness of the algorithmic loop, requires modes other than more algorithms in order to avoid systems that lead to less involvement and the antithesis of deeper analysis and problem solving, especially when dealing with the complexity of dataveillance and the potential authority lost on the individuals being watched, tracked or mined for their data.

CAN ALGORITHMS POLLUTE?

David Rokeby is a surveillance artist and programmer who analyzes data through his algorithmic driven interactive art installations. For 30 years the artist has observed thousands of viewers behaving within his interactive environments, especially, “Very Nervous System” 1982-1991 (VNS), the project that translates motion, or human gesture into sound via a surveillance camera. One may tweak a fine gesture of a finger moving, or a full-on choreographed dance, in order to control the music-like sound system as if the participant’s body were an instrument. Rokeby has observed that participants often move through a series of predictable reactions where he can anticipate the participant’s next action — if not their thought — by observing their gesture within his system. [36] In his concept of algorithmic pollution, Rokeby looks at how the algorithm itself is responsible for producing a coded behavior for the participant. This is where the gaps between Bergson’s “mobility of words” and the all-knowing “continuity of life” begin to appear. For instance, when David was initially interacting with VNS, he had a sensation that the system was reacting before he made a gesture. He knew this was impossible, but later asserted that the delay in consciousness is slower than an instant (1/10th of a second). If VNS is 1/30th of a second as he says, therefore Rokeby perceived that he heard a sound before his mind signaled to him that he had made a gesture. [37] These incompatibilities, glitches, gaps and miscommunication in code are what may be difficult or impossible to quantify but are real within human perception.

David Rokeby “Very Nervous System” 1983 - 1991 (original VNS system). Interactive audio installation, custom software, camera, sound system, dimensions variable. Image of the artist performing VNS in the streets of Potsdam, Germany for the Potsdam 1000th anniversary, 1993. Photograph by Lambert Blum, image courtesy of the artist.

David Rokeby “Very Nervous System” 1983 – 1991 (original VNS system). Interactive audio installation, custom software, camera, sound system, dimensions variable. Image of the artist performing VNS in the streets of Potsdam, Germany for the Potsdam 1000th anniversary, 1993. Photograph by Lambert Blum, image courtesy of the artist.

Rokeby’s early theory is based on the “loop”, that self-reflective mirror where feedback reflects back on humans and becomes internalized utilizing our perceptions in ways we don’t fully understand. [38] Hence the title “Very Nervous System” which may, in part, explain the same perception of instantaneous feedback that makes people so engaged, and addicted, to video games. McLuhan writes: “…all technologies are extensions of our physical and nervous systems to increase power and speed” [39] and, “Any extension, whether of skin, hand, or foot, affects the whole psychic and social complex.” [40] David knows that unquantifiable perceptions, nervous systems will be filtered out of any software system when he says:

We start to wear the behavior of the system internally and it starts to become part of ourselves and we loose our ability to differentiate between the system that’s surveying us until we start to wear it’s effects internally. What happens in these feedback loops is that we start a process of sorting, selecting and filtering, not every part of the feedback loop will connect with us, certain things within this odd relationship are enhanced while others start to disappear. I’m interested in these that are enhanced but especially in the things that disappear. [41]

For three main reasons, he questions the social power that we have granted the programmer from a critical and cultural point of view. He asserts that algorithms are the authoritative mechanism that governs our society of control, and that the programmers who write these controls are massively under-examined. First, the best programmers often work for corporations or the government and are asked to resolve the most challenging tasks under tight time constraints, from the conveniences that govern our lives, to issues of security and surveillance. Second, media critics, philosophers and social scientists find it challenging to contextualize the larger meaning of the programmers when they haven’t studied the complex depths of it’s operations and even industry professionals have trouble analyzing each others code. Third, computers are exclusively quantifiable, consistent and persistent. In the process of sorting, selecting and filtering, an algorithm will use quantifiable data and dismiss unquantifiable data. [42] It’s this final phase where we erroneously trust algorithms to govern our environment, our lives, private and public, our work, shopping and if these emerging behaviors are being internalized only from what is quantifiable then we’re creating a type of pollution, or as Rokeby says an “algorithmic pollution.c This type of pollution, Rokeby asserts, is a health issue that invades our relationship with public and private space, but also our nervous system, consciousness and therefore external behavior. [43]

David Rokeby “Giver of Names” 2002-2013, Interactive video installation, custom software, camera, iMac, projector, found objects, and misc. equipment, dimensions variable. Image courtesy of Lisa Moren.

David Rokeby “Giver of Names” 2002-2013, Interactive video installation, custom software, camera, iMac, projector, found objects, and misc. equipment, dimensions variable. Image courtesy of Lisa Moren.

An example of a computer surveillance-type system that uses the audience’s perception as part of what is unquantifiable to Rokeby’s own algorithm is entitled “Giver of Names.” The “Giver of Names” is a language system that sees an object and names it. It exploits Bergson’s notion of the “mobility of words” and in a nod to Duchamp, the project invites spectactors to place an object on a pedestal where the system analyzes its color, texture, shape, scale and relationship to other objects on the pedestal. Rather than use a dictionary or encyclopedia to ‘teach’ the system, Rokeby considered the complex issues similar to Bergson’s “continuity of life” concept and scanned scores of classical literature (such as “Moby Dick”), so that a richer inter-textual-relationship could be established. For instance if random objects — that include a yellow rubber ducky — is placed on the pedestal, “Giver of Names” may respond both audibly and on a monitor:

After another quarter day, across from many more of the shells, a brownish-yellow bath toy, on the left side of the one lake-water green inlet, will burn down all cresson houses, of the fighter, who has taken out this blood. Or Lemons more eyeless than other beady sectors would pardon no optical drops

In a display of live action algorithmic processes, Rokeby exploits the free association of the viewer’s perception between word and object as part of the piece. Everyone has agency. The “Giver of Names” is unlike the structure of Google, which constantly seeks to improve it’s literal and clunky response, but here with scores of pages of code, he poignantly points to the machines quaint lack of consciousness. The awkward and paradoxical grammar is what Rokeby so cleverly exploits. While exclaiming lyrical phrases, the “Giver of Names’s” own type of “knowledge engine” exposes the prejudicial flaws innate within the algorithms themselves and turns them into a subversive and poetic asset. Again, an example of an artist accessing and modulating the societies of control flaws, but here the flaw is so closely linked to the nature of the algorithmic code in which power depends.

Prejudicial flaws, simplistic codifying processes, innate bias and social sorting are natural to computers and their sorting and filtering meta-data and especially complex rule based instructions of algorithms. Many systems are developed by the agencies that have the most resources but are retrofitted for other causes, such as using military surveillance designed for enemies in Fallujah to be turned on the people of Baltimore. What are the gaps in the code of “Angel Fire” that when re-directed will prejudicially disrupt the irony of security Lexie Mountain presents. The same rule-based errors employed in Julia Kim Smith’s “Why?” and (CH2)’s “PolyCopRobots” and demonstrated in the “Flash Crash”, inspire artists to point to the fact that data is everywhere, meta-data and algorithms are infiltrating complex systems by highly resourced corporations, government and military agencies who work under high expectations, short deadlines and little oversight. Furthermore, these codes don’t always perform with same goals as the public hopes, especially minority publics. Finally, Rokeby goes one step further by suggesting these devices and codes are amalgamating with our nervous system and effecting our very psyche, and thereby our very being and behaviors.

IMAGE PROJECTS

Taylor Hokanson “Palimpsest #2” 2013, Steel, aluminum, motors, electronics, and found typewriter roller, Dimensions variable. Image courtesy of Lisa Moren.

Taylor Hokanson “Palimpsest #2” 2013, Steel, aluminum, motors, electronics, and found typewriter roller, Dimensions variable. Image courtesy of Lisa Moren.

More nostalgically, the artist Taylor Hokanson’s kinetic installation “Palimpsest” is reminiscent of the analog collection of memory and data used by spying agencies when wire-tapping was slow and human-control was high (and needed a court order). In his unique mechanical device, “Palimpsest” slowly scans the roller from a discarded typewriter. The incidental text is amplified and displayed on a live monitor. Several images exhibit captured texts that fragment truths about a previous environment that we can only now anticipate its meaning: “each department… needed… plastic back bones… a white… folder frame… 60619.” This still fairly recent style of data-mining is in stark contrast to meta-data-mining that exponentially expands its information archive building capabilities and thereby cultural memory itself.

WhiteFeather “Parent Folder”, 2012-2013, Mixed and digital media, text, Dimensions variable

WhiteFeather “Parent Folder”, 2012-2013, Mixed and digital media, text, Dimensions variable

The reality of rebuilding memory from all the social media data and surveillance footage one can consume is proved a daunting task in the multi-media installation “Parent Folder” by Canadian artist WhiteFeather. WhiteFeather has been estranged from her father since she was 4 years old at an age when memories move from utter fragmentation to narrative snippets. Although nearly 35 years ago, her father escaped to a remote Pacific Island in order to recreate his familial bond with the land, on October 23rd, 2012, he allowed her access to his life through a surveillance camera and an on-line archive called “Parent Folder.” This multi-media installation with a stop motion video on a monitor and headphones, invites the viewer to rest their head on a pillow and read the log that describes the artists daily relationship with the archived surveillance footage and occasional Facebook postings (he doesn’t do email). For instance when hearing her father and a woman discuss the surveillance camera WhiteFeather writes: “I wonder how long it will take for them to forget that I am watching.” The viewer is invited to rest their head on a pillow and read the log that describes the artists daily relationship with the archived surveillance footage. Living in a stop-motion relationship leaves an addicted voyeur constantly yearning for the murky-narrative to be more clear. But will higher resolution more frames per second, or frame rate bring the daughter closer to her father? In the reverse of the daughter surveying the banality of her distant parent, she guesses about his thoughts and likely wonders the big questions: Who is he? Where am I? Can WhiteFeather ever anticipate her father’s love by observing his surveilled behavior?

CONCLUSION

The panopticon teaches us that in order for behavior modification to be successful the inmate, or institutionalized being, needs to believe they’re being constantly surveyed. One of the methods is establishing the impression of a 24/7 relationship between the all-knowing and seeing and the one gazed upon, the surveyor and the one surveyed, the controller and the controlled. In Deleuze’s society of control this literal or figurative prisoner is now the citizen, not in an architectural cell or office, but moving through the world, being “pinged” while connecting and building data cells, such as in the “cloud” where digital data crumbs collect while working, playing, shopping. Deleuze’s method may be summarized as a parametric society where the controllers are the programmers, who if not in charge of directing the parametric, are at least defining the nuances of the algorithms. These methods are less about the direct relationship between the surveyors and surveyed than the relationship between the controller modulating a program, and the citizen being watched, tracked, and mined for their data. The transference of the panopticon to the parametric society of control is what Rokeby points out is the polluting factor of algorithms, affecting our behavior even when we’re in areas that seem in between institutions, as in public environments.

Many artists have responded to massive data mining and the residual scrutiny of their private lives by creating seemingly invisible art where they have more direct access to the modulating powers of the societies of control under their own rules. They reject the infiltration and behavior modification inflicted on them, and instead use algorithms, bio surveillance and dataveillance in order to expose and undermine the authorities that presume to have power over our lives in humorous, poetic or frightening ways. Without control over our “continuity of life” the physical extension of devices, code and algorithms filtering through our bodies and perhaps nervous system itself, we may “treat the living like the lifeless and think all reality, however fluid, under the form of the sharply defined solid. We are at ease only in the discontinuous, in the immobile, in the dead. The intellect is characterized by a natural inability to comprehend life.”[44] While the NSA, FBI, CIA, Federal Intelligence Surveillance Court, and other government bodies look for truth like a moving target in a fragmented puzzle, these artists purposefully fluxuate meaning as it moves between matter, media, and networks. Using the tools of power such as GPS tracking, data-mining, meta-data, glitch and surveillance aesthetics, augmented reality, algorithms, CCTV, drone lexicons, sensors, DNA surveillance and reverse-surveillance tactics, artists have responded in a way that questions the location of the art, disassembling how societies of control use algorithms and data cells in order to label and frame “alternative truths”’ [apologies for the pun] about citizens they’re surveying and data mining.

References

  1. Discipline and punish: the birth of the prison. By Michel Foucault. London: Penguin , 1977. 195-215. Print.
  2. Simon, Bart. “The Return of Panopticism: Supervision, Subjection and the New Surveillance.” Surveillance & Society 3.1 (2005): 3. http://ojs.library.queensu.ca/index.php/surveillance-and-society/issue/view/Open%202005.
  3. Welch, Robert. George Orwell Nineteen eighty-four. London: Longman, 1986. Print. PP3.
  4. Deleuze, Gilles. “Postscript on the Societies of Control.” October Vol 59.Winter (1992): 3-7.
  5. Bogard, William. “Discipline and Deterrence: Rethinking Foucault on the Question of Power in Contemporary Society.” The Social Science Journal 28.3 (1991): 327. www.isi-dl.com/downloadfile/123318
  6. Simon, 8.
  7. Deleuze, 4.
  8. Simon, 15.
  9. “Mail Cover.” Wikipedia. https://en.wikipedia.org/wiki/Mail_cover.
  10. Brown, Sheldon. “Video Wind Chimes by Sheldon Brown.” Video Wind Chimes. 1992. http://www.sheldon-brown.net/vwc/index2.html.
  11. Reel, Monte. “Secret Cameras Record Baltimore’s Every Move From Above.” Bloomberg Businessweek. August 23, 2016. https://www.bloomberg.com/features/2016-baltimore-secret-surveillance/.
  12. Ibid.
  13. Magid, Jill. “Evidence Locker.” Jill Magid. Accessed September 04, 2015. http://jillmagid.paas.webslice.eu/projects/evidence-locker-2.
  14. Dyson, George. Turing’s Cathedral: The Origins of the Digital Universe. New York: Pantheon Books, 2012. PP 9.
  15. PRI. “Studio 360: Kurt Andersen’s DNA Mask Revealed.” PRI Public Radio International. February 07, 2013. http://www.youtube.com/watch?v=u1jgkykLBVI. 1:18min
  16. Dewey-Hagborg, Heather. “Stranger Visions.” Lecture, CYBER INSECURITIES PANEL, Pepco Edison Place Gallery, Washington DC, September 2013.
  17. Risen, James, and Eric Lichtblau. “How the U.S. Uses Technology to Mine More Data More Quickly.” The New York Times. June 08, 2013. http://www.nytimes.com/2013/06/09/us/revelations-give-look-at-spy-agencys-wider-reach.html.
  18. Mobertz, Lauren. “This Marketing Firm Just Released Your Personal Data – but Is It Enough?” The Dash Burst. June 06, 2013. https://blog.dashburst.com/acxiom-releases-personal-data-information/.
  19. Morran, Chris, Consumerist. “Google Update Begins Transition From Search Engine To ‘Knowledge Engine'” Consumerist. 2012. https://consumerist.com/2012/05/16/google-update-begins-transition-from-search-engine-to-knowledge-engine/.
  20. Ibid.
  21. “Daily Report: Apps That Anticipate Your Needs.” July 30, 2013. http://bits.blogs.nytimes.com/2013/07/30/daily-report-apps-that-anticipate-your-needs/?_r=0.
  22. Hickman, Leo. “How Algorithms Rule the World.” The Guardian. July 01, 2013. https://www.theguardian.com/science/2013/jul/01/how-algorithms-rule-world-nsa.
  23. Lyon, David. Surveillance Studies: An Overview. Cambridge: Polity Press, 2008. Pp 94-118.
  24. Bauman, Zygmunt. Globalization: The Human Consequences. New York: Columbia University Press, 1998.
  25. Frontline Spying on the Home Front “The NSA’s Eavesdropping at AT&T, May 15, 2007. http://wwwpbs.org/wgbh/pages/frontline/homefront/view/
  26. @TiareDunlap. “Ryan Lochte Is a ‘Really Good Guy’ Who Made an ‘Amazingly Stupid Decision,’ Says Olympic Medalist Rowdy Gaines.” PEOPLE.com. 2016. http://www.people.com/people/package/article/0,,20996464_21025620,00.html.
  27. Herhold, Scott. “Herhold: Brock Turner Deserves County Jail, Not State Prison, for Stanford Sex Assault.” East Bay Times. 2016. http://www.eastbaytimes.com/2016/06/01/herhold-brock-turner-deserves-county-jail-not-state-prison-for-stanford-sex-assault/.
  28. Lyon, 94-118.
  29. Holder, Eric, Attorney General. Proceedings of Delivers Remarks at the Congressional Black Caucus Foundation Criminal Justice Issues Forum. September 19, 2013
  30. “Lexie Mountain’s Portfolio | Baker Artist Portfolios.” Lexie Mountain’s Portfolio | Baker Artist Portfolios. http://bakerartist.org/nominations/view/alexandramacchi/105393/. 2014
  31. Ibid.
  32. Bergson, Henri, and Arthur Mitchell. Creative Evolution. New York: Random House, 1944. pp 168.
  33. McLuhan, Marshall. Understanding Media the Extension of Man. London: Routledge & Kegan, 1964. pp 79
  34. Bergson, 162.
  35. Ibid, 165.
  36. Rokeby, David. Algorithmic Pollution: Artists Working with Data, Surveillance and Landscape. Proceedings of College Art Association, Washington DC. February 6, 2016. Moderated by Ingrid Bachmann and Lisa Moren, with panelists, Dr. Tiffany Holmes, Dana Dal Bo, Davide Rokeby and Jason Hoelscher.
  37. Ibid.
  38. Rokeby, David. “Transforming Mirrors: Subjectivity and Control in Interactive Media.” In Critical Issues in Interactive Media, edited by Simon Penney, 133-58. Albany, NY: SUNY Press, 1995.
  39. McLuhan, 90.
  40. Ibid. pp 4.
  41. David Rokeby. “Algorithmic Pollution.” Lecture, CYBER INSECURITIES PANEL, Pepco Edison Place Gallery, Washington DC, September 2013.
  42. Ibid.<l/i>
  43. Rokeby, Washington DC.
  44. Bergson, 165.

Biography

Lisa Moren is an artist who creates a hybrid of emerging digital and ancient methods as interactive installations, works on paper and public interventions. Her environmental works and experimental narratives have been exhibited widely including the Chelsea Art Museum, Cranbrook Art Museum, Ars Electronica and Akademie der Kunste and worked with the Artists Research Network, LaTrobe University, Melbourne, Australia. She is a Fulbright Scholar and a recipient of a National Endowment for the Arts award, a multi-year recipient of the Maryland State Arts Council, CEC Artslink International, and Janet and Walter Sondheim semi-finalist. She has been an artist-in-residence at STEIM, Amsterdam; the Banff Center for Arts, Canada; and the Imaging Research Center at UMBC. Her writing has appeared in Performance Research, Visible Language, Inter Arts Actuel, and her books “Marbleized Oil from the Gulf of Mexico”; “Intermedia” and for Issues in Contemporary Theory for the exhibition she curated “Command Z: Artists Working with Phenomena and Technology.” Lisa Moren is currently a Saul Zaentz Innovation Fellow for her collaborative augmented reality app with Jaimes Mayhew, Martin Bricelj Baraga and Neja Tomšič on “NONUMENT 01: The McMcKeldin Fountain.” For more information see: lisamoren.com.