School of Creative Media, City University, Hong Kong
The project discussed here, TechnoSphere 2.0, uses Augmented Reality (AR) in combination with 3D printing to create blended online/offline worlds where people interact with virtual creatures. TechnoSphere 2.0 is inspired by the ALife artwork, TechnoSphere, made by the author in collaboration with computer graphics programmer, Gordon Selley.  The 1995 web-based version had over 100,000 unique users who made over a million creatures, which was significant for an independent Internet artwork of that time. That version supported complex real-time interaction between creatures within the ALife world that was housed on a PC. Most users accessed our site via dial-up Internet connections with top speeds ranging from 28.8Kbps to 33.6Kbps. The combination of the number of users and low bandwidth made it impossible for us to display real-time interactions between creatures graphically to online subscribers. The interactions that people made with our ALife creatures (the ‘beasties’) was limited to using a 2D web interface whereby beasties were named and designed who then sent intermittent text postcards ‘home’ reporting key events in their lives. Some users set up fan pages to discuss the project, tell stories about the beasties and develop what we might now call small social networks. In 1999 a flight simulation expert, Mark Hurry, joined us to develop a real-time 3D installation for what is now the Moving Image Museum in the UK (see figure 1). The museum version, limited to two simultaneous users, allowed visitors to create creatures on the spot and showed tens of thousands of creatures interacting in real-time in a 3D terrain. Visitors could follow any creature with a virtual camera or fly freely through the landscape and see creatures moving around and interacting with one another.
However in both versions of the 1990s’ TechnoSphere, the relationship between people and the creatures that they designed was restricted such that, once created, no creature could be altered or affected further by human participants.  The boundary between the online ALife world and the offline human world was clearly defined. By contrast, and partly in response to the frustrations of this limited interaction, our designs for TechnoSphere 2.0 focus on humans interacting with nonhuman ALife virtual beasties and blending their worlds in ways that potentially affect both. We use AR, for example, to visually mix the two worlds.
TechnoSphere 2.0, a collaboration between the author and Mark Hurry, focuses on developing richer, more blended, relationships between human and Alife creatures. Our designs are deliberately situated at the increasingly blurred boundaries, or the continuum, between what has been termed the online/offline binary. AR entangles and blurs the offline and online, or as digital media theorists, Jay Bolter and Maria Engberg have suggested, “the polyaesthetics of AR […] engage multiple senses, and not only the senses of sight, hearing, and touch but proprioception as well. […] AR and MR encourage us to occupy two locations at once–to think of ourselves as both ‘here and there’–situated in our immediate physical and cultural location as well as in the virtual world of the Internet.”  To talk of blurred boundaries between online and offline in this paper is to discuss our simultaneous occupation of these two locations.
We do not treat technologies as quiet servants that invisibly extend the boundaries of our unconscious, or our bodies, nor suggest that our entanglements with these technologies are calm and seamless. Invisibility and seamlessness, or calmness, were the underlying principles of so-called ubiquitous computing (ubicomp) when Mark Weiser coined the term. In the 1980’s Weiser was head of the Xerox PARC computer science laboratory and later became its chief technology officer. His radical vision of ubicomp might be described as a blurring of the online augmented with the nuances of the real world, which Weiser considered wonderful. By contrast we take a feminist science studies position, drawing on the work of feminist philosopher of physics, Karen Barad, to understand blurring using Barad’s idea of diffraction. Diffraction is a pattern of interference where the ripples from two or more forces overlap and mingle. The blur we refer to is a visual indicator of activity, in this case the ripples from online and offline behavior and infrastructures meet one another and cause a blurring. This interference pattern is a messy disruption that draws our attention to those new entities that emerge as we humans intra-act with mobile devices, GPS and 3D print in our particular home and/or urban environments.
The first in our series of mobile Android applications (apps) that entangle virtual creatures with humans is our Creature Create app (see figure 2) based on the online version of TechnoSphere. People use Creature Create to design a creature by building it from a menu of heads, bodies, wheels and eyes. Throughout this process the creature can be rotated in real-time 3D and textured from a series of patterned swatches, then named and saved. Saved creatures can then be loaded into our AR apps or 3D printed. These subsequent apps enable mixed reality environments with the aim of blending the lives, artificial and otherwise, of people and ALife creatures, as well as online and offline spaces. For example, the AR apps use the cameras typically built into mobile devices to show computer-generated graphics of the beastie over the top of a live video image of the human’s environment. In the first AR app, based on sitting at a domestic table, the user prints simple 2D paper markers and places them on the table to locate and attract their beastie. In the second AR app we use GPS to entangle the location of people and nonhuman ALife creature. People can see their creature running alongside them as they walk around. But where are these people and creatures? How do they move through the city together?
When thinking about creatures and their tethered humans we resist modernist definitions of media that discuss augmented reality by focusing on so-called medium-specific characteristics, such as GPS.  Instead, our approach to using artificial life, AR and 3D print takes account of ideas from philosophy, science and technology studies (STS), and feminist techno-science studies, where scholars have variously argued that there is no pure science rather it is “entangled in societal interests, and can be held as politically and ethically accountable, as the technological practices and interventions to which it may give rise.”  We therefore situate our interconnected apps in the specific milieu of Hong Kong, to reflect the culture of that so-called ubicomp environment.
Our design methodology is based on the premise that the context and process of making matters. This approach we take in the design of TechnoSphere 2.0 demands that we remain aware of what computer scientist Paul Dourish and cultural anthropologist Genevieve Bell have described as the “central conundrum posed by the fact that Weiser’s vision of the [ubicomp] future is, by this point, not only an old one, but also a very American one.”  This prompted us to explore the particular ways we use both mobile devices and urban spaces in Hong Kong, in order to situate our designs locally.
We have found it useful not to simply critique Weiser but to think about his broader vision for ubicomp, in particular his hope that such invisible technology would free users’ attention to be redirected to practices other than interacting with the desktop computer. This vision was underpinned by his belief that “people live through their practices and tacit knowledge so that the most powerful things are those that are effectively invisible in use”.  Here, we agree with Weiser that “powerful things” are not necessarily technologies but also forms of knowledge that can be difficult to communicate verbally, that is, embedded in experience and embodied like skills and ideas.
The tacit knowledge of Hong Kongers is co-constituted with these particular urban spaces. As Bell and Dourish emphasized, wireless infrastructures are both physical and situated; they are culturally framed by space and they change our spatial understanding and behavior.  Therefore, it became essential to engage with the cultural and social spaces of Hong Kong, in order to design apps that took into account of how we “encounter the spatiality of the city through the range of services that might be available there, especially when such services are deployed selectively.”  Of equal importance is the observation by game designers and academics, Salen and Zimmerman that “Games reflect cultural values… the internal structures of game rules – forms of interaction, material forms.”  While we do not consider our designs to be games, as such, but rather, as new media scholar Leino differentiates, ‘playful works,’  we draw on Salen and Zimmerman’s ideas to make apps “in dialogue with the larger cultural values of the community for which the game is designed.”  Our specific location prompted shifts in our proposals for location-based experiences, informed by close observation of how we use smartphones in Hong Kong. However, by taking this approach, we were not interested in using the design process to perpetuate a notion of a generalized ‘Chinese user,’instead we aim to understand the specific practices of mobile phone use in Hong Kong, in particular in its urban spaces.
Although forty per cent of Hong Kong is green space, it is densely populated by 16,469 people per square mile with an average living space of 10 square meters (107 square feet) per Hong Konger. It is not untypical for three generations to live together in a 30 square meter (322 square foot) apartment, resulting in little privacy and less storage space in most domestic environments. Lots of Hong Kongers would love a pet but most have no space for one. In many homes a relatively small table is a busy multi-purpose surface, used for homework, eating and playing mah jong. The first TechnoSphere 2.0 AR app allows people to store their virtual beastie (an ALife companion rather than a real life pet) on their phone and then let it out onto the table at home, or in a café or restaurant. 2D paper markers provide points of reference that appear on the mobile device screen and are then overlaid with the AR image of the virtual beastie. These markers can be simple icons or more complex objects assembled from paper patterns.
Our decision to provide patterns so that people can make personalized markers resonated well with Hong Konger’s love of craft and making. In the first AR app (see figure 4) beasties are seen moving around the tabletop or other flat surface in a shared world that brings together the human’s offline space and the online creature. By adding more paper markers people can provide trees, food and pathways. These more complex worlds are apparently human-defined, as the markers are tied to the ALife engine with each graphic affording the creature, for example, shelter in the form of trees, caches of food and pathways to follow. As the creatures interact dynamically with the markers in real-time the shared world becomes more entangled.
With space at a premium in Hong Kong, very few of us have access to outdoor private space; furthermore, the predominantly hot and humid climate means that we seek the cool of air-conditioned public spaces. Our everyday lived experience and practice-based research reveals the propensity for socializing to take place outside the home, in public spaces of the city like malls and parks, which are often privately, owned public spaces (POPS). POPS’ spatiality includes expectations about, and restrictions on, behavior. For example, in most malls there is no eating or drinking allowed outside designated areas, but conversely, in abutting street markets eating out in the open is the norm. POPS enable other behaviors that are constrained in Hong Kong’s compressed domestic spaces, such as friends gathering to stroll and chat, to socialize both online and offline.
Residents of most ages in Hong Kong walk through public space while simultaneously being online. Our slow, deliberate walk, a response to the heat and humidity, differs from the walking practices in cities like New York or London. Whether indoors or out we interface with our phones looking at screens as we walk in the narrow and crowded public spaces. The obstructed clear pathways cause us to meander back and forth to avoid collisions with other pedestrians often at the last moment. In light of this behavior our second AR app uses GPS to tether human to creature. As humans walk around their domestic space, POPS or a public park, their creature scampers within feet of them.
This tethering allows human and artificial lives to blend the lives, artificial and otherwise, of people and online creatures. We use AR to blend the offline and online, focusing on the materialization of ubiquitous ALife. This extends the understanding that wireless communication includes physical infrastructure that brings together the online and offline. For example, in TechnoSphere 2.0, responding to the needs of an online creature (its hunger or need for exercise) may change the behavior of the person connected to it. If the online creature is hungry might the human tethered to it select a particular restaurant to eat in and/or alter their own eating habits to afford the virtual creature better health? Concurrently, the offline activity and location of a human could impact the behavior of the beastie. For example, local air quality or human exercise transmitted to the ALife engine via third party apps, have the potential to unpredictably impact the fitness of a creature.
Our idea is that ALife creature and human are co-constituted with their urban environment. Their lives unfold through intra-actions within physical and hidden systems. By using GPS, air quality apps and through the behaviors of users, some of these hidden potentials, or affordances, become clearer. The philosopher Gilbert Simondon refers to this unveiling of hidden potentials as moving from the abstract to the concrete.  We also refer to the work of philosopher Isabelle Stengers  to describe this concretization process as ‘making’. For both Stengers and Simondon, transduction, the process through which entities emerge is not an emergence into a context, environment or milieu that is then changed by them. Rather, entities emerge with their context, into an environment of individuation or what Stengers calls the “causality of coupling.”  We couple together people and ALife creatures through AR to merge with the context of urban Hong Kong. In conclusion, the outputs of this project, designs and apps, emerge with – and not into – their context
1. Prophet, Jane. “Sublime Ecologies and Artistic Endeavors: Artificial Life and Interactivity in the Online Project ‘TechnoSphere’.” Leonardo (1996): 339-344.
2. Prophet, Jane. “TechnoSphere: ‘real’ time, ‘artificial’ life.” Leonardo 34, no. 4 (2001): 309-312.
3. Engberg, Maria, and Jay David Bolter. “Cultural expression in augmented and mixed reality.” Convergence: The International Journal of Research into New Media Technologies 20.1 (2014): 3-9.
4. Asberg, Cecilia and Nina Lykke. “Feminist Technoscience Studies.” European Journal of Women’s Studies 4 no. 17 (2010): 299–305.
5. Bell, Genevieve, and Paul Dourish. “Yesterday’s tomorrows: notes on ubiquitous computing’s dominant vision.” Personal and ubiquitous computing 11, no. 2 (2007): 133-143.
6. Weiser, Mark. 1988. http://www.ubiq.com/hypertext/weiser/UbiHome.html. (accessed March 31, 2016).
7. Bell, Genevieve, and Paul Dourish. Divining a digital future. Cambridge: MIT Press (2012)
8. Salen, Katie, and Eric Zimmerman. Rules of play: Game design fundamentals. Cambridge: MIT Press (2004).
9. Leino, Olli Tapio. “From interactivity to playability.” Proceedings of the 19th International Symposium of Electronic Art. (2013). http://ses.library.usyd.edu.au/handle/2123/9475 (accessed March 15, 2016).
10. Simondon, Gilbert. “The Genesis of the Individual”. Incorporations. New York: Zone (1992), 296-319.
11. Stengers, Isabelle. Cosmopolitics II. Minnesota: University of Minnesota Press (2010).
Jane Prophet works across media to make art objects. She recently worked with neuroscientists to explore brain activity and memento mori. Her current work, TechnoSphere 2.0 and Pocket Penjing, made with Mark Hurry both combine ALife and augmented reality. Pocket Penjing uses weather data to grow ALife bonsai trees. She is Professor in the School of Creative Media, CityU, Hong Kong. The work described in this paper was partially supported by a grant from City University of Hong Kong (Project No. 9380065)