Professor of Digital Creativity
Institute of Creative Technologies
De Montfort University, Leicester, United Kingdom
Professor of Digital Humanities
Institute of Creative Technologies
De Montfort University, Leicester, United Kingdom
This short feature describes one of the world’s first iPad opera/ballets: Secret Garden, an artistic collaboration between Martin Rieser (artist) and Andrew Hugill (composer).
The nature of performance has been heavily impacted by new technologies in the last few decades, but there now appears to be a new emergence of energetic exploration around that most fixed and traditional of composites: Opera. A recently formed Digital Opera forum in the UK created a manifesto that states:
“Digital Opera transforms the way in which operatic works are produced and consumed. Audiences are engaged in active participation, and the roles of the other collaborators in a production are changed. The operatic experience becomes intermedial, and networked, often asynchronous and immersive in a way which does not rely on a proscenium arch – or any traditional theatrical setting – but is closer to the user experience of reading or gaming. The voice is always present, but may be real or simulated and, together with the media themselves, provides new opportunities for the ‘vehicle of narrative’.” 
While digital opera is not an entirely new concept as installation – there are a number of precursors in the works of pioneers such as Pip Greasley’s Planet of the Seas (1974),  (part of an eco-trilogy predicting Earth in meltdown) and his 1992 5K Pursuit Opera for Channel 4 Television, utilizing a wired-up velodrome to drive a non-linear drama where competing track cyclists controlled the plot, music and libretto. There were numerous others such as Douglas-Scott Goheen and Christopher Dobrian’s Microephiphanies (2000).  Since the conception of Secret Garden, however, other digital “operas” have been emerging in online contexts such as Jaakko Nousiainen’s mobile work Omnivore (2012),  where interaction is determined by the time of logon and a single character sings the virtues of food and its associations in a worldwide frame of reference. At the same time artists such as Craig Vear, were producing A Sentimental Journey (2012),  based on Sterne’s original novel, combining live, streamed music, fictional tweets, and live listener tweeted responses.
It is in this emerging context that Secret Garden was conceived as an attempt to recreate a contemporary version of the Eden myth in the midst of an urban environment. In its present form, it comprises eleven mounted iPads that are positioned in a circle centrally in gallery space, acting as virtual viewports into the garden. Peering into one of the viewports triggers a view of an idyllic three-dimensional scene in the Secret Garden (notionally Eden) that tells part of the mythical story of The Fall, interpreted through words, music and dance.
The Fall story is common to many of the world’s religions, including Judaism, Christianity, and Islam.  The structure of Secret Garden is loosely modeled on the ten paths of the ‘Sephirot’  related to the Jewish telling of the story, which is also a symbol of the Tree of Life in the oldest extant version. The original texts comprised eleven poems that tell this classic story in both a timeless and contemporary way, examining human choices in a ‘fallen’ world. Two animated 3D avatars re-enact the narrative, through dance vignettes within 3D-generated environments, each scene distributed to a different one of the eleven iPad viewports.
The viewer’s interactions with the scenes trigger both the music and dance. However, it is not necessary to see the viewports in any particular order, and a partial viewing will also provide a complete experience in itself.
The musical composition is adaptive – featuring vocal settings and digitally treated percussion. The virtual scenography consists of 3D designs based on an idealized garden space, inspired by the John Martin’s 19th century Mezzotints for Milton’s Paradise Lost. 
The Physical Installation
The physical installation is a unique virtual reality amalgam of poetry, music, and changing real-time 3D panoramic landscapes combined with motion-captured avatars. It plays with narrative and myth, transposed into a modern context, using cutting-edge technologies both in its production and delivery, but in a synthesized amalgam. Audience movement from viewport to viewport triggers vocal settings of authored verse and perspective changes in the viewed scenes.
An audience engages with the eleven screens one after another, experiencing 3D perspective with parallax vision-shift made possible through a manipulation of the native face-tracking features of the iPad tablet. The tracking software detects head movements and adjusts the scene’s field of view in real-time. Viewers are not required to wear any special headgear or glasses. They simply stand in front of an iPad and trigger its unique scene. The music is tied to each scene and builds in its layering, intensity and spatiality as more audience members interact with the work.
Motion Capture was on a ‘Vicom’ system using professional dancers and improvised choreography for each scene. The motion mapping was completed in ‘Maya’ using 3D skeletal models skinned with a map of the visual universe and transferred into a UNITY landscape modeled by Rieser.
Music composition and sound design
The music consists of eleven songs, scored for one or two voices (soprano/tenor) and accompanied by tuned gongs and crotales. The songs, composed by Andrew Hugill, are settings of the original poems by Martin Rieser that provide the basic scenario of Secret Garden. In addition, there is an extended ambient soundscape that comprises spectrally processed gong and crotale sounds. This is played at low level in the room to provide atmosphere. Whereas the songs last several minutes each, the soundscape evolves steadily over several hours.
The composition of the music presented particular and unusual challenges. The writing of “adaptive” music is familiar to composers of music for computer games, but less common in more formal situations. In this instance, the nature of the constraints meant a highly disciplined approach that goes well beyond the normal ‘if…then…’ formulae of adaptive music, and must take into account the potential of any given moment to combine with any other given moment. Since each of the eleven songs could be triggered at any given moment and in any order, there are 22 potential layers of randomly timed polyphony. It was important that any resulting effect should be pleasing, while at the same time containing enough contrast to retain interest.
A hexatonic gamut provided the basic scale, but stretched over several octaves and with some inflections to provide moments of variation. The texts were new poems that also presented some challenges in terms of word-setting since the language is simultaneously prosaic and mysterious in nature. An additional self-imposed constraint was the number symbology associated with the ‘Sephirot’. Figure 1 shows a typical composition from the setting of “Glory”. Notice the consistent modality, in this case built on E (the gong is also tuned to E for this song), with an intervallic sequence of 1, 2, 3, 5 semitones extending above that (e.g. F, G, Bb, D, G). This moment fits with any of the moments from the ten other songs, which are all, in their turn, built upon different constructions of the basic mode, to a total value of eleven. The disposition of the viewing windows and the relationship between sound and image, combined with the enforced limitations of binaural headphones, meant that the music has to work within a fairly narrow band of possibilities, which were constrained still further by the public setting.
As an audience member engages with a scene, they hear in the foreground the song that is native to their current position through headphones. Also present within their personal sound environment is the faint, often peripheral indication of other audience members viewing scenes at other viewport locations. This level of the sound design, which acts as an acknowledgment of the ebbing and flowing of user engagement, attempts to locate binaural sound-based spatially on their actual location in the virtual scenography. Achieving this in practice required the construction of a convincing three-dimensional sound environment and a robust communication framework between the iPad devices. Initially, songs were recorded in an acoustically neutral space in the MTI Research Centre at De Montfort University. They were then binaurally reproduced in the very resonant ‘Gallery 6′ space at the Leicester New Walk Museum and Art Gallery, using a loudspeaker and dummy head setup. The resultant spatialized songs were then distributed through headphones in the gallery using a combination of Max/MSP and Abelton Live. To implement networking between the iPad locations and the audio engine, a communication routine was established that encoded touch events on the devices as a UDP/Open Sound Control signal and delivered them to a central sound management system. In practice, the technological setup allows the music to adapt intuitively to the interaction of the users. The sound design aspects are a product of a collaboration between Andrew Hugill and PhD student Lee Scott.
The effect on people who experienced this work was one of increasing fascination, as they were drawn into the story by both the animated world and the sounds. Although the piece deployed leading-edge technologies in a novel non-linear narrative, it is in the end the qualities of the artistic experience that affected people and lived on in their memory. Pushing the technology to the limit had its limitations, where sometimes glitches could develop in real-time display interaction, particularly in the head-tracking aspect. Valuable lessons were learnt on the importance of maintaining flow for true immersion. The transfer of such technology into future projects will provide a model by offering a set of techniques and standards for the integration of 3D interactive technologies, music, poetry and drama in a public setting. As far as we know, this is in effect the world’s first virtual opera in the round, using iPads, and, as such, is groundbreaking in its demands on the audience and in opening out this genre to a new audience by delivery through a public art format.
Secret Garden will be further developed by creating an online mobile phone version for iPhone and Android smartphone platforms, with a mixture of Augmented Reality and GPS location technology. We are also currently conducting research into key questions, such as whether a hybrid form can deliver the same narrative impact as traditional opera or ballet. To this end, we are carrying out careful audience evaluations in a controlled environment. The results will be used to inform a new study in this developing area of virtual performance and hybrid digital narratives.
The mobile version will build on the Empedia Platform  developed with Cuttlefish Multimedia as a KTP project with the IOCT. We will port the Layar AR API into the Empedia software allowing the user to experience a locative trail where the 11 scenes are linked to 11 key locations in any area. The polyphonic aspects of the original design will be preserved and it will embody the first Augmented Reality opera ever designed for mobile technologies. The “Eden” scenes will grow around a user in 360-degree 3D panoramas tied to the GPS nodes selected and will be triggered automatically by location. By aligning locative technologies with the oldest of mythologies, we attempt to build on the notion of technology offering not simply a portal into the past, but one into the deeper layers of collective cultural myths, revivified for a contemporary audience.
1. Digital Opera: New Means and New Meanings Conference, York University Music Department, Toronto, Canada, May 9, 2011, http://www.york.ac.uk/music/conferences/digital-opera/; International Journal of Performance Arts and Digital Media, Vol. 8, issue 1, April 2012 (accessed November 14, 2013).
2. Pip Greasley, “Subtle Futures,” International Journal of Performance Arts and Digital Media, Volume 8, Issue 1 (April, 2012), pp. 125-134.
3. “Microepiphanies: A Digital Opera,” on the University of California Irvine Department of Music website, http://music.arts.uci.edu/dobrian/microepiphanies.htm (accessed August 14, 2013).
4. Jaakko Nousiainen, “Reframing Opera in Mobile Media,” International Journal of Performance Arts and Digital Media, Vol. 8, issue 1 (April, 2012), pp. 93-108.
5. “A Sentimental Journey,” on ev2 official website, http://www.ev2.co.uk/vear/asj.html (accessed August 14, 2013).
6. David M. Carr, “The Garden of Eden Story (2011)”, in An Introduction to the Old Testament (New York: John Wiley & Sons).
7. Dan Cohn-Sherbok, The Blackwell Dictionary of Judaica (New York: Wiley Blackwell, 1992).
8. John Milton, Paradise Lost, illus. by John Martin (London: Septimus Prowett, 1827).
9. Empedia official website, www.empedia.info (accessed November 14, 2013).
Martin Rieser is joint research Professor between the Institute of Creative Technologies and The Faculty of Art and Design at De Montfort University. His track record as a researcher and practitioner in Digital Arts stretches back to the early 1980s. Originally a graduate of English Literature and Philosophy from Bristol University, he subsequently studied printmaking at Atelier 17 in Paris with Stanley William Hayter and then at Goldsmiths, where he also developed an abiding interest in photography. From this hybrid background he moved into computer arts, establishing the first postgraduate course in the discipline in London in 1982. His art practice in internet art and interactive narrative installations has been seen around the world including Milia in Cannes; Paris; The ICA London and in Germany, Montreal, Nagoya in Japan and Melbourne, Australia.
Andrew Hugill studied composition with Roger Marsh at the University of Keele between 1976 and 1980, and in 1983 he founded the ensemble “George W. Welch”. He began lecturing at Leicester Polytechnic in 1986, working alongside Gavin Bryars and Dave Smith, eventually becoming subject-leader for the BA Performing Arts: Music. He founded the Music, Technology and Innovation programme in 1997 at De Montfort University and taught Creative and Negotiated Projects, Musicianship and Internet Music. In 2006 he founded the Institute of Creative Technologies and was its Director until 2012. He is also an Associate Researcher at the Universite de la Sorbonne, Paris and a National Teaching Fellow of the Higher Education Academy. In 2006 he was Highly Commended for the Most Imaginative Use of Distance Learning by the Times Higher Education Award.