Steve Jones: The Carry Principle: Strategies for Mobile Music Practice

header image

Abstract

In design practice, what is referred to as the “Carry Principle” defines the core elements of the mobile experience: small, personal, communicative, multifunctional, battery operated and always connected. These qualities have ensured that for many of us some form of mobile device has become an ever-present part of our lives, a ubiquitous presence in our quotidian tasks of communicating and accessing information, for work or for entertainment. A digital tablet such as an iPad affords an ideal interactive performance system: self-contained, autonomous, a sound-producing device that enables anyone to make music in a wide range of situations. But do mobiles have the potential to liberate or re-entrench old working practices? Can mobility transform how we see, hear and experience our surrounding environment?

 

Introduction

Barbara Ballard’s “Carry Principle” defines the core elements of the mobile experience: small, personal, communicative, multifunctional, battery operated and always on (even in a standby state) (Ballard 2007, 231). These qualities have ensured that for many of us some form of mobile device has become indispensable, a ubiquitous presence when it comes to working, playing, communicating or accessing information.

With the shift from static to mobile technologies, the device moves further away from its original function as a communication tool towards one of a relational device, it becomes an enabler of creative production. Mobiles are highly personalised, they connect, receive and transmit our movements, experiences and opinions, miniaturising the “stuff” of our lives, storing our memories, behaviours and habits. All the media objects held on our devices are material composed of digital code – mathematical representations that can be subject to all manner of algorithmic manipulation, whether that be “warped, streamed, accelerated, slowed, supersaturated or attenuated” (Morris 2006, 16).

Lev Manovich’s observations on the convergence of computing and media technologies note that when data can no longer be viewed from a single, fixed viewpoint, we are transformed from passive viewers into active users (Manovich 2001, 27). With the transition from static to mobile technologies, Smartphones, digital tablets and cameras are extending the limits of what the Internet can offer, to include the physical spaces we inhabit and the ways we perceive proximity. They are our agents to navigate these hybrid spaces.

Yet this mode of thinking also demands new perspectives of how we think of a musical instrument, defying established cultural practices and social forms. With mobiles acting as shared agencies, is it possible to disseminate ideas that challenge assumptions of a musical instrument, our use of the word instrument or even expectations of what we think of as music practice?

Defining a new musical instrument

Mobile devices differ from a computer in that they are designed to be hand-held and remain active, even when powered down. Thus we usually carry them around with us and consequently we have rapidly adapted to a screen-based gestural language of tapping, scrolling, pinching and swiping. The proliferations of sound and music applications, particularly for the iOS platform, are beginning to explore the potential for sonic interactions using these multi-touch gestures. A typical Smartphone has a wide range of onboard sensors – camera, microphone, accelerometer, gyroscopes, compass and GPS system – that suggests new possibilities and techniques for making music not possible with desktop computing. Although they may not yet have the processing power of a laptop, a hand-held device’s embodiment affords an interaction closer to the immersive sense of what Robert Rowe calls the “player paradigm.” For Rowe, this paradigm is observed when playing a musical instrument, which should encourage people “to engage in active music making rather than to simply absorb the music coming at them from… every side” (Rowe 1999, 84).

In their research into musical interactions beyond the traditional piano keyboard, Professors Eduardo Miranda and Marcello Wanderely assert that an interactive performance system can be considered to be an instrument by virtue of its possessing: sensor inputs, signal processing capabilities and a sound output (Miranda, Wanderely 2006, 26). Atau Tanaka’s collaboration with Adam Parkinson; their pioneering work 4 hands 2 iPhones (2009), was an early example of explorations into the implementation of a mobile’s on-board sensors in a musical context. Tanaka described the iPhone as a “self-contained and autonomous sound-producing object that enables musicians to perform in a live situation” (Tanaka 2010, 5), and it is the autonomous nature of a mobile that allows it to be thought of as a musical instrument. However, this notion is metaphorical, rather than a strict definition: the word “instrument” helps us to conceive of its links to an artistic tradition of musical technique, virtuosity and creative practice.

Nonetheless, there are still anxieties regarding aesthetic authenticity and the cultural legitimacy of mobile technologies. One of the appeals of Smartphone apps is their capacity for non-musicians to make musical sounds. Often interactions are designed to be heuristic, enabling a user to learn its function very quickly – an approach that means apps are often dismissed as toys rather than serious music performance systems. Anna Dezeuze notes the same criticism levelled at new media, disparaged as “Nokia art” or “shallow” technology. “Like most art, it is the content and intent of the work that is important, rather than the media used” (Dezeuze 2010, 299). If critics are not familiar with the media themselves, they are unlikely to be aware of its significance.

In the context of live electronic music, could we consider a mobile device to be a new folk instrument? It allows music to be made in any social setting, it accommodates all levels of ability and technical knowledge, while affording networked communities to collaborate and exchange musical ideas (Jones 2013). Obviously, there are problems linking commercially available electronic devices with the cultural connotations associated with folk music, yet historically, folk transmission has been the exchange of data between different communities via oral traditions. The troubadour and the medieval jongleur exchanged data in the form of stories and songs, and the technology they used to augment their spoken or singing voices had to be light and portable too – stringed, wind and percussive instruments (Rojek 2013, 177). A guitar allows for a wide range of musical techniques; from basic strumming to highly complex playing styles. Although a fiddle and a classical violin are physically identical, they cannot be considered the same instrument. Same object, different uses. Musical exchange continues via oral traditions, but has become super-extended with broadcast media and the Internet: now encrypted data is the primary media of exchange. And what do we use to access it?

Nathan Bowen’s PhD dissertation has a detailed timeline on the use of mobiles for music, tracing a path demonstrating how artists are starting to situate performances and sound art outside the concert hall or gallery. Even so, he admits there is a lack of uptake in defining a mobile as an instrument in a cultural sense. Public perception has difficulties in correlating metaphors and gestures with music made with mobile devices (Bowen 2013, 108). The idea of a mobile instrument is likely to remain defined by its indeterminate quality. What does it look like to play a mobile? How should it be held? How does one become proficient at playing music on a Smartphone?

The extent of a lack of cultural reference was evident during a public performance with Dr Sally Rodgers at the Bradford Street Festival, 2014, under our current working title, Discrete Machines. With two iPads running the Samplr app, two Dirty Electronics cardboard box loudspeakers, and generic workman’s helmets with binaural microphones and a GoPro camera attached for documentation, we moved up and down the street, sampling, manipulating and amplifying the noises of the busy market stalls, funfair and a music and dance festival.

Fig. 1 Discrete Machines, Bradford Street Festival, June 2014

Fig. 1 Discrete Machines, Bradford Street Festival, June 2014

Fig. 2 Discrete Machines, Bradford Street Festival, June 2014

Fig. 2 Discrete Machines, Bradford Street Festival, June 2014

Although billed by the festival organisers as “Street Theatre”, it felt more like guerrilla sonic interventions, confounding the usual expectations associated with electronic music production. Immersed in a crowd, making strange abstract sound collages provoked looks of surprise, bemusement and even annoyance from the groups of families and afternoon shoppers. The close proximity between “audience” and us also went against conventional music listening practice as we moved and projected the sound amongst people with portable loudspeakers. Despite our combined history of working and performing together, the Bradford Festival proved one of the most challenging undertakings we had experienced. Harder still was attempting to define the performance: were we an intervention, a marching band, buskers or public nuisance? In correspondence with Rodgers, she opines that “Mobility [was] the most significant digression from ‘conventional’ electronic music-making practice here.”

What We See and Do/What We Know

Adalaide Morris observed that new technologies are never readily assailable in terms of current cultural codes. Indeed, it seems we do not have sufficient language to describe our technologically mediated world (Morris 2006, 3). Unsophisticated hybrid terms such as “moving pictures” were used to describe early cinema, the first automobile was a “horseless carriage” – a phrase completely unable to prepare us either for the potential of the car or the changes it would have on transport and society in general.

Theories of embodied knowledge, neuroscience and cognition explain we understand the world around us in two different ways: through sensations located in the lower brain and central nervous systems – instinct, emotion and touch – and through “learned knowledge, book knowledge and canonical convictions” (Varela, Thompson, Rosch 1991). There is a gap between these two types of knowledge, between what we know, because of what we see and do, and what we know because of what we think (Morris 2006, 1). We are continually catching up when negotiating the transitions brought about by technology.

Technological change presents fundamental challenges in our understanding of what constitutes a musical instrument; and furthermore changes in musical styles continue to extend the idea of what makes an instrument. Ballard’s Carry Principle dictates that mobile design is unstable and the constant introduction of additional features, or “feature creep”, is inherently part of the development process. Paul Théberge identified this feature as the intersection between consumption and use of technology. Instruments are defined through their use and not by their form; “musical instruments are not ‘completed’ at the stage of design and manufacture, but rather they are made-over by musicians in the process of making music.” (Théberge 1997, 159-160). As a musician’s practice becomes more aligned with being a technological consumer, the old divisions between producers and consumers become weaker. Similarly, the Internet is breaking down the distinctions between composers and performers as active, and an audience and users as passive.

Getrude Stein commented that we remain “at least several generations behind [our]selves” (1998a, 521), and the same could be said today – that what we see and do is conditioned by our techno-environment of networked computers, telecommunications and online commerce, yet how we think is conditioned by yesterday’s world of print media (Morris 2006, 2). Perhaps the debate on whether a mobile could be considered as an instrument will seem irrelevant in the near future?

Space, Place and Home

What relevance does space or place have in the context of audio mobility? Can mobile practice renegotiate the phenomena of space into something that feels like a place? As an electronic musician constantly in transit across two similar yet very different geographies and socio-economic territories – France and the UK – I have become dependent on mobile technology to stay connected with my private and work life, to remain socially co-present, even if that can be a virtual presence. As my research is held outside the confines of a music studio, a place designed for “recognisable, manageable, understandable and unproblematic scenario[s]” (De Paula 2013, 12), it is of vital importance to have light, portable and resilient equipment to document and record the shifting, fleeting ambiences of places and experiences.

Travel brings me into contact with what Marc Augé refers to as non-places: “[…] the air, rail and motorway routes, the mobile cabins called ‘means of transport’ (aircraft, trains and road vehicles), the airports and railway stations, hotel chains, leisure parks, large retail outlets” (Augé 1995, 79). For Augé, these ever expanding, non-places are the real spaces of our time. If we accept mobiles are only “imperfect instruments”, which maintain some sense of security and location amidst a “culture of flow and deterritorialization” (Morley 2004, 453), can they help reclaim impersonal transit space?

As systems and tools are increasingly networked, as we become increasingly connected with media and information technology through our screens – moving us further away from the idea of being individuals – does mobility render notions of place obsolete?

Firstly, it can be difficult clarifying the difference between place and space as they are often set in opposition to each other. A general perception of place is of a distinct location, somewhere “proper, stable” (Morse 1999, 195). Whereas the idea of space is more abstract and undifferentiated, but can shift to a place when we get to know it better: when we “endow it with value” (Tuan 1977, 6).

Much critical theory regarding mobility was written before the existence of the Smartphone, as theorists began to analyse how mobile phone use was changing our understanding of distance. Do they connect us by overcoming distance, or do they insulate us from the outside world?

Most people will have some experience of overhearing phone conversations in public spaces, we might think of this as the domestication of intimate conversation for a public arena. In From Stabilitas Loci to Mobilitas Loci, Rowan Wilken argued that networked mobility forces a renegotiation that significantly alters our understanding of place (Wilken 2005). David Morley criticises this notion of mobiles transforming distance, claiming they are unable to transcend space but establish co-existing parallel networks instead. Despite all talk of post-modern nomadology, the reality is often far more limited, banal and localised: for instance the typical opening of a conversation with the words “Where are you?” or “I’m on the train” (Morley 2003, 440). Wilken points out Derrida’s observations regarding our attempts to overcome a sense of dislocation brought about by a fear of globalisation and mass communication, by focussing on the local and the home: “The more powerful and violent the technological expropriation, the delocalization, the more powerful […] the return toward home”(Derrida 2002, 79-80). As a result, the domestication and integration of technology through ownership and appropriation is our coping mechanism as we negotiate our way into a technologically mediated future.

The act of close listening is one method of re-hearing what might be thought of as unwanted noise. In his book Electric Sound, Joel Chadabe discusses with the composer Paul Lansky the process of making his piece Night Traffic (1990): “I sometimes use the computer as a camera on the sounds of the world and the sounds of the world then colour the music.” Lansky would make field recordings of passing cars and process them with a computer in his studio. Using the chaos of the sounds – noise and Doppler-shift elements – as a filter through which to hear melodies and harmonies, with the sound of the cars acting as an excitation source, “the cars become the music” (Chadabe 1997, 134).

To borrow from Paul Lansky, my own research has involved making a series of sound studies using the iPad as a camera to capture the sounds of a space, which then colour and form the music. The difference being the music is made on a single, hand-held device; the chaos of the sounds manipulated in situ, in real-time, and as a result the melodies and harmonies augment and overlap with the real world.

Sound Studies

Fig. 3: Steve Jones, the colour of sound: ligne 1. Paris, 2013. iPad, Samvada, Zoom H4 recorder

Fig. 3: Steve Jones, the colour of sound: ligne 1. Paris, 2013. iPad, Samvada, Zoom H4 recorder

This is an early study, experimenting with processing the sound of a Metro carriage while moving. Although the iOS music app Samvada is intended as a simulation of a Sitar instrument, this piece used only its accompanying drone function as an audio comb-filter system. Operating with a simple set of slider GUI objects allowed me to shift between the real-world and manipulated signals while modulating across harmonisations based on classical Indian raga tuning.

Journeys on public transport proved to be ideal environments for this kind of compositional work, the frequent stops lend a sense of musical development and recapitulation. The continually changing ensemble of unknowing performers/contributors added to the sonic texture every few minutes. Moreover, because this was a shared public space, no one associated my gestures with music making and as a result no one took the slightest notice while I stood or sat holding an iPad and wearing headphones.

Fig. 4: Steve Jones, sound study: non-space, Leicester, 2014. iPad, Turnado, Audiobus, TWRecorder.

Fig. 4: Steve Jones, sound study: non-space, Leicester, 2014. iPad, Turnado, Audiobus, TWRecorder.

 

SoundStudy_nonSpace is another transport piece, that demonstrates how far mobile sound processing has evolved in less than a year; Turnado is of a new generation of apps, that allows up to eight separate processing arrangements to be manipulated simultaneously using sliders, dials, and multi-touch gestures. This afforded a more rhythmic and dynamically changing sound world, yet this “feature creep” of increasing complexity requires new playing methods and consequently creates new challenges for the mobile practitioner. Audiobus allows groups of apps to be linked together using inputs effects and output slots enabling the piece to be recorded internally thus eliminating the need for a separate audio recorder and a greater autonomy and physical freedom.

These studies were intended to examine the limitations and practicalities of manipulating sound on a device while travelling, often in uncomfortable situations. Each site introduced its own unpredictable and fast changing elements, very similar to working with free improvisation musicians. Re-interpretation of the sound of a place is an exercise that draw on past practices of close listening, free improvisation, and music composition. Travel becomes an active rather than a passive activity. Instead of being isolated with headphones, the process of listening to both augmented and real-world sounds brings a heightened awareness of the sonic environment. The space informs the sound, it informs the play.

Fig. 5: Steve Jones, sound study: mobile instrument Île de Ré, 2014. iPad, Galaxy Siii

Fig. 5: Steve Jones, sound study: mobile instrument Île de Ré, 2014. iPad, Galaxy Siii

As a coda, this piece hints at directions the research is exploring using video documentation. Building on the previous sound studies, it shows a bicycle ride across the Marais on the Île-de-Ré while playing the sound apps I have described. Although technically unsophisticated – the iPad was placed in a basket and the journey was shot with a Smartphone strapped in front – it was an immersive experience. There was a sense of playing a “mobile” instrument: the passing environment, the sound of wind, wildlife, the road surface and the mechanics of the bike were integral to the overall sound. The piece acts as a link, bridging earlier soundworks with my current studies that aim to integrate sound, image and text.

Conclusion

This article describes one aspect of my research, examining strategies for portable technologies in site-specific interventions and the development of a nascent practice. My personal interest is not so much in the technology as in the relationships that take shape between a user and their device, between a user and other practitioners, and the intersection between consuming and creating. This research is practice-based with an autoethnographical aspect, something of a contentious approach with regard to academic qualitative inquiry, but I would argue that critical and reflexive narrative is necessary to furnish what Douglas and Carless describe as “past, present and future history […]. Happening simultaneously and repeatedly (in different contexts for different people)” (Douglas Carless 2013, 103).

Building on the tradition of the nomad troubadour, the continuation of electronic music practices such as STEIM’s research into self-supporting instruments and the importance of what music theorist Kevin Dawe refers to as “the sonic and design result of travel” (Dawe 2010, 189), this article considers the richness and intimacy in the relationship between play and sound and how that might encourage everyone to engage in active music making rather than simply absorbing it. If hand-held devices afford new ways of seeing, hearing and experiencing our everyday surroundings, can mobile practice transform a space into place, into somewhere that feels like home?

References

Augé, Marc. 1995. Non-Places: Introduction to an Anthropology of Supermodernity. trans. John Howe. London: Verso.
Ballard, Barbara. 2007. Designing the Mobile User Experience. Chichester, UK: John Wiley & Sons.
Bowen, Nathan. 2013. Mobile Phones, Group Improvisation and Music: Trends in Socialized Music-making. PhD diss., City University of New York.
Chadabe, Joel. 1997. Electric Sound: The Past and Promise of Electronic Music. Upper Saddle River: Prentice-Hall.
Dawe, Kevin. 2010. The New Guitarscape in Critical Theory, Cultural Practice and Musical Performance. Farnham: Ashgate.
Derrida, Jaques and Bernard Steigler. 2002. Echographies of Television: Filmed Interviews, trans. Jennifer Bajorek. Cambridge, UK: Polity Press.
Dezeuze, Anna. 2010. The ‘Do-It-Yourself’ Artwork: Participation from Fluxus to New Media. Manchester and New York: Manchester University Press.
Douglas, Kitrina and Carless David. 2013. A History of Autoethnographic Inquiry in Handbook of Autoethnography, edited by Holman Jones S. Adams, T. Ellis, C. Walnut Creek, California: Left Coast Press.
Jones, Steve. 2013. “The Mobile Device: A new folk instrument?” Organised Sound 18, 299-305.
Miranda, Eduardo and Marcello Wanderely. 2006. New Digital Musical Instruments: Control and Interaction Beyond the Keyboard. Computer Music and Digital Audio Series 21. Middleton, Wisconsin: A-R Editions.
Manovich, Lev. 2001. The Language of New Media. Cambridge, MA: MIT Press.
Morris, Adalaide. 2006. “New Media Poetics: As We May Think/How to Write.” In New Media Poetics: Contexts, Technotexts, and Theories, edited by Adalaide Morris and Thomas Swiss. Cambridge, MA – London: MIT Press.
Morley, David. 2003. “What’s ‘Home’ Got to Do with It?: Contradictory Dynamics in the Domestication of Technology and the Dislocation of Domesticity.” European Journal of Cultural Studies 6.4, 435-458.
de Paula, Rogério. 2013 “City Spaces and Spaces for Design.” Interactions Vol. XX.4: 12-15.
Rojek, Chris. 2011. Pop Music, Pop Culture. Cambridge and Malden: Polity Press.
Rowe, Robert. 1999. “The Aesthetics of Interactive Music Systems.” Contemporary Music Review 18, Part 3: 83-87.
Stimpson, Catherine R. and Harriot Chessman. eds. 1998. “Composition as Explanation.” Gertrude Stein: Writings 1903-1932. New York: Library of America.
Tanaka, Atau. 2010. “Mapping Out Instruments, Affordances, and Mobiles.” Proc. New Interfaces for Musical Expression (NIME), 88-93.
Théberge, Paul. 1997. Any Sound You Can Imagine: Making Music/Consuming Technology. Hanover: Wesleyan University Press, University Press of New England.
Tuan, Yi-Fu. 1977. Space and Place: The Perspective of Experience. Minneapolis: University of Minnesota Press.
Varela, Francisco, Evan Thompson, and Eleanor Rosch. 1991. The Embodied Mind: Cognitive Science and Human Experience. Cambridge, MA: MIT Press.

 

Online Resources

Wilken, Rowan. 2005. “From Stabilitas Loci to Mobilitas Loci: Networked Mobility and the Transformation of Place.” The Fibreculture Journal. Issue 6. http://six.fibreculturejournal.org/fcj-036-from-stabilitas-loci-to-mobilitas-loci-networked-mobility-and-the-transformation-of-place/.

http://audiob.us/

http://iotic.com/samvada/

http://samplr.net

http://sugar-bytes.de/content/products/TurnadoIOS/index.php?lang=en

 

PDF version of this article

Sound artist and musician, with a practice rooted in UK electronic dance music and DJ culture, Steve Jones is currently undertaking doctoral research under the supervision of Dr. John Richards at the Music, Technology and Innovation Centre, De Montfort University, Leicester, UK.http://steranko.tumblr.comhttp://amancalledadam.com

Leave a Reply

Your email address will not be published. Required fields are marked *