Post-hitchBOT-ism

By David Harris Smith, Frauke Zeller & Andrea Zeffiro

Abstract

Issue co-editor, Andrea Zeffiro, speaks with Frauke Zeller and David Harris Smith about the cultural significance of hitchBOT, the infamous hitchhiking robot, whose journey across the United States came to an abrupt end in Philadelphia in August of 2015. Zeller and Smith reflect on hitchBOT’s legacy and consider how caring for non-human matter might bring renewed attention to an ethics of trash.

 

David Harris Smith & Frauke Zeller

Interviewed by Andrea Zeffiro

 

Frauke and David, when I approached you both about doing an interview for this special issue of Wi on mobile trash, it was shortly after hitchBOT was trashed, so to speak, in Philadelphia. The catalyst for the interview, however, was a piece I’d read a few days after the event - David, you referenced it in your talk[1] - which described hitchBOT as “a literal pile of trash”. What caught my attention was the way in which the author unintentionally captured a fundamental quality of hitchBOT. “The impulse, here, is to say that hitchBOT was ‘destroyed,’ but that is nonsense” the author writes, “what is the actual consequence to hitchBOT of detaching its parts? A loss of function? What function? It had no function.”

Arguably, hitchBOT had many functions. For starters, it was something of a social catalyst. David, in your talk, you revealed the various and often culturally specific ways that people engaged with hitchBOT. Can you both talk a little bit about these engagements. For instance, what kinds of social relations were drawn forth from these encounters?

Albert Burneko’s rant in The Concourse,[2] which heaped scorn upon hitchBOT, and Canadians by extension, was part of a lively discussion in the media in the days immediately following hitchBOT’s demise in Philadelphia. Burneko’s take on hitchBOT was that Americans build and deploy industrial high-quality killer robots while their deluded northern neighbours will settle for a ‘loudmouthed freeloading bucket.’ This type of catharsis was not completely unexpected; grief may initially take the form of anger and denial.

Of course the question of function is dependent upon which of the several robot categories is under consideration, including service robots that act as assistants, protector robots that replace humans in dangerous environments, robotic prostheses such as exoskeletons, and robot societies, which imitate swarms (Zeller, 2005). Social robots are autonomous robots that interact and communicate with humans or another autonomous physical agent by following social behaviours and rules associated with a social role (Zeller, 2005; see also Breazeal, 2004). hitchBOT is this latter type of robot, its social role is the gregarious sojourner and its functions include actively signalling for rides with its hitchhiking arm, maintaining some contextual awareness of its situation and objectives and carrying on a more or less responsive conversation with the people it meets.

hitchBOT’s social roles included wedding crasher, guest of honour at a First Nations Pow Wow, band groupie, tourist, media celebrity, camper and playmate. hitchBOT seemed to catalyze social relationships for those who picked it up; as four Quebeckers who adopted hitchBOT noted, having the robot in their camping entourage was a great ice breaker, inviting many opportunities to be introduced to fellow (human) travelers.

 

Intertwined with hitchBOT’s social function is of course, the technological element to the project. The plastic-bucket torso, pool-noodle limbs, and rubber gloves and wellies, are a front for sophisticated software and hardware programming. Frauke, can you talk more about the technical specifications that brought hitchBOT to a-life?

Hardware requirements for the robot included a camera for recording occasional travel photos, a microphone and speaker for speech input and output, 4G Internet access, and a computer processor to run applications. These requirements were satisfied economically by using a simple tablet PC (in our case, a Samsung Nexus LTE tablet) and writing custom software applications to run the various systems. We found it necessary to install soundproofing inside the robot to optimize speech recognition from the tablet microphone and an external amplifier and speakers inside the robot to improve audibility of the speech outputs in noisy environments. A designated satellite GPS tracking system was installed to overcome sparse 4G coverage while traveling. Within hitchBOT’s acrylic cake saver head, an Arduino processor and 4 LED panels were used to display an animated smiling and winking face on the front aspect and animated graphics on the sides and rear. A small servomotor, set to variable cycle, powered the hitchhiking arm movement.

The power system for hitchBOT required operation of all systems from battery storage while on roadside, as well as the ability to operate while charging from several different sources, including automotive 12v, 120v, and solar. This customized power and charging system yielded approximately 8 hours of operation relying solely on the onboard battery storage power supply.

Interactivity criteria for the robot included AI supported sociability, the ability to initiate and maintain an interesting (or at least entertaining) conversation with drivers while maintaining some contextual awareness of its current location and travel objectives. Our communication strategy required that these AI systems should be integrated with hitchBOT’s web and social media accounts. Budget constraints encouraged us to work with existing and open source technologies where possible. For speech recognition we utilized a combination of Pocket Sphynx (CMUSphynx) and Google voice recognizer (Web Speech API). For the robot’s conversation modeling system we worked with the developer of the proprietary Cleverscript AI (Cleverscript) to customize the application to respond to predicted speech inputs relating to hitchBOT’s identity and travel objectives. The biggest challenge in our dialogue modeling was to simulate contextual awareness in the conversation by appearing to respond on topic to an unpredictable variety of possible inputs. This was largely accomplished by keyword recognition and strategically responding to keyword topics with outputs associated with opinion, rather than facts. Using keyword recognition we were able to capture and approximately identify the subject matter of a greater variety of inputs and return a speech output that was more or less on topic.

Personality design: Social robots require some kind of personality in order to facilitate interaction and bonding of humans with robots. We gave hitchBOT a personality profile that included the identification of family members (Meet hitchBOT’s Family), hobbies and interests, and a friendly, somewhat cheeky, disposition. Even before hitchBOT went on its first trip, we observed that there was a disproportionate attribution of male gender to the robot and cultural assumptions that the typical robotics researchers and technicians are male. We wanted to avoid these stereotypes in our robot and were careful not to assign a gender to hitchBOT in any of our design work, including the crafting of hitchBOT’s personality, web and social media profiles. This was particularly challenging when it came to selecting hitchBOT’s speech synthesis. Since we were not able to source a gender neutral voice, we opted for a female voice, hoping to instil discussions about our gender perceptions and attributions. We also tried to adapt its personality to the technical challenges and features. We knew that the on-board speech recognition would sometimes not work well, simply given that we used the robot in situations that are extremely challenging for any speech recognition machine: unrestricted dialogue input, many different voices and accents, background noise, etc. Thus, knowing that the quality of conversation might be uneven, we gave hitchBOT a chatty disposition, leading the conversation rather than having to rely on understanding what humans are saying. One typical strategy used to accomplish this in our dialogue modeling was to have hitchBOT respond with a joke or start a game of “I spy” when the speech input recognition was faulty.

Media and Social Media Strategy. Our objective to create a proxy adventurer necessitated that we have a comprehensive media and social media strategy to allow for remote interaction with the project. This represented yet another innovative approach in our cultural robotics experiment since social media are rarely used in traditional HRI settings. Social media accounts featuring hitchBOT’s personality profile were established on three prominent platforms: Facebook, Twitter and Instagram. Also, a hitchBOT.me domain website was established to disseminate information about the project and development team, a live interactive map featuring hitchBOT’s current GPS location and travel history, blog posts from followers, and information and press releases for journalists. For hitchBOT’s website and social media accounts, hitchBOT’s first person narrative was used; all content and updates posted to these media sites were submitted in hitchBOT’s ‘voice’, helping to reinforce hitchBOT’s personality and presence as a social entity. One of the challenges in our website development was to be able to economically accommodate the thousands of page views and visitors anticipated during hitchBOT’s travels. A system was developed whereby we could add multiple public mirror sites as needed while maintaining the ability to edit updates to a master website and subsequently push revised content out to mirror sites.

 

As I’ve alluded to above, the research team made specific decisions when designing hitchBOT. How is hitchBOT’s unassuming and whimsical exterior interconnected to its social function?

Our objective was to create a hitchhiking social robot that integrated social media and face-to-face interaction. Moreover, the face-to-face interaction would be unsolicited or unguided by us, requiring the robot’s interaction features to be straightforward and non-intimidating. To meet these objectives we decided upon the following design criteria:

Form factors implicit in our objective required that hitchBOT should be immediately recognizable as a robot and that it be also anthropomorphic. Human intrigue for robots has been with us since ancient times, even rooted in our mythical heritage. Hephaestos, Greek god and blacksmith to the gods, crafted human and animal automatons to serve him. Prometheus, who fashioned humankind from clay, underscores our fascination with animating the inanimate (Cohen, 1966). Anthropomorphism, the attribution of human characteristics and appearances to non-human objects or animals, represents an important area of human-robot interaction (HRI) research. Providing robots with human facial features, such as eyes and a mouth, is a common strategy used to instil trust and curiosity. On the other hand, the uncanny valley effect (Mori, 1982; Reichardt, 1978; Bryant, n.d.) demonstrates that high fidelity anthropomorhism can increase the likelihood of rejecting a machine; there is a chance that we are simply unsettled by robots that are too much like us (Zeller, 2005). A smaller, approximately anthropomorphic robot, the size of a child, with a stylized face, is less likely to intimidate and more likely to elicit empathy. hitchBOT was designed to be approximately the size of a 5– 6 year old child (100 cm) and low weight (10.4 kg) so that it could be easily picked up by drivers and strapped into a car passenger seat. To facilitate the seatbelt wearing, we gave hitchBOT a fixed sitting posture and incorporated a child’s booster seat as the rigid base of its body. Since we were planning to abandon hitchBOT on the side of a road to wait for rides, it had to be durable, withstand adverse weather, and had to be easily repaired or replaced. We used a plastic 20 litre pail (actually a beer bucket cooler) for the body and to house the main electronic components, a clear acrylic cake saver and plastic garbage can lid for the head and hat, blue pool noodles reinforced with copper and plastic PVC pipe for the arms and legs, and foam filled gloves and rubber wellington boots for hands and feet. The comical effect of what we called hitchBOT’s ‘yard-sale aesthetic’ furthered our aims to evoke curiosity and empathy. We added a retractable tripod leg to allow the robot to ‘stand’ while waiting for its next ride. Most importantly, the robot had to signal that it was hitchhiking, requiring its right hand and arm to move spontaneously, giving the ‘thumbing-a-ride’ gesture.

hitchBOT communication patterns were also designed to reflect its whimsical physical appearance - being a chatty, spontaneous robot that likes telling jokes and would rather be in for a fun, entertaining conversation than high-flying academic discourses. In fact, the communication patterns and style (its ‘voice’) were defined in great detail, and anyone working on the communication aspects of the project had to be trained how to apply its voice (ranging from vocabulary, communication frequency, topics, etc.).

 

At a certain point during hitchBOT’s tour, its celebrity started to supersede its capacity for spontaneity. What challenges did this pose for the project?

As a preamble to our discussion about the challenges posed by hitchBOT’s celebrity, it is important to point out that hitchBOT’s celebrity was somewhat uneven; by that we mean that even as the story of hitchBOT was making appearances on social and traditional media platforms around the world, it was not unusual for hitchBOT to encounter someone who had no prior knowledge of the project and, as a result, have a spontaneous interaction. In fact, if you follow hitchBOT’s Twitter feed you will continue to see posts from new hitchBOT fans, often disappointed that they are coming to the story after its climax.

 

hitchBOT’s international celebrity really took off following Alexis Madrigal’s article in The Atlantic (Madrigal, 2014) prior to the launch in Halifax. We were swamped with media enquiries from around the world from the moment that article was published and hitchBOT had not yet raised a thumb. Fortunately our media design strategies were in place with prepared press release materials and we were assisted by our university public relations officers in managing and scheduling the numerous interviews we did for print, radio and television.

hitchBOT was designed to allow for remote engagement through its social media feeds in addition to the direct physical interactions typical of HRI experiments. As we watched the social media feeds happening in real time when hitchBOT’s journeys unfolded, we began to see virtual cueing for hitchBOT’s next ride. Picking up hitchBOT started to look like competitive geocaching. In some cases we shared in the disappointment of hitchBOT’s social media followers as hitchBOT missed out on activities such as skydiving, or a tour of the Bundestag (Germany’s constitutional and legislative body), because it had already been spirited away to a new location. Especially affecting for us, were the posts about children who had waited in vain for hitchBOT to visit.

In the progress of hitchBOT’s adventures beginning in Canada, then Germany, Netherlands, and the U.S., we began to see enthusiasm from commercial interests eager to host or sponsor hitchBOT as part of their corporate public relations and branding strategies. With the exception of offers for air travel (which we were happy to accept to supplement our meagre budget) we said no to commercial product endorsement offers. We felt that hitchBOT’s public reception was overwhelming positive, in part, due to its purely recreational character; hitchBOT was not trying to sell you anything other than a good time.

 

I would imagine that hitchBOT’s brush with fame brought forth different kinds of social relations than those described above. How is the research team processing HithBOT’s celebrity?

Our 18-member research team is comprised of undergrad and graduate students from McMaster and Ryerson Universities. These students demonstrated exceptional levels of commitment to the project, investing countless hours in the design, build and running of hitchBOT. Added to this labour commitment, we think there was a layer of affective engagement with hitchBOT because of its anthropomorphic qualities. The team were responsible for developing hitchBOT’s personality, something accomplished through hitchBOT’s eccentric yard sale physique, its dialogue modeling and social media profile. This was a collective activity, involving a lot of social interaction among the research team in the tasks associated with character development and scenario development. Not surprisingly, all of us were emotionally engaged with hitchBOT’s fortunes as it traveled, revelling in its successes, annoyed with it when it failed to respond (typically in live television interviews), and saddened by its destruction. We know that hitchBOT is a great talking point in job interviews as some of our team members have graduated and embarked upon their first career employment.

 

David, in your talk you referenced the human fascination with animating matter. Arguably, this fascination is equaled by a darker impulse: the human fascination with de-animating matter. How is the research team coming to terms with this tension? What might be gleaned from this tension? How might this tension inform future research?

Foremost in our thinking, from the outset of the hitchBOT project, was a concern for safety and, by extension, our liability, should things go badly. To some extent, hitchBOT can be regarded as a thought experiment about an autonomous artificial social actor, which was subsequently materialized and performed. As a thought experiment it necessitated thinking through possible scenarios in a detailed step-by-step manner. In some cases we produced illustrations to anchor a particular issue, such as a car accident, a secure international border crossing, surveillance and privacy, or a legal liability challenge. We feel that these scenarios, and the hitchBOT project, contribute to discussion on the future of autonomous technologies by helping to focus discussion on the possible varieties of artificial social actors, their value, ownership, ethics, safety and liability. In particular, hitchBOT’s destruction, or de-animation, as you suggest, was met with a great deal of sadness and disappointment. This was strongly felt among the team members, reflecting a sense of personal loss to be certain, but more strongly, a disappointment stemming from the sudden disruption to a collective experience of play. We think that the emotional response to hitchBOT’s destruction is complicated for this reason, predicated upon good humour, empathy and cooperation. An insightful essay (the antithesis of Burneko’s rant in The Concourse) appeared two days later in the literary journal Full Stop. In ‘HitchBOT’s Last Lesson,’ authors Timothy Ignaffo and Christopher Dougherty, highlighted hitchBOT’s status as a synecdoche for some of our best attributes:

Clearly, as the outpouring of international grief suggests: Hitchbot belonged to the world, which is another way of saying he belonged to no one person. His value was in no way connected to his electronic viscera or physicality but rather an idea and belief in the fundamental goodness of people everywhere.

How does hitchBOT’s destruction inform our future research in robotics? We consider the destruction of hitchBOT to be an exceptional response. hitchBOT traveled tens of thousands of kilometres in the company of strangers without incident, at its worst annoying people with either its refusal to talk or incessant chattering. Key to people’s acceptance of hitchBOT was certainly the good humour and approachability evident in its physical and character design. But the interaction design feature most likely responsible for its relative longevity was hitchBOT’s invitation to people to creatively contribute in the development of its activities.

 

Finally, what, if anything, has hitchBOT revealed about the affective bonds between humans and non-humans? I’m wondering if feeling for or caring for non-human matter might motivate us - the big social us - to anatomize our relationship to things. Put differently, how might the very act of caring for “a literal pile of trash” bring renewed attention to an ethics of stuff, or more appropriately, an ethics of trash?

An interesting interview with MIT robot ethicist Kate Darling on this topic appeared in Wired following hitchBOT’s destruction in Philadelphia (Collins, 2015). Darling observed that humans have a tendency to anthropomorphize and extend empathy toward animated objects, especially robots. In Darling, Nandy and Breazeal (2015), the researchers conducted an experiment to better understand empathy toward robots and specifically whether lifelike movement and storytelling affected a participant’s willingness to strike a robot. The storytelling condition for the experiment entailed giving the robot a name and a personified backstory. Their results showed that participants were more hesitant to hit robots that had been attributed a name and personal story and that this hesitation was significant whether lifelike movement was included or not.

Our results confirm that stories can have an impact on people’s reactions to robots. This has implications for design. Designing robotic technology that can accrue experience or is personified could influence users’ perception of robots and increase users’ emotional responses to robots. Also, introducing robots with stories could facilitate adoption of robotic technology if stories help people relate to robots on an emotional level.

The August 2015 timing of the publication of Darling’s study and the August 1st 2015 destruction of hitchBOT comprise a poetic symmetry, perhaps offering some hypotheses on what went wrong for hitchBOT. The safe and successful travels of hitchBOT prior to Philadelphia were most certainly supported by the careful crafting of hitchBOT’s personality, backstory and the emergent dramatic arc of its adventure; hitchBOT was designed to accrue more personality and experience by the participatory interaction of fellow travelers. In one scenario we can imagine that the person(s) who vandalized hitchBOT might have been unfamiliar with the story of hitchBOT and therefore was not affected by the empathic hesitation noted in Darling et al (ibid.). In another scenario, the individual(s) who destroyed hitchBOT might have been familiar with the hitchBOT story but might have low trait empathy scores (this varies among human)[3]  resulting in the potential gain through the theft of its parts, or the thrill of vandalism, winning out over hitchBOT’s preservation.

How might findings on the significance of storytelling and anthropomorphising relate to an ‘ethics of trash’? This reminds of us social justice and environmentalist strategies to personalize the often-complex network of events that lead to humanitarian and ecological crises. For example, “Following the e-waste trail – UK to Nigeria” (Greenpeace and Buus, 2009) details the recycling journey of a television set, which instead turns out to be the story of illegal dumping of e-waste, or “Behind the Swoosh” (Team Sweat, 2011) which documents the daily life of Indonesian sweatshop workers subcontracted to manufacture Nike running shoes. Storytelling as it is used in these cases, illustrates the inextricable mix of things and social concerns. But, unlike shoes and TVs, which follow a teleological path from useful consumer good to useless waste, hitchBOT appeared as a mostly useless artefact from the outset. Perhaps this cultivated uselessness is a critical strategy for exempting an artefact from transformative flow of goods to trash. In keeping with our refusal to assign a use-value to hitchBOT, the storytelling engaged for hitchBOT, unlike the prescriptive nature of the Greenpeace and TeamSweat tales, was participatory and undecided. We framed the emerging ethics of social robots with the question “Can robots trust humans?”. And the overall results from the project let us theorize that robotic technologies that afford creative shaping by their users are more likely to become socially integrated.

 

PDF Version of Article

References

Breazeal, C. (2004). Designing Sociable Robots. MIT Press.

Bryant, Dave. (n.d.). The Uncanny Valley: Why are monster-movie zombies so horrifying and talking animals so fascinating? Accessed October 15, 2015 http://www.arclight.net/~pdb/nonfiction/uncanny-valley.html.

Burneko, A. (August 3, 2015). HitchBOT was a Literal Pile of Trash and Got What it Deserved. The Concourse. Available online http://theconcourse.deadspin.com/hitchbot-was-a-literal-pile-of-trash-and-got-what-it-de-1721850503

Cleverscript. Available online http://www.cleverscript.com/

CMUSphynx. Open source speech recognition toolkit. Available online http://cmusphinx.sourceforge.net/

Cohen, J. (1966). Human Robots in Myth and Science. London: George Allen & Unwin Ltd.

Collins, K. (August 20, 2015). What the death of hitchBOT teaches us about humans. Wired. Available online http://www.wired.co.uk/news/archive/2015-08/20/hitchbot-death-robot-ethics-human-psychology

Hermandez, Darling, K., Nandy, P., & Breazeal, C. (2015, August). Empathic concern and the effect of stories in human-robot interaction. In Robot and Human Interactive Communication (RO-MAN), 2015 24th IEEE International Symposium on (pp. 770-775). IEEE.

Greenpeace and Buus, K. (2009, February 18). “Following the e-waste trail – UK to Nigeria. Greenpeace International. Retrieved from http://www.greenpeace.org/international/en/multimedia/multimedia-archive/Photo-Essays1/following-the-e-waste-trail/

Hogan, R. (1969). Development of an empathy scale. Journal of consulting and clinical psychology, 33(3), 307.

Ignaffo, T. and Dougherty, C. (August 5, 2015). Hitchbot’s Last Lesson. Full Stop. Available online http://www.full-stop.net/2015/08/05/blog/timothy-ignaffo-and-christopher-dougherty/hitchbots-last-lesson/

Madrigal, A. (2014, June, 12). Meet the Cute, Wellies-Wearing, Wikipedia-Reading Robot That's Going to Hitchhike Across Canada. The Atlantic, USA. Retrieved from http://www.theatlantic.com/technology/archive/2014/06/meet-the-cute-wellies-wearing-robot-thats-going-to-hitchhike-across-canada/372677/?utm_source=theatlantic_110614&utm_medium=internal_web&utm_content=hitchBOT%20media&utm_campaign=media

Mori, M. (1999). The Buddha in the Robot: A Robot Engineer’s Thought on Science and Religion (6. Aufl.). Tokio: Kosei Publishing.

OpenUPS. Available online http://www.mini-box.com/OpenUPS

Reichardt, J. (1978). Robots: Fact, Fiction and Prediction. London: Thames & Hudson. Meet hitchBOT’s Family. Available online http://www.hitchbot.me/family/

TeamSweat (2011). “Nike Sweatshops: Behind the Swoosh. TeamSweat. Retrieved from https://youtu.be/M5uYCWVfuPQ

Web Speech API. Web Speech API Demonstration. Available online https://developers.google.com/web/updates/2013/01/Voice-Driven-Web-Apps-Introduction-to-the-Web-Speech-API?hl=en

Zeller, F. (2005). Mensch-Roboter Interaktion: Eine sprachwissenschaftliche Perspektive. Kassel: Kassel University Press.

 

About the authors

Dr. David Harris Smith is Assistant Professor in the Department of Communication Studies and Multimedia at McMaster University, Canada. His research interests include the culture and practices of emerging media, and the practices and applications of virtual worlds and mixed reality environments, cultural robotics and AI.

Dr. Frauke Zeller is Assistant Professor in the School of Professional Communication at Ryerson University, Canada. Her research interests include Human-Robot Interaction, digital communication analysis and method development for big and complex digital data.

Dr. Andrea Zeffiro is Assistant Professor in the Department of Communication Studies and Multimedia and Academic Director of the Lewis and Ruth Sherman Centre for Digital Scholarship at McMaster University.

Footnotes

  1. On November 25, 2015, David Harris Smith, co-creator with Frauke Zeller of hitchBOT, gave a talk at McMaster University called “The Death and Lives of hitchBOT the Hitchhiking Robot”. In that talk, Smith shared the story of the life and times of hitchBOT. This interview, in part, stems from that talk. ^
  2. http://theconcourse.deadspin.com/hitchbot-was-a-literal-pile-of-trash-and-got-what-it-de-1721850503 ^
  3. Hogan, R. (1969). Development of an empathy scale. Journal of consulting and clinical psychology, 33(3), 307. ^

Leave a Reply

Your email address will not be published. Required fields are marked *