This essay is an analysis into the different materialites of AI and how they script their interactions with humans through gender and sex stereotypes. Through the examination of the BMW GPS in Germany, the ELIZA Chatbot, Blabdroid, hitchBOT, and the “Amber Doll Project,” connections can start to be drawn about how the appearances and behaviors of these robots enlist predictable reactions from people, whether that be disgust, fear, happiness, sexuality, or nutriment. Meanings can be extrapolated from these connections that inform us about how the way we build objects enforces objectification.
The following is a collection of stories about the interactions between robots and people. It is incumbent to keep in mind how the materiality of the robot, including the voice, size, and expression, result in the rescription of the narrative and agency of the robots. Why are German men so bothered by an artificial female voice giving them directions? What about ELIZA made people feel that they could talk to her more freely and openly than a real person? Is the reason why Bladroid was so successful because it was small and voiced by a “real” 8-year-old boy? Was it the physical features of hitchBOT or the culture of America that led to its destruction? Can a sex doll ever be treated as something who has other functions besides sex?
German Drivers and the BMW GPS
In Germany during the late 1990s, BMW recalled the female-voiced navigation in their series five cars. Female voices in navigation devices were not new and have been used since World War II; women’s voices were employed in airplane cockpits because they stood out among the male pilots. The reason for the recall was that German drivers were refusing to take directions from a woman-voiced navigation system (NPR Studying Computers To Learn About Ourselves). BMW was flooded with complaints, insisting that the voice must be changed. BMW attempted to clarify to their customers that the voice wasn’t a real female woman, that it was only a computer giving them directions, and, in fact, “all the people who had designed the GPS and the directions were male” (NPR Studying Computers To Learn About Ourselves). However, the German drivers still insisted that it must be changed. This outrage towards the female-voiced navigation system has been regarded as a case of deep seeded cultural stereotypes about women and driving and being in control.
ELIZA the Psychotherapist
ELIZA was the world’s first Chatbot. It emulated a Rogerian psychotherapist, that asked and discussed personal questions with users about their lives. Invented in 1966 by MIT computer programmer Joseph Weizenbaum, ELIZA was adored by the people who interacted with her. While most people could recognize that the program was not operated by a real person, ELIZA users still felt a sense of attachment to the program and comfort in knowing that the things that they would talk about with ELIZA wouldn’t be judged or scrutinized (“Studying Computers To Learn About Ourselves,” NPR). Eliza’s creator, however, was not so fond of her. Weizenbaum expressed appalment at how easily people wanted to believe ELIZA was real: “People who knew very well they were conversing with a machine,” but that they “soon forgot that fact, just as theatergoers, in the grip of suspended disbelief, soon forget that the action they are witnessing is not ‘real” (Computer Power and Human Reason). Weizenbaum considered ELIZA to be a fraud, and to prove it, he wrote an entire book called Computer Power and Human Reason where he “exposed” her.
In the late noughties, scientist and professor Alexander Reben became fixated with the idea of how to build robots that make people more willing to tell personal, embarrassing information to. He then teamed up with artist and filmmaker Brent Hoff to invent the adorable cardboard robot, Blabdroid (Sydell, 2018). Blabdroid is a small, simple robot with cameras in its eyes, a button on its side that activates a voice box, and a track to allow movement. Blabdroid interacts with people by asking them questions, narrated by Hoff’s 8-year-old son, which adds further to the childlike, sweet presence of Blabdroid (Sydell, 2018). The kinds of questions Blabdroid would ask include, “who do you love most in the world?”, “what will people remember you for?”, and “if you died tomorrow what would you regret the most?” (Blabdroid.com). People loved Blabdroid; they found its little cardboard body and cheeky cut-out mouth to be non-threatening and endearing (Sydell, 2018). By constructing a robot that possesses such inherently cute and nonthreatening features, users were more likely to speak openly and honestly to Blabdraid. Often people sound as though they are talking to a child, using simple words, and a positive tone. Hoff and Reben are now making a documentary where Blabdroid wanders around asking different people questions and recording their interactions and responses (Blabdroid.com).
The Hitchhiking Robot
HitchBOT was a quirky “hitchhiking robot” from Canada. Creators David Harris Smith and Frauke Zeller built hitchBOT as an art piece and social experiment. HitchBOT was a gender-neutral robot with a beer bucket for a body and red LED lights in its plastic head that lit up its smiling face (Bayerque, 2016). HitchBOT quickly gained international attention through social media, and had successfully hitched all across Canada and in Europe. All along the way, it kept an updated Instagram and Twitter with photos of the people it met and the places it went. Its creators had brief instructions written on its back to help the travelers who would guide it through its bucket list in America, such as being the fifth face in a photo of Mount Rushmore (http://www.hitchbot.me/.). HitchBOT was also great at conversing with the people that picked it up. Its voice, gender neutral and upbeat, had all of Wikipedia memorized and could carry out impressively sophisticated conversation (CBCTheNational. “HitchBOT the Hitchhiking Robot.”). When asked about hitchBOT coming to America, Harris said, “Americans are saying, oh yeah they’re doing that up in Canada. Canadians are crazy. It will probably work in Canada, it would never work here because here in the States we would put it in a ditch or shoot it” (CBCTheNational, “HitchBOT the Hitchhiking Robot”). However, maybe the Americans were right, because in July 2015 at the age of 1 year old, hitchBOT was found beaten and dismembered in a Philadelphia alleyway (Victor, 2018). There was overwhelming support for hitchBOT on social media, and many people were left wondering, why was its journey so successful until it started its journey in the United States?
The “Amber Doll Project”
“Amber Doll Project” was created by Amber Hawk Swanson, and is an ongoing project from 2006 to the present (2018). Swanson is an artist and professor at the Rhode Island School of Design, and for the “Amber Doll Project” she made a sex doll modeled off her own body. She then created a series of photographs, performance pieces, and videos with the doll named Amber that express ideas surrounding agency and objectification, talking about “negotiating power through one’s own participation in a cultural narrative that declares women as objects” (“Amber Doll Project,” 2006). In these videos, we watch as Swanson marries the Amber doll. In another scene the doll is facedown on a roller rink, and in another cut we watch as four men crudely sexually assault it at a football tailgating party by engaging in oral sex with it and poke and prod it.
What does all of this mean?
In this series of accounts of robots’ interactions with humans, the materiality of the robots scripts the performance of people. It is an exploration into how the materiality of a robot determines what kind of reaction it will receive from people. Through all these instances, we can begin to distil the cause and effect nature of the materiality of robots and artificial intelligence. For example, if we compare Blabdroid and hitchBOT with ELIZA and the Amber doll, it could be possible that the features that made Blabdroid and hitchBOT likeable and cute were the features that reminded people of children (small, smiling face, adorable voice), while the features in ELIZA and the Amber doll that led to them being objectified and harassed may have to do with them being represented as women, with no voice or ability to animate themselves through speech. In Anthropomorphic Interactions with a Robot and Robot-Like Agent, Sara Kiesler, Aaron Powers, Susan Fussell, and Cristen Torrey (2006) investigated the effects of physical appearance and voice frequency on attribution of sociability and competence to a robot (Eyssel and Hegel pg.2215). They concluded that baby-faced humanoid robots were perceived as more sociable but less competent when compared to a mature-faced robot (Eyssel and Hegel pg.2216). They also concluded that low voice frequency in robots prompted participants to be more willing to take advice from it (Eyssel and Hegel pg.2216).
In both the cases of the ELIZA Chatbot and the German BMW GPS system, there was backlash. With ELIZA, it was her creator who resented her, and with the German BMW GPS system, it was the German men who felt threatened by a female voice giving them directions. These instances are interesting to extrapolate from because they are both cases where the AI has no physical form. Despite existing only virtually or through code, there was still an emotional response from the people interacting with this AI. It becomes interesting to then consider how the voice of something affects its perception. The research of gender-stereotypic responses toward computers by Nass, Moon, and Green (1997) is applicable when considering the female voice in the non-bodied AI like the German BMW GPS systems, Alexa, or Siri. They found that a female-voiced computer in a dominant role was perceived more negatively than was a male-voiced computer. In addition, the male computer was taken more seriously than when praise was given by the female computer (Eyssel and Hegel pg.2216).
It seems that if not constructed with all of these factors in mind, objects will to objectification. When the female form is represented in AI objects, the more “convincing” the object is, the more it is sexualized and objectified. We can see this happen in the case of the Amber doll. Because it was a molded replica of a woman, it ended up depicting how a realistic manifestation of the female body will be interacted with. In almost every video recorded encounter with the Amber doll, people are laughing and grabbing, touching, and thrusting sexually at it.
Moving forward with technology and as robots become more integrated into the everyday lives of people, considering the materiality of the robots being created becomes all the more important when considering what kind of social constructs we want to break and what ones we want to cement. Every decision in the materiality of a robot will congeal its identity and change the lens through which it is perceived and interacted.
“Amber Doll Project.” AMBER HAWK SWANSON. Accessed April 08, 2018. http://amberhawkswanson.com/artwork/2940916.html.
Bayerque, Nicolas. “A Short History of Chatbots and Artificial Intelligence.” VentureBeat. August 15, 2016. Accessed April 08, 2018. https://venturebeat.com/2016/08/15/a-short-history-of-chatbots-and-artificial-intelligence/.
“BlabDroid.” Blabdroid.com. Accessed April 08, 2018. http://blabdroid.com/.
Griggs, Brandon. “Why Computer Voices Are Mostly Female.” CNN. October 21, 2011. Accessed April 08, 2018. https://www.cnn.com/2011/10/21/tech/innovation/female-computer-voices/index.html.
“HitchBOT.” HitchBOT. Accessed April 08, 2018. http://www.hitchbot.me/.
CBCTheNational. “HitchBOT the Hitchhiking Robot.” YouTube. July 28, 2014. Accessed April 08, 2018. https://www.youtube.com/watch?v=4pWNQ3yUTJo.
“Robot’s Cross-country Trek Ends Abruptly in Philly.” Fortune. Accessed April 08, 2018. http://fortune.com/2015/08/03/hitchbots-trip-ends-in-philly/.
“Studying Computers To Learn About Ourselves.” NPR. September 03, 2010. Accessed April 08, 2018. https://www.npr.org/templates/story/story.php?storyId=129629756.
Sydell, Laura. “Sometimes We Feel More Comfortable Talking To A Robot.” NPR. February 24, 2018. Accessed April 08, 2018. https://www.npr.org/sections/alltechconsidered/2018/02/24/583682556/sometimes-we-feel-more-comfortable-talking-to-a-robot.
Victor, Daniel. “Hitchhiking Robot, Safe in Several Countries, Meets Its End in Philadelphia.” The New York Times. August 03, 2015. Accessed April 08, 2018. https://www.nytimes.com/2015/08/04/us/hitchhiking-robot-safe-in-several-countries-meets-its-end-in-philadelphia.html.
Eyssel, Friederike, and Frank Hegel. “(S)hes Got the Look: Gender Stereotyping of Robots1.” Journal of Applied Social Psychology42, no. 9 (2012): 2213-230. doi:10.1111/j.1559-1816.2012.00937.x.
ROBINSON, NICK. PLANTING DESIGN HANDBOOK. Place of Publication Not Identified: ROUTLEDGE, 2018.
Turner, Kimberly Duffy., and Ronda Brands. Botany for Designers: A Practical Guide for Landscape Architects and Other Professionals. New York: W.W. Norton & Company, 2011.