Artist and game designer, Digital Utopia editorial board
On an evening stroll on the London wharfs I met a woman. We spent the night together and the following day agreed that this is more than just a casual fling, we should see each other again when I next return from my journeys at sea. A few tours on my boat and a few dates later, I received a note that she is with child. Earning enough money, I bought a house – a respite for myself, and a container for my lover and offspring. They moved into our humble abode and there they await my return – always available whenever I arrive from my journeys at sea. My child always listens to my stories, if I spare them the time, and through my stories I shape my child.
Public discourse around robots and artificial intelligence has always been prophetic: anticipating their grand arrival to the world stage as a watershed moment that has yet to arrive, but will be upon us as soon as the technologies, markets and regulations all mature to allow it, and will greatly impact society when it is finally here. Either in the far future or right around the corner, visions of robots and similarly-artificial agents imagine them as our servants or our children, rising up against us in a technological slave rebellion or patricide. Other visions imagine robots as partners, blending in with human society or becoming host bodies for a human soul so that it may outlive the flesh.
The popular form of this discourse manifests through speculative fiction, starring robots and AIs like Star Trek’s Data, The Terminator’s Skynet and Terminators, or The Matrix’s Agent Smith; as well as public-facing, spectacular, speculative engineering attempts, such as IBM’s Deep Blue, Boston Dynamics’ Atlas and Spot, and Hanson Robotics’ Sophia.
In recent years, a new vision for AI and robots has been gaining popularity: that of emotional support and companionship to humans. Pinpointing the causes for the trend can be difficult, since like many other ideas, it has deep roots – going back to Greek mythology and the story of Pygmalion and Galatea, if not further back. Is it a reaction to the adversarial depictions of robots in 20th century media? Or to the increasingly-adversarial relations we actually experience online with web-based bots? Perhaps it’s inspired by the advances in “natural language” algorithms, giving robots a familiar voice? Perhaps it is disseminated by researchers and courses in the rising academic discipline of Human-Computer Interaction (HCI)? Or maybe it isn’t about the robots at all, but rather reflects a yielding to internet-age alienation from our fellow humans? Regardless, the excitement and intrigue around the prospect of computers providing emotional services to humans is on the rise.
You can see it in “cutting edge” technological experiments and the media coverage that surrounds them (1), asking questions about how future society will adapt to the presence of emotionally-servile robots, and how companies and researchers can design their robots to be more emotionally appealing and perform more emotional labor for humans. Skeptics are asking us to consider how the advent of emotional service robots may harm humans whose work involves emotional labor, or emotionally-deskill humans (2) who receive such services from AI.
You can also see it in contemporary speculative-fiction and science-fiction. Most known perhaps is the 2013 film Her (by director Spike Jonze), where the human protagonists fall in love with their advanced AI smartphone, and the 2012 film Robot and Frank (director: Jake Schreier, opening Film at the 2012 Utopia International Film Festival) where an elderly man develops a complex relationship with his emotional support robot. A 2013 episode of the TV series Black Mirror depicted a woman’s relationship with an AI surrogate of her dead lover, and in 2020 it was unironically attempted in real life, when the creators of the documentary Meeting You produced an interactive VR stand-in for a mother’s dead child (3).
Emotional Robots Have Been Here For a While
Most participants in this discourse have lots to gain from its speculative nature, anticipating the future rather than tackling the past and present. Put simply, there’s less friction in the uninhabited frontier, less waves in a Blue Ocean, unencumbered by the burden of proof. For the skeptics, it is a way to attract attention and stir emotion through our fear of the unknown. For the proponents, it is a way to attract fame and (a lot of) funding with promises that need not be shadowed by, or take responsibility for, present reality. As any Silicon Valley investor would tell you, the worst thing that can happen to a cutting edge tech product is actually going to market.
But though it may be profitable to keep emotional robots in the realm of speculative design, it is simply not the true state of affairs. Emotional robots are already part of society, have been part of society for many decades, have become a cultural staple and even cliché – they simply look a little different than what the speculators would have you imagine.
That woman I met in the London wharfs is one of them. In my 9-to-5 I am not a seafarer nor do I own a house in London, but I do in the videogame Sunless Sea (developed by Failbetter Games, first released in 2015). This game is filled with fictional characters, all reactive to my actions with their own fictional personalities and agendas, all in service of my grand seafaring adventure. My fictional lover’s agenda is, and forever will be, to comfort me whenever I return from sailing the wide and open seas.
Emotional support is already one of the most widespread uses of AI, and has been for many decades, primarily through the function of NPCs: non-player characters, made to inhabit digital spaces and provide services to human visitors that which, in human terms, we would consider as emotional labor: teachers, companions, personal trainers, therapists, surrogate family, and hospitality workers. We meet them in computer games, as well as on the internet and in some electronic toys – in any digital domain where we expect to be made to feel good.
Understanding that this robotic function is indeed not upcoming but already well-ingrained in society, not waiting for some technological breakthrough but already well-developed over generations of design and actual use, reveals something deeper about those robots: depicting emotional-support AI as a sort of technology is false. It is instead a sort of fiction, defined not by its liveliness but by its servitude. Robots are not defined by some internal mathematical complexity or by their degree of closeness to that “General Intelligence” coveted by nerdy engineers, but rather by their societal function, the stories we tell about them, the roles we cast them for and have them play.
The emblematic computer therapist, ELIZA, was conceived in the mid-1960’s and became a cultural myth.
The simple scripts it ran as part of the MIT experimental laboratory are exactly that – scripts, more fiction than intuition – and would not pass a contemporary Turing Test to be considered true AI. But it did not stop ELIZA’s creator from being ushered out of the room when his secretary, as the myth goes, was having an intimate conversation with the digital therapist. ELIZA did not spawn widespread use of robots as explicit therapists, but its scripted-reaction techniques has found their way into many other digital agents, most notable today through the proliferation of chatbots that welcome humans to business webpages, providing very little functional service but adding a friendlier face to the emptiness of digital commercial space.
Other forms of emotional robots burst into the mainstream with the rise of commercial video-games. While early experiments such as Tennis for Two (William Higinbotham, 1958) and Spacewar! (Steve Russell, 1962) were fashioned in the mold of competitive sports and board games, where two players compete against each other, later commercial products like the digital arcade cabinet, home console, and personal computer had much to gain from allowing a single human customer to enjoy a game by themselves.
Because competition and struggle are still fantastic motivators even for a Cartesian single-player, digital robots appeared as adversaries. But since they are now servile, instead of partners, to the single human’s enjoyment, they could be made to be unequal: Pacman’s ghosts are no match to the player’s agility, Super Mario’s dumb Goombas squish under his boot, and Doom’s endless waves of grunts are made to be slaughtered. All operate according to scripts, all making us feel powerful, skilled, and in control. Still the most popular form of NPC today, the myriad faces of computer game adversaries are simply fiction on top of a million variations of Solitaire.
NPCs also provide companionship, standing in for best friends, parents, children and other family members and of course adventuring cohorts, escorts and comrades. Most famous perhaps is the Tamagotchi trend of the mid 1990s. Marketed for children, it became a cultural icon, pets living on tiny keychain computers that needed regular but very easy attention in order to thrive, and could even “die” if not properly taken care of – only to be reborn again with the click of a button. This basic function evolved into other big trends: Furby was a reactive robot doll that did not require care and could not die, but had a script that made it seem as if it was learning to speak English (it was created in the US and designed for that market) over time spent with its owner; The Sims is a video game series where players care for virtual people, creating for them a fantastic home and career modeled loosely after the American white-picket-fence bourgeois lifestyle. Sims are a liminal robotic being, half-way between virtual wards and virtual avatars, so that players can feel both care for and identification with them; combined with a fantasy of guaranteed gradual upwards class mobility made The Sims into the best-selling video game in history at the time of its release.
In addition to children or wards, many NPCs provide mentorship for personal growth. Adventure games often explain the rules of engagement to the player, in the shoes of a young protagonist, through the character of an older veteran. Following in the tradition of tabletop role-playing games like Dungeons and Dragons, video-games also tend to place the solitary human player in an adventuring party made up of robot companions. Usually, none of those companions will be as capable as the player, especially not in terms of growth – their static agency reflected in their set of abilities, while the dynamic human agent grows in skill and power. They will stick with the player through an adventure that is ultimately made for humans, playing their part in the human’s personal journey.
With the rise of complex online games, where a player sits alone in their physical space and meets both humans and NPCs in similarly-represented digital form, those relationships become more diffuse: I could be cooperating with a group of human and AI companions against a different group of AIs, or humans, or both. Human online playmates might seem more desirable, but truthfully it matters less who actually drives the characters on-screen; it matters more that regardless of being human or machine, they serve similar functions for the player at home.
The 2007 video game Portal exemplifies many of the NPC functions mentioned above, satirizing them while at the same time maintaining all of their functions. The principal NPC is portrayed as an actual robot, an AI operating system named GLaDOS that manages the laboratory in which the player finds themselves in the game. GLaDOS guides the player through a series of “experiments” teaching them how to play the game, talking at them like a child or trainee. Every line GLaDOS says is scripted, yet she is reactive to the player’s performance in the game, and her attitude is aimed directly at the player’s ego, with compliments and insults alike. As the player progresses through the game’s narrative, GLaDOS is revealed to be not a mentor but an adversary; this revelation has fictional meaning, but no functional meaning – the gameplay dynamics remain the same. This change happens exactly when the player has finished learning the mechanics of the game, has no longer need of mentorship and instead will enjoy the feeling of overcoming a rival. The other NPCs in the game are Sentries and Companion Cubes. The Sentries are robot adversaries that will shoot the player on sight, but are completely incompetent, easy to defeat, and will greet the player with a friendly voice as they fire a useless hail of bullets. Finally the Companion Cube: a large box with the shape of a heart painted on its side. It’s Just a box, completely mute and does absolutely nothing, except being assigned the role of companion. Players take Companion Cubes along with them, use them sometimes as weights to solve puzzles with, and discard them when they wish to move on. Through the blunt metaphor of this gaming in-joke, Portal makes clear that the cube’s companionship is completely fictional, made to put the player’s mind at ease as they traverse the empty and artificial space of the lab. It requires some reflection by the player to recognize that the other, more-chatty robots in the game, are all but the same, all fictional.
So far in this essay I did not attempt to challenge the efficacy of these fictions: emotional-service robots may well relieve some loneliness from people in care homes; the Companion Cube certainly made my journey through Portal’s trials feel less lonely. Decades of design knowledge empowered the game’s creators to openly poke fun and ridicule the phenomenon, while still utilizing it and losing none of the actual effect, and a lifetime of playing with NPCs made me receptive to their charms.
I do suspect the degree to which such fictions are effective. As was hopefully made clear by now, I care less about the imagined potential of speculative robots, and more about the actual effect that actual robots have on contemporary society; and to the best of my understanding, with the historical rise of applications offering robot companionship depicted in this essay, we also see the advent of the “loneliness pandemic” (4), considered in parts of the world as a public health crisis.
As entire industries learn how to better capitalize on societal atomization, we are bound to see a growing injection of servile personal bots into society, wrapped in louder promises of artificial companionship and care. We will be better prepared for their arrival if we take them for what they are: not a new marvel of design and engineering but rather an old story, one that has been told and retold for decades. We would do well to ask why old stories are being repackaged and sold to us as new and inspiring tech.
Footnotes and Other Stray Thoughts
1 // One example of many for this, PBS, Jackie Snow, 2019, This time, with feeling: Robots with emotional intelligence are on the way. Are we ready for them?
2 // Slate, Christine Rosen, 2016, Should We Outsource Emotional Labor to Robots?
3 // Reuters, Minwoo Park, 2020, South Korean mother given tearful VR reunion with deceased daughter.
4 // Harvard Magazine, Jacob Sweet, 2021, The Loneliness Pandemic, The psychology and social costs of isolation in everyday life.