Mirror, Mirror: How Technology Re-Produces Challenges Concerning Human Relations
- Mathea Møller
- Ruben Jones
- Hanne Jensen Moe
Keywords: Technology, Humanity, Pessimism
The Anthropocene is defined as the “human-dominated, geological epoch” (Crutzen 2002). The epoch is based on overwhelmed global evidence that geologic, atmospheric, biospheric and other earth system processes are now altered by humans. There are discussions about when the Anthropocene started: either after or within the Holocene – about 10,000 years ago (Welcome to the Anthropocene 2019);the late 18th-century – as a result of the industrial revolution (Crutzen 2002); while others say it started with the dawn of agriculture some 7,000-8,000 years ago (Ruddiman et al. 2015). As a more recent starting point, some seem to view the year 1945 as the beginning, which was the onset of the ‘Great Acceleration’ of human enterprise (Steffen et al. 2011; Hamilton et al. 2015, 1). Even though the imprint of human activity was already clear, “the rate at which that imprint was growing increased sharply at mid-century” (Steffen et al. 2011, 849). Regardless of the exact starting point of the Anthropocene, scholars agree that this epoch is characterized by the effect human activity has on the environment, and that this results in an exit from the Holocene (ibid., 843).
Most of the factors appointed to the Anthropocene are involved with technological development, e.g. medicine and industrialization (Finstad 2019). As early as in the 19th-century, people were wondering if technology was an unleashed “force that was beyond their own capacity to regulate” (Robin et al. 2013, 262). However, there are two sides to this. One can look at technology as a phenomenon of liberation and solutions, or one can look at it as consuming and resulting in decay and problems. This is symbolized in the title “Mirror, Mirror”, where the repetition suggests that technological devices creates new problems and forms a continuation it would be difficult to escape from. The anthology series Black Mirror (2011-present) explores society with regard to the unforeseen effects and consequences of new technologies. As discussed in what follows, most of the episodes of the series have a dark and satirical tone, which problematizes the impact technology may have on humans and the relationships between humans. This article presents three episodes (“Be Right Back”, “Metalhead” and “Nosedive”) from the TV series described and discussed through close analysis. Common for all these episodes is the sense that technology creates solutions, as well as additional problems. Modern societies and technological developments may result in questioning what makes us human. This is further complicated in the episode “Be Right Back” (2013), where a deceased man comes back to his girlfriend as a human-like robot composed of his virtual data.
“Be Right Back” (2013) is one of the best episodes of Black Mirror in relation to the question about what makes us human and the relation between human and technologies, because it never loses sight of its humanity. The technologies that we create and come to rely on reveal far more about us self than almost anything else. We are a species still getting over the traumatic fears that come along with basic existence. We create email and new telecommunications devices to stay close to one another – to feel like we are part of a pack, huddled together in a cold, uncaring world. The main character Martha begins to spend the majority of her days with a human-like robot composed by the virtual data of her deceased boyfriend Ash. She tells him about memories they cherished, and memories they do not have. She ignores calls from her real sister and loses connection to the real world. One day after Martha gets her first ultrasound and wants to share the heartbeat with the replica of Ash, she drops and breaks her phone. What follows is a full-blown meltdown. She has lost him, again.
“Be Right Back” is a peculiar episode of television precisely because it understands our fears and holds our hands through them. Towards the end of the episode, Martha leads Ash to a cliff and commands him to jump off. He cheerily agrees to do so before she tells him that is not what the ‘real’ Ash would do. Martha is at her wits end. She now realizes this, longing for someone who actually possesses the ability to react emotionally, without logic. Not a machine that is programmed to respond in a series of actions based on an algorithmic equation. The episode stresses how we all exist in a weird continuum of clashing forces like love, death, memory, grief, and fear. Martha did her best to reconcile those forces and deal with her pain, but she failed. And that is fine. Because “to err is human” (Pope Alexander 1711), and each failure is a success because of the weight behind the lesson that taught it.
The digitally based machines can be programmed to meet individual needs; they handle tasks of a complexity that we once could only have imagined (Möllers 2015.). “Nosedive” (2016) is about a technology-based society that is driven by a system similar to social media functions. People have devices similar to our smartphones, where they have profiles with social scores. The profiles make people able to post photos and rate each other based on how they behave and what they post. They also get social benefits according to their social points. In other words, it is a digital class system. However, the social issues are apparent when the concern about getting high ratings removes people’s natural behavior.
In “Nosedive”, the new technological resources have been developed to function as the foundation of an order society. However, as the episode moves forward, we see how technology can also cause destruction. A system that is supposed to inspire people to be more law-abiding may drive people to act unreasonable and follow natural impulses that in this society results in prison, which is what happens to the main character Lacie. At the end of the episode, we see that she is finally able to act according to her nature – screaming, cursing, laughing – finally being able to act as a human. The action of following our natural impulses is also clearly visible in other episodes as well, such as “Metalhead” (2017), where the characters are on a life-threatening mission to get a teddy bear for a dying child.
In “Metalhead” (2017) the action is set in a world where dog-like robots are superior to the humans, and humans live in fear of being killed by these robots. These robotic dogs have exterminated animals and most of the human population. “Metalhead” notably shows a frightening picture of what may happen when technology goes too far. The distinction between technology and humanity is difficult to establish in our contemporary time, when artificial intelligence and other technological systems are under constant development and may have unforeseen negative effects. The dogs in “Metalhead” were inspired by the robotic dogs by Boston Dynamics that went viral in 2017 (Yeung 2019). These robotic dogs are now available to the public (ibid 2019). The futuristic world in “Metalhead” does not seem far out of reach, if these robotic dogs (and other robots) keep evolving and gain more advanced artificial intelligence. The future of technology may therefore pose challenges in social relationships, and how we interact with both technology and other humans.
A study by Melson et al. (2009) shows that even though people recognize robots, such as the robotic dog AIBO, as a technological product, they also give them “many of the attributes of a living dog”. The distinctive line between robotic dogs and real dogs may therefore be even more complicated in the future. However, according to Regan (2004) animals “have beliefs and desires; perception, memory […]; an emotional life together with feelings of pleasure and pain” (Smulewicz-Zucker et al. 2012, 2-3). The belief that animals have emotions limits the possibility that robots will replace pets in the future. This would also be the case for humans. Attempts to reproduce emotional mechanisms into machines or computer programs results as being “statistically mimetic or behavioural” (Vallverdú 2016, 321), as was the case in the episode “Be Right Back” (2013).
The series Black Mirror (2011-present) and its episodes are clearly an example of technology pessimism. This view on technology sees humans as inferior and submissive to the technology-based system (Finstad 2019). The technology is all-consuming, humans are oppressed by the system and the technology has resulted in a morally, culturally and socially decline (Robin et al. 2013, 262). In “Nosedive”, the main character is first seen as very accepting of the system and is subject to its rules of behavior. However, due to natural human responses, such as stress, frustration, and impulses, the character falls out of the system and ends up feeling socially oppressed. Technology is said to make people more liberated, but instead it has made them more restricted (Robin et al. 2013, 267).
article suggests that there will always be a human essence that technology will
never be able to adopt or replace. Technical innovations may be seen as “double-edged
swords that serve to transform relationships among people, as well as between
human societies and our natural environments” (Hård & Jamison 2005, xiii). Technological
solutions to solve problems are followed by additional problems in the future. What
these episodes suggest is that there is something that differentiate us from
technology, something that makes it impossible for technology to “be human” –
namely our humanity. Nevertheless, is it likely that we reach a point where it
is impossible to detect the difference between a robot and a human? The time will show. But, the story of the
technological achievements of the Anthropocene is also a story of the problems
that they bring with them (Möllers 2015).
Black Mirror. “Be Right Back.” Season 2, episode 1. Directed by Owen Harris. Written by Charlie Brooker. Netflix, February 11, 2013. Accessed October 23, 2019. https://www.netflix.com/watch/70279173?trackId=200257859
Black Mirror. “Metalhead.” Season 4, episode 5. Directed by David Slade. Written by Charlie Brooker. Netflix, December 20, 2017. Accessed October 26, 2019. https://www.netflix.com/watch/80131570?trackId=13752290&tctx=0%2C4%2C6450736a-3a90-4d5a-8866-b5bf7214ebb9-17671246%2C%2C
Black Mirror. “Nosedive.” Season 3, episode 1. Directed by Joe Wright. Written by Charlie Brooker. Netflix, October 21, 2016. Accessed October 21, 2019. https://www.netflix.com/watch/80104627?trackId=200257859
Crutzen, Paul J. 2002. «Geology of Mankind.» Nature 415 (6867): 23. doi:http://dx.doi.org/10.1038/415023a. https://search.proquest.com/docview/204518064?accountid=12870
Finstad, Terje. 2019. “Teknologi og det grønne skiftet.” Menneskets tidsalder? Accessed September 9, 2019. https://ntnu.blackboard.com/webapps/blackboard/execute/content/file?cmd=view&content_id=_771707_1&course_id=_18330_1&framesetWrapped=true
Hamilton, Clive, Christophe Bonneuil and François Gemenne. 2015. “Thinking the Anthropocene.” In The Anthropocene and the global environmental crisis: rethinking modernity in a new epoch, editors Clive Hamilton, Francois Gemenne and Christophe Bonneuil, pp. 1-13. London: Routledge.
Hård, Mikael, and Jamison, Andrew. 2005. Hubris and Hybrids: A Cultural History of Technology and Science. London: Routledge. Accessed November 9, 2019. ProQuest Ebook Central.
Melson, Gail F., Peter H. Kahn, Jr., Alan Beck & Batya Friedman. 2009. “Robotic Pets in Human Lives: Implications for the Human-Animal Bond and for Human Relationships with Personified Technologies.” Journal of Social Issues, 65(3): 545-567. Accessed November 2, 2019. https://depts.washington.edu/hints/articles/robotic_pets_2.pdf
Möllers, Nina. 2015. “Milestones of the Anthropocene.” Accessed November 7, 2019. http://www.environmentandsociety.org/exhibitions/anthropocene/milestones-anthropocene
Möllers, Nina. 2015. “Humans and machines.” Accessed November 7, 2019. http://www.environmentandsociety.org/exhibitions/anthropocene/humans-and-machines
Pope, Alexander. 1711. “An Essay on Criticism, Part II”
Robin, Libby, Sverker Sörlin and Paul Warde. 2013. “Part 6 Technology. Does Technology Create More Problems Than It Solves?” In The Future of nature: Documents of global change, editor Paul Warde et al., pp, 261-190. New Haven: Yale University Press, 2013. Accessed November 7, 2019. ProQuest Ebook Central.
Ruddiman, William F., Erle C. Ellis, Jed O. Kaplan, Dorian Q. Fuller. 2015. «Defining the epoch we live in». Science 348 (6230): 38-39. doi: 10.1126/science.aaa7297. https://science.sciencemag.org/content/348/6230/38
Smulewicz-Zucker, Gregory R., Drucilla Cornell, Julian H. Franklin, Heather M. Kendrick, Eduardo Mendieta, Andrew Linzey, Paola Cavalieri, et al. 2012. Strangers to Nature : Animal Lives and Human Ethics. Logos: Perspectives on Modern Society and Culture. Lanham: Lexington Books. http://search.ebscohost.com/login.aspx?direct=true&db=nlebk&AN=463635&site=ehost-live.
Steffen, Will, Jacques Grinevald, Paul Crutzen, and John McNeill. «The Anthropocene: Conceptual and Historical Perspectives.» Philosophical Transactions: Mathematical, Physical and Engineering Sciences 369, no. 1938 (2011): 842-67. http://www.jstor.org/stable/41061703.
Yeung, Jessie. 2019. “Boston Dynamics’ robot dog is now available for select customers.” CNN Business. Accessed November 2, 2019. https://edition.cnn.com/2019/09/25/app-tech-section/robot-dog-sale-intl-hnk-scli/index.html
Vallverdú, Jordi, and Gabriele Trovato. “Emotional Affordances for Human–Robot Interaction.” Adaptive Behavior 24, no. 5 (October 2016): 320–34. doi:10.1177/1059712316668238.
Welcome to the Anthropocene. 2019. “Anthropocene.” Accessed November 7, 2019. http://www.anthropocene.info/