Filmwerke

  / aventinus / Neuzeit / Filmwerke

aventinus visio Nr. 1 [28.4.2012] 

 

Alexander Wiegel 

AI in Science-Fiction 

A Comparison of Moon (2009) and 2001: A Space Odyssey (1968)

Artificially intelligent characters (or AI) have been a recurring trope in both the human psyche and literature for much of recorded human history. [1] The genre of science-fiction (or Sci-fi) has frequently tasked itself with asking questions pertaining to the moralistic, scientific, and societal consequences of advanced technology, such as AI. As a genre, sci-fi traditionally is not so concerned with narrating a story in a high-technology setting, as using a high-technology setting to narrate the consequences of new technologies. [2]

The purpose of this essay is to examine the different ways in which two sci-fi films approach AI in the near future – Moon (2009) and 2001 (1968) – and what their approaches tell us about the perceived consequences of AI technology during those time periods. Though Moon and 2001 are both thematically very similar, they have very different takes on AI. This essay will first define AI and then summarize the depictions of AI in both films, concluding with what their depictions suggest about societal fears surrounding AI around the times of the films' releases. This essay hopes to show that Moon represents a large shift in the perception of AI since 2001, most likely due to the increased prevalence, dependance on, and familiarity with computerized technology since 1968. 

Artificial intelligence refers to an intelligence of varying levels which has been made or designed by humans, rather than evolving naturally. More specifically, AI is an entity which replicates the “human thinking processes [by means of] human construction.” Something being “Artificially Intelligent” does not specifically imply that it is alive or sentient, nor does it imply that it isn't, only that it is capable of complex mental processes. [3] There are therefore two types of AI: Strong AI and Weak AI. Weak AI refers to AI that uses predetermined rules to accomplish predefined goals, such as navigating a maze. Strong AI refers to AI that has cognitive ability similar to that of a human, such as the ability to reason and evaluate based on past experience. [4] This essay will focus only on Strong AI.

The definition of “cognitive ability similar to that of a human” is not widely agreed upon, but there are some generally agreed upon characteristics that would be required for Strong AI. [5] To qualify as a Strong AI, an Intelligence should be capable of: planning, learning, using reason (to solve problems with incomplete information), communication, and being able to integrate those skills together. Though they are never defined as such, both GERTY and HAL – the AI characters from Moon and 2001 respectively – fit the criteria for being Strong AI. They communicate with humans, plan and problem-solve based off of incomplete information, and incorporate those skills together in order to execute complicated actions.

Filmed in 1968, 2001 is the story of humans attempting to uncover the mystery behind an alien “monolith.” The film begins with a monolith appearing before a group of primates in the prehistoric age, and then jumps to 2001. Now it is modern humans who have encountered a monolith, and it sends a powerful signal to Jupiter. A ship is launched to chase the signal to its destination, and the AI HAL controls most of the ship's functions during the expedition. The ship's human pilots, David Bowman and Frank Poole, are mostly along for maintenance, with the rest of the crew in hibernation. However, when HAL miss-reports the failure of the ship's primary antenna, and then blames the fault on human-error, Frank and Dave begin to have doubts about HAL's infallibility. Growing paranoid, HAL kills the majority of the crew – only Dave escapes alive. Dave then deactivates HAL and is forced to carry out the mission alone, encountering another monolith in Jupiter orbit as the film ends. 

Moon deliberately takes after 2001 and was described by it's director, Duncan Jones, as a movie designed to pay specific homage to 2001, Silent Running (1972), Solaris (1972), and Alien (1979). [6] Sam Bell is the lone employee of Lunar Industries at a moon base that overseas Lunar Industries' automated Helium-3 mining operation. With only the AI GERTY for company, Sam is nearing the end of his three-year contract when he begins having hallucinations and has an accident. Later, he wakes in the base infirmary to find he has been confined to the base due to another, unknown, accident. Frustrated at “being treated like a kid,” he convinces GERTY to let him leave the base. Outside, he finds the site of the original accident and another Sam Bell. The two Sams discover they are clones being used for slave labor and, together with help from GERTY, they enact a plan to send one of them to Earth to expose Lunar Industries practices.

In 2001 HAL is described as a “heuristic” AI – or a problem-solving, learning, and discovery-based AI. This means that his method of problem-solving comes from a logic and rule-set-governed thought process, which causes him to behave much like a chess-player who moves based on anticipated counter-moves. His breakdown to murder can be characterized as him struggling with conflicting logic paths that his programming is unable to adequately cope with. When he is interrogated by the crew about the nature of the mission, he experiences conflict due to being programed to lie to them about the mission, and fumbles reporting on the antenna array, which leads to the conflict within the film. Though HAL is a Strong AI, he is a Strong AI that has “no defense against changing and inconsistent goals.” [7]

GERTY's thought process is characterized in contrast by his ability to softly navigate conflicting goals and information. GERTY is able to avoid stress from otherwise conflicting options in ways that HAL cannot because he is able to adapt his rule-set into orders of importance. GERTY's longest standing order has always been to help Sam and keep him safe. Therefore, when GERTY is presented with two conflicting objectives, such as keeping Sam confined or letting him go outside, he will choose the option that helps Sam the most – in this case letting Sam go outside, because that satisfied the objective that GERTY deemed more important. In this way GERTY is able to anticipate outcomes and make decisions not just from logic, but also from hindsight and past interaction. [8]

The moment of critical importance in Moon comes as the second Sam is preparing to leave the lunar base for Earth. GERTY has recorded all of their actions and knows that the inbound recovery team will check his memory banks once they reach the base. When they do, they will discover the Sams' plan and alert Lunar Industries, who will most likely kill him. Because of this, GERTY tells Sam to reboot his memory banks, erasing it, and effectively killing the entity known as GERTY. In this moment, GERTY completely subverts the trope set by HAL in 2001, and becomes the self-sacrificing hero of the film, whose selfless sacrifice ensures the success of the film's protagonists. Both HAL and GERTY “die” at the end of each film, but where HAL is killed out of self-defense, GERTY makes a sacrifice for the good of others. 

2001 and Moon therefore render two very different judgments of AI. In 2001, AI is the cold-hearted killer that is responsible for the death of the expedition's crew. In Moon, AI is the quiet savior, without whom the protagonists could not have succeeded, and whose final sacrifice ensured their safety. Moon's villain is instead other humans: corporations who are willing to use slave labor for higher profits. 

In the years following the Second World War, the public fear and fascination with computer systems focused largely on huge, room-sized machines, which some journalists called “giant brains.” [9] Computers such as these were in general not well understood by the public, and any advance in computing technology was immediately hyped into the assumption that computers would soon take over the human decision-making process. This lead to movies like WarGames (1983), where the AI WOPR is given control over the US Nuclear Arsenal, and proceeds to nearly cause the Third World War because it is unable to differentiate between simulation and reality. Movies such as WarGames were largely driven by misunderstanding or fear of the new emerging computer technology, and 2001's HAL is no different. [10] It warns that overconfidence in the infallibility of computers could lead to disastrous results.

A massive increase in demand for computer-integrated goods and services within the last 30 years lead to a dramatic shift away from this paranoia, however. Especially during the 1980s, there was a massive increase in demand for industrial computerized devices over the previous decades. Computers made complex computation available to amateurs, and suddenly the results of sophisticated mathematics were available to the average worker, who could then take advantage of that math even if he didn't understand it himself. [11] This affected everything in the modern industrial economy, from production to accounting, which lead to not just an increased prevalence of the technology, but increased understanding as well. [12]

The industrial demand for computing technology also increased the private demand. Especially in the last ten years, computers have invaded every aspect of daily life, from cars to watches. Though true Strong AI technology is still far off, the Ambient AI (another form of Weak AI) devices that were envisioned at the turn of the century, such as smart phones, can now be seen in the hands of kids as young as ten years old. [13] Other types of primitive AI technology, such as Apple's Siri, are widely regarded today as mostly benevolent, and are in fact the subject of many humorous parodies online. [14] Perceptions are beginning to shift: It is the humans behind the technology that should not be readily trusted, not the technology itself, and many consumers are becoming more distrustful of the business practices of large companies, than of the technologies used to employ them. [15]

Moon is a result of this shift away from technology paranoia, and towards corporate paranoia. The film offers in contrast to 2001 a very optimistic view of the future of AI: It suggests to us that it is not AI that is likely to run amok when given too much power, but our own fellow man. In Moon, AI is represented as something that should be treated with the respect suiting of an actual human being. Moon strongly suggests that realized Strong AI technology will mean more than just very advanced computers when the second Sam ends the film by saying, “We're not programs, GERTY. We're people.” 

Quellen und Literatur 

BOLTER, David (1984), Turing's Man: Western Culture in the Computer Age, UNC Press Books 

DOUGLAS, Edward (2009) in interview with Duncan Jones and Sam Rockwell, in: ComingSoon.net, 23.01.2009 http://www.comingsoon.net/news/sundancenews.php?id=52031 (28.04.2012).

ISTAG (2001), Scenarios for Ambient Intelligence in 2010; Final Report, ftp://ftp.cordis.europa.eu/pub/ist/docs/istagscenarios2010.pdf (28.04.2012).

KLING, Rob (1996), Computerization and Controversy: Value Conflicts and Social Choices, San Diego, CA, Academic Press.  

LUCAS, Duncan (2002), “Body, Mind, Soul – The 'Cyborg Effect': Artificial Intelligence in Science Fiction,” in: Open Access Dissertations and Theses, McMaster University 

MASNICK, Mike (2012), “Study Confirms What You Already Knew: Mobile Data Throttling About the Money, Not Stopping Data Hogs,” in Wireless by techdirt; http://www.techdirt.com/blog/wireless/articles/20120224/10500217867/study-confirms-what-you-already-knew-mobile-data-throttling-about-money-not-stopping-data-hogs.shtml (28.04.2012)

McCARTHY, Erin (2009) in Interview with Duncan Jones, in PopularMechanics.com, 01.10.2009 http://www.popularmechanics.com/technology/gadgets/news/4313243  (28.04.2012)

McCARTHY, John (2007), “Basic Questions,” Stanford University; http://www-formal.stanford.edu/jmc/whatisai/node1.html (28.04.2012).

McCORDUCK, Pamela (2004), Machines Who Think (2. ed.), Natick, MA, AK Peters Ltd. 

MOUNT, John (2010), “Gerty, a character in Duncan Jones' “Moon”,” in MZLabs; http://mzlabs.com/MZLabsJM/page6/Gerty/Gerty.html (28.04.2012).

ORDWAY III, Frederick, “2001: A space Odyssey in Retrospect” http://www.visual-memory.co.uk/amk/doc/0075.html (28.04.2012).

SLOSTAD, Brody (2008), “The Two Types of Artificial Intelligence: A Deeper Look at The Strong and The Weak Future Technology” http://brody-slostad.suite101.com/the-two-categories-of-artificial-intelligence-a60694  (28.04.2012)

 

Dieser Beitrag entstand in Zusammenarbeit mit dem Lehrstuhl für Zeitgeschichte des Historischen Seminars der LMU München.

 

Anmerkungen

  • [1]

    McCORDUCK, Pamela (2004), Machines Who Think (2. ed.), Natick, MA, AK Peters, Ltd., p. xviiip

  • [2]

    LUCAS, Duncan (2002), “Body, Mind, Soul – The 'Cyborg Effect': Artificial Intelligence in Science Fiction,” in: Open Access Dissertations and Theses, McMaster University, p. 7.

  • [3]

    Ibid., p. 20. 

  • [4]

    SLOSTAD, Brody (2008), “The Two Types of Artificial Intelligence: A Deeper Look at The Strong and The Weak Future Technology”.  

  • [5]

    LUCAS, Duncan (2002), p. 15-16. McCARTHY John (2007), “Basic Questions,” Stanford University. 

  • [6]

    DOUGLAS, Edward (2009) in Interview with Duncan Jones and Sam Rockwell, in: ComingSoon.net, 23.01.2009.

  • [7]

    ORDWAY III, Frederick, “2001: A space Odyssey in Retrospect”. MOUNT, John (2010), “Gerty, a character in Duncan Jones' “Moon”,” in MZLabs.

  • [8]

    MOUNT, John (2010). McCARTHY, Erin (2009) in Interview with Duncan Jones, in PopularMechanics.com, 01.10.2009.

  • [9]

    KLING, Rob (1996), Computerization and Controversy: Value Conflicts and Social Choices, San Diego, CA, Academic Press., p. 2, 10-11.

  • [10]

    Ibid, p. 2, 10-11. MOUNT, John (2010). 

  • [11]

    BOLTER, David (1984), Turing's Man: Western Culture in the Computer Age,” UNC Press Books, p. 231-236.

  • [12]

    KLING, Rob (1996), p. 16-25. 

  • [13]

    ISTAG (2001), Scenarios for Ambient Intelligence in 2010; Final Report, p. 6, 9.

  • [14]

    Such as the College Humor skit: “Siri Argument.” http://www.collegehumor.com/video/6648229/siri-argument.

  • [15]

    MASNICK, Mike (2012), “Study Confirms What You Already Knew: Mobile Data Throttling About the Money, Not Stopping Data Hogs,” in Wireless by techdirt.

Empfohlene Zitierweise

Wiegel, Alexander: AI in Science-fiction: a comparison of Moon (2009) and 2001: A Space Odyssey (1968). aventinus visio Nr. 1 [28.4.2012], in: aventinus, URL: http://www.aventinus-online.de/no_cache/persistent/artikel/9389/

Bitte setzen Sie beim Zitieren dieses Beitrags hinter der URL-Angabe in runden Klammern das Datum Ihres letzten Besuchs dieser Online-Adresse.



Erstellt: 27.04.2012

Zuletzt geändert: 22.08.2013

ISSN 2194-3427