ARTIFICIAL INTELLIGENCE (2001)
PHILOSOPHICAL ISSUES: Artificial intelligence
CHARACTERS: David (artificial boy), Monica (David’s mother), Henry (David’s father), Martin (David’s brother), Gigolo Joe, Professor Hobby (William Hurt, head of Cybertronics)
OTHER FILMS BY DIRECTOR STEVEN SPIELBERG: Close Encounters of the Third Kind (1977), E.T. (1982), Jurassic Park (1993)
SYNOPSIS: The story behind A.I. was originally conceived by Stanley Kubrick, who confided in Steven Spielberg on the project. On Kubrick’s sudden death in 1999, his widow persuaded Spielberg to take over the film. The film is set at a future time when progress in robotics poses a possible threat to the human species. David, a robotic boy, is the artificial life form that is capable of experiencing love. As a prototype, he is given to a couple whose real son is in what appears to be an irreversible coma. After a rough start, David and his mother bond. The real son miraculously awakes from the coma, returns to the family and tricks David into doing dangerous things. The father feels that they must return David to the manufacturer for destruction, but the mother allows David to escape. For the rest of the film David seeks to be reunited with his mother, and, for a time, is joined on his quest by “Gigolo Joe,” a robot designed to be a male prostitute. David becomes frozen in the ocean, and, millennia later – long after the extinction of the human species – robots of the future rescue him and allow him to reunite with his mother for one day that will last in his mind for eternity.
1. The film opens with the narrator stating that, because of depleting natural resources, human reproduction had been placed under strict control, and only licensed couples could have children. How bad would things need to get before licensing parents would become a necessity?
2. At the beginning of the movie, Professor Hobby states that “to create an artificial being has been the dream of man since the birth of science.” There’s probably an element of truth to this. Why do we have this fascination?
3. One of the scientists at Cybertronics asks, “If a robot could genuinely love a person, what responsibility does that person hold toward that mecha in return?” Professor Hobby responds, “In the beginning, didn’t God create Adam to love him?” What is implied by Professor Hobby’s answer?
4. When David's mother drops him off in the woods, David cries out "I'm sorry for not being real!" Aside from the fact that he's been kicked out of his house, why is he sorry for this?
5. In a documentary on the movie, Steven Spielberg states that the story is set at a time when humans and robots are on the brink of civil war. This is graphically depicted in the “Flesh Fair,” where old or unregistered mechas are rounded up and destroyed gladiator-style before a cheering crowd. One of the mechas explained that the Flesh Fair was an attempt to cut back on the numbers of mechas so that humans could "maintain numerical superiority." From the opposite perspective, one of the humans at the Fair said that mechas should be destroyed since they will take over. Why can't we just live in peace with robots?
6. Consider some of the imagery the Flesh Fair: motorcycles, cowboy hats, heavy metal music, flannel shirts. What statement does this make about the kind of humans that opposed robots?
7. An announcer at the Flesh Fair states the following "What about us? We are alive and this is a celebration of life, and this is commitment to a truly human future." How might the brutal destruction of robots a celebration of life?
8. The owner of the Flesh Fair states that child mechas like David, were built to disarm humans by playing on human emotions. Nevertheless, the human spectators feel sympathy with David, particularly because he pleads for his life. What abilities would a robot have to exhibit before we would consider it an equal with humans?
9. Gigolo Joe states that sex robots like him "are the guiltless pleasures of the lonely human being." Would it really be guiltless to have sex with a robot that looked just like a real human? What if the robot had the appearance of a ten-year-old human?
10. Gigolo Joe tells David that his mother does not love him, but only loves what he does for her. Is it plausible to think that a normal human could love a robot as though it were a real human?
11. Gigolo Joe tells David "They made us too smart, to quick, and too many. We are suffering for the mistakes they made because when the end comes all that will be left is us. That's why they hate us." Is this a good reason to hate robots?
12. David was the first of a kind in mecha design with an ability to experience love. On the box containing the mass-produced Davids it states “at last a love of your own.” It turns out, though, that David has desires, self-motivated reason, and the ability to chase down his dreams. What is the connection between love and these other cognitive abilities?
13. When David meets another David robot, he destroys it in a fit of rage, believing that this new David will compete with him for his mother's love. In retrospect, this outburst justifies the decision of David's parents to return him to Cybertronics for destruction. Is this a critical element of the plot, or just bad (and inconsistent) storytelling?
14. David enters a warehouse containing boxes upon boxes of other Davids. He walks up to a facial mask of one of the Davids and peers through its eye openings. What if anything does this symbolize?
15. As Joe gets magnetically pulled up to a police helicopter his final words to David are "I am, I was." The "I am" clause is a reference to Descartes' famous statement "I think, therefore I am," which indicates Joe's assertion of his existence. What does the "I was" clause signify?
16. The ancient Greek term "deus ex machina" refers to an awkward plot device introduced to get the hero out of a tough situation – like the cavalry coming over the mountain at the last minute. Does the ending of this film rely on a deus ex machina?
17. Why does the narrator robot at the beginning and end of the movie have a British accent?
18. In a documentary on the movie, Steven Spielberg states that a key issue raised by the movie is the extent to which we have a moral responsibility to the intelligent robots that we will someday create. What kind of qualities would a robot need before we would recognize our moral responsibility towards it?