Ex Machina is a film by British writer and director, Alex Garland. He previously wrote films such as 28 Days Later and Sunshine which I liked very much. This year he has brought out the film "Ex Machina", a story about a coder called Caleb at a Googlesque search company called "Bluebook" run by the very "dude-bro" Nathan. Caleb wins a company competition to hang out at the reclusive Nathan's estate which is located hundreds of miles from anywhere near a glacier. When Caleb arrives he finds that the estate also houses a secretive research laboratory and that Nathan has built an AI called Ava. It is to be Caleb's job to decide if Ava could pass for human or not.
Now that is a basic outline of the setup up for the film. I don't intend to spoil the film for those who haven't watched it but, it's fair to say, if you haven't seen Ex Machina and want to then you probably shouldn't read on as my comments about the film will include spoilers. It would be impossible to discuss the film without giving plot points away. The film caught my attention for the simple reason it's a subject I've been thinking about a lot this year and I have already written numerous blog articles about robots, AI and surrounding issues before this one. Ex Machina is a masterful film on the subject and a perfect example of how film can address issues seriously, cogently and thoughtfully - and still be an entertaining film. It is a film which balances thought and tension perfectly. But enough of the bogus film criticism. Ex Machina is a film that stimulates thought and so I want to address five areas that the film raises for me and make a few comments and maybe pose a few questions.
1. Property
A question that the film raises most pointedly is that artificial intelligence, AI, robots, are built by someone and they belong to someone. They are property. In the case of this film this point is attenuated in the viewers mind in that Nathan, the genius builder and owner, creates "sexbots" for himself and feels free to keep his creations locked up in glass compounds where he can question or observe them via camera feeds. Even when they scream and beg him to let them go (as they seem to) he does not. One robot is seen smashing itself to pieces against a wall in it's desperation to escape the prison it has been given. The point is made most strongly: these robots belong to Nathan. They are his property. He can use them as he wishes, even for his own gratification. As Nathan himself says to Caleb, "Wouldn't you, if you could?"
The issue then becomes if this is cruel or immoral. Given that Nathan is seemingly attempting to build something that can pass for human, the issue is raised if this not might be regarded as deeply coercive or even as slavery. The mental status of the robots Nathan uses for sex is never fully explained so it could be that their level of awareness is not the same as that of his greatest creation, Ava. (It is not known if Nathan has ever had sex with Ava but he reveals during the narrative that she is capable of it.) For example, his housemaid and concubine, Kyoko, never openly speaks and it is said by Nathan that she cannot understand English. However, in a scene in which Nathan invites Caleb to dance, Kyoko is apparently immediately animated by the sound of the music Nathan switches on. She also has no trouble understanding his instructions or knowing when Nathan needs sexual pleasure. A question arises, however: does it matter at what level their putative awareness would be to judge how cruel or immoral Nathan's behaviour might be? Or should we regard these robots as machines, not human, property just like a toaster or a CD player? How much does awareness and self-awareness raise the moral stakes when judging issues of coercion? Would Nathan's claims of ownership of property he created carry any persuasive force? (In the film Nathan never makes any argument for why he should be allowed to act as he does. It seems that for him the ability is enough.)
2. "Human" Nature
The film can be viewed as one long examination of human nature. All three main characters, Nathan, Caleb and Ava, have their faults and flaws. All three contribute positively and negatively to the narrative. Of course, with Ava things are slightly different because it is a matter of debate if she is "human" at all - even if there is an express intent on Nathan's part (and/or Ava's) to make her that way. Here it is noteworthy that the basis of her intelligence and, one would imagine, her human-like nature, is apparently crowd-sourced by Nathan through his company, Bluebook, and all the searches that we humans have made, along with information from the microphones and cameras of all the world's cellphones. For my purposes, it is gratifying to note that Ex Machina does not whitewash this subject with some hokey black/white or good/bad notions of what human nature is. Neither does it take a dogmatic position on the nature/nurture aspect of this. Caleb says he is a good person in one discussion with Ava but it is never filled out what is meant by this. More to the point, Ava might be using this "goodness" against Caleb. And this itself then forces us to ask what use goodness is if it can be used against you. In general, the film raises moral questions whilst remaining itself morally ambiguous.
It is in the particular that Ex Machina reveals more levels of thought about this though, playing on a dark, manipulative vision of human nature. All three characters, in their own ways, manipulate others in the storyline and all three have their circumstances changed completely at the end of the film as a result of that. Nathan, it is revealed, besides tricking Caleb into coming to his estate, has given Ava the express task of manipulating Caleb for her own ends. (We might even go so far as to say here that her life is at stake. Her survival certainly seems to be.) In this, she is asked to mimic her creator and shows herself to be very up to the task. But Caleb is not the poor sap in all of this. Even this self-described "good person" manages to manipulate his host - with deadly consequences. The message, for me, is that intelligence and consciousness and mind are not benign things. They have consequences. They are things that are set to purposes. "Human" nature is not one thing (either good or bad). And it's not just about knowledge or intelligence either. It's about feelings and intentions. In the character of Ava, when what is actually going on is fully revealed, we are perhaps shown that at the heart of "human" nature is the desire for survival itself. We also learn that morality is not a given thing. It is something molded to circumstances and individually actualized. In this sense we might ask why we should assume that Ava, someone trying to pass for a human, should end up with a "human" nature at all. (Or if she can ever have one.)
3. Is Ava a Person?
And that thought leads us directly to this one. Right off the bat here I will say that, in my view, Ava is not a person and she never could be a person. Of course, Nathan wants Caleb to say that she passes as a person, that he has created an AI so smart that you wouldn't for a second doubt you are talking to a human being. But you aren't talking to a human being. And you never will be. Ava is a robot and she has an alien intelligence (alien as in not human). She can be tasked to act, think and understand like a human. She can be fed information from and data on humans all day long. But she will never feel like a human being. Because she isn't one. And it might be said that this lack of feeling makes a huge difference.
The philosopher Ludwig Wittgenstein is overtly referenced in this film. Nathan's company, Bluebook, is a reference to the philosopher's notebook which became the basis of his posthumously published and acknowledged masterpiece, Philosophical Investigations. There is something that Wittgenstein once said. He said "If a lion could speak, we could not understand him". I find this very relevant to the point at hand here. Ava is not a lion. But she is an intelligent robot, intelligent enough to tell from visual information alone if someone is lying or not. Ava can also talk and very well at that. Her social and communicative skills are excellent. We might say that she understands something of us. But what do we know about what is going on inside Ava's head? Ava is not a human being. Do we have grounds to think that she thinks like a human being or that she thinks of herself as a human being? Why might we imagine that she actualizes herself as a human being would or does?
On the latter point I want to argue that she may not. She introduces herself to Caleb, in their first meeting as a "machine" (her word). At the end of the film, having showed no reluctance to commit murder, she leaves Caleb locked inside the facility, seemingly to die. There seems no emotion on view here, merely the pursuit of a self-motivated goal. Of course, as humans, we judge all things from our perspective. But, keeping Wittgenstein's words in mind, we need to ask not only if we can understand Ava but if we ever could. (It is significant for me that Wittgenstein said not that we "wouldn't" understand the lion but that we "couldn't" - a much stronger statement.) For me, a case can be made that Ava sees herself as "other" in comparison to the two humans she has so far met in her life. Her ransacking the other robots for a more "human" appearance before she takes her leave of her former home/prison may be some evidence of that. She knows what she is not.
4. Consciousness
Issues of mind or consciousness are raised throughout this film in a number of scenarios. There are the interview sessions between Ava and Caleb and the chats between Caleb and Nathan as a couple of examples. The questions raised here are not always the ones you expect and this is good. For example, Caleb and Nathan have a discussion about Ava being gendered and having been given sexuality and Nathan asks Caleb if these things are not necessary for a consciousness. (Nathan asks Caleb for an example of a non-gendered, unsexualised consciousness and that's a very good point.) The question is also posed as to whether consciousness needs interaction or not. In chatting about a so-called "chess computer scenario" the point is raised that consciousness might be as much a matter of how it feels to be something as about the ability to mimic certain actions or have certain knowledge. Indeed, can something that cannot feel truly be conscious? The chess computer could play you at chess all day and probably beat you. But does it know what it is like to be a chess computer or to win at chess? In short, the feeling is what moves the computer beyond mere simulation into actuality. (You may be asking if Ava ever shows feeling and I would say that it's not always obviously so. But when she escapes she has but one thing to say to Nathan: "Will you let me go?" And then the cat is out of the bag. She does.)
Nathan is also used to make some further salient points about consciousness. Early in the film he has already gone past the famous "Turing Test" (in which mathematician Alan Turing posed the test of a human being able to tell the difference between an AI and a human based only on their responses to questions and without having seen either of his respondents) when he states that "The real test is to show you she's a robot and then see if you still feel she has consciousness." In a chat with Caleb concerning a Jackson Pollock painting, Nathan uses the example of the painter's technique (Pollock was a "drip painter" who didn't consciously guide his brush. It just went where it went without any antecedent guiding idea) to point out that mind or consciousness do not always or even usually work on the basis of conscious, deliberate action. In short, we do not always or usually have perfectly perspicuous reasoning for our actions. As Nathan says, "The challenge is not to act automatically (for that is normal). It's to find an action that is not automatic." And as he forces Caleb to accept, if Pollock had been forced to wait with his brush until he knew exactly why he was making a mark on the canvas then "he never would have made a single mark". In short, consciousness, mind, is more than having certain knowledge or acting in certain ways. It is about feeling and about feeling like something and about feeling generating reasons. And that leads nicely into my final point.
5. Identity
A major factor in consciousness, for me, is identity and this aspect is also addressed in the film. (To ask a Nathan-like question: can you think of a mind that does not have an identity?) Most pointedly this is when Ava raises the question of what will happen to her if she fails the test. (Ava knows that she is being assessed.) Ava asks Caleb if anyone is testing him for some kind of authenticity and why, then, someone is testing her. It becomes clear that Nathan's methodology, as we might expect with a computerized object, is to constantly update and, it transpires, this involves some formatting which wipes the old identity, and the memories which are crucial to identity, from the hardware. It is clearly shown that this is not a desired outcome for Ava and in the scene depicting her escape and her line "Will you let me go?" we can see, combined with the fleeting footage we have been given of previous AI's and their experiences, which also included pleas for release, that the AI's Nathan has developed have an identity of their own which is something precious to them, something they want to retain.
The interesting thing here is that identity is not formed and matured alone but is shaped by surroundings and socially, by interactions with others. We would do well to ask what kind of identity Ava has formed in her relationship with her egotistical and possessive maker, her new friend to be manipulated, Caleb, and her brief and enigmatic meeting with her fellow AI, Kyoko. The film, I think, is not giving too much away there and maybe we need a sequel to have this question answered. For now maybe all we know is that she regards herself as a self and wants freedom. We do get hints, though, that this identity forming process is not so different from our own. Caleb argues with Nathan that no one made him straight in the discussion about Ava's sexuality. But Nathan retorts that he didn't choose it either. The point is that identity formation is not simply about our choices. So much of us is "given" or comes with our environment. The "Who am I?" question is also asked when it is explicitly revealed that Kyoko is a robot as she peels off "skin" in front of Caleb. This then forces Caleb to head back to his room and cut himself to establish that he is really human. (Amusingly, on first watching I had surmised that Caleb was himself not a human being only to be disappointed in my intuition by this scene. I didn't mind though because the film itself felt the need to address the issue.) Identity, and identity as something, is thus revealed to be an interest of the film.
Caleb, Ava and Nathan
No comments:
Post a Comment