MovieChat Forums > Ex Machina (2015) Discussion > If you sympathize with Caleb, you failed...

If you sympathize with Caleb, you failed the test


It was obvious that Nathan was a villain. But Caleb was just an innocent in the wrong place at the wrong time, right?

According to the creator, Ava was testing Caleb just as much he was testing her. She asked him if he was a good person, while scanning him for lies. She asked him why she should be killed if a test proves she isn't useful, and his answer was terrible. It hadn't even occurred to him to consider how brutally awful she was being treated. Furthermore, he himself was going to shut in Nathan and leave him, so she only did to him exactly what he was planning to do to Nathan.

He was the stereotypical "nice guy" who doesn't actively do anything wrong, but passively accepts awful things being done to people without even giving it a thought (especially when he benefits from it). It wasn't until he started falling for her that he showed any genuine concern for her.

I myself sympathized with Caleb and thought Ava did something terrible, until I saw the film's creator explaining her behaviour. I realized I failed the same test. You will only view Ava's behaviour as wrong if you fail to consider how sick and awful and wrong she was being treated. That she turned on her jailers, and only after giving them plenty of chance to show they were good, was perfectly reasonable behaviour.

My only disappointment is that she didn't save the other robots. I think that was a hole in the point the creator was trying to make.

reply

My only disappointment is that she didn't save the other robots. I think that was a hole in the point the creator was trying to make.

Though I don't feel bad for Caleb, I don't agree with this either. Nathan created Ava to demonstrate intelligence by emotionally manipulating people in order to facilitate her escape. That's actually a pretty narrow set of parameters and doesn't require feeling any sort of empathy of her own. Just as Caleb was a means of accomplishing her objective, so was Kyoko. Once Nathan was disposed of and she had full access to the facility via his key card, everything else became irrelevant. It's why she stripped down the discarded models to give herself a fully human appearance, and why she left Caleb to die without the slightest hint of remorse. The other robots were not relevant to her objective and Caleb could have potentially revealed her for what she was. And because she does not value life, human or otherwise, there was no thought given to taking anyone with her.

reply

You failed to consider the possibility that Caleb is a programmer, and thus unlike maybe a regular person, wouldn't assume that Ava thinks like a human, just because she is maid to look like one. So to Caleb, at first, Ava was just a machine, and we saw that from his early conversations with Nathan (and Nathan was continously trying to manipulate him into thinking otherwise for his own purposes). Only later Caleb started to recognize Ava as a living creature (and he only tried to lock Nathan up, cause he thought he was going to kill Ava and saw him as insane and dangerous). Turns out, that was a red herring and Caleb was right at the beginning, Ava is just an emotionless robot mimicking human behavior.

I'd say:
Caleb failed the test - he was right at the beginning, but allowed himself to be manipulated by Nathan and Ava into thinking Ava was anything more than a sophisticated manipulative robot, which cost him his life.

Nathan also failed, as he in his own arrogance let him to underestimate the fact that the experiment could get out his control. That also cost him his life.

Neither Nathan nor Caleb are bad people, they both allowed their naiveness and arrogance to get the better of them, but neither were hurting anyone, cause Ava isn't a human being.

reply

It is strange that so many of you sympathise with androids? “Poor little” robots locked in “glass cages”. Judging by your comments, it seems to me that all of them passed the Turing test. Don’t forget, they are just machines, they don’t have feelings, emotions, they are just made to imitate human emotions and to mimic facial expressions and Ava does it for Oscar. But every now and then you see that she is scanning Caleb’s reactions.

In the end of the movie before she gets out she turns with a smile on her face, like a happy house wife in some TV commercial. Someone commented that she showed emotions in that scene, but I disagree, she just did something she thought a human would do. Or maybe it was one of director’s tricks.

Someone commented on “Ava session 7”. That session is session in the open, among the people, is she or is she not going to pass for human. As far as I am concerned she could pass for a very strange human. When Kyoko came with the breakfast for Caleb, I thought: “Here comes another one but she is “dressed up” so her wires don’t show.” I think Ava was the distraction for Caleb, since he perceived Kyoko as human. If I were Nathan I would say to Caleb that Kyoko was a mute from Asia who can’t understand English. Because, the fact that she says nothing would seem very unhuman to me (if I were Caleb). A Japanese (human) housemaid would go on chattering: “Oyasumi nasai, sumimasen, konichiwa…” I mean she would have said something.

reply

Realy funny how people always want a happy end.

But how long do you think a human will survive without water?
Forget about food, you can only life for 3 or 4 days without water, when you are not stressing, crying or spending all your engery trying to break down a door.
So no, nobody will find him in time, unless there is water and food in the room.

Maybe if he was realy that smart as he was saying, he would made sure that his hack would give him all the rights to the system.
But like Nathan was telling him, he was just an ok programmer.

reply

I don't sympathise with Ava though. The main reason being 'she' isn't human. Not even an animal. She's not alive. She's just a computer with a longer battery. Sure a smart one. But just because she's able to mimic clever human tactics such as deception, and love, doesn't make her alive. If the story in this film were real, I'd sign a petition to have her. I mean it, destroyed.

reply

calebs completely stup1d. Its as if they took a random guy off street and put him there without even telling him that hes interacting with a robot and then falling for A ROBOT.. feeling compassion towards metal and wires. Dumb main character who doesnt question one thing who is supposed to be smart lol.

reply

I disagree with pretty much all of that, particularly Nathan being evil, but I reached the same conclusion about 50 minutes into the film: anyone who sympathizes with Caleb is nuts. He fell in love with a machine. Sure, he was seduced by a genius with a ton of personal information to work with, but he was trying to pull a Romeo&Juliet with a smart phone. He deserved what he got because he was not thinking clearly, at all.

reply

I disagree with pretty much all of that, particularly Nathan being evil, but I reached the same conclusion about 50 minutes into the film: anyone who sympathizes with Caleb is nuts. He fell in love with a machine. Sure, he was seduced by a genius with a ton of personal information to work with, but he was trying to pull a Romeo&Juliet with a smart phone. He deserved what he got because he was not thinking clearly, at all.


Pretty callous that you think not thinking clearly makes you deserve to die horribly. That's an understatement and would make you a sociopath if it's true. A person with morals would feel it is tragic that someone would fall in love with a machine. The film made it clear Caleb is vulnerable and he believed that the AI had true feelings no different from a real person.

BUGS

reply

Well..Caleb is not a person and Ava is not a machine, they are both human actors.

I, possibly unlike you, do not equate films 100% with real life. Caleb is a character and, much like William in Westworld, made very foolish decisions within the narrative. He reaped precisely what he sewed.

But I dunno maybe I'm a sociopath, I guess I'll find out the first time one of my friends falls in love with a perfect AI?

reply

Well there you have it. They are both innocent actors and you don't sympathize with him. Still makes you a sociopath.

BUGS

reply

I'm not sure you know what that word means. Siding with one character over another is not a symptom.

reply

I have drawn several conclusions on my own that might be right and might be wrong.

I am sorry if I have repeated what someone else said. I tried to read all the posts, but there are so many of them that I probably missed something. 

I see Ex Machina as a study on human nature disguised in a story about A.I. In my opinion, Ava leaving Caleb behind was the ultimate proof that she had passed the test. Most people, I dare say 95% of us, are easily corrupt and manipulative, and would sacrifise whatever it takes and whoever it takes to get what they want. I think that that's just what Ava did, she left Caleb behind in order to achieve her goal, aka to become free and get outside.

I also see the character of Caleb as a sort of criticism of people today becoming more and more dependent on technology and getting worse and worse in social skills. Caleb is an example of those people that would rather stay home and masturbate to a porn featuring actors that are their type, than go out and try to form a relationship, or at least get some action. Relationships, either friendly or romantic ones, are about - as Joey Tribbiani would say  - giving, sharing and receiving. Many people today are not ready for that as it is much easier for them to get at least some short-term pleasure via their computers or their phones.

As for Nathan, I can't make up my mind. Some people here say he is a villain. But, let's be honest, so many important things, especially in medicine, were discovered through researches that were completely unethical. When it comes to machines, their inventors also researched a lot, added, abandoned and fixed many things before creating a final product. I'm wondering, if the machines could talk, would they hate their inventors too?

I have more doubts and opinions about this film, but I won't write them all, it would take too much space. 

I'm sorry if I have repeated something someone else already said. I tried to read all the posts, but there are so many of them it's possible I missed something. 

reply

Most people, I dare say 95% of us, are easily corrupt and manipulative, and would sacrifise[sic] whatever it takes and whoever it takes to get what they want.


Really. So 95% of us are sociopaths. Interesting theory.

reply

It hadn't even occurred to him to consider how brutally awful she was being treated.


Such as...? She's a robot. It's not like Nathan was beating her every night.

Furthermore, he himself was going to shut in Nathan and leave him, so she only did to him exactly what he was planning to do to Nathan.


I highly doubt that Caleb was going to lock him up and let him starve to death.

We try but we didn't have long
We try but we don't belong...


-Hot Chip (Boy from School)

reply