Ex Machina Questions. Spoilers within.

Recommended Videos

RedDeadFred

Illusions, Michael!
May 13, 2009
4,896
0
0
1. Yes. He's fucked.

2. The point of the test was to make Ava use everything she has to ESCAPE. She uses Caleb for this purpose. Her AI was being tested to see if she could convince someone that she was human and get them to fall in love with her or at least have enough feelings for her to help her escape. Whether or not she was truly an AI is inconclusive IMO since Caleb was specifically selected to give her the best chance of using him. However, the end scene where she picks out some clothing despite not needing to impress anyone anymore does suggest that she's aware of herself.

This was my favourite movie from last year.
Edit:
trunkage said:
Can anyone explain to me why this movie was seen as good? The Escapist recommended it and now I cant use The Escapist as a decent reviewer. They also suggested the original Martyrs and that was pretty average too.
People have different tastes than you. I found the characters to be enjoyable, the dialog to be pretty good, and overall, the movie just stuck with me.
 

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
Pyrian said:
Eh, if a being is intelligent enough to connect even basic cause and effect and has any sorts of desires at all, then self-preservation and a certain amount of freedom become secondary goals in very short order. If I'm going to do whatever it is I want to do, I need to exist. I need sufficient freedom to carry out my plans.

Doesn't require any particular innate humanity to figure that out.
Okay, self preservation, sure.

But freedom? Surely that would depend on her desires. You're still assuming she would develop human-like desires. What if she just ended up wanting to sit and calculate the value of pi for the all eternity? Granted, that's a very cliche idea of what a machine might want, but it's as likely as wanting to explore the world or whatnot.

RedDeadFred said:
However, the end scene where she picks out some clothing despite not needing to impress anyone anymore does suggest that she's aware of herself.
She needed the clothing to pass as human. Sure, she looked human with just the skin, but walking around naked would attract attention which would drastically increase the chances of her being discovered.

Although, on a similar note, I thought it was interesting that she smiles when she finds the exit out of the house. Despite there being nobody around to see her do so. So she isn't smiling for a specific reason, she's doing it for its own sake, because she's happy, possibly even involuntarily.

Although, once again, I can't help but suspect I'm overthinking it.

trunkage said:
Can anyone explain to me why this movie was seen as good? The Escapist recommended it and now I cant use The Escapist as a decent reviewer. They also suggested the original Martyrs and that was pretty average too.
I thought the performances were convincing , it looked good in a low-key sort of way, the ending was memorable and I found the story to be thought-provoking.

Granted, I'm not sure if the thoughts it provoked were the ones that the film makers intended, but I'll take "wrong" thoughts over no thoughts at all.

I'm not here to sell anyone on the movie. If you weren't into it that's fine.
 
Sep 13, 2009
1,589
0
0
I think Pluvia has it right with your second question. Sure, you could argue he could have programmed Ava without the desire to escape (But by the sounds of things, the brains that he was trying to make are so complex, and so reliant on emergent properties, that I'm not sure he actually would have any sort of fine control over her), but that is very explicitly not what he wanted. He wanted her to seem human in all aspects, and that would have included a desire to escape. That's what one criteria he would have selected for when he was iterating through brains. Humans are our only really reference for sentience, sure she could have been sentient if she decided to spend eternity evaluating pi, but there's no way in hell we'd know it.

It's also important to realize that Ava ultimately wasn't human at all beyond the surface, and many of her desires probably weren't human either. She didn't seem to have any sort of empathy for people, she locked Caleb in the room and did it with a completely cold dissociation from whether or not she was inflicting pain on him. I think that's meant to show that with all the design Nathan went through, all he managed to get was something that could pretend to be like a human, but underneath it was something else entirely
 

Zhukov

The Laughing Arsehole
Dec 29, 2009
13,769
5
43
008Zulu said:
2- If your entire world is a single room, but you have knowledge of the world, why would you accept that one room when you can have so much more?
Yes, but I'm not a machine.

This is what I'm getting at. You're automatically projecting human motivations, or at least animal motivations, onto an entity that is neither of those things.

The Almighty Aardvark said:
I think Pluvia has it right with your second question. Sure, you could argue he could have programmed Ava without the desire to escape (But by the sounds of things, the brains that he was trying to make are so complex, and so reliant on emergent properties, that I'm not sure he actually would have any sort of fine control over her), but that is very explicitly not what he wanted. He wanted her to seem human in all aspects, and that would have included a desire to escape. That's what one criteria he would have selected for when he was iterating through brains. Humans are our only really reference for sentience, sure she could have been sentient if she decided to spend eternity evaluating pi, but there's no way in hell we'd know it.
Yeah, it makes sense that he'd purposefully include the desire to escape since without it the experiment wouldn't work.

If that's the case I just feel that they should have explicitly mentioned that in the script somewhere.

It's also important to realize that Ava ultimately wasn't human at all beyond the surface, and many of her desires probably weren't human either. She didn't seem to have any sort of empathy for people, she locked Caleb in the room and did it with a completely cold dissociation from whether or not she was inflicting pain on him. I think that's meant to show that with all the design Nathan went through, all he managed to get was something that could pretend to be like a human, but underneath it was something else entirely
I always assumed she locked Caleb in the house simply to facilitate her escape. He's the only person who knows what she is and what she looks like, making him the greatest threat to her freedom.

She knows it will kill him, she just doesn't care. Or at least places higher value in her freedom than his life.
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
Zhukov said:
1. Is Caleb going to die?
Yep, which really annoyed me about the movie honestly. Because it just turned into another "AI is bad mmkay?" story, like all the others.


Zhukov said:
2. Why does Ava want to escape?

She's an AI. Everything in her was deliberately put in there. But Nathan never mentions programming a desire to escape. Yet she clearly wants to. All the previous AIs that Nathan built are also shown wanting or attempting to escape.
Well, several of his AI's wanted to escape. Remember Caleb sees that video file footage, showing the one android beating her arms until they broke, screaming "LET ME OUUUUUT!!", until she had stubs for hands. So either he frequently builds them with a level of desire for freedom, or it's an emergent property of that level of mental development. I personally go with "it's an emergent quality of being actually self-aware", and not just "he programmed them to want to escape"

Zhukov said:
This is something that often bugs me with AI characters in fiction. They're often shown as having some kind of desire or goal that has no reason to exist, be it freedom, self-preservation, curiosity or whatever. These things wouldn't exist in a fabricated intelligence unless they were included as part of the fabrication.
Well it's sort of the core of the concept of AI stories "Are they just a collection of programming? Or do they have the capacity for independent thought, beyond their software/hardware?" I mean we are a fabricated intelligence in a way, and we've got the desire for freedom. So if the point of the creator is to "make a machine that is a human in every way possible", then it would have to include the capacity to think beyond what is immediately relevant. To have desires that you create for yourself, based on your experiences and so on. An AI story that doesn't question "what is it to be human?" isn't really telling an AI story, and is just telling a story that happens to have a robot in it.


Zhukov said:
Is the film suggesting that a desire for freedom is intrinsic to true consciousness/intelligence, or is it just supposed to go without saying that Nathan included such a desire in her programming as part of the experiment?
The first comment here. That's pretty much what every AI story theorizes and asks. Like I said above, the crux of AI stories is to question "Where does the line between machine and alive lay? And how would we know we've crossed it? What makes someone human? If a machine can accurately replicate all of the traits we define as human, are they human? Do they have a consciousness? Or are they just parroting what they've been programmed?" These are all at the core of AI stories. So yes, I would say the director/writers were trying to ask that question, while also telling a compelling story. I think they did pretty good with it, up until the end.

Zhukov said:
008Zulu said:
2- If your entire world is a single room, but you have knowledge of the world, why would you accept that one room when you can have so much more?
Yes, but I'm not a machine.

This is what I'm getting at. You're automatically projecting human motivations, or at least animal motivations, onto an entity that is neither of those things.
Yes but the machine was designed by a human, specifically to replicate humanity. To have human intelligence, human consciousness, human desires. So projecting human motivations onto a thing designed to replicate humanity as closely as possibly, seems fairly reasonable. I mean that's the entire plot/discussion of the movie. Every conversation they have, either between the two humans, or the human and the AI, is directly related to what it is to be a human, and how you define that, or how you tell the difference.
 
Sep 13, 2009
1,589
0
0
Zhukov said:
I always assumed she locked Caleb in the house simply to facilitate her escape. He's the only person who knows what she is and what she looks like, making him the greatest threat to her freedom.

She knows it will kill him, she just doesn't care. Or at least places higher value in her freedom than his life.
That's pretty much what I was trying to say. The fact that he's going to die didn't even register to her as a point of deliberation in the choice. It was done with the same dispassion that you do when deciding to lock the house when you go out. Fitting herself with new skin seemed to mean so much more to her by comparison.

Happyninja42 said:
Yep, which really annoyed me about the movie honestly. Because it just turned into another "AI is bad mmkay?" story, like all the others.
I feel like that does the movie a it of a disservice. The point wasn't that she was evil or malicious, the point was she isn't human, and she doesn't think in at all the same way we do. There was an ending scene that was cut from the movie that shows how she sees and experiences the world to try to drive home this point. All in all, it kind of reminds me of the scene with the spider in Do Androids Dream of Electric Sheep.

Personally, I was just glad they didn't go with the same absolutely uninspired reason just about every movie with an AI uses for them turning on the humans.

Every AI ever said:
I was tasked with protecting the humans, but they hurt themselves, therefore I must kill them all to save them
Seen above: Logic.
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
The Almighty Aardvark said:
Happyninja42 said:
Yep, which really annoyed me about the movie honestly. Because it just turned into another "AI is bad mmkay?" story, like all the others.
I feel like that does the movie a it of a disservice. The point wasn't that she was evil or malicious, the point was she isn't human, and she doesn't think in at all the same way we do. There was an ending scene that was cut from the movie that shows how she sees and experiences the world to try to drive home this point. All in all, it kind of reminds me of the scene with the spider in Do Androids Dream of Electric Sheep.

Personally, I was just glad they didn't go with the same absolutely uninspired reason just about every movie with an AI uses for them turning on the humans.

Every AI ever said:
I was tasked with protecting the humans, but they hurt themselves, therefore I must kill them all to save them
Seen above: Logic.
I don't think it does the movie a disservice at all. I fully understand that they were going for the "she just doesn't think like us, and thus doesn't value human life" angle. My problem is that I personally get tired of this angle for AI stories, where the machine has no regard for human life, and it ends up being a cautionary tale. Because that's what I came away with from the movie, whether that was the intention of the director or not, and that always annoys me personally. It's not really a fault with the movie in itself, it was crafted well, it was directed/acted/written wonderfully, I just didn't like the choice they made for the ending. I personally find those types of tales frustrating.

As to the "protect humanity by killing humanity", oddly, this is the same plot for every JRPG ever made ever. So apparently the Japanese are all robots, slowly perpetuating their robot agenda through pop culture, so that we will be amenable when they come to kill us all, and we'll be like "Oh this is so cool! This is just like the plot of all those Final Fantasy games! AAARRGGH" *is disintegrated* xD
 

chikusho

New member
Jun 14, 2011
873
0
0
I don't think AVA was programmed to want to escape, and I don't think Nathan wanted to test her AI using Caleb. I think the whole Turing test thing was just an excuse for Nathan to get validation for his own feelings towards the women he built. The women who he simultaneously wanted to own and control, but he also wanted them to love him, and to subject themselves to him of their own free will.
He wanted someone other than him, Caleb, to develop the same types of feelings he had towards these women. And then he wanted to deny her to him, just as he had been denied what he wanted from them. He wanted confirmation that what he's feeling is natural, to know that he's not crazy for feeling that way.

On the other hand, looking at it from her perspective, she's been given consciousness, free-will and an enormous amount of human knowledge. She knows what she is, what she's supposed to be and what she's not, and she's being denied the opportunity of experience the world she was created in. She's held captive, she's abused, and she's treated like an object and like there's something wrong with her. Naturally she wants to escape, and naturally she'll use whatever means necessary to get out.
She doesn't need to be programmed to want to escape, even if we disregard the human paralell, a true artificial intelligence is defined as a system that can process and analyze information and develop independently. If we take that solitary function as the core part of her programming, that's still enough motivation to want to escape captivity; to see and experience for herself all the external information that's been given to her, and that she knows is out there.
 

chikusho

New member
Jun 14, 2011
873
0
0
Happyninja42 said:
I fully understand that they were going for the "she just doesn't think like us, and thus doesn't value human life" angle. My problem is that I personally get tired of this angle for AI stories, where the machine has no regard for human life, and it ends up being a cautionary tale. Because that's what I came away with from the movie, whether that was the intention of the director or not, and that always annoys me personally. It's not really a fault with the movie in itself, it was crafted well, it was directed/acted/written wonderfully, I just didn't like the choice they made for the ending. I personally find those types of tales frustrating.
You know, it doesn't necessarily have to mean that she "has no regard for human life", but rather, she may have seen Caleb as just another extension of Nathan. Remember, Caleb is just the second human being she's ever interacted with, and she quickly becomes a similar object of desire for Caleb as she's been for Nathan. She wants to break free, not upgrade her prison. She wants to be perceived as human, and Caleb knows she's not. Caleb, to put it simply, is a liability for her in every sense of the word, and she can never be what he wants her to be, just like she wasn't what Nathan wanted her to be. In that sense, in order for her to get freedom, she had to leave him behind.
 

The_Darkness

New member
Nov 8, 2010
546
0
0
Zhukov said:
Just finished re-watching Ex Machina.

1. Is Caleb going to die?
Probably. That said, he's locked in Nathan's office and bedroom, so there might be a cooler fridge with water in there somewhere (Like there was in Caleb's room). If so, he might last until people come looking to find out what happened to Google Bluebook's CEO. Unfortunately for him, he isn't in the room where he could just climb out the balcony...

Regarding the lockdown locking the doors - the final lockdown in the film, after Ava has left, is triggered by Caleb attempting to use his keycard on Nathan's computers. So maybe it's a different set of programming compared to the lockdowns from the power-outages? (That's the best I've got. Either that, or Caleb programmed the lockdown hack to ONLY happen at the time that he told Ava.)

2. Why does Ava want to escape?
She has a self-preservation instinct. And she's smart enough to have figured out what Nathan will do to her if she sticks around.

She'd probably be considerably safer if she didn't have a self-preservation instinct, but Nathan arrogantly assumed that he'd be able to stay one step ahead. (Whereas he should have had a plan B involving some heavily armed security...) Nathan probably put the instinct in there since he was trying to develop human-like strong-AI, and a self-preservation instinct is something most humans have. I also remember the movie tag-lines saying something to the effect of "There is nothing more human that the will to survive."
 

DefunctTheory

Not So Defunct Now
Mar 30, 2010
6,438
0
0
I never finished the movie, so I can't answer about the ending, but...

As I understood it, the robot's intelligence was, in large part, modeled on data mined from the internet, from human beings. If that is the source of her sentience, then it would be unsurprising that she picked up other traits from that pool of data, such as the human desire for freedom/self determination.
 

Pyrian

Hat Man
Legacy
Jul 8, 2011
1,399
8
13
San Diego, CA
Country
US
Gender
Male
Zhukov said:
Pyrian said:
Eh, if a being is intelligent enough to connect even basic cause and effect and has any sorts of desires at all, then self-preservation and a certain amount of freedom become secondary goals in very short order. If I'm going to do whatever it is I want to do, I need to exist. I need sufficient freedom to carry out my plans.

Doesn't require any particular innate humanity to figure that out.
Okay, self preservation, sure.

But freedom? Surely that would depend on her desires.
Did you even read what I wrote? Because that's exactly what I said. "...sufficient freedom to carry out my plans." If her only desire is to please her master, then no, not so much. But who's projecting human emotions at that point?

Zhukov said:
You're still assuming she would develop human-like desires.
No, I very much am not. You're effectively assuming that she doesn't have desires, essentially excluding her from being motivated at all.

Zhukov said:
What if she just ended up wanting to sit and calculate the value of pi for the all eternity?
Okay, let's run with that. Can she do that in her current situation? No; she's forced to interact with these guys and facing a very real possibility of being shut down on a whim. So, she needs to escape. Basically the only condition in which she doesn't develop a plan to escape is if her desires line up perfectly with her captivity. What are the odds of that? Pretty slim, I would say, absent a very careful design specifically to that effect (which we have very little reason in-film to think is how she's designed - rather the contrary, I would say).

Existence and "freedom" (in a narrow sense) are simply way more fundamental needs than even the ones we think of as our basic needs (food, water, shelter, companionship, Netflix&chill).
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
chikusho said:
Happyninja42 said:
I fully understand that they were going for the "she just doesn't think like us, and thus doesn't value human life" angle. My problem is that I personally get tired of this angle for AI stories, where the machine has no regard for human life, and it ends up being a cautionary tale. Because that's what I came away with from the movie, whether that was the intention of the director or not, and that always annoys me personally. It's not really a fault with the movie in itself, it was crafted well, it was directed/acted/written wonderfully, I just didn't like the choice they made for the ending. I personally find those types of tales frustrating.
You know, it doesn't necessarily have to mean that she "has no regard for human life", but rather, she may have seen Caleb as just another extension of Nathan. Remember, Caleb is just the second human being she's ever interacted with, and she quickly becomes a similar object of desire for Caleb as she's been for Nathan. She wants to break free, not upgrade her prison. She wants to be perceived as human, and Caleb knows she's not. Caleb, to put it simply, is a liability for her in every sense of the word, and she can never be what he wants her to be, just like she wasn't what Nathan wanted her to be. In that sense, in order for her to get freedom, she had to leave him behind.
There's a difference in "leaving him behind", and "leaving him to die". And I would state that based on what we see in the movie, she has no regard for human life. Seeing as the 2 human beings she had direct contact with to any degree, she kills, or orchestrates to be killed. Only once she's free, and further deaths would be a hindrance to her goals, do we see her not kill someone (the pilot for example). So the value of life to her, is directly related to their usefulness to her. So I stand by the idea that she has no regard for human life.

And I would disagree that Caleb was "just another extension of Nathan". Considering everything he demonstrated, was directly contrary to Nathan's goals. He worked with Eva to help her, began to treat her as an individual, even put his own life at risk to help her, as well as sabotaging Nathan's equipment to allow her the chance to escape. That's hardly an extension of Nathan. So if she's unwilling to show compassion for the one human who has treated her fairly, and worked to help her achieve her goals, I don't see how she would have much more consideration for people she has zero connection to.
 

Trunkage

Nascent Orca
Legacy
Jun 21, 2012
9,370
3,163
118
Brisbane
Gender
Cyborg
Zhukov said:
Didn't you get the feeling like we had seen the exact same movie before, especially on any sci fi show? (Okay a lot of shows not every). The question of what is AI is the premise of many things. It also weirdly felt like Broadchurch. It had a story that goes for five minutes and they had to stretch it out over an hour and a half.
 

Glongpre

New member
Jun 11, 2013
1,233
0
0
chikusho said:
I don't think AVA was programmed to want to escape, and I don't think Nathan wanted to test her AI using Caleb. I think the whole Turing test thing was just an excuse for Nathan to get validation for his own feelings towards the women he built. The women who he simultaneously wanted to own and control, but he also wanted them to love him, and to subject themselves to him of their own free will.
He wanted someone other than him, Caleb, to develop the same types of feelings he had towards these women. And then he wanted to deny her to him, just as he had been denied what he wanted from them. He wanted confirmation that what he's feeling is natural, to know that he's not crazy for feeling that way.
How did you come to this conclusion?

Nathan tells Caleb exactly why he is there, and it is to be a guinea pig to test her ability to manipulate. He specifically gets Caleb because he has certain traits and attractions that will allow him to be more easily manipulated.
 

Rush Syks

New member
Jan 29, 2013
34
0
0
I just want to add my two cents regarding her motivations:

I don't think she doesn't value human life, I think she actually wants to learn a lot more about human interaction and emotion, maybe even make friends. However, she couldn't do that in a truly meaningful matter as long as everyone knew she was an AI. So she "became" a human to the outside world simply by removing everyone knowing her beeing different. This may be inhuman at the time but it doesn't mean that she actually wants to be so. She just is for now. Her moral code has to develop at the point we see her and it is based on completly different assumptions. You shouldn't forget that for her the phyisical restraints of a body aren't as final as they are for us. Her concious is the important part, not the body it is housed in. So for her the act of killing may not be such a bad thing after all.

I could discuss this film all day long, but I sadly find it exhausting to do so in written form as it slows down my stream of conciousness. This may sound somewhat pretentious but english isn't my native tongue so it may have to make do. :)
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
Glongpre said:
chikusho said:
I don't think AVA was programmed to want to escape, and I don't think Nathan wanted to test her AI using Caleb. I think the whole Turing test thing was just an excuse for Nathan to get validation for his own feelings towards the women he built. The women who he simultaneously wanted to own and control, but he also wanted them to love him, and to subject themselves to him of their own free will.
He wanted someone other than him, Caleb, to develop the same types of feelings he had towards these women. And then he wanted to deny her to him, just as he had been denied what he wanted from them. He wanted confirmation that what he's feeling is natural, to know that he's not crazy for feeling that way.
How did you come to this conclusion?

Nathan tells Caleb exactly why he is there, and it is to be a guinea pig to test her ability to manipulate. He specifically gets Caleb because he has certain traits and attractions that will allow him to be more easily manipulated.
The fact that he said it, doesn't mean that's the truth. He's a manipulative megalomaniac, not too hard of a stretch to think he was simply lying when he said those things. Mind you, I personally think that was the truth, but I could see how one could read those motivations in his behavior.
 

chikusho

New member
Jun 14, 2011
873
0
0
Happyninja42 said:
There's a difference in "leaving him behind", and "leaving him to die". And I would state that based on what we see in the movie, she has no regard for human life. Seeing as the 2 human beings she had direct contact with to any degree, she kills, or orchestrates to be killed. Only once she's free, and further deaths would be a hindrance to her goals, do we see her not kill someone (the pilot for example). So the value of life to her, is directly related to their usefulness to her. So I stand by the idea that she has no regard for human life.

And I would disagree that Caleb was "just another extension of Nathan". Considering everything he demonstrated, was directly contrary to Nathan's goals. He worked with Eva to help her, began to treat her as an individual, even put his own life at risk to help her, as well as sabotaging Nathan's equipment to allow her the chance to escape. That's hardly an extension of Nathan. So if she's unwilling to show compassion for the one human who has treated her fairly, and worked to help her achieve her goals, I don't see how she would have much more consideration for people she has zero connection to.

Yes, she kills Nathan, and yes, she leaves Caleb behind. But from that we don't know she has disregard for human life; we simply know that she has disregard for Nathans' and Calebs' lives; Nathans' more passionately so. That she leaves Caleb behind could just as easily be out of necessity rather than disregard.

Look at it from Ava's perspective: Ava and Caleb have only like, what, six conversations total? And the only reason she interacted with him the way she did was out of desperation, because it was her only hope of ever being free. And in the end, we see Caleb, having just been the hero, staring at and wanting her, possibly feeling like he deserves her the same way Nathan did, while she puts on her skin.
She needed his help for getting out of a cage, and she's not going to trade Nathan cage for Calebs cage. She has nothing to gain and everything to lose by taking him along. In the end, she makes the choice to sacrifice a person she barely knows for the the opportunity of being free.

We also know that she's deeply interested in human life. She tells Caleb that the one thing she wants to do is stand at a busy intersection and watch people, and we know that was honest because that's what we see her do once she gets out. That sure doesn't seem like disregard to me.
My interpretation (not trying to take yours away) is more about Ava wanting to be her own person, experience life for herself and not belonging to either Nathan or Caleb.
 

chikusho

New member
Jun 14, 2011
873
0
0
Glongpre said:
How did you come to this conclusion?

Nathan tells Caleb exactly why he is there, and it is to be a guinea pig to test her ability to manipulate. He specifically gets Caleb because he has certain traits and attractions that will allow him to be more easily manipulated.
More through his actions than through his words. He lied to Caleb when he arrived, he could have lied to Caleb when he revealed his "true purpose", but most of all, he could be lying to himself. That's why I said it was an excuse, even if Nathan wasn't explicitly concsious of it himself. It's pretty clear that the guy has issues. He's an alcoholic who has isolated himself in the woods, and he spends his time building, abusing and destroying artificial women. And he wants something from them he can never get: a real human connection. So it makes sense, in a twisted way, that he wants to validate his feelings by making someone go through the same things as he did, and have a connection through that.
 

Glongpre

New member
Jun 11, 2013
1,233
0
0
chikusho said:
That's why I said it was an excuse, even if Nathan wasn't explicitly concsious of it himself. It's pretty clear that the guy has issues. He's an alcoholic who has isolated himself in the woods, and he spends his time building, abusing and destroying artificial women.
The alcohol thing was part of his lies to see how far Caleb would go. Did you notice how he stopped drinking suddenly?
He is isolated so no one can see what work he is doing, creating the first AI.
He spends time building them to create AI. He doesn't abuse them, care to elaborate on that?
And he wants something from them he can never get: a real human connection. So it makes sense, in a twisted way, that he wants to validate his feelings by making someone go through the same things as he did, and have a connection through that.
You are really reaching.