Still don't get why Elizabeth killed me at the end of Bioshock infinite or wtf was up with that scene after the credits. Am I alive or not?!
Bioshock Infinite. Brilliant game but wtf story.
The way I understood it is this:DugMachine said:Still don't get why Elizabeth killed me at the end of Bioshock infinite or wtf was up with that scene after the credits. Am I alive or not?!
Bioshock Infinite. Brilliant game but wtf story.
tippy2k2 said:I removed it from the title but I didn't mean to call these plot holes, just head-scratchers where what the character did made no sense. His actions made sense in terms of plot but didn't make much sense to me in terms of "smart".Shia-Neko-Chan said:snip
I suppose I can buy that and flashes of it are shown during the conversation. I guess my main thing is that Dr. Schultz basically sentenced himself and Django to death with that action and for what? To kill one slave owner? I understand the emotions he was going through seeing such an injustice but you've gotta pick you battles doc!
To add to what uchytjes said, which is basically correct,DugMachine said:Still don't get why Elizabeth killed me at the end of Bioshock infinite or wtf was up with that scene after the credits. Am I alive or not?!
Bioshock Infinite. Brilliant game but wtf story.
uchytjes said:snip
That actually makes a whole lot of sense. Thanks for clearing that up you two. I can finally stop wondering what exactly happened.Innegativeion said:snap
Sort of fanboy coming through!bartholen said:Wohohooo boy, does this thread have good timing. I just compiled a massive list of issues like these from Neon Genesis Evangelion, and I think I could have come up with double the amount if I wanted to. I'm pretty sure this is going to piss a lot of Eva fans off.
This is just kind of circumstantial.- In the first episode, Shinji has apparently just arrived in Tokyo-3 when it's under alert. Why has he come just at the time when the angel has attacked? More precisely, why didn't Gendo invite him just a little bit earlier if he knew when the angels were going to come back?
- And if the place is under alert, how did Shinji miss the alarms? Ok, so maybe the people had already evacuated when he arrived, but that means there was still public transportation coming into the city. You'd think that such things would shut down in case of, you know, MONSTERS BRINGING THE APOCALYPSE. And even then you'd think he'd be a bit confused to arrive into a giant city and see absolutely nobody anywhere.
- Misato drives like a maniac to pick Shinji up and get him to the Geofront in time, which implies they're in a bit of a hurry. But she doesn't seem to be the least bit upset when they get on the fairly slow-looking train thing or when they get lost in the Geofront, losing precious time all the while. Why is this?
This can simply be attributed to rule of cool. Maybe there's somekind of economical benefit to having a thriving metropolis above Nerv, I don't know. Doesn't make too much sense, but growing buildings just look awesome.- Why would anyone build a city with millions of people in it on top of the trigger to the apocalypse, which gets continuously attacked by monsters the size of skyscrapers, bombarded and destroyed? Especially if you knew the angels would attack there, and there alone. There doesn't seem to be a single good reason to want to live in Tokyo-3.
Because the destruction of the Angels is not the true purpose of the Eva's. Throughout the series it becomes quite clear that Nerv and Seele don't seem too worried about the Angels, and see their victory over them as pretty much assured. Their real focus and the Eva's true purpose is the Human Instrumentility Project. This is why the Jet Alone gets sabotaged by Nerv in episode 7. Because it might very well have been able to fight Angels. But then the funding of the Eva's would cease and Seele would be unable to perform Instrumentality.- If the angels were attacking only the Geofront, what was all that talk about other nations possessing evas, and there being restrictions on how many a country could have? What other uses did they have for the only thing that supposedly could stop the angels? Why not just give them all to Japan, unless they were warring amongst themselves with giant robots?
I don't think repairs were ever the issue in that particular case. It was the pilots not being able to work together.- In episode 6 NERV directs all of Japan's electricity into a single rifle in a matter of hours. Okay, so they can do that. But when the angel who splits into two attacks, they can't speed up the repairs of the evas just enough to get them into action before the monster awakens? While it just, you know, STANDS IDLY BY DOING NOHING!
Nerv has supercomputers that can generally read the energy levels of an Angel, so keeping a bead on how close it/they are to reactivating isn't too difficult.- From the very same episode: a monster which brings about the end of the world if it gets close enough has split itself in two and lies in a dormant state for 5 days. First off, how do they know it will be in that state for that exact amount of time? Second, the monster lies dormant, doing nothing. Why does their only plan include having two teenagers who hate each other playing freaking twister to learn some ridiculous coreography that will apparently go exactly according to the monster's movements, despite them HAVING ABSOLUTELY NO WAY OF KNOWING HOW IT IS GOING TO ACT????????
I think there's only so much you can visualize.- We never see how the various halls and hangars in the Geofront are located in relation to one another, nor do we have any idea how large a construct it is.
I think this is mainly a maintenance issue. And also for experiments and such. And again, they're gods, so they wouldn't want too much unautherized personel near them- Since the angels are all attacking from ground level, why wouldn't they just store the evas there, instead of what seems like several miles underground?
The amphibious Angel was after Adam which was transported by Kaji who was on the naval escort.- If the goal of the angels was to get to Lilith at Terminal Dogma, what was up with the amphibious angel in episode 8? How was it going to get several miles underground? Or for that matter, how was it going to move on dry land at all? Was it going to eat its way down?
Think of it like this; If you have a giant rifle, would you rather it be on a stationary turret, or in the hands of a giant agile super god, if you have one or two available?- They say many times that the evas are the only thing that can defeat the angels. But in episode 11 they kill the angel by merely shooting at it with a gigantic rifle. Did the fact that it was the eva operating the rifle instead of, say, a turret, make the difference? Do the evas have some magical touch that turns normal projectiles into magical bullets? Or if the AT field needed to be negated, why didn't they fight every angel with the same strategy: an eva gets close to the angel to nullify the AT field and then the military blasts it to smithereens. Seems pretty functional to me.
Actually, it does make some sense. In context of the movie, at least.MeChaNiZ3D said:In the film I, Robot...
The main villain turns out to be VIKI, who has interpreted the protection of humans to mean that she has to sacrifice a few for the greater good if that's what it takes to halt humankind's self-destructive tendancies. But the thing is, this requires violating the third law, which basically says the number one priority is not harming humans, and to my knowledge, VIKI is programmed with the 3 Laws and has no way to override them. Sunny does, but he is not connected to VIKI. So basically VIKI overrides her own programming without assistance, which cannot happen. This would be explained if she had an uninhibited AI like Sunny, but I don't think she does, and no matter how you reason it, she harmed humans despite that being the one thing that she absolutely shouldn't be able to do, above all other laws. It's all very well her logically analysing the situation, but an AI cannot simply adapt it's way out of the 3 Laws.
I think I agree..... The robots quantify life numerically and down to probability.DoPo said:Actually, it does make some sense. In context of the movie, at least.MeChaNiZ3D said:In the film I, Robot...
The main villain turns out to be VIKI, who has interpreted the protection of humans to mean that she has to sacrifice a few for the greater good if that's what it takes to halt humankind's self-destructive tendancies. But the thing is, this requires violating the third law, which basically says the number one priority is not harming humans, and to my knowledge, VIKI is programmed with the 3 Laws and has no way to override them. Sunny does, but he is not connected to VIKI. So basically VIKI overrides her own programming without assistance, which cannot happen. This would be explained if she had an uninhibited AI like Sunny, but I don't think she does, and no matter how you reason it, she harmed humans despite that being the one thing that she absolutely shouldn't be able to do, above all other laws. It's all very well her logically analysing the situation, but an AI cannot simply adapt it's way out of the 3 Laws.
it was the first law that VIKI ignored - the third one is about self protection. Just as an aside.
It seems that in the movie "I, Robot", they have a bit looser interpretation of the Three Laws - in Will Smith's backstory flashback, the robot chooses to save him and lets the kid to die in whatever accident there was. Will Smith even ordered the robot to save the child and it refused. Well, OK, I'm not entirely sure if that conflicts with the second law, however it does conflict with the first. The situation is similar to VIKI's one - she followed the second law to the exclusion of the first to...actually uphold the first law to begin with. Weird, but the movie shows that the laws seem to be more like guidelines in whatever universe it happens. It also shows it doesn't deserve it's title, too.
IronMit said:I think I agree..... The robots quantify life numerically and down to probability.DoPo said:Actually, it does make some sense. In context of the movie, at least.MeChaNiZ3D said:In the film I, Robot...
The main villain turns out to be VIKI, who has interpreted the protection of humans to mean that she has to sacrifice a few for the greater good if that's what it takes to halt humankind's self-destructive tendancies. But the thing is, this requires violating the third law, which basically says the number one priority is not harming humans, and to my knowledge, VIKI is programmed with the 3 Laws and has no way to override them. Sunny does, but he is not connected to VIKI. So basically VIKI overrides her own programming without assistance, which cannot happen. This would be explained if she had an uninhibited AI like Sunny, but I don't think she does, and no matter how you reason it, she harmed humans despite that being the one thing that she absolutely shouldn't be able to do, above all other laws. It's all very well her logically analysing the situation, but an AI cannot simply adapt it's way out of the 3 Laws.
it was the first law that VIKI ignored - the third one is about self protection. Just as an aside.
It seems that in the movie "I, Robot", they have a bit looser interpretation of the Three Laws - in Will Smith's backstory flashback, the robot chooses to save him and lets the kid to die in whatever accident there was. Will Smith even ordered the robot to save the child and it refused. Well, OK, I'm not entirely sure if that conflicts with the second law, however it does conflict with the first. The situation is similar to VIKI's one - she followed the second law to the exclusion of the first to...actually uphold the first law to begin with. Weird, but the movie shows that the laws seem to be more like guidelines in whatever universe it happens. It also shows it doesn't deserve it's title, too.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
The robot saves will smith instead of the kid in will smith's flashbacks. It ignores the 2nd law to follow the 1st law. Will Smith has a higher chance of survival so it saves him, regardless of will smith ordering it to save the child (2nd law).
Note how the AI quantifies human life - this is very important
VIKI wants to 'save' humans from harm.
If she does nothing (inaction- see law 1) this is against the 1st law since a crap load of humans kill each other.
We already know VIKI quantifies life from will smith's flash back. So to save as many humans from harm she takes over.
Killing a minority to save the majority must be greater then the amount of humans that would be otherwise unharmed through 'inaction'. VIKI clearly calculated it is. So she goes through with her plan.
If someone tries to destroy her then more humans will come to harm in the longterm. Thus she kills people that try to stop her.
Right Hook said:Personally I think Shultz was okay with being killed, I think he saw removing Candy as a noble enough cause. Making any sort of action against the shotgun guy would have just got him shot faster, Shultz is a very dramatic man with a sense of flair, I think he may have seen this circumstance as the right note to go out on.
However I don't think he felt Django would necessarily die, Shultz was aware of how insanely amazing of a gunman Django was. Shultz was also incredible upset with how Django ignored the plight of the other slaves (he rebukes him on it as they head into Candyland) and instead focused on his wife exclusively. I believe that Shultz may have viewed his death as a way to let...ahem, let Django off the chain and give all those white boys at Candyland exactly what they deserved.
Or you know, people tend to do dumb shit in the heat of the moment, just because someone may be smart, it doesn't make them immune to such behavior.
Ohhh, how much I'd have to agree with this. Being a Bond fan and hearing how much praise Skyfall was receiving, I was pretty hyped. Then I saw the film...IronMit said:Every 5 minutes of skyfall something new happens that makes no sense.
One example out of 25;
Bond gets shot in turkey by a bad guy and then disappears. He comes back and then decides MI6 needs some new leads. So he pulls out some bullet shrapnel that has been in his chest/shoulder to get analysed. They find out who uses this bullet and track him down.
Now - first of all the guy shot about 100 bullets in turkey. They couldn't retrieve one?
Bond left a led bullet in his body??! That's insane...and poisonous!
Only 3 people on the planet use this bullet?!? His a ghost?! Well then why is he using a bullet no1 else uses? and if his a ghost why do we know his flight manifesto?
DugMachine said:Still don't get why Elizabeth killed me at the end of Bioshock infinite or wtf was up with that scene after the credits. Am I alive or not?!
Bioshock Infinite. Brilliant game but wtf story.
I actually only noticed that when I watched the 3D re-release. And if I may, I shall add two more:Rblade said:I'll tag it to be sure, about jurassic park. Which I loved, but...
the #$^#ing cliff appearing out of bum#$^# nowhere in jurassic park...
one moment road/fence/forest(with goat and stuff)
next moment road/fence/50+ foot drop
Jurassic park 3 is actually one of my worst films of all times. I'm not even going to list them but that movie has immersion breaking bullshit pretty much every scene.
Entirely my fault for typing from memory. Silly me. As for the first incident:DoPo said:Actually, it does make some sense. In context of the movie, at least.MeChaNiZ3D said:In the film I, Robot...
The main villain turns out to be VIKI, who has interpreted the protection of humans to mean that she has to sacrifice a few for the greater good if that's what it takes to halt humankind's self-destructive tendancies. But the thing is, this requires violating the third law, which basically says the number one priority is not harming humans, and to my knowledge, VIKI is programmed with the 3 Laws and has no way to override them. Sunny does, but he is not connected to VIKI. So basically VIKI overrides her own programming without assistance, which cannot happen. This would be explained if she had an uninhibited AI like Sunny, but I don't think she does, and no matter how you reason it, she harmed humans despite that being the one thing that she absolutely shouldn't be able to do, above all other laws. It's all very well her logically analysing the situation, but an AI cannot simply adapt it's way out of the 3 Laws.
it was the first law that VIKI ignored - the third one is about self protection. Just as an aside.
It seems that in the movie "I, Robot", they have a bit looser interpretation of the Three Laws - in Will Smith's backstory flashback, the robot chooses to save him and lets the kid to die in whatever accident there was. Will Smith even ordered the robot to save the child and it refused. Well, OK, I'm not entirely sure if that conflicts with the second law, however it does conflict with the first. The situation is similar to VIKI's one - she followed the second law to the exclusion of the first to...actually uphold the first law to begin with. Weird, but the movie shows that the laws seem to be more like guidelines in whatever universe it happens. It also shows it doesn't deserve it's title, too.
DugMachine said:Still don't get why Elizabeth killed me at the end of Bioshock infinite or wtf was up with that scene after the credits. Am I alive or not?!
Bioshock Infinite. Brilliant game but wtf story.