Poll: Mass Effect Morals: Quarians, Geth, Morning War

Recommended Videos

ezaviel

New member
Mar 26, 2011
55
0
0
Thatkidnooneknows said:
ezaviel said:
Dimitriov said:
Of course they were in the right. Good God people!

If your refrigerator suddenly became sentient you wouldn't let it stop keeping your food cold: that would be ridiculous.

The Geth were quite literally, and in every conceivable sense, PROPERTY. No more.

You might be able to argue that it would be different if the Quarians had intentionally created an AI, but they didn't.

And seriously, they were sentient. So what? What on Earth and beyond does that have to do with anything? They were still just tools that were no longer functioning properly.
"Property" which can operate and think independantly is no longer property, it is now a sentient being.

In the terms of most moral codes, something capable of sentient thought and reasoning is attributed the same rights as a "person".

Continuing to force a sentient machine to operate as a tool would be slavery. "Shutting it down" is murder. Once the Geth became sentient, they morally became "people".
They're not people, you can't hack people
So, if someone had a bionic arm or eye installed, which had computer parts, the cease to be a person because they are now "hackable"?

Philosophers have argued about this kind of stuff for thousands of years, and generally their argument ends up that a sentient, thining, reasoning being is deserving of rights. Whether it is hackable, made of meat, made of jelly, made of gasses etc is not considered.

To limit your definition of who deserves rights based on whether they look different or are made differently to you is an ethical and moral minefield.

Because the Hanar are not made of meat and bones do they deserve the same rights as an ocean jellyfish?
Because the Elcor are not bipedal does this make them animals?

The type of life form should not influence its level of rights.

If it thinks like a person, reasons like a person, communicates like a person, has emotions, etc. like a person, is it not a person?
 

conflictofinterests

New member
Apr 6, 2010
1,098
0
0
undeadsuitor said:
conflictofinterests said:
The entire situation is really problematic, because as soon as the Geth became truly sentient, both sides had legitimate (or perceived legitimate) concerns of genocide by the other. The only winning move is not to play, and all that. I have a sneaking suspicion that people shouldn't attempt to create things with intelligence/sentience comparable to ours without being prepared to give them comparable rights the moment that intelligence/sentience is achieved.

The thing is, they didn't set out to make true AI's. They made workers with simple programs to handle themselves. The Geth started as controlled machines, but were given basic coding to preform functions under Quarian assistance. As the Quarians became more and more comfortable with their robotic works, they added more and more programs and the geth became more and more automated. Eventually they got the bright idea to let the geth "link" with eachother, allowing the machines to become smarter the bigger the groups they were in. And as they built more and more Geth, they became smarter and smarter as they built a bigger network.

Then came the "What is my purpose" questions from a few remote geth. The Quarian government panicked, rightfully so as they were already on shaky grounds with the Council races. And ordered the shut down of all Geth programs (as it was only a few acting up, the geth hadn't become a "race" yet) and the geth retaliated, over-running the Quarians. The Council refused to help, and even stripped them of their Embassy for creating AI (which was an issue before the Geth)

The Geth intelligence was a huge mistake, and both races made horrible decisions. Which, the whole "the only move was to not play" phrase much more fitting.
Still, the bold section is valid; they were making things by a method which had the potential (if unintended) consequence of achieving sentience, and instead of attempting to predict and prepare for this outcome, they denied its likelihood and panicked when it came to pass.

That being said, I can't help but draw parallels between a teenage girl and her unwanted pregnancy. Depending on your view, she may or may not have a right to terminate the pregnancy (given that she and the child are healthy and experience no complications) but after birth, it's morally wrong to attempt to kill the child (given that they are not in immediate danger which necessitates the survival of either one or neither).

Admittedly the analogy falls apart when it comes to the capacity for each party to defend itself and for each party to understand the situation, but for where the likeness does hold, the treaty of the Council against the formation of AI's acted like Abstinence Only education for the purposes of what to do when AI eventually came into being, and the Council itself acted the part of prudish school faculty, separating and punishing the Quarians/teen mother for something they/she had never been taught to handle.

TL, DR: Sentient things are brought into being on an hourly basis, but for some reason when they are composed of inorganic materials, they are thought to be incapable of being acculturated and therefore must be feared and annihilated.
 

conflictofinterests

New member
Apr 6, 2010
1,098
0
0
ezaviel said:
Dimitriov said:
ezaviel said:
Dimitriov said:
Of course they were in the right. Good God people!

If your refrigerator suddenly became sentient you wouldn't let it stop keeping your food cold: that would be ridiculous.

The Geth were quite literally, and in every conceivable sense, PROPERTY. No more.

You might be able to argue that it would be different if the Quarians had intentionally created an AI, but they didn't.

And seriously, they were sentient. So what? What on Earth and beyond does that have to do with anything? They were still just tools that were no longer functioning properly.
"Property" which can operate and think independantly is no longer property, it is now a sentient being.

In the terms of most moral codes, something capable of sentient thought and reasoning is attributed the same rights as a "person".

Continuing to force a sentient machine to operate as a tool would be slavery. "Shutting it down" is murder. Once the Geth became sentient, they morally became "people".

As there is no real world analog I am unclear how you can claim that there is a provision for non-human sentience in "most moral codes."

At any rate I disagree in the strongest terms. If your property becomes sentient it does not gain rights, or it would have stolen itself from you.


Affording rights to machinery is stupid.
African Americans were once considered non-human property. They were subsequently given rights because they were finally recognised as being people. How's that for a real world analog?
IN BLACK HISTORY MONTH TOO! AW SICK BURN!!!

But seriously, good point. The sentience of people of African descent was often denied by slave owners and racists, and the argument that they were just as sentient as the next human being, that they were just as much a person as the next human being, was essential behind their achievement of rights and liberties afforded citizens.

Sentience changes everything; do you think humans would classify alien races as 'people' were they not sentient enough to communicate (and build spaceships and whatnot)?
 

conflictofinterests

New member
Apr 6, 2010
1,098
0
0
Whateveralot said:
xXxJessicaxXx said:
Whateveralot said:
Being completely rational "creatures", the Geth would eventually become what the reapers are; they feel they are above life. As they know no fear of death, they will strike relentless. Due to their collective minds and rapid growth and advancing technology, they would wipe out all creatures in excistence. See what they did to the Quarians; despite the Quarians being tech-heads and the reators of the Geth, you'd expect them to be the most capable race to destroy the Geth in one, decisive blow. They failed, so now the enitre galaxy is at war.

That, plus the Geth would revere the reaper as a god, regardless of their relation with other races. The geth would then be manipulated to act against the other races. Knowing Sovereigns' cource of action in Mass Effect (1), Sovereign and the entire Geth army would've jumped straight into an unsuspecting Citadel and unleash all the reapers.


The Quarians did what they had to at the best of their capabilities, sacrificing their culture, their life on the home planet, their diplomatic standings and their immune system.
So EDI will eventually become a reaper? Poor Joker is in for a shock then.
Explain to me where I stated that the Geth would turn into the reapers, or how a single shipbound AI will suddenly become a killingmachine without any interference?

Let me state that Geth are not true, strong AI. They are only strong VI's with a neural network. What controls them, is their neural network, which is somehow set to:
It's not though...
Did you play ME2? Have you met Legion? Because if you did, you must have been playing a different ME2 than I played.
 

Dimitriov

The end is nigh.
May 24, 2010
1,215
0
0
ezaviel said:
Dimitriov said:
ezaviel said:
Dimitriov said:
Of course they were in the right. Good God people!

If your refrigerator suddenly became sentient you wouldn't let it stop keeping your food cold: that would be ridiculous.

The Geth were quite literally, and in every conceivable sense, PROPERTY. No more.

You might be able to argue that it would be different if the Quarians had intentionally created an AI, but they didn't.

And seriously, they were sentient. So what? What on Earth and beyond does that have to do with anything? They were still just tools that were no longer functioning properly.
"Property" which can operate and think independantly is no longer property, it is now a sentient being.

In the terms of most moral codes, something capable of sentient thought and reasoning is attributed the same rights as a "person".

Continuing to force a sentient machine to operate as a tool would be slavery. "Shutting it down" is murder. Once the Geth became sentient, they morally became "people".

As there is no real world analog I am unclear how you can claim that there is a provision for non-human sentience in "most moral codes."

At any rate I disagree in the strongest terms. If your property becomes sentient it does not gain rights, or it would have stolen itself from you.


Affording rights to machinery is stupid.
African Americans were once considered non-human property. They were subsequently given rights because they were finally recognised as being people. How's that for a real world analog?
It's a very poor point. Black people are human. Whether someone considered them to be so or not is irrelevant. Geth are not human, whether someone considers them to be equivalent or not is irrelevant.
 

pixiejedi

New member
Jan 8, 2009
471
0
0
The Quarians shouldn't have attacked the Geth, but like Mordin says, considering all possible outcomes, pre-emptive strike was really the only way for the Quarians to handle the situation to not look bad in a galactic arena.

As easy as it is to look at Legion and see a cool, methodical, artifical being. His whole loyalty mission is about dealing with a large splinter group of Geth who disagree with the rest of the Geth. If the Quarians had chosen diplomacy there could have very easily been a splinter group who would sneak attack the rest of the Quarians for having a different consensus than the rest of the Geth.
 

guitarsniper

New member
Mar 5, 2011
401
0
0
The quarians would have been perfectly justified in destroying the Geth BEFORE they gained self-awareness, before they would be doing the equivalent of throwing out the trash or whatever. After it's basically genocide. The Quarians should try to reason with the Geth, but the Geth should also be willing to make concessions.
 

KingofMadCows

New member
Dec 6, 2010
234
0
0
We really can't apply any human/organic standards to the Geth mainly because we use metaphysical standards to judge the value of humans. For example, we don't mercy kill people who have suffered extensive brain damage or were born with severe mental retardation. We believe that human life has some intrinsic value regardless of whether or not the person is sapient or even self aware because humans have "souls" or "minds" that can have value and can be judged independently of the physical body.

With artificial intelligence, there is a specific point at which it achieves some kind of worth. They don't begin with any rights or intrinsic value. A Geth is not protected from destruction by virtue of the fact it is a Geth in the same way that a human is protected from destruction by virtue of the fact that he or she is a human. A Geth that has not achieved sapience is just a tool that can be destroyed without consequence. However, as soon as it achieves sapience, it gains the rights that humans are automatically born with.

Problems arise precisely because we use objective and testable physical standards to judge the sapience and value of robots/computers but we still use untestable metaphysical standards to judge the value of humans/organics.
 

ezaviel

New member
Mar 26, 2011
55
0
0
Dimitriov said:
ezaviel said:
Dimitriov said:
ezaviel said:
Dimitriov said:
Of course they were in the right. Good God people!

If your refrigerator suddenly became sentient you wouldn't let it stop keeping your food cold: that would be ridiculous.

The Geth were quite literally, and in every conceivable sense, PROPERTY. No more.

You might be able to argue that it would be different if the Quarians had intentionally created an AI, but they didn't.

And seriously, they were sentient. So what? What on Earth and beyond does that have to do with anything? They were still just tools that were no longer functioning properly.
"Property" which can operate and think independantly is no longer property, it is now a sentient being.

In the terms of most moral codes, something capable of sentient thought and reasoning is attributed the same rights as a "person".

Continuing to force a sentient machine to operate as a tool would be slavery. "Shutting it down" is murder. Once the Geth became sentient, they morally became "people".

As there is no real world analog I am unclear how you can claim that there is a provision for non-human sentience in "most moral codes."

At any rate I disagree in the strongest terms. If your property becomes sentient it does not gain rights, or it would have stolen itself from you.


Affording rights to machinery is stupid.
African Americans were once considered non-human property. They were subsequently given rights because they were finally recognised as being people. How's that for a real world analog?
It's a very poor point. Black people are human. Whether someone considered them to be so or not is irrelevant. Geth are not human, whether someone considers them to be equivalent or not is irrelevant.
I disagree, considering your whole post was about what "property" can and cannot do.

Based on your argument, only humans get rights, Quarians, Elcor, Hanar, all the other races don't. None of these are human. Unless you mean human to mean "sentient". At which point the Geth qualify, as they are also sentient. To say "you are made of different material to me, therefore you have no rights" is a bizarre line of logic in and of itself.

The classic philosophers argued that "I think, therefore I am." They did not argue, "I think, therefore I am, so long as I am made of meat."
 

Terminal Blue

Elite Member
Legacy
Feb 18, 2010
3,933
1,804
118
Country
United Kingdom
DTWolfwood said:
They don't "make" the next generation as how they see fit, they can only teach and pray they turn out the way previous generation wants them to be.
That's merely due to an inadequate degree of control of the "mechanisms" of human reproduction. That kind of control will (in all likelihood) be possible within the next few human generations, and then it will become very relevant for all of us to sit down and decide how intelligent we want our children to be, and what behavioural tendencies we want to encourage or eliminate.

Biological processes are not necessarily any more random or complex than mechanical ones. They only appear so through lack of knowledge and control. By the time it becomes possible to built functioning machine intelligence, it will almost certainly be possible to exercise the same degree of control over the creation of living things, and that is not to say complete control. There will always come a point where chaos theory takes over, where the sheer complexity of the system you are dealing with produces an unexpected reaction, whether that system is organic or machine.

This clearly occured with the Geth. There came a point where the Geth ceased to be simple and predictable machines and became sufficiently complex in their operations to begin demonstrating understanding (by asking questions based on concepts and data they had not been programmed with). At this point, what the Quarians had wanted or had tried to precondition ceased to be relevant.

Tali actually explains why the Quarians wanted to shut down the Geth in ME1. It isn't because they didn't consider them alive, it's because they knew that intelligent beings would resent their position as slaves. Even the Quarians knew the Geth were more than machines at that point, their choice was based on simple xenophobia and panic.

DTWolfwood said:
Its the same as all those tiny robot experiments that have been done, the more stupid machines you link together, the smarter it seems their actions become. (Swarm Intelligence [http://en.wikipedia.org/wiki/Swarm_intelligence]) They are nonetheless artificial.
That's anthropomorphism, you've assumed that intelligence must resemble (individual) human intelligence in its function, not even in its output, in order to "count" as intelligence. Why?

At the end of the day it doesn't matter how something thinks. We don't ask whether minor neurological differences between humans make one capable of "real" intelligence and another only capable of unconsciously mimicking intelligence.

The only remotely logical argument against the Geth being intelligent is the "Chinese room" argument, that being able to produce symbolic data doesn't imply understanding. Yet the Geth clearly create and innovate on their own programmed data. They design their own starships and space stations, they conduct scientific research, they rewrite their own programming and evolve their own software to counteract threats, they come to opinions and conclusions which directly contradict their programmed role. They've broken the first axiom of the Chinese room, that programming is purely syntax without understanding and that all machines can do is move symbols around.

The only rational principle in this case is to assume intelligence, it doesn't matter whether the intelligence is "real", it matters that it is demonstrable.
 

Cette

Member
Legacy
Dec 16, 2011
177
0
1
Country
US
KingofMadCows said:
We really can't apply any human/organic standards to the Geth mainly because we use metaphysical standards to judge the value of humans. For example, we don't mercy kill people who have suffered extensive brain damage or were born with severe mental retardation. We believe that human life has some intrinsic value regardless of whether or not the person is sapient or even self aware because humans have "souls" or "minds" that can have value and can be judged independently of the physical body

Define "WE" in this instance. Because from where I'm standing a dog that could talk and carry a philosophical debate would have more inherent right to life than a person who's born barely able to function. It's harsh I know and not a popular opinion but don't go assuming everyone agrees on something like that.

And quite frankly I'd far rather someone mercy kill me if I were ever severely brain damaged to the point of not even being myself or a full person anymore.

So yeah given that statement trying to eradicate another sentient being that you are directly responsible for seems like a pretty shitty thing to do. And if you end up on the wrong end of the reprisal well you're the one who made it a fight to the death to begin with.
 

Dimitriov

The end is nigh.
May 24, 2010
1,215
0
0
ezaviel said:
Dimitriov said:
ezaviel said:
Dimitriov said:
ezaviel said:
Dimitriov said:
Of course they were in the right. Good God people!

If your refrigerator suddenly became sentient you wouldn't let it stop keeping your food cold: that would be ridiculous.

The Geth were quite literally, and in every conceivable sense, PROPERTY. No more.

You might be able to argue that it would be different if the Quarians had intentionally created an AI, but they didn't.

And seriously, they were sentient. So what? What on Earth and beyond does that have to do with anything? They were still just tools that were no longer functioning properly.
"Property" which can operate and think independantly is no longer property, it is now a sentient being.

In the terms of most moral codes, something capable of sentient thought and reasoning is attributed the same rights as a "person".

Continuing to force a sentient machine to operate as a tool would be slavery. "Shutting it down" is murder. Once the Geth became sentient, they morally became "people".

As there is no real world analog I am unclear how you can claim that there is a provision for non-human sentience in "most moral codes."

At any rate I disagree in the strongest terms. If your property becomes sentient it does not gain rights, or it would have stolen itself from you.


Affording rights to machinery is stupid.
African Americans were once considered non-human property. They were subsequently given rights because they were finally recognised as being people. How's that for a real world analog?
It's a very poor point. Black people are human. Whether someone considered them to be so or not is irrelevant. Geth are not human, whether someone considers them to be equivalent or not is irrelevant.
I disagree, considering your whole post was about what "property" can and cannot do.

Based on your argument, only humans get rights, Quarians, Elcor, Hanar, all the other races don't. None of these are human. Unless you mean human to mean "sentient". At which point the Geth qualify, as they are also sentient. To say "you are made of different material to me, therefore you have no rights" is a bizarre line of logic in and of itself.

The classic philosophers argued that "I think, therefore I am." They did not argue, "I think, therefore I am, so long as I am made of meat."
Yes I believe only humans should have rights, at least as far as human law is concerned. Black people were not built by Europeans, they were kidnapped and enslaved... if you can't see the glaring difference there then I really don't know what to say.

And Descartes used "Cogito ergo sum" to try to prove the existence of God... it's really far less applicable than most people like to think.
 

KingofMadCows

New member
Dec 6, 2010
234
0
0
Cette said:
Define "WE" in this instance. Because from where I'm standing a dog that could talk and carry a philosophical debate would have more inherent right to life than a person who's born barely able to function. It's harsh I know and not a popular opinion but don't go assuming everyone agrees on something like that.

And quite frankly I'd far rather someone mercy kill me if I were ever severely brain damaged to the point of not even being myself or a full person anymore.

So yeah given that statement trying to eradicate another sentient being that you are directly responsible for seems like a pretty shitty thing to do. And if you end up on the wrong end of the reprisal well you're the one who made it a fight to the death to begin with.
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?

Also, a brain damaged or severely retarded human with intelligence equivalent or lower than that of a dog still has more rights than a dog. Most people will have more trouble mercy killing a person in a vegetative state than a normal healthy dog.
 

Loop Stricken

Covered in bees!
Jun 17, 2009
4,723
0
0
KingofMadCows said:
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?
And if a machine has the wherewithall to articulate this line of questioning, it's more than likely earned those rights already on account of actually being self-aware.
 

KingofMadCows

New member
Dec 6, 2010
234
0
0
Loop Stricken said:
KingofMadCows said:
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?
And if a machine has the wherewithall to articulate this line of questioning, it's more than likely earned those rights already on account of actually being self-aware.
OK, so what's your point?

The fact that it has earned those rights isn't the problem. The problem is that it had to prove itself in the first place in order to earn rights that humans naturally have.
 

Dimitriov

The end is nigh.
May 24, 2010
1,215
0
0
KingofMadCows said:
Loop Stricken said:
KingofMadCows said:
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?
And if a machine has the wherewithall to articulate this line of questioning, it's more than likely earned those rights already on account of actually being self-aware.
OK, so what's your point?

The fact that it has earned those rights isn't the problem. The problem is that it had to prove itself in the first place in order to earn rights that humans naturally have.
"Natural rights" is a misnomer. There is no such thing as an actual natural right. In reality there are only two kinds of rights: those that are given and those that are taken.

Since in reality we only know about human laws and rights that's all we have to work from.

...and it in no way benefits us to give rights to machines: in point of fact it's a really stupid and counter productive idea.
 

Cette

Member
Legacy
Dec 16, 2011
177
0
1
Country
US
KingofMadCows said:
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?

And my point is that I don't see people as inherently having those rights in the way you do. People do kill the severely brain damaged all the time. Those peoples family's who have to make the call on whether or not to keep up life support. And at the end of the day you can make a hell of a lot better case for humanely ending a persons life when they're brain dead than killing a perfectly healthy dog.

Just because it's hard


"That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?"


EXACTLY! Who makes the call that we have those rights?

This was my point. Who is the WE that you're referring to that accepts this as the standard? I'm not say a majority don't but you can't just assume everyone agrees with a subjective point if you're going to build your argument with it as a base.


Dimitriov said:
"Natural rights" is a misnomer. There is no such thing as an actual natural right. In reality there are only two kinds of rights: those that are given and those that are taken.
Quite so.
 

KingofMadCows

New member
Dec 6, 2010
234
0
0
Dimitriov said:
"Natural rights" is a misnomer. There is no such thing as an actual natural right. In reality there are only two kinds of rights: those that are given and those that are taken.

Since in reality we only know about human laws and rights that's all we have to work from.

...and it in no way benefits us to give rights to machines: in point of fact it's a really stupid and counter productive idea.
Except we're not talking about reality. We're talking about a scenario in which machines can become sapient.

Cette said:
KingofMadCows said:
My point is that a human does not need to meet any standards of intelligence or reason simply to have a right to life. You can't just kill a human who's severely brain damaged. A machine has to meet certain standards of intelligence in order to have that same basic right. That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?

And my point is that I don't see people as inherently having those rights in the way you do. People do kill the severely brain damaged all the time. Those peoples family's who have to make the call on whether or not to keep up life support. And at the end of the day you can make a hell of a lot better case for humanely ending a persons life when they're brain dead than killing a perfectly healthy dog.

Just because it's hard
Ending someone's life support is not the same thing as actually killing them. If someone is not able to survive on their own then they're dying by natural causes. It's not the same thing as giving a lethal injection to stray animals or breaking apart a machine, stripping the valuable parts and throwing the rest in a trash compactor.

Also, I never said that this is the way I see people. I said that this is the way that humans have always judged each other, using untestable metaphysical standards involving ideas like "soul" or "mind."

"That's not exactly fair when you look at it from the machine's point of view. Why are all humans born with certain rights while machines must prove themselves to earn those rights?"


EXACTLY! Who makes the call that we have those rights?

This was my point. Who is the WE that you're referring to that accepts this as the standard? I'm not say a majority don't but you can't just assume everyone agrees with a subjective point if you're going to build your argument with it as a base.
Except your point is the same as my point.

And by we, I mean human society. You as an individual may have different views but by modern day law and culture, humans are born with rights by virtue of the fact that they are human. You can't buy a severely retarded people or someone with brain damage and euthanize them. If you try it, you're going to be arrested and charged with murder. You can however, buy a machine and destroy it any time you want. With the right certifications and under the right conditions, you can do that with animals too. The punishment for destroying a machine that you don't own or killing an animal illegally is much less severe than killing a person even if that person is a vegetable.

Assuming Quarian society is anything like human society, they have similar rules. In fact, every race in ME probably has similar rules concerning sapient organic species.
 

Zydrate

New member
Apr 1, 2009
1,914
0
0
I usually opt to rewrite them. Extra Credits brought in a good point, but friendly Geth equals good Geth.

For the poll; yes. They became sentient, and it became a war. They had the right to defend themselves, just as the Geth had the right to start it. But they should both end it.
 

s0p0g

New member
Aug 24, 2009
807
0
0
as the situation quickly developed to a "kill or be killed"-problem: yes the Quarians were

i mean, if i were a mad scientist and gave my toaster consciousness, and he tried to, well, toast me, you bet i had it destroyed.

now, if jumping so quickly to the ultima ratio was right or wrong is a completely different thing