Well, I suppose that's a bit better, but the idea that you would bring it up like that is still cause for concern.Withall said:If -ONLY- there is no other option. Genocide should NEVER be a at-hands-ready option. Never. However, if there is -NO OTHER- option.
or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.Yoshemo said:How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
That would be the best... but if they're attracted to children or power, it doesn't make them a bad person unless they actually rape em. If a person has those feelings but doesn't want to act on them, we should help them get over them instead of just killing themmrdude2010 said:or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.Yoshemo said:How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
Firstly, how would this be different from the other "safe ways" of venting those undesirable impulses? If a guy is sexually frustrated, he can shout sexist comments to the women who walk by his construction site, or he could go watch porn in his basement. Both are vents, but only one is harmful (a little). I doubt that having a new outlet for sexually confused or frustrated individuals would lead to any sort of "devolution."Mintaro said:...many people who had previously been holding back those impulses (or finding safe ways to let them out), would suddenly be able to indulge them with abandon. Which would of course desensitise people to them.
To answer your question in another form; what makes us human is the ability to use our logic to better ourselves. By recognising destructive tendencies in ourselves, and making an active choice to not indulge in them, even to use our powers of self control to abolish them from our minds. Creating in their place a person better than before. The ability to control our social evoloution. It is what has made us so powerful to begin with.
If one has violent or 'Sick' impulses, one should seek help in controlling them. If one simply cannot control them then one needs to find a safe way to let them out.
I object to the kind of argumentation in the first quoted statement. It suggests that all living things are capable of suffering and deserve humane treatment such as humans receive. You might be able to make that argument for all mammals even, but certainly not "all living things." What about trees? We already use them as objects of decoration and profit, why not sexual gratification (don't think about that too hard please)? The qualities which cause something to deserve special treatment are such things as sentience and the ability to feel hurt or wronged. Your argument is irrelevant because I explicitly stated in the OP that there would be no possibility of suffering or even understanding, in the human sense. As for the second thought, since this is a hypothetical situation I can postulate that they won't gain intelligence, but logically, that doesn't seem to make sense anyway. I'm talking about replacing or removing the parts of their brain and body that would be responsible for sentience. And it's sort of a given that they won't reproduce, so it's not like they could "evolve" intelligence. Just no.Mr.Petey said:Nothing can be gained from treating any living thing as a sexual object regardless of artificial creation. Humanity already has the exact image of what it's like to endure that level of suffering, some moreso than others even.
...what's to stop them eventually gaining enough intelligent and perception for them to demand not to be treated this way...?
I never thought of it like that. See? Good idea.Shynobee said:Well, sex topic aside, the part about using them for medical reasons would be awesome.
A major problem today is how surgeons are "rated." If one surgeon does a specific surgery really well, and fairly consistently, everyone will want to see that surgeon for that problem. What this creates is one surgeon who is being overcrowded, creating a waiting list for the surgery, and also, no other doctors are able to get good at this surgery.
But if we had living test bodies, any doctor could practice the surgery, and we could have a plethora of skilled doctors, removing the need for waiting lists, and getting rid of the pointless deaths that occur because of them.
I dig that too. I'm thinking something like the Matrix, where it just plugs straight into your senses. Though that scenario didn't work out too well for Keanu Reeves, lol.Miles Tormani said:What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?
There is a fundamental question about morality that still hasn't been answered here -- what constitutes immorality? If it's not that just something that does harm to another (however that is), then what?Kyuubi Fanatic said:It says more about who we are when we have such things and choose whether or not to still do such actions. Even if the object couldn't feel pain or remember abuse, the act itself is wrong, and we are simply using a replacement because we "can get away with it".
In the end we simply justify it to ourselves and those around us. If it needs justification to not be amoral, it probably still is.
Exposing yourself to a particular behaviour in some sort of distanced or reduced form doesn't stop a person from wanting to do something. It usually has the opposite effect. I recently read a study which involved half a group playing competitive video games and the other half playing cooperative puzzle solving games or something. They were then asked to participate with a partner to answer some questions. The ones playing the competitive games remained competitive, and visa versa. The point being they didn't use up their competitive or cooperative urges for the day, it encouraged them.Miles Tormani said:Let's look at it from another angle.
What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?
(EDIT: I'm not going to post my own conclusion on that because I'm not entirely certain myself. Just bringing up a different line of thinking for the OP's scenario.)
This actually seems like a better idea than what the OP said, I completely agree.Yoshemo said:How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
of course, i was talking about people who acted on those feelings, because once they act on them they can't be rehabilitatedYoshemo said:That would be the best... but if they're attracted to children or power, it doesn't make them a bad person unless they actually rape em. If a person has those feelings but doesn't want to act on them, we should help them get over them instead of just killing themmrdude2010 said:or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.Yoshemo said:How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
This is a serious slippery slope argument at best. Furthermore, citation would be appreciated.TheIronDuke said:Exposing yourself to a particular behaviour in some sort of distanced or reduced form doesn't stop a person from wanting to do something. It usually has the opposite effect. I recently read a study which involved half a group playing competitive video games and the other half playing cooperative puzzle solving games or something. They were then asked to participate with a partner to answer some questions. The ones playing the competitive games remained competitive, and visa versa. The point being they didn't use up their competitive or cooperative urges for the day, it encouraged them.Miles Tormani said:Let's look at it from another angle.
What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?
(EDIT: I'm not going to post my own conclusion on that because I'm not entirely certain myself. Just bringing up a different line of thinking for the OP's scenario.)
Moral issues aside, this is a bad idea, for the same reason that expressing your anger with violence on a pillow, for example, is a bad idea. It doesn't mean you use up your violence quota, and will no longer want to use violence for the next 24 hours. It just means you're more predisposed to use violence in situations when you're angry. Using violence is how you control your violent urges, so it becomes more natural to do. We all have violent urges when we're very angry, but satisfying them isn't the way to stop them, it just means satisfying them becomes the norm for releasing anger. Now, people who hit pillows obviously don't all eventually escalate into axemurderers, but are more likely to break things when there is no pillow around, or even move up to striking animate objects, pets or people. I'd say the same would apply to people who satisfy their darker urges with human proxies. Like the pillow, they become used to just satisfying these urges when they have them.
Also, in the same way that punching a pillow doesn't come close to fulfilling the desire to flatten the nose of the person inspiring your anger, I'm sure the people with pedophilia or rape/domination or murder urges will not be nearly satisfied with defiling what would essentially be a more lifelike blowup doll. So you'd have all these people used to, possibly even encouraged by the creators of these human proxies, "venting" their urges on the proxies, becoming used to fulfilling their desires and at the same time being thoroughly unsatisfied with the reaction from something that doesn't care that you're exercising your need for dominance on it and still wishing they could really punch that mocking nose instead of the uncaring pillow.
Not to mention the people who would never even think about doing this to a real person, but become curious with experimenting with pillow punching a proxy. People are inherently curious. If they had a chance to do something of this nature while being assured they wouldn't be hurting anyone, hey, why not give it a try? Because those words have never led to trouble.
My argument is irrelevant? That's very nice, thank you ^_^summerof2010 said:I object to the kind of argumentation in the first quoted statement. It suggests that all living things are capable of suffering and deserve humane treatment such as humans receive. You might be able to make that argument for all mammals even, but certainly not "all living things." What about trees? We already use them as objects of decoration and profit, why not sexual gratification (don't think about that too hard please)? The qualities which cause something to deserve special treatment are such things as sentience and the ability to feel hurt or wronged. Your argument is irrelevant because I explicitly stated in the OP that there would be no possibility of suffering or even understanding, in the human sense. As for the second thought, since this is a hypothetical situation I can postulate that they won't gain intelligence, but logically, that doesn't seem to make sense anyway. I'm talking about replacing or removing the parts of their brain and body that would be responsible for sentience. And it's sort of a given that they won't reproduce, so it's not like they could "evolve" intelligence. Just no.
Whelp, guess it's time to pull out the old 40K lore and put some deep questions on the table.Mintaro said:If we created Creatures, or robots for the matter, with the express purpose of absorbing our violent and cruel tendencies, we would be enabling ourselves to maintain those tendencies. Which would stunt our psycological and cultural growth. In essence we would cease to evolve mentally. In fact it is more likely that we would devolve as many people who had previously been holding back those impulses (or finding safe ways to let them out), would suddenly be able to indulge them with abandon. Which would of course desensitise people to them.
To answer your question in another form; what makes us human is the ability to use our logic to better ourselves. By recognising destructive tendencies in ourselves, and making an active choice to not indulge in them, even to use our powers of self control to abolish them from our minds. Creating in their place a person better than before. The ability to control our social evoloution. It is what has made us so powerful to begin with.
I was actually afraid you'd be offended... whatever, nice surprise that you took it well, I guessMr.Petey said:My argument is irrelevant? That's very nice, thank you ^_^
Yes, the skull fucking a baby thing is really an implicitly upsetting image, lol. Try to see around your gut reaction though and don't base your justifications on you internal biases.Mr.Petey said:I still don't feel it is right for a human to take sexual gratification in such degenerate behaviour on an artificial being and exploit it, regardless if it can feel or not. It's just unnatural really in my opinion.
The only act of sexual behaviour should be with a consenting member of the same species, not just some clone grown without any thought processes aside from pleasure or whatever is left behind if everything is switched off. It can perhaps detach the human element in that equation from any guilt or emotional attachment but it really doesn't make it right to create life for our own pleasures. Sorry I just disagree on my personal principles. Plus I found the last part of the OP a little disturbing regarding skull fucking a baby. That could have been tidied up but that's just me and I won't go on about it any further![]()
What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?BlackWidower said:Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
...what idea of mine are you talking about specifically? I confus.Daffy F said:This actually seems like a better idea than what the OP said, I completely agree.
It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.summerof2010 said:What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?BlackWidower said:Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
...what idea of mine are you talking about specifically? I confus.Daffy F said:This actually seems like a better idea than what the OP said, I completely agree.
Souls don't exist. That's stupid. The "soul" is barely even defined in the first place, there is no evidence supporting its existence, and there are better, empirical explanations for few, vague things the soul is consistently said to be responsible for. I have absolutely no idea why some people (not you apparently) insist on the existence of the damn things when they won't even take the time to decide what they are. Speaking of, even if they did exist, their presence in the analogues would be irrelevant because no one can seem to put down what the things are for or how they work. (I don't mean to say you're stupid, but you're certainly not making an intelligent point by bringing this up in this conversation. I hope you're not offended.)BlackWidower said:It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.summerof2010 said:What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?BlackWidower said:Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
With an android, you could easily prevent it from feeling pain by not programming it into them.
Seriously, what do you think is more reliable: removing pain, or not including it in the first place?
Well then you don't get AI to act emotionally. That's how you avoid that. They are sex toys.summerof2010 said:Souls don't exist. That's stupid. The "soul" is barely even defined in the first place, there is no evidence supporting its existence, and there are better, empirical explanations for few, vague things the soul is consistently said to be responsible for. I have absolutely no idea why some people (not you apparently) insist on the existence of the damn things when they won't even take the time to decide what they are. Speaking of, even if they did exist, their presence in the analogues would be irrelevant because no one can seem to put down what the things are for or how they work. (I don't mean to say you're stupid, but you're certainly not making an intelligent point by bringing this up in this conversation. I hope you're not offended.)BlackWidower said:It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.summerof2010 said:What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?BlackWidower said:Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
With an android, you could easily prevent it from feeling pain by not programming it into them.
Seriously, what do you think is more reliable: removing pain, or not including it in the first place?
Now then, on the subject of "guaranteeing" there will be no pain -- again, it's a hypothetical. When I say, "If x, then would y still apply?" it's completely dodging the question to just go, "Well, x is impossible."
I don't think it would be so difficult to determine the processes that cause physical pain and interrupt them, nor do I think it would be that difficult to quantify pain to the point that we can tell if a person is experiencing it, at least in the case of clones that have no rights or purpose except to be prodded and scanned and redesigned. Scientists already understand chemical processes that control pain (painkillers, neuro-toxins, etc.) and we even have real life examples of people that cannot feel pain as a result of genetic defects [http://abcnews.go.com/GMA/OnCall/story?id=1386322]. In a few decades it could be easy, in a century or two, elementary. There are much more complex problems to deal with than pain regulation, like keeping the body from killing itself (see the linked article), which would require some sort of removal of independent action.
As for emotional pain, that would fall under one of those "more complex problems," but you would run into the exact same difficulties designing AI to act emotionally without "having emotions," despite the tendency of the glib to assume that emotion is limited to things with "life" (spiritual bull, in the same conversation as souls), though the problems might be arrived at in reverse if working from "human" down. These sorts of things are way out there, beyond what we can see in currently developing science, but the point is that "life" is not so special, as most think. It's about thought, and emotion. What makes something "alive" is nothing more than the action of certain processes that we call biological. The definition of the term does not include anything important to the subject at hand.
You can ignore the spoiler, as it really is a completely separate argument, but I stand by what I say in it.