If it walks like a person, talks like a person, should you rape it?

Recommended Videos

Withall

New member
Jan 9, 2010
553
0
0
If -ONLY- there is no other option. Genocide should NEVER be a at-hands-ready option. Never. However, if there is -NO OTHER- option.
 

jthm

New member
Jun 28, 2008
825
0
0
Wouldn't help, rapists and pedophiles are more attracted to emotional states of distress and vulnerability than they are to children or sexual acts in general. They obsess over control, or misplaced affection or a warped need to be loved. A braindead human wouldn't mean much to them. That said, it would open up exciting new possibilities for us regular old perverts. I'd buy braindead, vat-grown clone girl (though I like my women post pubescent, thanks) to replace my fleshlight, so long as I was certain that she had no brainwaves, sentience or awareness.
 

Miles Tormani

New member
Jul 30, 2008
471
0
0
Withall said:
If -ONLY- there is no other option. Genocide should NEVER be a at-hands-ready option. Never. However, if there is -NO OTHER- option.
Well, I suppose that's a bit better, but the idea that you would bring it up like that is still cause for concern.
 

mrdude2010

New member
Aug 6, 2009
1,315
0
0
Yoshemo said:
How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.
 

Yoshemo

New member
Jun 23, 2009
1,156
0
0
mrdude2010 said:
Yoshemo said:
How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.
That would be the best... but if they're attracted to children or power, it doesn't make them a bad person unless they actually rape em. If a person has those feelings but doesn't want to act on them, we should help them get over them instead of just killing them
 

Unesh52

New member
May 27, 2010
1,375
0
0
Mintaro said:
...many people who had previously been holding back those impulses (or finding safe ways to let them out), would suddenly be able to indulge them with abandon. Which would of course desensitise people to them.

To answer your question in another form; what makes us human is the ability to use our logic to better ourselves. By recognising destructive tendencies in ourselves, and making an active choice to not indulge in them, even to use our powers of self control to abolish them from our minds. Creating in their place a person better than before. The ability to control our social evoloution. It is what has made us so powerful to begin with.

If one has violent or 'Sick' impulses, one should seek help in controlling them. If one simply cannot control them then one needs to find a safe way to let them out.
Firstly, how would this be different from the other "safe ways" of venting those undesirable impulses? If a guy is sexually frustrated, he can shout sexist comments to the women who walk by his construction site, or he could go watch porn in his basement. Both are vents, but only one is harmful (a little). I doubt that having a new outlet for sexually confused or frustrated individuals would lead to any sort of "devolution."

On the other hand, I acknowledge that certain impulses (such as burning your neighbors cat) do need to be controlled, and that with proper focus these desires can ostensibly cease to exist. However, there seems to be just as much merit in venting, so to speak. Trying to contain your desires and forcing yourself to reject them, even ones which feel perfectly natural and justified (in your mind), has it's own problems. Just look at closet homosexuals (homosexuality is not a dangerous impulse, I'm just trying to illustrate that suppressing one's desires can lead to depression and a loss of self-worth, etc.).

Also, I find it interesting that you suggest these types of actions would be desensitizing since I was just talking about Jack Thomson and the relationship between video games and my topic in the edit of the OP. Of course, exploding some alien in Halo is not the same as genetically engineering a humans for sex slavery (never thought I'd say that sentence), but I think the function of "desensitization" is similar in both. It's just not the same -- doing something in one medium is not the same as doing it "for real," and rational people can make the distinction. Just because it's cool to rape sex bot 3000 while it screams doesn't mean it's ok to abduct your neighbors daughter. I don't think having these analogues would lead to people committing actual sex crimes due to "desensitization."

Mr.Petey said:
Nothing can be gained from treating any living thing as a sexual object regardless of artificial creation. Humanity already has the exact image of what it's like to endure that level of suffering, some moreso than others even.

...what's to stop them eventually gaining enough intelligent and perception for them to demand not to be treated this way...?
I object to the kind of argumentation in the first quoted statement. It suggests that all living things are capable of suffering and deserve humane treatment such as humans receive. You might be able to make that argument for all mammals even, but certainly not "all living things." What about trees? We already use them as objects of decoration and profit, why not sexual gratification (don't think about that too hard please)? The qualities which cause something to deserve special treatment are such things as sentience and the ability to feel hurt or wronged. Your argument is irrelevant because I explicitly stated in the OP that there would be no possibility of suffering or even understanding, in the human sense. As for the second thought, since this is a hypothetical situation I can postulate that they won't gain intelligence, but logically, that doesn't seem to make sense anyway. I'm talking about replacing or removing the parts of their brain and body that would be responsible for sentience. And it's sort of a given that they won't reproduce, so it's not like they could "evolve" intelligence. Just no.

Shynobee said:
Well, sex topic aside, the part about using them for medical reasons would be awesome.

A major problem today is how surgeons are "rated." If one surgeon does a specific surgery really well, and fairly consistently, everyone will want to see that surgeon for that problem. What this creates is one surgeon who is being overcrowded, creating a waiting list for the surgery, and also, no other doctors are able to get good at this surgery.

But if we had living test bodies, any doctor could practice the surgery, and we could have a plethora of skilled doctors, removing the need for waiting lists, and getting rid of the pointless deaths that occur because of them.
I never thought of it like that. See? Good idea.

Miles Tormani said:
What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?
I dig that too. I'm thinking something like the Matrix, where it just plugs straight into your senses. Though that scenario didn't work out too well for Keanu Reeves, lol.

Kyuubi Fanatic said:
It says more about who we are when we have such things and choose whether or not to still do such actions. Even if the object couldn't feel pain or remember abuse, the act itself is wrong, and we are simply using a replacement because we "can get away with it".

In the end we simply justify it to ourselves and those around us. If it needs justification to not be amoral, it probably still is.
There is a fundamental question about morality that still hasn't been answered here -- what constitutes immorality? If it's not that just something that does harm to another (however that is), then what?

Man, there are so many good posts, I haven't even been on the second page yet. This is too long already though.

[/long post]
 

TheIronDuke

New member
Nov 19, 2009
25
0
0
Miles Tormani said:
Let's look at it from another angle.

What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?

(EDIT: I'm not going to post my own conclusion on that because I'm not entirely certain myself. Just bringing up a different line of thinking for the OP's scenario.)
Exposing yourself to a particular behaviour in some sort of distanced or reduced form doesn't stop a person from wanting to do something. It usually has the opposite effect. I recently read a study which involved half a group playing competitive video games and the other half playing cooperative puzzle solving games or something. They were then asked to participate with a partner to answer some questions. The ones playing the competitive games remained competitive, and visa versa. The point being they didn't use up their competitive or cooperative urges for the day, it encouraged them.

Moral issues aside, this is a bad idea, for the same reason that expressing your anger with violence on a pillow, for example, is a bad idea. It doesn't mean you use up your violence quota, and will no longer want to use violence for the next 24 hours. It just means you're more predisposed to use violence in situations when you're angry. Using violence is how you control your violent urges, so it becomes more natural to do. We all have violent urges when we're very angry, but satisfying them isn't the way to stop them, it just means satisfying them becomes the norm for releasing anger. Now, people who hit pillows obviously don't all eventually escalate into axemurderers, but are more likely to break things when there is no pillow around, or even move up to striking animate objects, pets or people. I'd say the same would apply to people who satisfy their darker urges with human proxies. Like the pillow, they become used to just satisfying these urges when they have them.

Also, in the same way that punching a pillow doesn't come close to fulfilling the desire to flatten the nose of the person inspiring your anger, I'm sure the people with pedophilia or rape/domination or murder urges will not be nearly satisfied with defiling what would essentially be a more lifelike blowup doll. So you'd have all these people used to, possibly even encouraged by the creators of these human proxies, "venting" their urges on the proxies, becoming used to fulfilling their desires and at the same time being thoroughly unsatisfied with the reaction from something that doesn't care that you're exercising your need for dominance on it and still wishing they could really punch that mocking nose instead of the uncaring pillow.

Not to mention the people who would never even think about doing this to a real person, but become curious with experimenting with pillow punching a proxy. People are inherently curious. If they had a chance to do something of this nature while being assured they wouldn't be hurting anyone, hey, why not give it a try? Because those words have never led to trouble.
 

Daffy F

New member
Apr 17, 2009
1,713
0
0
Yoshemo said:
How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
This actually seems like a better idea than what the OP said, I completely agree.
 

mrdude2010

New member
Aug 6, 2009
1,315
0
0
Yoshemo said:
mrdude2010 said:
Yoshemo said:
How about instead of letting rapists and pedos get off, we find what causes the fetish for the and eliminate the problem?
or we could just eliminate the rapists and pedos. preferably in a slow and painful fashion.
That would be the best... but if they're attracted to children or power, it doesn't make them a bad person unless they actually rape em. If a person has those feelings but doesn't want to act on them, we should help them get over them instead of just killing them
of course, i was talking about people who acted on those feelings, because once they act on them they can't be rehabilitated
 

Miles Tormani

New member
Jul 30, 2008
471
0
0
TheIronDuke said:
Miles Tormani said:
Let's look at it from another angle.

What if it was a virtual reality situation in which there is no "doll," but a virtual representation of someone who seems as much like a human being as they can be, but still an AI? I'm sure a lot of the angry types tend to feel a lot less angry after a typical Prototype rampage. Why couldn't the same thing work for the sexually frustrated types?

(EDIT: I'm not going to post my own conclusion on that because I'm not entirely certain myself. Just bringing up a different line of thinking for the OP's scenario.)
Exposing yourself to a particular behaviour in some sort of distanced or reduced form doesn't stop a person from wanting to do something. It usually has the opposite effect. I recently read a study which involved half a group playing competitive video games and the other half playing cooperative puzzle solving games or something. They were then asked to participate with a partner to answer some questions. The ones playing the competitive games remained competitive, and visa versa. The point being they didn't use up their competitive or cooperative urges for the day, it encouraged them.

Moral issues aside, this is a bad idea, for the same reason that expressing your anger with violence on a pillow, for example, is a bad idea. It doesn't mean you use up your violence quota, and will no longer want to use violence for the next 24 hours. It just means you're more predisposed to use violence in situations when you're angry. Using violence is how you control your violent urges, so it becomes more natural to do. We all have violent urges when we're very angry, but satisfying them isn't the way to stop them, it just means satisfying them becomes the norm for releasing anger. Now, people who hit pillows obviously don't all eventually escalate into axemurderers, but are more likely to break things when there is no pillow around, or even move up to striking animate objects, pets or people. I'd say the same would apply to people who satisfy their darker urges with human proxies. Like the pillow, they become used to just satisfying these urges when they have them.

Also, in the same way that punching a pillow doesn't come close to fulfilling the desire to flatten the nose of the person inspiring your anger, I'm sure the people with pedophilia or rape/domination or murder urges will not be nearly satisfied with defiling what would essentially be a more lifelike blowup doll. So you'd have all these people used to, possibly even encouraged by the creators of these human proxies, "venting" their urges on the proxies, becoming used to fulfilling their desires and at the same time being thoroughly unsatisfied with the reaction from something that doesn't care that you're exercising your need for dominance on it and still wishing they could really punch that mocking nose instead of the uncaring pillow.

Not to mention the people who would never even think about doing this to a real person, but become curious with experimenting with pillow punching a proxy. People are inherently curious. If they had a chance to do something of this nature while being assured they wouldn't be hurting anyone, hey, why not give it a try? Because those words have never led to trouble.
This is a serious slippery slope argument at best. Furthermore, citation would be appreciated.

Think back to my given example: Prototype. It's obviously an avenue for people to "blow off steam," but by the logic that you're saying here, everyone who played the game, especially those who did so to blow off steam or for "violent tendencies," would be much more likely to find some random passerby and try to rip them in half. Same goes for the typical God of War player. Statistically, how often has this actually happened compared to various other causes for violence? How many of those have been even proven to be true?

Besides, think about it for a second. You're angry, and need to best someone at something. You pick up a first person shooter and start playing online. You do extremely well and start feeling very satisfied with yourself for not only letting off that adrenaline, but performing well despite what's angering you. Is that going to make you want to pick up an actual AK-47 and shoot up a shopping mall? Unlikely. Especially not since the game theoretically provides much more satisfaction by virtue of there being some actual competition. The rational individual would also be able to acknowledge that no one is getting hurt with this behavior, with the exception of pride.

Also, when did I ever justify the "blow up dolls"? My scenario involved a virtual reality situation. EDIT: Before you go and say there is no difference, I can think of one right now: difficulty setting. Make it harder to get into the proper situation and the thrill resurfaces.
 

Mr.Petey

New member
Dec 23, 2009
521
0
0
summerof2010 said:
I object to the kind of argumentation in the first quoted statement. It suggests that all living things are capable of suffering and deserve humane treatment such as humans receive. You might be able to make that argument for all mammals even, but certainly not "all living things." What about trees? We already use them as objects of decoration and profit, why not sexual gratification (don't think about that too hard please)? The qualities which cause something to deserve special treatment are such things as sentience and the ability to feel hurt or wronged. Your argument is irrelevant because I explicitly stated in the OP that there would be no possibility of suffering or even understanding, in the human sense. As for the second thought, since this is a hypothetical situation I can postulate that they won't gain intelligence, but logically, that doesn't seem to make sense anyway. I'm talking about replacing or removing the parts of their brain and body that would be responsible for sentience. And it's sort of a given that they won't reproduce, so it's not like they could "evolve" intelligence. Just no.
My argument is irrelevant? That's very nice, thank you ^_^

I still don't feel it is right for a human to take sexual gratification in such degenerate behaviour on an artificial being and exploit it, regardless if it can feel or not. It's just unnatural really in my opinion.

The only act of sexual behaviour should be with a consenting member of the same species, not just some clone grown without any thought processes aside from pleasure or whatever is left behind if everything is switched off. It can perhaps detach the human element in that equation from any guilt or emotional attachment but it really doesn't make it right to create life for our own pleasures. Sorry I just disagree on my personal principles. Plus I found the last part of the OP a little disturbing regarding skull fucking a baby. That could have been tidied up but that's just me and I won't go on about it any further :)
 

BlackWidower

New member
Nov 16, 2009
783
0
0
Here's my thoughts on pedophilia. Fantasize all you want about raping kids, buy a sex doll shaped like a 12-year-old, take photos of children on the street and look at them while you wank. I don't care, just don't actually do it. Fantasy is fine, acting on it is something else.

I'm reminded of all those cases of child porn involving illustrated material. If no one is getting hurt, what's the problem?

Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
 

HellsingerAngel

New member
Jul 6, 2008
602
0
0
Mintaro said:
If we created Creatures, or robots for the matter, with the express purpose of absorbing our violent and cruel tendencies, we would be enabling ourselves to maintain those tendencies. Which would stunt our psycological and cultural growth. In essence we would cease to evolve mentally. In fact it is more likely that we would devolve as many people who had previously been holding back those impulses (or finding safe ways to let them out), would suddenly be able to indulge them with abandon. Which would of course desensitise people to them.

To answer your question in another form; what makes us human is the ability to use our logic to better ourselves. By recognising destructive tendencies in ourselves, and making an active choice to not indulge in them, even to use our powers of self control to abolish them from our minds. Creating in their place a person better than before. The ability to control our social evoloution. It is what has made us so powerful to begin with.
Whelp, guess it's time to pull out the old 40K lore and put some deep questions on the table.

Look at the Eldar. A clearly evolved race of beings that have decided to block out negative emotion in fear of being enveloped by it. In fact, they avoid indulging themselves in anything for fear of being enveloped by it. In this universe, the ultimate consequence is very real: being swallowed by the Warp, but in our reality you could say anyone who indulges themselves gets addicted and might as well be a demon. However, they do recognise that no race is completely void of emotion and completely ruled by logic and as such have on path that is extremely controversial, but at the same time, very essential: the Path of the Warrior.

Within this Path, the Eldar may indulge themselves in as brutal and as viciral acts of violence as they see fit upon the battlefield. They may relish in the blood of their enemies, find elogance with slaughtering millions and revel in the torturous screams and pleas of their foes. This is not without precaution, however, as the Eldar done what is called their "Mask". The mask is a barrier between their concious being and their state of emotional freedom. Though they retain all memories of what they've done, it's like their subconcious is the one that holds them. They cannot recall their momeories unless they so choose, which they clearly do not.

Now, here's the question: the cultural growth and stability of this race was solely dependant on the creation of the "Mask" to ensure the safety of releasing these urges. Without it, they would fall into a repressive state and destroy themselves from within. Using it too much, and they are purely animals. Would you not say that these cloned dolls would not be a similar "Mask" for the sexual desires of the human race?

Another example I would like to bring up are the Space Orks. This race is considered to eb the most socially perfect race within the universe. How could a war mongering, animal-like race be far superior socially to humans or even the eloquent Eldar? For one there is no poverty. Their economic system is based off teeth, which are never in sort supply as they shed them quickly and gain them from battles. There is no jealousy, only honour. The Ork society is based upon kill or be killed. As such, anyone who wants the spoils of glory must eanr it by fighting. If you want buddy's Wartruk, you challenge him for it. There is no sadness. Why? Because Orks do whatever they want. They go around and build giant war machines, create status to Gork (or is it Mork?) out of dung, drive really fast on their red warbikes and drink until a squig eats their face off. Most importantly, Orks only focus on their favorite thing in the universe: making war. When an Ork dies, he's content because he died fighting. When and Ork wins (and they always win) they've proven they're the best! When an Ork runs (because they never lose) they're happy because they've lived to fight another day and can finish their victory tomorrow. For their society, everyone is happy from the moment they're born to the moment they die.

Yet again, is it because they focus on simplicity that they're so happy? Is this really the goal to social satisfaction? The fact that the rule of law is "kill or be killed" certainly makes for a brutal society, something we shun because we believe it degrades us as a species, yet it's considered to hit that plateau of perfection to bring everyone to the pincicle of happiness derived from social groupings in this scenario.

Just a few things to think on...
 

NoNameMcgee

New member
Feb 24, 2009
2,104
0
0
Clones? Hell no.

Robots that seem human? Hell yes. It would allow people to take out their frustrations without harming anyone, and I can definitely see this happening in the future. Robots are robots, they should have no rights anyway and exist only to serve us since they are not actually alive.

However I highly doubt it would stop rapists since rape is usually about dominance.
 

Unesh52

New member
May 27, 2010
1,375
0
0
Mr.Petey said:
My argument is irrelevant? That's very nice, thank you ^_^
I was actually afraid you'd be offended... whatever, nice surprise that you took it well, I guess :)

Mr.Petey said:
I still don't feel it is right for a human to take sexual gratification in such degenerate behaviour on an artificial being and exploit it, regardless if it can feel or not. It's just unnatural really in my opinion.

The only act of sexual behaviour should be with a consenting member of the same species, not just some clone grown without any thought processes aside from pleasure or whatever is left behind if everything is switched off. It can perhaps detach the human element in that equation from any guilt or emotional attachment but it really doesn't make it right to create life for our own pleasures. Sorry I just disagree on my personal principles. Plus I found the last part of the OP a little disturbing regarding skull fucking a baby. That could have been tidied up but that's just me and I won't go on about it any further :)
Yes, the skull fucking a baby thing is really an implicitly upsetting image, lol. Try to see around your gut reaction though and don't base your justifications on you internal biases.

I think your justifications for your opinion (I think opinions need justification, even if you can't "prove" what they're about) are unfairly rooted in spiritual or else arbitrary definitions of right and wrong. And life. The way I've described them, the analogues are as close to a human as a vacuum cleaner, or, more appropriately, a blow up sex doll. Or maybe it would help to think of them as a humanoid fruit. Think about it -- a woman made of cantaloupe (brings new meaning to "eating someone out" *slaps knee* ...sorry). I mean, yeah, it's still messed up to screw your delicious fruit woman, but you can hardly say it hurts anything. Except maybe your friends' perception of you.

Before I go on, let me say that right and wrong should be indefinite ideas that adjust to the situation they're applied to. Specifically, anything that does more harm than good is bad, and the opposite is good. Of course, you're going to find different perceptions and data depending on the scope of your observation and other factors, but this is not a discussion about morality. Suffice to say that nothing is implicitly bad, it's only inappropriate for the situation, and there will always be reasons for that.

To the best of my knowledge, your argument (which concludes that artificial humans should not be created for sexual exploitation) comes down to three points, that (1) exploiting artificial humans for sex is unnatural, that (2) the only sexual behaviors that should be allowed are ones between consenting members of the same species, and that (3) it is morally wrong to create life just to please ourselves. Let's go in order:

(1) Irrelevant. Lot's of very common and benevolent things, behaviors included, are unnatural. And this is true regardless of how you define "natural," which is highly subjective to begin with. Is it "natural" to cook our food? To build tall, "nature" defying sky scrapers (which, by the way, facilitate our "natural" tendency to over-reproduce, like in Japan, where vertical engineering is compensating for overpopulation. "Natural" overpopulation is spreading the food supply too thin, causing social unrest and famine, which is immoral by the above definition)? Whether something is natural or not means nothing about it's value, efficiency, or morality.

(2) This seems simple enough, but taken with a critical enough eye, there is one glaring flaw. The reason consent is usually necessary is because there is potential for harm in the act of sex. Pregnancy, injury/disease, humiliation, and a number of other psychological damages. With non-sentient beings, these doesn't apply. They don't care about what happens to them, even if they can perceive any change in themselves, whether it's pain or pregnancy. Consent is not an issue in this case.

Also, though it is considerably less important in this discussion because the analogues would presumably be the same species as us, I find the same species thing unnecessarily restrictive. I see no implicit harm in cross species sex. And don't say "it's obviously wrong because other species are so different from us." That's one step away from "No interracial marriages!" The presence of differences does not imply incompatibility. What if some hot aliens landed on earth and one wanted to give you a handy?


Just image that... glowing hand on your... er, actually this may have been a bad example.

(3) Remember the definition above. Does it cause anyone who cares any harm? Maybe, as big business has a tendency to do that, but it certainly doesn't hurt them. As mentioned before, they physically can't care. You have an emotional objection to it apparently because it's a misuse of "life," but I don't think something being alive makes it inherently valuable, sacred, or inherently anything really. Except, alive. I bring up plants again because they're a perfect example of the disconnect between life and sentience. We raise flowers simply to have them fragrance our homes, but this sort of slavery is not a crime because the flower doesn't think. It doesn't care of purpose or meaning, it just tries to live. Similarly, these analogues would not have "human lives" as we know them, but would be more like "living objects," like plants. So who cares if you kill it or screw it or whatever? It sure doesn't.

Fuckediity fuck fuck fuck. There's some swears. I uber tired and I'm going rafting tomorrow! Er, later today. Sleep sleepy sleepy. Goodnight Mr. Petey.
 

Unesh52

New member
May 27, 2010
1,375
0
0
BlackWidower said:
Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?

Daffy F said:
This actually seems like a better idea than what the OP said, I completely agree.
...what idea of mine are you talking about specifically? I confus.
 

BlackWidower

New member
Nov 16, 2009
783
0
0
summerof2010 said:
BlackWidower said:
Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?

Daffy F said:
This actually seems like a better idea than what the OP said, I completely agree.
...what idea of mine are you talking about specifically? I confus.
It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.

With an android, you could easily prevent it from feeling pain by not programming it into them.

Seriously, what do you think is more reliable: removing pain, or not including it in the first place?
 

Unesh52

New member
May 27, 2010
1,375
0
0
BlackWidower said:
summerof2010 said:
BlackWidower said:
Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?
It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.

With an android, you could easily prevent it from feeling pain by not programming it into them.

Seriously, what do you think is more reliable: removing pain, or not including it in the first place?
Souls don't exist. That's stupid. The "soul" is barely even defined in the first place, there is no evidence supporting its existence, and there are better, empirical explanations for few, vague things the soul is consistently said to be responsible for. I have absolutely no idea why some people (not you apparently) insist on the existence of the damn things when they won't even take the time to decide what they are. Speaking of, even if they did exist, their presence in the analogues would be irrelevant because no one can seem to put down what the things are for or how they work. (I don't mean to say you're stupid, but you're certainly not making an intelligent point by bringing this up in this conversation. I hope you're not offended.)

Now then, on the subject of "guaranteeing" there will be no pain -- again, it's a hypothetical. When I say, "If x, then would y still apply?" it's completely dodging the question to just go, "Well, x is impossible."

I don't think it would be so difficult to determine the processes that cause physical pain and interrupt them, nor do I think it would be that difficult to quantify pain to the point that we can tell if a person is experiencing it, at least in the case of clones that have no rights or purpose except to be prodded and scanned and redesigned. Scientists already understand chemical processes that control pain (painkillers, neuro-toxins, etc.) and we even have real life examples of people that cannot feel pain as a result of genetic defects [http://abcnews.go.com/GMA/OnCall/story?id=1386322]. In a few decades it could be easy, in a century or two, elementary. There are much more complex problems to deal with than pain regulation, like keeping the body from killing itself (see the linked article), which would require some sort of removal of independent action.

As for emotional pain, that would fall under one of those "more complex problems," but you would run into the exact same difficulties designing AI to act emotionally without "having emotions," despite the tendency of the glib to assume that emotion is limited to things with "life" (spiritual bull, in the same conversation as souls), though the problems might be arrived at in reverse if working from "human" down. These sorts of things are way out there, beyond what we can see in currently developing science, but the point is that "life" is not so special, as most think. It's about thought, and emotion. What makes something "alive" is nothing more than the action of certain processes that we call biological. The definition of the term does not include anything important to the subject at hand.

You can ignore the spoiler, as it really is a completely separate argument, but I stand by what I say in it.
 

BlackWidower

New member
Nov 16, 2009
783
0
0
summerof2010 said:
BlackWidower said:
summerof2010 said:
BlackWidower said:
Though I do have a problem with your idea. Instead, maybe an android sex-bot fashioned like a 12-year-old. That could work.
What exactly is the difference between a sex robot and a living, breathing body that lacks a conventional mind?
It's organic nature. It might still possess a soul, if souls even exist, and there is no guarantee they would feel no pain, not really.

With an android, you could easily prevent it from feeling pain by not programming it into them.

Seriously, what do you think is more reliable: removing pain, or not including it in the first place?
Souls don't exist. That's stupid. The "soul" is barely even defined in the first place, there is no evidence supporting its existence, and there are better, empirical explanations for few, vague things the soul is consistently said to be responsible for. I have absolutely no idea why some people (not you apparently) insist on the existence of the damn things when they won't even take the time to decide what they are. Speaking of, even if they did exist, their presence in the analogues would be irrelevant because no one can seem to put down what the things are for or how they work. (I don't mean to say you're stupid, but you're certainly not making an intelligent point by bringing this up in this conversation. I hope you're not offended.)

Now then, on the subject of "guaranteeing" there will be no pain -- again, it's a hypothetical. When I say, "If x, then would y still apply?" it's completely dodging the question to just go, "Well, x is impossible."

I don't think it would be so difficult to determine the processes that cause physical pain and interrupt them, nor do I think it would be that difficult to quantify pain to the point that we can tell if a person is experiencing it, at least in the case of clones that have no rights or purpose except to be prodded and scanned and redesigned. Scientists already understand chemical processes that control pain (painkillers, neuro-toxins, etc.) and we even have real life examples of people that cannot feel pain as a result of genetic defects [http://abcnews.go.com/GMA/OnCall/story?id=1386322]. In a few decades it could be easy, in a century or two, elementary. There are much more complex problems to deal with than pain regulation, like keeping the body from killing itself (see the linked article), which would require some sort of removal of independent action.

As for emotional pain, that would fall under one of those "more complex problems," but you would run into the exact same difficulties designing AI to act emotionally without "having emotions," despite the tendency of the glib to assume that emotion is limited to things with "life" (spiritual bull, in the same conversation as souls), though the problems might be arrived at in reverse if working from "human" down. These sorts of things are way out there, beyond what we can see in currently developing science, but the point is that "life" is not so special, as most think. It's about thought, and emotion. What makes something "alive" is nothing more than the action of certain processes that we call biological. The definition of the term does not include anything important to the subject at hand.

You can ignore the spoiler, as it really is a completely separate argument, but I stand by what I say in it.
Well then you don't get AI to act emotionally. That's how you avoid that. They are sex toys.

Plus, you say if we experiment with the clones enough it will be fine, which will be easy while working with clones who have no rights. I have a hard time believing anyone will stand by while someone else says clones have no rights. I know I'll be one of the many fighting for clone rights.

But reading all you wrote, it seems your underlying thesis is we shouldn't do it at all. Negating your original argument.