-You made a hypothetical case that, if your spontaneously disintegrated and were replaced by a perfect clone, complete with your memories, then you would somehow still be alive because of some nonsense about candles, willpower, and observers somehow defining whether or not 'you' are still alive.
-I proposed a counter-situation where a clone was created with perfect memories, appearance, etc, but you remained alive, and asked you how exactly this completely separate individual would still be 'you' if the actual you was still alive.
-You ignored the question and talked about how you'd need more toiletries and public transit tickets.
You've so far constantly refused to acknowledge the simple fact that if you, as you are right now, still exist after the clone's creation, it doesn't matter how perfect his creation. He's a different person the instant his experience branches off from your own because people are more than their appearance and memories. You are no longer interchangeable the moment that you have separate perspectives. Ergo, it is only your death that could even possibly justify his existence as a perfect substitute for you...assuming that you don't mind dying in the process.
Most of it comes down to this line of yours:
TheUsername0131 said:
I'm sure the traditional definition of dying doesn't entail leaving behind a working duplicate/functioning copy/qualified replacement.
You're operating under the delusion that being spontaneously replaced by someone who can live out your life in your stead doesn't leave you anywhere but dead. You're like the opposite extreme of a sociopath: you think that your perspective is the only one on Earth that doesn't matter.
Somehow I don't think you've ever seen The Prestige [http://en.wikipedia.org/wiki/The_Prestige_%28film%29]. It's a pretty good movie. If you don't mind spoilers, I'll gladly talk about a major plot point that summarizes the problem with your argument.
But if you want to watch it, do so. Don't read this first. Or ever, if you don't want to. It'll spoil a rather big part of an excellent film.
Hugh Jackman's character created a 'magic' trick with a device with a not-so-clear function. It does one of two things: it teleports the user a short distance away and leaves behind a perfect clone of said user, or it creates a perfect clone of the user a short distance away.
Here's Jackman's dying speech, the point being from 1:45 onward [https://www.youtube.com/watch?v=XHKan75x7GI]. Because the clone was perfect in every way, he had absolutely no way of knowing whether or not he was simply the first clone. The first time he used the machine, he killed the one who was 'teleported,' meaning he either killed the original Jackman, or (since he rigged the device to kill the man who was standing on it after activation) it meant that he died every single time he used the device and the clone produced by it simply thought that he'd been teleported. When your memories are perfectly clear, and the other guy's memories appear the same, the only thing you have to cling to is your perspective.
Hell, the same thing was in 'The Sixth Day.' The movie's a good deal older than 'The Prestige,' but I'll spoiler-tag it anyway.
As with the former, don't read before watching unless you want spoilers.
The villain and his henchmen use cloning technology to gain effective immortality. It somehow scans their brains at the time of death and produces a flawless copy in the vat immediately afterward. Unfortunately, it's made abundantly clear in the ending that there's no transfer of consciousness going on. The villain, while dying of a gunshot wound, desperately finishes the cloning process for a new clone of himself. The new clone then steps out and casually starts taking the dying progenitor's clothing. Here's their exchange:
Villain: "You're not even gonna wait until I die?"
Clone!Villain: "Would you?"
You can actually see the moment in the villain's eyes when he realizes that cloning tech doesn't give you immortality. He's about to die, and someone who looks exactly like him is about to take over his life.
Honestly, they're both good movies, though the latter might be 'worse,' it's probably more fun to watch. If you haven't seen them already, please do so. Not for the sake of this argument, mind you. Just because they're worth watching.
Option 4 for me. My brain in an immortal, unaging body? Nice.
I might be willing to go 4.5 for a few mental augments, but only so long as they didn't causes any serious psychological changes. Also, on the body side, while I wouldn't mind an artificial body, I'd want it to generally feel and produce sensory data like a human body.
Or nanites. I could always go for nanites.
Seneschal said:
Wh-... No brain uploading?! What kind of string-budget transhumanism are we aiming for here? You could have a body of synthetic diamond and still operate much like a ye-olde-meat-sack, but once you're a digital entity, that's when the real fun begins.
i'm repulsed by my vessel. to say i have no attachment to any part of it would be a grave understatement. if i were given enough evidence to suggest beyond a reasonable doubt the affects on my consciousness of any procedure available to escape from this vessel, the options would merit consideration.
-You made a hypothetical case that, if your spontaneously disintegrated and were replaced by a perfect clone, complete with your memories, then you would somehow still be alive...
In a manner of speaking. The dissonance behind it makes me feel irresolute over it.
Char-Nobyl said:
That you would somehow still be alive because of some nonsense about candles, willpower, and observers somehow defining whether or not 'you' are still alive.
I'm trying to make the gradual shift from outright delusion to obvious denial. Incoherent alliteration was purposed for that task.
[You didn't have to abridge my nonsense to the point of making a deliberately unflatteringly distorted annotation when it simply boiled down too; "I don't think identity is a discreet immutable thing. To the extent that I disagree with time-honoured notions of identity."]
Char-Nobyl said:
You ignored the question and talked about how you'd need more toiletries and public transit tickets.
Well, since the machine's intended purpose was botched, there are now two people that require to make use of such amenities. Which agrees with you, that there are two people. Two... different people. I tried to agree that I may be agreeing with you, with hesitation to the point of tangent. They certainly shouldn't share the same toothbrush, what if one of them catches a contagious infection.
Char-Nobyl said:
Ergo, it is only your death that could even possibly justify his existence as a perfect substitute for you...assuming that you don't mind dying in the process.
But like the obfuscated mechanism in The Prestige. You pointed out the painfully clear fault in that system. One I sought to hide behind a latency of about 2.0013845711889*10^-15 s. Since that value is unattainable It would be better off as 1.67*10^-4 s.
An example I thought less tiering on one's mental health then simply "rigging the device to kill the man who was standing on it after activation." By making the disincorporation essential to the process, rather than as an afterthought.
I've come to terms with the... hitch in its workings. I sought to only patch it up with denial. An act that took considerable time staring at a wall and thinking things over. Where once people sought immortality through their works, or heirs.
There was never drivel about others defining legitimacy, only deceiving them. People are very good at deceiving themselves. [The many lies I tell myself to go to sleep at night.]
It must speak volumes about me, the subject of my desperation. Should such an exploit be accomplishable, all that would be left is the distasteful matter of minimising the psychological damage through copious amounts of denial on my part. Lots, and lots of denial.
For myself, I consciously choose to make no distinction between transferring and copying. So long as the method is destructive within a strict margin of making me least uncomfortable thinking about it.
I certainly wouldn't say I'm the same person I was a decade ago, all those cells are long gone, replaced, gradually. All those memories, diminished, distorted, misremembered. Desperate people look for whatever, however tenuous that leap may be to go from the general to the particular. No matter how insistent instinct is about saying otherwise.
Char-Nobyl said:
- I proposed a counter-situation where a clone was created with perfect memories, appearance, etc, but you remained alive, and asked you how exactly this completely separate individual would still be 'you' if the actual you was still alive.
It's polite to make room for unexpected guests. As opposed to giving in to the gut instinctual response of recoiling in abject horror at the frightening realisation. Because all it takes is a "God, What Have I Done!?" moment for everything too fall apart.
I have doubts if I could hold my composure in such an event.
Char-Nobyl said:
You've so far constantly refused to acknowledge the simple fact that if you, as you are right now, still exist after the clone's creation, it doesn't matter how perfect his creation. He's a different person the instant his experience branches off from your own because people are more than their appearance and memories. You are no longer interchangeable the moment that you have separate perspectives.
This I whole heartedly agree with. But he and myself would be quite opposed to calling him a clone. Clone is loaded with all sorts of connotations. Like abomination, aberration, and other words that many not necessarily start with the letter 'a.'
Char-Nobyl said:
Ergo, it is only your death that could even possibly justify his existence as a perfect substitute for you...assuming that you don't mind dying in the process.
A really, really big road bump in continuity! (O_O) Not suspicious at all!
Char-Nobyl said:
You're operating under the delusion that being spontaneously replaced by someone who can live out your life in your stead doesn't leave you anywhere but dead.
So long as the inertia of my thoughts makes it across, I fail to regard the unsettling implications... If it didn't unsettle me, then I'll have to over think about what sort of person I've become if I'm willing to go through with it. I'd still have hesitation over it, much, much hesitation. *shudders*
Char-Nobyl said:
You're like the opposite extreme of a sociopath: you think that your perspective is the only one on Earth that doesn't matter.
Hearing that from someone else makes me (more than admittedly) uncomfortably self-conscious, and appalled with myself.
Such is the influence of hypotheticals'. They uncover uncomfortable truths about the person answering them.
I'll admit circumstances in the last five years have driven me to waver on an unhealthy mechanistic perspective at odds with how I should be feeling... But I prefer to think of myself as the ultimate utilitarian... because that doesn't implicate me poorly at all...
I try putting on a affable demeanour and attending to as many other people's wellbeing at the cost of my own, when I am in a good mood I gain satisfaction ensuring other people are contented. If I could, I'd micromanage the cosmos like a sim-tycoon-strategy-simulation game. And try to ensure everyone's happiness.
Unfortunately I can only comprehend ensuring the general wellbeing and satisfaction of 82*10^35 Humans. (Blatant Lies, I can only manage a couple dozen people.) Never mind how much of a logistical nightmare that is, and not being in a position to do so.
That's why I picked option #6 in this thread. I'm certainly opposed to becoming malignant. But if the personality changes entail alleviating my anxiety, I'd be all for that.
Char-Nobyl said:
Somehow I don't think you've ever seen The Prestige [http://en.wikipedia.org/wiki/The_Prestige_%28film%29]. It's a pretty good movie. If you don't mind spoilers, I'll gladly talk about a major plot point that summarizes the problem with your argument.
But if you want to watch it, do so. Don't read this first. Or ever, if you don't want to. It'll spoil a rather big part of an excellent film.
I've seen this film and I agree, it is an excellent film. The Prestige and The Illusionist have great reveals at the end. I find your cordial conduct endearing. It has been an absolute pleasure typing with you.
Hugh Jackman's character created a 'magic' trick using a device with a not-so-clear function. It does one of two things: it teleports the user a short distance away and leaves behind a perfect clone of said user, or it creates a perfect clone of the user a short distance away.
Here's Jackman's dying speech, the point being from 1:45 onward [https://www.youtube.com/watch?v=XHKan75x7GI]. Because the clone was perfect in every way, he had absolutely no way of knowing whether or not he was simply the first clone. The first time he used the machine, he killed the one who was 'teleported,' meaning he either killed the original Jackman, or (since he rigged the device to kill the man who was standing on it after activation) it meant that he died every single time he used the device and the clone produced by it simply thought that he'd been teleported. When your memories are perfectly clear, and the other guy's memories appear the same, the only thing you have to cling to is your perspective.
Char-Nobyl said:
Hell, the same thing was in 'The Sixth Day.' The movie's a good deal older than 'The Prestige,' but I'll spoiler-tag it anyway.
As with the former, don't read before watching unless you want spoilers.
The villain and his henchmen use cloning technology to gain effective immortality. It somehow scans their brains at the time of death and produces a flawless copy in the vat immediately afterward. Unfortunately, it's made abundantly clear in the ending that there's no transfer of consciousness going on. The villain, while dying of a gunshot wound, desperately finishes the cloning process for a new clone of himself. The new clone then steps out and casually starts taking the dying progenitor's clothing. Here's their exchange:
Villain: "You're not even gonna wait until I die?"
Clone!Villain: "Would you?"
You can actually see the moment in the villain's eyes when he realizes that cloning tech doesn't give you immortality. He's about to die, and someone who looks exactly like him is about to take over his life.
Char-Nobyl said:
Honestly, they're both good movies, though the latter might be 'worse,' it's probably more fun to watch. If you haven't seen them already, please do so. Not for the sake of this argument, mind you. Just because they're worth watching.
It's scenes like that which keep me up at night. Every night.
*He got an idea. An awful idea. A wonderful, awful idea.*
Bare with me, this isn?t mean?s to prove any point, nor is it meant to be an argument. It?s just meant to prod at what I consider the disorientating ambiguity. Just playful banter.
What if I rendered you unconscious, duplicated you and left the two of you in the same room. Upon waking? Will each one will be rather insistent that they are the ?real one,? and that the other is most assuredly an imposture; all the while plagued with doubts as to the legitimacy of their claim?
I admit that would be a demonstrable transgression against all things decent and wholesome to put someone in such a situation, but since this is (thankfully) just banter. What would you do then?
When your memories are perfectly clear, and the other guy's memories appear the same, the only thing you have to cling to is your perspective. How would you resolve the situation?
People are more than their appearance and memories? Could you elaborate on that. ?In my gut? I know what you mean. But I need it explained to truly grasp that notion.
How would you resolve that without bringing in a person?s subjective perspective?
Is the message I send you different from the one I am typing just because it is cached in my computer?s memory and not in yours when you read it? I took many (hesitant) steps to go from there and try to apply it to people. Does it hold up if multiples instances of people aren?t hanging around, or do they point out an inherent flaw? How is it that intelligent beings are somehow except from this? Are they not a finite-state model that changes as a functions of time, an electro-chemical autonomous system?
As far as I'm concerned, the body is just life support, sensors, transportation, and tools equipped by the brain. That brain however is something more complex than anything we'll understand in the near future and more capable than any comparable machine we're likely to make for centuries at least. So, full body conversion but keeping the brain intact. Maybe a few augmentations, like external memory storage and calculator functions and data connections activated by thoughts, but no alterations to how the brain itself works. So... mostly 4, maybe little bit of 5?
I do expect this to change how I think, from lack of hormones and additional tools if nothing else, but I've never personally invested much of my sense of self in anything below the neck.
I understand on a conceptual level that other people invest a lot of their sense of humanity in the body and such, but it's difficult for me to really grasp. Might have something to do with my asexuality? A lot of people seem to put a lot of stock in sexual capability, sexual identity, and it would definitely be harder for many to think of themselves as a 'real man' or 'real woman' if they converted more than outer limbs or a couple of organs to chrome, but with respect to myself I just kinda fail to care about that kind of thing.
This topic is awesome. There is so many implications that result from it, namely:
- If cybernetics are used for medical purposes, will doctors need to learn engineering?
- If cybernetics made humans run on an easily available energy supply, would food become obsolete, and what would happen to those industries?
- If cybernetics increased lifespan, how would we control population growth?
- Would athletics, beauty contests and competition exist now that people can now augment themselves to make them superior?
- Would an improvement in brain stimulation through augmentation result in further productivity growth?
- Would the relative unavailability result in a world where only the wealthy are able to obtain it, increasing income disparity?
- Would there be an exploitation of mining sites that are raw materials used to produce cybernetics, i.e. iron ore, etc?
and a helluva lot more
OT: Stage 3 would be ideal, so to keep the humanity in the world, even if it is imperfect.
You'd still need enough nutrients to keep your brain healthy if you did anything less than full personality upload, so food will probably never become entirely obsolete. In fact it seems likely that the easiest way to power partially converted people would be through a food-powered internal generator, so they would be eating more to power their augmentations. The human body is actually almost perfect in terms of efficiency after our millions of years of evolution with limited resources, so one of the easiest ways to upgrade would be to swap out parts for ones that are faster and more powerful at the cost of needing a lot more food (now that anybody who can afford augmentation won't exactly have shortages of food).
- If cybernetics increased lifespan, how would we control population growth?
Problem fixes itself. Anybody with enough of their body converted to get a significant improvement in lifespan would probably not be having children accidentally or impulsively anymore; full conversion cyborgs would either have conception shifted to a voluntary thing or reproduce via artificial womb, meaning that outside rare and specific cases they'd only have children when they planned to and could support them. And as tech advances humanity will spread to other planets and no longer be limited to the 11-12 billion person ceiling.
- Would an improvement in brain stimulation through augmentation result in further productivity growth?
Probably. Just like people started getting more done in less time after they learned to write things down instead of remembering themselves, or when they got machines to do complex math for them instead of doing it in their heads, or when they gained the ability to google things instead of spending hours at the library or finding a supposed expert nearby. Every advance in information technology has changed the way we think, shifted more mental processes to external devices, and improved productivity to some degree; putting computers in people's brains will just accelerate the process and probably lead to advances we haven't thought of yet. As well as allowing you to watch porn or cat gifs in your head after your finish your serious business.
- Would the relative unavailability result in a world where only the wealthy are able to obtain it, increasing income disparity?
Cybernetics wouldn't take much in the way of rough materials; even if iron were used for human parts you could equip half a metropolis for the same amount of iron as an ordinary bridge. It will likely require a lot of the more exotic stuff... titanium, rare earths, all the parts for the most expensive medical supplies and computer parts. But by the time we advance our tech enough for full body conversions we'll probably also have asteroid mining to bring in the raw materials we need.
I think the biggest problem, by far, is going to be making electronics and the brain directly compatible. Our brains are, well, mind-bogglingly complex and we don't really know exactly how they work.
I'm sure it'll be possible before this century is over.
Mine would be somewhere between 2 or 3 but I'll go ahead and consider it a 2.
MY limit would be around replacing something like my left hand (or forearm if more convenient for the operation) for the sake of it being more convenient, if not for the potential of having quicker, more assured motion with my weak hand (I'm right hand dominant).
But nothing major like an entire arm or organs... I've thought about this subject a lot and wouldn't want my well-being and health being compromised by something like an EMP or some sort of digital virus.
But nothing major like an entire arm or organs... I've thought about this subject a lot and wouldn't want my well-being and health being compromised by something like an EMP or some sort of digital virus.
EM fields are far too common in this day and age for any responsible cybernetics producer to ignore. It would only take a few people having heart attacks from walking too close to an RFID scanner (which are located almost everywhere you go) to drive that particular point home. Add that to the fact that adding a Faraday cage to an electronic device is both easy and cheap, and you should not have much to worry about when it comes to EMPs. Well, at least as long as you aren't buying cheap off brand implants, but then I suspect you'd have bigger concerns.
As for digital viruses, just don't get the heart with a wi-fi connection. Ok, yeah, there is more to it than that, but running implants exclusively from firmware is likely to be the standard configuration. Viruses are not much of a thread under those circumstances. It would be about as likely as a virus being able to infect the remote control for your TV.
But nothing major like an entire arm or organs... I've thought about this subject a lot and wouldn't want my well-being and health being compromised by something like an EMP or some sort of digital virus.
EM fields are far too common in this day and age for any responsible cybernetics producer to ignore. It would only take a few people having heart attacks from walking too close to an RFID scanner (which are located almost everywhere you go) to drive that particular point home. Add that to the fact that adding a Faraday cage to an electronic device is both easy and cheap, and you should not have much to worry about when it comes to EMPs. Well, at least as long as you aren't buying cheap off brand implants, but then I suspect you'd have bigger concerns.
As for digital viruses, just don't get the heart with a wi-fi connection. Ok, yeah, there is more to it than that, but running implants exclusively from firmware is likely to be the standard configuration. Viruses are not much of a thread under those circumstances.
I figured it would be pretty easy to counter-develop the examples I made but I mean if we could get to the point to where we are fundamentally evolving ourselves as a species then I think some new unseen threats could be made as well.
So I guess I would wait it out for a while... idk after playing DE:HR I thought a lot about it and set my mark to just an arm augmentation like Prescott has.
No reason to go past the minor stuff right away, but in the end, I feel like augmentation to extend my lifespan is perfectly acceptable, supposing that in a society that has developed that technology population size is no issue, otherwise it would get crowded really fast with all these people busy not growing old and dying
I would probably move from 1 to 5 or 6 over the course of a few decades. I am operating under the assumption that if someone were to copy my consciousness and put it in a robotic body it would no longer be me, just a copy. However a gradual process means that my 'original' consciousness carries over to the robot body.
Option 2:
Implants that can fight cancer, HIV and many other diseases would be absolutely incredible.
I don't want anything that makes me or anyone else superhuman when it comes to things like strength and other senses. It's then when normal folk start to become obsolete, and that is terrifying.
All of you that want to go above and beyond that, good luck reproducing.
Depends on how dependent I would become. In the Deus Ex world, the anti-rejection drugs are expensive and turn people into corporate slaves.
Conversely, ME2-3 seem to have little in the way of negative repercussions (aside from that whole dying bit...)
Personally, I'd start with minor augments. Then, as I got older, I'd covert over into full body augments, but keep the nervous system intact. Then, just for curiosity (and if my brain started to break down from age), I'd go as far as partian mind conversion.
I wouldn't go further than that, but who knows what sort of mindset a transhuman would have. I might not even be myself once the partial mind conversion happened (not a concern I have up to the "Raiden" level).
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.