Captain Blackout said:
1) Neurons do not equate to memory locations in a computer
Hmmm. How sure can we be of that? It's certainly not the basic folder-hierarchy we know of computers, but at least we
can mark certain territories important for the creation of memory, such as the hippocampus, as well as the reliving of memory, such as auditory, visual and what ever other sensory cortices are required for that specific memory.
I'm not saying we can pinpoint their locations, no way, but we can at least specify certain.... checkpoints they have to pass when the person's mind is working on them.
2) I recently read up on how the brain stores memory, and apparently it involves proteins.
Yeah, I heard about that, too. I believe it was mostly about the different expressions of receptor proteins such as NMDAr. I'm not sure how much new info there is on this subject, however. The biomedical knowledge currently advances incredibly fast, it's hard to keep up.
Not only that, but the process of remembering isn't recalling hard data. When we remember things we are actually, at least to some degree, re-create the event in our heads. This suggests a far more complicated process, with a far more complicated data system, than a machine.
*Nods again*
Yes, that, too, I heard of. It was also part of the experiment I mentioned a few posts ago, when they compared fMRI signals of seeing emotionally laden pictures to remembering similarily laden memories. Interestingly, apart from the known memory centers, all the necessary sensory cortices activated as well in similar intensity to the viewing of picture.
So, I fully agree with you on this, the memory is extremely complicated and almost like reliving the situation itself.
However, my point about the basic principles used in neural circuitry still stands. In fact, it makes it more interesting considering such a complex memory system is possible on this comparably simple (and throughout evolution barely altered) basis.
I agree, we are biomechanical machines. I don't accept that as the complete picture, however. Machines are deterministic, even if they fall into the category of 'quantum determinism' (don't ask, that's an entirely different but just as complicated discussion).
Okay, I won't. Heh.
Oh, I don't dispute that (even though it's heavily influenced by brain chemistry, hormones, drugs, stress and what not).
I'd dispute it when looking at a very low animal, such as an ant being sent through the nest with pheromones like a little mindless robot, but not when looking at humans or similarily complex animals.
However, since my views on AI are very open, to me this is not a conflict.
I see consciousness and free will as part of the evolutionary process. A creature that could think for itself and adapt to new conditions on the spot would obviously have greater chances at surviving a dangerous situation.
Assume it can be done. Here's a question: We would not need to program qualia to achieve this, but would an AI have qualia anyway?
Hmmm, yes. Again, this comes down to the question already posed whether slight physical (and, in this case, software-specific instead of experience-determined) peculiarities could create a subjective experience for this AI
on their own. I'd say "yes".
Imagine red apart from any one thing red. Just the color. You have isolated a qualia for yourself.
The question is whether this is even possible considering my mind'd constantly try to associate this thought ("red...red...red...") with something it knows from older experiences.
How do you stop your mind from associating?
Hard hallucinogenics. When you can hear red and see rough, your qualia will stand out brutally.
Never tried anything beyond weed. Considering some folks cut off their own tongue and/or penis, maybe amputation of a few fingers or a foot is the better way to go.
I'm going to go look for a proof that qualia can't be quantified. Wish me luck (after almost completely helping you make your point I'm pretty sure I'm going to need it)
Good luck. I'm going to lean back and wait for technology to solve this problem for me.
Nah, not really.