evilthecat said:
I mean, sure, but the follow up question is then "is anything intelligent?"
The cells of the human brain are ultimately made of the same stuff as the rest of the universe. It turns out if you put that stuff together in the right way, it results in what we call consciousness. Sure, we don't understand this process (although the Culture do, which is how they create artificial intelligence), but assuming the same process is impossible with any other configuration of matter strikes me as vastly, vastly more anthropocentric even than assuming intelligence must resemble human intelligence. I mean, if you're going to build a society where humans and machines live together, having human-like intelligence certainly helps.
The culture series is "soft" science fiction, it has little hard moments here and there (the way the universe works is mostly consistent) but ultimately the Culture serves the same narrative function as the Federation in Star Trek, it's a moral point of reference against which to compare the real world. Banks never really follows through on some of the implications of the Culture, and one of these areas is the consequences of mind intelligence (the other is the full implications of a society with an entirely fluid concept of sex/gender, but I can accept that blind spot from a cishet writer in the 90s and appreciate the attempt). So sure, they are just big, super-intelligent people, but the idea that people can be made of something other than meat is probably enough for one day. Baby steps.
Well, we can be. I have a colleague working on simulated true touch sensation for myoelectric prosthesis. We are on the edge of the distinct possibility of wirelessly transmitting sight through submerged chipping and wiring of the corpus callosum and occipital lobe. It's still bleeding edge, but we're only 100 years from the true posthumanity stage of gestalt sensory perception.
And by the time we master it we will likely be able to simulate motor intelligence via neuroprosthetic manipulation of the ventral stream. Basically creating humans with just as fast reactions through predictive algorithms with remote processing networks that learns and would be personalized better than a computer, and can actively compartmentalize mental actions in a way computers can't by that fundamental idea of 'humans always have that one additional action' computers on their own can't perform.
Basically the only thing halting this progress is ethics committees are getting in the way.
So technically 'yes' we are made up of everything else in the universe, but that doesn't mean we can't achieve a posthuman state that gives as much utility as the Minds in these books, with the added bonus of transcending the inherent loneliness of the human condition. By definition, transforming the sum of all human experience, consciousness, and universes
of universes of thought that holistically marries a true totality of our atmost possible reconciling of ourselves and their relations.
And that will be greater than any machine by definition of recognizability of intelligence because there can't be anything more that we can possibly know that isn't immediately shared by all, a continuous maximal revelation, that will end war, poverty and arbitrary, unwelcome suffering. And that is the endstate of humanity. It matters not whether the brain is biological or not. That posthuman state is clearly superior to what you're describing. So why wouldn't this Mind, if benevolent, not simply give that option.
I mean think about it... sure, in a Sartrean/Camus(ian?) sense existence and freedom is a curse, but then again, whatever stops us from killing ourselves can't simply be outsourced to a machine that constantly tells you you're inadequate. I'd personally go mad and break shit because I was mad, so I broke shit as I was quite mad, so I...
Honestly we're our worst enemy, and I kind of agree with Shadowrun's diagnosis of us that we're (still) too autistic to survive corporations just giving us a job because we can't handle simply thinking collectively (yet) what's better for *all of us* in the end?
So why wouldn't a Mind just say; "Here's how you become posthuman and can fuck right off..."?
Maybe we should.
Maybe the fact that we can't is symptomatic of a problem with us and the world we live in, not the fact that trust is inherently impossible or a bad thing.
I liketo think of it as collective autism. We can't say what we really feel because we're not exactly as smart saying how we feel in comparison to knowing we're smarter than we can individually try to communicate how we actually feel, and other people aren't smart enough to truly understand that, or fail to see themselves in the same state, or fail to recognize that communication is inherently going to fail in a big way the communicator.
And that renders them lonely, hurt, and deeply afraid... and that maybe you feel exactly as they do, and how much better it would be if you (everyone) told themselves that when they wake up each morning.
So we spend a life trying to (hopefully) be more thoughtful, not simply take people at their word but rather try to form deeper connections and personal understanding recognizing everyone is autistic. Then we fail, and ultimately we either give up, or grow old and die, or both--but definitely the latter.
Honestly, being human is garbage. Thousands of years of philosophy has already taught us that. We don't need computers reiterating the same.
Why can't I be a hawk with the problem solving ability to use fire to feed myself, but none of this garbage self awareness that keeps failing me and making me drink either far too much or not enough to be happy--And the fact that I'll never actually get an answer to this and even if I did it won't stop me drinking because I'll argue it sort of helps me be happy with friends--Who I also enable their drinking in turn because deep down I'm a horrible parasite moreso looking for people to be introspectively happy in an otherwise collectively miserable state in contrast to being just miserable alone?
Hawks don't feel like this, and can still use fire. That's like win-win.
And no amount of computer can solve this. So honestly it seems like Mind is a quasi-benevolent being that ultimately it's benevolent simply because it doesn't just kill the annoying humans crawling around its insides, or simply so self-aware it needs a fellow drinking buddy. So ultimately is it even beneficial or is it just enabling people to be drunk with it?
If the latter that would be a truly sapient computer, and ultimately more harmful than actually good like every other person like me out there.
One weird I initially found quite jarring is that people in the culture are often quite openly bitchy and impolite to each other, but beyond being a quirk of how Bank's writes dialogue, I also think it's intentional. They live in a society that is completely safe, where any kind of deep interior hatred doesn't really exist. Being told a machine can do something better than you isn't an insult unless there are genuine consequences to not being as good as a machine, and there aren't. Additionally, the context of that exchange is that the composer character is writing a piece which will be performed at a celebration to mark an event that (we later learn) has incredible personal significance to that mind. It's not a dismissive statement, it's an honest statement between two people who have something approximating friendship, because to be dishonest would be patronizing.
That's different though. Patronizing simply to discourage is actively harmful. As I was saying before, the moral value of hedonism is other-regarding. Which is why it's deeply wrong to simply dismiss a moral theory like utilitarianism as pigs in mud or (merely) egocentrist. It's egoistic, but the highest moral principle is sacrifice irrespective of who benefits... Giving your life freely to save others, rather than a society that demands one
must give their life to save others. The highest moral principle being without coercion, as a society who demanded sacrifice is one that is less happy.
If the composer is performing a recital on the basis of celebrating something the Mind is personally invested in, there is no demanded sacrifice or discouragement of happiness simply asking how it would like to finish a piece.
A truly dystopian society isn't simply 'x amount of suffering liberated from one's control to ameliorate exists...' You can have a utopian social ideal and still recognize cancer will be a thing and hurt people. You could still have a utopian ideal of society regardless of the amount of cancer in it (assuming said society doesn't inflict cancer, moreso a hypothetical humanity that has twice the cancer neoteny) ....
Once again, the truly smart reply to the composer would still be "No, but it might flow better if you transform the start of the third movement with a violin staccato'd piece leading into a larger ensemble... maybe kind of like this?" After all, that's how you personally would react if you were a fellow composer, correct? There's patronising and prosocial, constructive engagement. Moreover what benefit does the Mind get by humans eschewing any sense of capacity to engage and thrive?
If the moral principle of the Mind is hedonism, then it should also predicate why hedonism is good...
Or maybe I'm allowing my time in education to colour my morality, but I feel like the Mind would at least value epistemological development over simple performance. Otherwise why bother interacting withhumans at all? Moreover why would iteven care about knowledge and skills acquisition at all? It might as well just kill itself, or annex and decouple the parts of its brain that feels boredom so it canjust be happy with everything and anything.
I mean, sure... itwould jeopardize the crew... by why should it care at all?
I talked about the in-universe criticism of the culture, and that comes from the protagonist of the first book, who really hates the Culture. He comes from a species which has been genetically altered for espionage and assassination (and are a clear reference to the Bene Tleilax from Dune). I won't spoil, but the subtext of that conflict is that the Culture represents a kind of mirror image of the instrumentality which created people like him. Culture humans are genetically engineered, but they're engineered to be able to experience more pleasure or have interesting drug experiences. What he hates about the culture is their instrumental, purposeful pursuit of purposelessness. He is unable to grasp the possibility of a tool that isn't designed to be used by someone, and that ultimately describes everything in the Culture, but especially the minds. The minds are built to feel enjoyment and pleasure and happiness in the way humans do and to experience a degree of autonomous free will because they live in a society that values these things even though they are "pointless".
Sorry, I love this series and I could talk about it all day.
I like these Bene Tleilax types. They have they right idea. I'm a strong advocate work should be educational, and tools should empower humanity todo more, not simply replace knowledge itself. I'd rather have a surgeon fail me after doing their best, then no surgeon at all caring because a machine has preoccupied any capacity for another human to recognize the intricacies of my condition and relevance to my wellbeing. I wouldn't mind medicine eliminating cancer, but the real danger is people failing to understand why getting rid of cancer was good.
It sounds like a fun collection of books, however. Not sure I'll gel with the philosophy, but it sounds like a fun thought experiment put to paper.
I'll look them up and give them a read.