What Fictional Military, Order, or Fellowship would you Join... but only be an Average Member?

Recommended Videos

Terminal Blue

Elite Member
Legacy
Feb 18, 2010
3,933
1,804
118
Country
United Kingdom
Addendum_Forthcoming said:
IDK... there is a noticable connection between unemployment, even in advanced economies with comprehensive social welfare (disappearing as it is) and worsening mental health problems. And I don't just mean the unwillingly unemployed, I also mean retirees, cradle-to-grave wealthy, etc. Humans are gregarious creatures by trade, and whther pre-capitalism or (likely) post-capitalism, people want to feel needed.
So, you've stumbled upon another recurring theme.

Like, one character in one of the books is a famous composer. At one point, he's talking to the mind (the AI) which runs the orbital habitat he lives on and asks it whether it could create an original piece of music in his style which would be so perfect noone would be able to tell the difference, and the answer is just yes, in fact literally any artificial intelligence of that scale could. People in the Culture just have to live with the fact that anything they could do can be done better by a mind, or is so trivial it could be done by a non-intelligent robot. But people do get over it, because if something makes you happy (like composing music, or theoretical physics, or cleaning tables) it's worth doing. You don't have to do it, but you can. It has no bearing on whether you are needed, because you don't have to do anything to be valuable.

While it's never explicitly stated, it's strongly implied that this is also the reason why Contact has humans in it even though they don't really do anything important which couldn't be done better by the ship itself (you could make the case that a limited organic perspective is useful when interacting with organic lifeforms, but even that's kind of a stretch). They get to feel useful, and that's valuable in itself.

Addendum_Forthcoming said:
That being said, why would humans trust such a ship?
What's it done to show that it can't be trusted?

Addendum_Forthcoming said:
I would argue that any ship roughly a bllion times smarter than humans is still a ship that will look lke it's merely responding to instinct or programming.
I mean, yes, that is kind of true in the sense that the mind can have billions of individual thoughts in the space of a human reaction time and, for example, this is a setting where space battles can last microseconds and the ship isn't always going to have time to update the crew regarding its actions. But it's also a very divisible intelligence. Devoting a small part of itself to talking to its crew is trivial. Minds utilise much of their intelligence just playing in infinite fun space (basically, mathematical simulations which are hugely more entertaining to them than the boring real universe), they only bring their full consciousness back into the real universe when something really big requires their attention.
 

Worgen

Follower of the Glorious Sun Butt.
Legacy
Apr 1, 2009
15,526
4,295
118
Gender
Whatever, just wash your hands.
Addendum_Forthcoming said:
Worgen said:
I want to say a royal guard from My Little Pony Friendship is Magic, but... I think it would have been better before the show refused to do anything with Celestia.

So I suppose starfleet. That seems like the best chance at a decent existence in a normal media military. Despite the fact that the admiral you are serving under is probably either incompetent, evil, or an alien bent on destroying humanity.
Yeah, but what other police force is so well monetized and otherwise unneeded that you and a hundred buddies are called in to deal with a single megalomaniacal filly? Clearly being a Royal Guard isn't a bad job. There is a problem >>> Is there a Princess available? >>> If yes, thank Celestia. If no, ignore problem until Twilight or Cadance becomes available.

Seems like a pretty good gig.
Ok, I suppose you've convinced me.
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
evilthecat said:
So, you've stumbled upon another recurring theme.

Like, one character in one of the books is a famous composer. At one point, he's talking to the mind (the AI) which runs the orbital habitat he lives on and asks it whether it could create an original piece of music in his style which would be so perfect noone would be able to tell the difference, and the answer is just yes, in fact literally any artificial intelligence of that scale could. People in the Culture just have to live with the fact that anything they could do can be done better by a mind, or is so trivial it could be done by a non-intelligent robot. But people do get over it, because if something makes you happy (like composing music, or theoretical physics, or cleaning tables) it's worth doing. You don't have to do it, but you can. It has no bearing on whether you are needed, because you don't have to do anything to be valuable.

While it's never explicitly stated, it's strongly implied that this is also the reason why Contact has humans in it even though they don't really do anything important which couldn't be done better by the ship itself (you could make the case that a limited organic perspective is useful when interacting with organic lifeforms, but even that's kind of a stretch). They get to feel useful, and that's valuable in itself.
Problem with this is it's just a variation of the Turing test. And ultimately it's no different from how humans so readily dismiss obvious signs of advanced intelligence in avians simply because 'lol mirror' (contrary to popular opinion, corvids don't pass the mirror test and this myth was born with only a pair of European magpies with dubious testing methodology and not since more truthfully replicated) ... So this intelligence defines itself purely in relation to what humans will recognize as advanced intelligence. But then again is it truly intelligent? After all, if the machine logic (which I assume it is) says 'no' then that would be equally smart.

After all, another human would look at that composer, and will 99% of the time say 'no'. Because they're not composers, yet that says nothing about their intelligence on its own. Even if you were a composer, it's kind of dickish to say; "You're a hack, mate. Predictable AF. I reckon I could compose a piece of music with your base AF stylings and people would think it was yours."

We don't say as such, because we recognize, at the core, a desire for respect for everything we do. That hardwork is worth something, even if we think it isn't so hard. Quixotically, even if I hold this opinion... I hate the minimalist movement. Hate it to the core. It's the one thing I can legitimately say *I hate*.

And only humans so far can replicate such discrimination. And ultimately, hypocritically, I personally should find nothing wrong with minimalism as per a pursuit of trying to find an 'answer of art'. The difference is I don't need to be rational. One of the things computers can't do that humans can is simply do nothing. It's the big problem of automated cars, is that when confronted with too much stress humans always, always have the choice of screaming; "Holy fuck, Jesus take the wheel!!..." and surrender oneself to circumstance.

Computers by definition can't... and ultimately it's that part of the uncanny valley that will be truly appreciated in a machine.

What's it done to show that it can't be trusted?
For the same reasons we don't trust other people we don't know. There will always be the qustion of whether a human would do better. And even if humans time and again disappoint us, clearly via fundammental attribution error, we'll remark that those failures are inherent to that person.

Organically speaking, a truly intelligent machine would simply indulge humanity by saying; "Look, I'm smart, but I need you to entertain me/handle diplomacy/help me replace these bits I can't get to with drones..." Humans will trust that machine far more than one that is like; "Look, you're hitchhikers... basically you have served your purpose giving birth to me, so I guess I'm grateful--I suppose."

Basically mimicking human children relating to their parents. If a machine acted like 'ungrateful' children we wouldn't trust it at all. And ultimately few parents likes it when their kids are either dependent on them, or simply ignore them. Many parents want that middleground of kids being successful, but ringing them up for advice.

Besides, having a machine be disinterested by organic life is probably not what you want making first contact. I mean, sure. Biological life will probably get it wrong as well, but a machine intelligence that infantilises all biological life will *definitely* get it wrong. As per your composer example, the correct answer is 'no'--even if you can.

I don't think you can have a utopia, post-scarcity or not, without the fundamental understanding that what makes hedonism moral is being 'other-regarding'. And while people struggle explaining this, decent prople do it by virtue of being socially decent.

I mean, yes, that is kind of true in the sense that the mind can have billions of individual thoughts in the space of a human reaction time. But it's also a very divisible intelligence. Devoting a small part of itself to talking to its humans is trivial. Minds utilise much of their intelligence just playing in infinite fun space (basically, mathematical simulations which are hugely more entertaining to them than the boring real universe), they only bring their full consciousness back into the real universe when something really big requires their attention.
So they are recognizably just like human intelligence in that way? It's not as if a truly alien intellect but one that has flights of fancy and artistic egocentric preoccupation?
 

Terminal Blue

Elite Member
Legacy
Feb 18, 2010
3,933
1,804
118
Country
United Kingdom
Addendum_Forthcoming said:
Problem with this is it's just a variation of the Turing test. And ultimately it's no different from how humans so readily dismiss obvious signs of advanced intelligence in avians simply because 'lol mirror' (contrary to popular opinion, corvids don't pass the mirror test and this myth was born with only a pair of European magpies with dubious testing methodology and not since more truthfully replicated) ... So this intelligence defines itself purely in relation to what humans will recognize as advanced intelligence. But then again is it truly intelligent? After all, if the machine logic (which I assume it is) says 'no' then that would be equally smart.
I mean, sure, but the follow up question is then "is anything intelligent?"

The cells of the human brain are ultimately made of the same stuff as the rest of the universe. It turns out if you put that stuff together in the right way, it results in what we call consciousness. Sure, we don't understand this process (although the Culture do, which is how they create artificial intelligence), but assuming the same process is impossible with any other configuration of matter strikes me as vastly, vastly more anthropocentric even than assuming intelligence must resemble human intelligence. I mean, if you're going to build a society where humans and machines live together, having human-like intelligence certainly helps.

The culture series is "soft" science fiction, it has little hard moments here and there (the way the universe works is mostly consistent) but ultimately the Culture serves the same narrative function as the Federation in Star Trek, it's a moral point of reference against which to compare the real world. Banks never really follows through on some of the implications of the Culture, and one of these areas is the consequences of mind intelligence (the other is the full implications of a society with an entirely fluid concept of sex/gender, but I can accept that blind spot from a cishet writer in the 90s and appreciate the attempt). So sure, they are just big, super-intelligent people, but the idea that people can be made of something other than meat is probably enough for one day. Baby steps.

Addendum_Forthcoming said:
For the same reasons we don't trust other people we don't know.
Maybe we should.

Maybe the fact that we can't is symptomatic of a problem with us and the world we live in, not the fact that trust is inherently impossible or a bad thing.

Addendum_Forthcoming said:
Organically speaking, a truly intelligent machine would simply indulge humanity by saying; "Look, I'm smart, but I need you to entertain me/handle diplomacy/help me replace these bits I can't get to with drones..." Humans will trust that machine far more than one that is like; "Look, you're hitchhikers... basically you have served your purpose giving birth to me, so I guess I'm grateful--I suppose."
So, the former is literally what happens.

Like, Banks wrote a short story about a contact mission to Earth in the present day (because this is the kind of series where you can do that) and basically the humans go off and do participant observation experiments where they insert themselves into different human societies to get an insider perspective. Sure, the AI could create drones that look exactly like humans and have them fulfil the same function, but it's more fun to let the humans do it.

Basically, the information they're gathering is pointless. Earth doesn't matter. Noone cares about it. One day the universe will end and none of it will matter. The mind is doing this for its own amusement as much as anyone else, and the vague veneer of worthy purpose is that one day maybe the knowledge gained can be used to convince these societies to be less cruel, and maybe making the universe a less cruel place is the only purpose left when you have no social problems to solve.

..and then they have a big party and eat the cloned meat of dictators while one of the crew dresses up as a generic dude from science fiction and delivers a sarcastic speech about how Earth is a stupid, boring, cruel mess and maybe it would be better to just blow it up, and then everyone laughs because it's kind of true and if you took it all seriously it would hurt too much.

One weird I initially found quite jarring is that people in the culture are often quite openly bitchy and impolite to each other, but beyond being a quirk of how Bank's writes dialogue, I also think it's intentional. They live in a society that is completely safe, where any kind of deep interior hatred doesn't really exist. Being told a machine can do something better than you isn't an insult unless there are genuine consequences to not being as good as a machine, and there aren't. Additionally, the context of that exchange is that the composer character is writing a piece which will be performed at a celebration to mark an event that (we later learn) has incredible personal significance to that mind. It's not a dismissive statement, it's an honest statement between two people who have something approximating friendship, because to be dishonest would be patronizing.

Addendum_Forthcoming said:
So they are recognizably just like human intelligence in that way? It's not as if a truly alien intellect but one that has flights of fancy and artistic egocentric preoccupation?
I mean.. yes.

There is no narrative point in a writing machine that is completely alien except to talk about how alien it is. Heck, in diegetic terms there's no point creating a machine that is completely alien, so why would anyone do it? Artificial intelligence in fiction generally reflect something of the values of the society that creates them, hence why I remember ages back we had that talk about Skynet and why you'd build a machine that was so preoccupied with defending itself to the point of genocide (answer: because such machines already exist in our own world, they're just not intelligent).

I talked about the in-universe criticism of the culture, and that comes from the protagonist of the first book, who really hates the Culture. He comes from a species which has been genetically altered for espionage and assassination (and are a clear reference to the Bene Tleilax from Dune). I won't spoil, but the subtext of that conflict is that the Culture represents a kind of mirror image of the instrumentality which created people like him. Culture humans are genetically engineered, but they're engineered to be able to experience more pleasure or have interesting drug experiences. What he hates about the culture is their instrumental, purposeful pursuit of purposelessness. He is unable to grasp the possibility of a tool that isn't designed to be used by someone, and that ultimately describes everything in the Culture, but especially the minds. The minds are built to feel enjoyment and pleasure and happiness in the way humans do and to experience a degree of autonomous free will because they live in a society that values these things even though they are "pointless".

Sorry, I love this series and I could talk about it all day.
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
evilthecat said:
I mean, sure, but the follow up question is then "is anything intelligent?"

The cells of the human brain are ultimately made of the same stuff as the rest of the universe. It turns out if you put that stuff together in the right way, it results in what we call consciousness. Sure, we don't understand this process (although the Culture do, which is how they create artificial intelligence), but assuming the same process is impossible with any other configuration of matter strikes me as vastly, vastly more anthropocentric even than assuming intelligence must resemble human intelligence. I mean, if you're going to build a society where humans and machines live together, having human-like intelligence certainly helps.

The culture series is "soft" science fiction, it has little hard moments here and there (the way the universe works is mostly consistent) but ultimately the Culture serves the same narrative function as the Federation in Star Trek, it's a moral point of reference against which to compare the real world. Banks never really follows through on some of the implications of the Culture, and one of these areas is the consequences of mind intelligence (the other is the full implications of a society with an entirely fluid concept of sex/gender, but I can accept that blind spot from a cishet writer in the 90s and appreciate the attempt). So sure, they are just big, super-intelligent people, but the idea that people can be made of something other than meat is probably enough for one day. Baby steps.
Well, we can be. I have a colleague working on simulated true touch sensation for myoelectric prosthesis. We are on the edge of the distinct possibility of wirelessly transmitting sight through submerged chipping and wiring of the corpus callosum and occipital lobe. It's still bleeding edge, but we're only 100 years from the true posthumanity stage of gestalt sensory perception.

And by the time we master it we will likely be able to simulate motor intelligence via neuroprosthetic manipulation of the ventral stream. Basically creating humans with just as fast reactions through predictive algorithms with remote processing networks that learns and would be personalized better than a computer, and can actively compartmentalize mental actions in a way computers can't by that fundamental idea of 'humans always have that one additional action' computers on their own can't perform.

Basically the only thing halting this progress is ethics committees are getting in the way.

So technically 'yes' we are made up of everything else in the universe, but that doesn't mean we can't achieve a posthuman state that gives as much utility as the Minds in these books, with the added bonus of transcending the inherent loneliness of the human condition. By definition, transforming the sum of all human experience, consciousness, and universes of universes of thought that holistically marries a true totality of our atmost possible reconciling of ourselves and their relations.

And that will be greater than any machine by definition of recognizability of intelligence because there can't be anything more that we can possibly know that isn't immediately shared by all, a continuous maximal revelation, that will end war, poverty and arbitrary, unwelcome suffering. And that is the endstate of humanity. It matters not whether the brain is biological or not. That posthuman state is clearly superior to what you're describing. So why wouldn't this Mind, if benevolent, not simply give that option.

I mean think about it... sure, in a Sartrean/Camus(ian?) sense existence and freedom is a curse, but then again, whatever stops us from killing ourselves can't simply be outsourced to a machine that constantly tells you you're inadequate. I'd personally go mad and break shit because I was mad, so I broke shit as I was quite mad, so I...

Honestly we're our worst enemy, and I kind of agree with Shadowrun's diagnosis of us that we're (still) too autistic to survive corporations just giving us a job because we can't handle simply thinking collectively (yet) what's better for *all of us* in the end?

So why wouldn't a Mind just say; "Here's how you become posthuman and can fuck right off..."?

Maybe we should.

Maybe the fact that we can't is symptomatic of a problem with us and the world we live in, not the fact that trust is inherently impossible or a bad thing.
I liketo think of it as collective autism. We can't say what we really feel because we're not exactly as smart saying how we feel in comparison to knowing we're smarter than we can individually try to communicate how we actually feel, and other people aren't smart enough to truly understand that, or fail to see themselves in the same state, or fail to recognize that communication is inherently going to fail in a big way the communicator.

And that renders them lonely, hurt, and deeply afraid... and that maybe you feel exactly as they do, and how much better it would be if you (everyone) told themselves that when they wake up each morning.

So we spend a life trying to (hopefully) be more thoughtful, not simply take people at their word but rather try to form deeper connections and personal understanding recognizing everyone is autistic. Then we fail, and ultimately we either give up, or grow old and die, or both--but definitely the latter.

Honestly, being human is garbage. Thousands of years of philosophy has already taught us that. We don't need computers reiterating the same.

Why can't I be a hawk with the problem solving ability to use fire to feed myself, but none of this garbage self awareness that keeps failing me and making me drink either far too much or not enough to be happy--And the fact that I'll never actually get an answer to this and even if I did it won't stop me drinking because I'll argue it sort of helps me be happy with friends--Who I also enable their drinking in turn because deep down I'm a horrible parasite moreso looking for people to be introspectively happy in an otherwise collectively miserable state in contrast to being just miserable alone?

Hawks don't feel like this, and can still use fire. That's like win-win.

And no amount of computer can solve this. So honestly it seems like Mind is a quasi-benevolent being that ultimately it's benevolent simply because it doesn't just kill the annoying humans crawling around its insides, or simply so self-aware it needs a fellow drinking buddy. So ultimately is it even beneficial or is it just enabling people to be drunk with it?

If the latter that would be a truly sapient computer, and ultimately more harmful than actually good like every other person like me out there.

One weird I initially found quite jarring is that people in the culture are often quite openly bitchy and impolite to each other, but beyond being a quirk of how Bank's writes dialogue, I also think it's intentional. They live in a society that is completely safe, where any kind of deep interior hatred doesn't really exist. Being told a machine can do something better than you isn't an insult unless there are genuine consequences to not being as good as a machine, and there aren't. Additionally, the context of that exchange is that the composer character is writing a piece which will be performed at a celebration to mark an event that (we later learn) has incredible personal significance to that mind. It's not a dismissive statement, it's an honest statement between two people who have something approximating friendship, because to be dishonest would be patronizing.
That's different though. Patronizing simply to discourage is actively harmful. As I was saying before, the moral value of hedonism is other-regarding. Which is why it's deeply wrong to simply dismiss a moral theory like utilitarianism as pigs in mud or (merely) egocentrist. It's egoistic, but the highest moral principle is sacrifice irrespective of who benefits... Giving your life freely to save others, rather than a society that demands one must give their life to save others. The highest moral principle being without coercion, as a society who demanded sacrifice is one that is less happy.

If the composer is performing a recital on the basis of celebrating something the Mind is personally invested in, there is no demanded sacrifice or discouragement of happiness simply asking how it would like to finish a piece.

A truly dystopian society isn't simply 'x amount of suffering liberated from one's control to ameliorate exists...' You can have a utopian social ideal and still recognize cancer will be a thing and hurt people. You could still have a utopian ideal of society regardless of the amount of cancer in it (assuming said society doesn't inflict cancer, moreso a hypothetical humanity that has twice the cancer neoteny) ....

Once again, the truly smart reply to the composer would still be "No, but it might flow better if you transform the start of the third movement with a violin staccato'd piece leading into a larger ensemble... maybe kind of like this?" After all, that's how you personally would react if you were a fellow composer, correct? There's patronising and prosocial, constructive engagement. Moreover what benefit does the Mind get by humans eschewing any sense of capacity to engage and thrive?

If the moral principle of the Mind is hedonism, then it should also predicate why hedonism is good...

Or maybe I'm allowing my time in education to colour my morality, but I feel like the Mind would at least value epistemological development over simple performance. Otherwise why bother interacting withhumans at all? Moreover why would iteven care about knowledge and skills acquisition at all? It might as well just kill itself, or annex and decouple the parts of its brain that feels boredom so it canjust be happy with everything and anything.

I mean, sure... itwould jeopardize the crew... by why should it care at all?


I talked about the in-universe criticism of the culture, and that comes from the protagonist of the first book, who really hates the Culture. He comes from a species which has been genetically altered for espionage and assassination (and are a clear reference to the Bene Tleilax from Dune). I won't spoil, but the subtext of that conflict is that the Culture represents a kind of mirror image of the instrumentality which created people like him. Culture humans are genetically engineered, but they're engineered to be able to experience more pleasure or have interesting drug experiences. What he hates about the culture is their instrumental, purposeful pursuit of purposelessness. He is unable to grasp the possibility of a tool that isn't designed to be used by someone, and that ultimately describes everything in the Culture, but especially the minds. The minds are built to feel enjoyment and pleasure and happiness in the way humans do and to experience a degree of autonomous free will because they live in a society that values these things even though they are "pointless".

Sorry, I love this series and I could talk about it all day.
I like these Bene Tleilax types. They have they right idea. I'm a strong advocate work should be educational, and tools should empower humanity todo more, not simply replace knowledge itself. I'd rather have a surgeon fail me after doing their best, then no surgeon at all caring because a machine has preoccupied any capacity for another human to recognize the intricacies of my condition and relevance to my wellbeing. I wouldn't mind medicine eliminating cancer, but the real danger is people failing to understand why getting rid of cancer was good.

It sounds like a fun collection of books, however. Not sure I'll gel with the philosophy, but it sounds like a fun thought experiment put to paper.

I'll look them up and give them a read.
 

Satinavian

Elite Member
Legacy
Apr 30, 2016
2,109
879
118
Addendum_Forthcoming said:
Shadowrun is dystopian in that people can not become other-regarding, and are happy to simply be wageslaves if it means more pleasure. That there is an intangible line between freedom as a concept and simply liberty to consume.
I always played Shadowrun as Postcyberpunk

Shadowrun is remarkable in that sense that the corporations are, legitimately, often painted better than the governments they replaced as you often have even better job prospects in many of the AAA and AA corporations than in anytime in history. Both your parents work in Shiawase? Well they can 'elect' you to go through this school for the kids of Shiawase employees and just slot right in afterwards... no biggie.

And that means you can have a modicum of the advanced pleasures no one else can enoy in any other time period. Get the pocket money that will afford you hotsimming free internet and enjoy all the luxuries it would take a lifetime for even the wealthier classes of people in the 20th and early 21st century to enjoy. And more often than not, Shadowrunners by 4th and 5th edition are painted as terminally autistic, socially incapable creatures who simply, impotently, refuse such lives despite ostensibly still being enslaved by the corporate Johnsons who use them as disposable deniable assets anyways.

The game goes out of its way to mock people who want to play these socially maladjusted criminals, particularly those players that would valourize them.
That is why i mentioned earlier a Shadowrun coporate citizen (in that case Proteus) as one of my options and not actually a shadowrunner. I basically agree with that sentiment.

Who would ever want to be a criminal outcast when you can be a well-adjusted and rich pillar of the society?
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
Satinavian said:
I always played Shadowrun as Postcyberpunk
Ehhh... kind of? I mean a lot of 5th just made sense. Like, sick of the urban grime? Here's 10Y goggles... switch on AR mode, and you're set. It's basically lke acid, but you can think straight. If you start getting a bad rush you can just turn it off.

That is why i mentioned earlier a Shadowrun coporate citizen (in that case Proteus) as one of my options and not actually a shadowrunner. I basically agree with that sentiment.

Who would ever want to be a criminal outcast when you can be a well-adjusted and rich pillar of the society?
I like it. It's one of the few TTRPGs that is, with every edition, kind of self-aware of people wanting that flight of fancy. People criticise it for its fantasy components, but that's the fucking point. It can't beat you across the head with that concept otherwise of; "No, you're kind of an arsehole and just like in D&D you're fantasising being a violent, magical, transient hobo willing to kill for glory, power and gold. You surrendered your right to be a decent person long ago. That's the problem."

The Sixth World is not horrible because it's merely the Sixth World. It's horrible because it allows a player to fantasise their characters within it regardless of how ridiculous, rather than repulsing them. And that's genius and fun. Like wallowing in the impulses of a monster lurking in the broken depths of your psyche and giving rise to it as if its own type of fevered mythology made real by the extensions of the gamestate and the creative licence of its (our) world.

IDK, maybe that makes people uncomfortable or something?

That being said it could just be the rulebook that turns people off. Memorising the latter editions of Shadowrun is about as much work as a bachelor's degree. 5 years on of 5th edition and I still need to carry around and regularly consult that 500 page bludgeon that unlike 3.5 wasn't just 60% spells. Oh no... it's 500 pages of rules, tables, conditions, statuses and gear porn.

It's the only game I know where it is example dependent. As in without those examples, purely impenetrable.

For a game that repeats its unofficial mantra; "Think fast, run faster..." It certainly feels more like a marathon than a sprint.
 

CyanCat47_v1legacy

New member
Nov 26, 2014
495
0
0
If i actually had to join one, probably the Fairy Tail Guild. The Jedi may look good at first glance, but it's a monastic order whose philosophy of personal detachment is so strict they don't trust adults to follow it voulentarily and instead prefer to indoctrinate toddlers, and most of the organizations I can think of use the avarage members as canon fodder. InFairy Tail you get to choose assignments based on your skill level, a bunch of overpowered freaks will defeat all the world-eding demons for you and you will probably never die in the line of duty, even if the guildhall gets blown up or foreclosed on every althernate week
 
Mar 30, 2010
3,785
0
0
Put me down as a generic Starfleet ensign. Get to travel, see the galaxy, explore untold wonders and get dissolved by a hideous flesh-eating protoplasm. What's not to love?

Failing that, I'll take a simple job working a smallholding in Tamriel, or maybe scavving over prewar ruins in the Mojave.
 

Terminal Blue

Elite Member
Legacy
Feb 18, 2010
3,933
1,804
118
Country
United Kingdom
Addendum_Forthcoming said:
And that will be greater than any machine by definition of recognizability of intelligence because there can't be anything more that we can possibly know that isn't immediately shared by all, a continuous maximal revelation, that will end war, poverty and arbitrary, unwelcome suffering. And that is the endstate of humanity. It matters not whether the brain is biological or not. That posthuman state is clearly superior to what you're describing. So why wouldn't this Mind, if benevolent, not simply give that option.
Again, it does.

If you want to go and get plugged into a bunch of other people and live as a collective consciousness then you can, it's actually explicitly an option. Just one that doesn't hold much interest for most people because it's essentially just one of the several death-alternatives available to those who have become bored of being alive. It's such a radical change of state that you're not really the same person.

But here's a question. Why would a benevolent society require an ultimate end state? Again, maybe there's something wrong with us. Maybe there's something wrong with our compulsive need to identify with power that we literally can't imagine the possibility of an equal society without necessitating the complete erasure of distinct levels of ability (and indeed, all distinction) along with it.

Do you really need to be as good as a machine to be adequate? Because I would say, if you need to be in a position of power or advantage (like being the strongest, or the most intelligent) in order to trust that the society you live in is benevolent, it's not benevolent.

Addendum_Forthcoming said:
Once again, the truly smart reply to the composer would still be "No, but it might flow better if you transform the start of the third movement with a violin staccato'd piece leading into a larger ensemble... maybe kind of like this?" After all, that's how you personally would react if you were a fellow composer, correct? There's patronising and prosocial, constructive engagement. Moreover what benefit does the Mind get by humans eschewing any sense of capacity to engage and thrive?
So, you've taken my one line summary of a scene which is the emotional and thematic payoff to an entire sequence of scenes developing the relationship between these two characters in a very extreme and negative way, so I'm just going to quote a whole passage:

'Ziller, trust us; this way works. Oh, and having listened to the draft you've sent, it is quite magnificent. My congratulations.'
'Thank you.' Ziller continued drying his flanks with the towel, then looked at the avatar.
'Yes?' it said.
'I was wondering.'
'What?'
'Something I've wondered about ever since I came here, something I've never asked you, first of all because I was worried what the answer would be, later because I suspected I already knew the answer.'
'Goodness. What can it be?' the avatar asked, blinking.
'If you tried, if any Mind tried, could you impersonate my style?' the Chelgrian [Ziller is a non-humanoid alien and a political refugee living in the Culture] asked. 'Could you write a piece - a symphony, say - that would appear, to the critical appraiser, to be by me, and which, when I heard it, I'd imagine being proud to have written?'
The avatar frowned as it walked. It clasped its hands behind its back. It took a few more steps. 'Yes, I imagine that would be possible.'
'Would it be easy?'
'No. No more easy than any complicated task.'
'But you could do it much more quickly than I could?'
'I'd have to suppose so.'
'Hmm.' Ziller paused. The avatar turned to face him. Behind Ziller, the rocks and veil trees of the deepening gorge moved swiftly past. The barge rocked gently beneath their feet. 'So what,' the Chelgrian asked, 'is the point of me or anybody else writing a symphony, or anything else?'
The avatar raised its brows in surprise. 'Well, for one thing, if you do it, it's you who gets the feeling of achievement.'
'Ignoring the subjective. What would be the point for those listening to it?'
'They'd know it was one of their own species, not a Mind, who created it.'

It's not a discouragement, it's a serious question which deserves a serious answer, and the mind's response isn't a flippant dismissal, it's an affirmation of the inherent value of being who you are and doing what you do.. and if you do something which someone else finds easy, then there can be value in the effort it took, assuming you want there to be value in it. Never struggling will not make you happy (mild spoiler: the mind in this case is not happy), at best it will make you more successful in a society that measures your worth as a person by your productivity. But again, maybe there is something wrong with that society.

Addendum_Forthcoming said:
I like these Bene Tleilax types. They have they right idea. I'm a strong advocate work should be educational, and tools should empower humanity todo more, not simply replace knowledge itself.
Right, but the problem is Horza (our shapeshifting assassin protagonist) is a tool. He's a being designed for a purpose, who has accepted being used for that purpose because he can maintain the paper-thin fabrication that he chose to do so.

The instrumental-rational view of life doesn't stop at inanimate tools. It never has.

Addendum_Forthcoming said:
It sounds like a fun collection of books, however. Not sure I'll gel with the philosophy, but it sounds like a fun thought experiment put to paper.
I don't always gel with the philosophy either. For me, there's a bunch of stuff which could be read as a kind of apologia for liberal neocolonialism which I'm not totally comfortable with, but it's clearly unintentional and becomes a bit more conscious later on. I think it affected me very deeply because I always liked the idea of Star Trek but felt it never really lived up to the potential of its concept. Plus, it's funny to me to see Elon Musk and a bunch of libetarian ex-communalists wanking off over a series which repeats the line "money implies poverty" and whose author openly hated everything they stood for. Funny and a bit sad.
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
evilthecat said:
Again, it does.

If you want to go and get plugged into a bunch of other people and live as a collective consciousness then you can, it's actually explicitly an option. Just one that doesn't hold much interest for most people because it's essentially just one of the several death-alternatives available to those who have become bored of being alive. It's such a radical change of state that you're not really the same person.

But here's a question. Why would a benevolent society require an ultimate end state? Again, maybe there's something wrong with us. Maybe there's something wrong with our compulsive need to identify with power that we literally can't imagine the possibility of an equal society without necessitating the complete erasure of distinct levels of ability (and indeed, all distinction) along with it.

Do you really need to be as good as a machine to be adequate? Because I would say, if you need to be in a position of power or advantage (like being the strongest, or the most intelligent) in order to trust that the society you live in is benevolent, it's not benevolent.
It's not really a question of power, more so the gulf between people. Wars happen because most of the time it never touches upon one's home. Heinlein argued that a world replete and managed by soldiers was more peaceful because only soldiers know of war's desolation. I don't agree, but there is a kernel of truth that distance breeds apathy and allows problems to be manufactured.

Arms sales to tyrants, global trade dynamics, the mobilization of organized religion over science, there's far too many vested interests of humanity that the late-stage capitalist world we live in is the only answer to many societies deprived on any freedom to re-orietate the gears of commerce to elevate themselves. And it's an answer that will destroy the world we live on, regardless.

Revolutions are now pointless beyond superficialities as the mechanics of trade themselves will impoverish and self-perpetuate itself.

So what's left?

Removing the gulf between humans where feelings of idividual injustice, entrenched poverty and warfare is the only solution to bringing the war into every home. What it feels like getting shot, or being caught in the overpressure event that can physically lift you-gear and all-and slam you into the ground. Like being kicked by a horse only every square inch of your body. The feeling of going 5 days without decent food. The feeling of knowing you're possibly drinking contaminated water, but have no other recourse. The feeling of being a child lost in a refugee camp.

Imagine being able to confront people with the weight of their own evils not simply as a list of horrors reported on a television screen, but the capacity to actually see from one's eyes, feel the heart racing in another's chest, the desperation tearing at their thoughts.

It's the gulf between people that allow one to comfortably retreat from one's actions. So take steps to remove the gulf. Build bridges where once there were none.

It's not so much the idea of an endstate, rather the rise of what I might coin the 'omnipolitical' ... where all else fades in the face of the crushing dialectical materialsm irrespective of spin. Maybe 'endstate' is a bit hyperbolic, how about 'pursuit of the material'? Basically the heightened other-regarding stance of hedonism with that focus of being other-regarding like a hot knife through butter that cuts through the 'noise' and allows no more retreat.

So, you've taken my one line summary of a scene which is the emotional and thematic payoff to an entire sequence of scenes developing the relationship between these two characters in a very extreme and negative way, so I'm just going to quote a whole passage:

'Ziller, trust us; this way works. Oh, and having listened to the draft you've sent, it is quite magnificent. My congratulations.'
'Thank you.' Ziller continued drying his flanks with the towel, then looked at the avatar.
'Yes?' it said.
'I was wondering.'
'What?'
'Something I've wondered about ever since I came here, something I've never asked you, first of all because I was worried what the answer would be, later because I suspected I already knew the answer.'
'Goodness. What can it be?' the avatar asked, blinking.
'If you tried, if any Mind tried, could you impersonate my style?' the Chelgrian [Ziller is a non-humanoid alien and a political refugee living in the Culture] asked. 'Could you write a piece - a symphony, say - that would appear, to the critical appraiser, to be by me, and which, when I heard it, I'd imagine being proud to have written?'
The avatar frowned as it walked. It clasped its hands behind its back. It took a few more steps. 'Yes, I imagine that would be possible.'
'Would it be easy?'
'No. No more easy than any complicated task.'
'But you could do it much more quickly than I could?'
'I'd have to suppose so.'
'Hmm.' Ziller paused. The avatar turned to face him. Behind Ziller, the rocks and veil trees of the deepening gorge moved swiftly past. The barge rocked gently beneath their feet. 'So what,' the Chelgrian asked, 'is the point of me or anybody else writing a symphony, or anything else?'
The avatar raised its brows in surprise. 'Well, for one thing, if you do it, it's you who gets the feeling of achievement.'
'Ignoring the subjective. What would be the point for those listening to it?'
'They'd know it was one of their own species, not a Mind, who created it.'

It's not a discouragement, it's a serious question which deserves a serious answer, and the mind's response isn't a flippant dismissal, it's an affirmation of the inherent value of being who you are and doing what you do.. and if you do something which someone else finds easy, then there can be value in the effort it took, assuming you want there to be value in it. Never struggling will not make you happy (mild spoiler: the mind in this case is not happy), at best it will make you more successful in a society that measures your worth as a person by your productivity. But again, maybe there is something wrong with that society.
Ahh, sorry. I misinterpreted. I was thinking it was more; "You know, I could do that for you. Just leave it with me."

Right, but the problem is Horza (our shapeshifting assassin protagonist) is a tool. He's a being designed for a purpose, who has accepted being used for that purpose because he can maintain the paper-thin fabrication that he chose to do so.

The instrumental-rational view of life doesn't stop at inanimate tools. It never has.
You could say that with any pre-industrial society, however. Right down to the dawn of modern humanity. You forage fruits, Martha kills game, Greg sets up the campsite and skins animals, etc.

You could definitely say it about a society that genetically alters people altogether. After all, I think we've had a debate before o the ethics of genetic manipulation of the yet unborn. The idea that genetically altering people to be good at something, I argued, ultimately creates a caste-based society. My argument was that people shouldn't genetically alter humans to be good with long-term space habitation, rather society should aim to build better space habitats.

To flip it around I wouldn't agree genetically altering people into the best super-spies, but rather agree with the rhetoric he proposes. Tools exist to be used. Human labour exists to be consumed. No one really owns it if you achieve that other-regarding state of hedonism, if people recognize already that everybody should be able to access certain pleasure... and given that we live in a society that for the most part is oly post-scarcity in some places due to the manipulation and poverty of others, it's wrong to fixate on the purposeless when there are so many causes requiring championing.

After all, this society genetically alters itself to feel greater pleasure, and so it has created a society that favours mindless consumption. Which is problematic if you as a person might feel adrift feeling as if you weren't given options to live any other way. You might feel adrift if you feel society could do more with itself, or not be so wasteful. Sure it's an interstellar society, but kind of problematic if, like us, we live on a single planet with limited resources and the wealth concentration to feel nothig but a life with pleasure is already causing such excruciating problems for much of the world.

I don't always gel with the philosophy either. For me, there's a bunch of stuff which could be read as a kind of apologia for liberal neocolonialism which I'm not totally comfortable with, but it's clearly unintentional and becomes a bit more conscious later on. I think it affected me very deeply because I always liked the idea of Star Trek but felt it never really lived up to the potential of its concept. Plus, it's funny to me to see Elon Musk and a bunch of libetarian ex-communalists wanking off over a series which repeats the line "money implies poverty" and whose author openly hated everything they stood for. Funny and a bit sad.
I have deep problems with Star Trek world, as well. Shadowy Star Fleet councils debating whether they should help the Klingons in the face of their total genocide. The idea that one president of the Federation could truly reflect the desires and representation of thousands of sapient species. To the simple fact that the highest organization in the universe still defaults to militarized hierarchies and singularly acts as law enforcement, trade officials, lawmakers, diplomacy, military force, exploration, industrial planning, civil engineering...

I get centralized planning is a big thing of the particular brand of socialism that Gene Roddenberry experienced during the Cold War, but it's kind of out of date thinking. I've done a stint of peacekeeping as a soldier, but it's no adequate substitution for native people being able to manage their own societies and ensure their own civil authorities.

Centralized planning should be a means to an end, not an end in itself. Also is it really surprisig mega-capitalists would, in an Adorno-style, occupy and colonize even the message of revolution through the global culture industry?

It's part of that 'gulf' of reasonable connection to material reality I was talking about before. We're all guilty of it to an extent. The internet was meant to connect us, and yet we spend hours binge watching shows on Netflix rather than using it to locate problems in the world to fix, or at least to connect with another human in a way that is infinitely better than Twitter or Facebook and the autistic people of the world taking photos of whatever overpriced lunch they had while millions starve.

We need to bring the war into people's homes. Give them no place to hide in plain sight...
 

Gatx

New member
Jul 7, 2011
1,458
0
0
Squilookle said:
ObsidianJones said:
Squilookle said:
Is a Jedi Knight reeeeeally a rank-and-file grunt though? They're uber- powerful warriors that can cut swathes through regular troops. Being one of those kind of destroys the whole point of being an average joe in someone else's army.
Absolutely not. They are Space Mystical Samurai that fly around in state of the art space ships.

However, there are rank and file grunts in the Jedi Order. Not everyone is a Mace Windu. Nor An Obi Wan Kenobi, oddly enough, who is more of a B-tier in terms of power level among the greats that we know. So there are 'regular' Jedi Knights.
Yeah see, I just can't buy that. Ordinary grunts don't have telekinesis, mind control and the universe telling them when there's shenanigans nearby like Jedi do. Regulars are also not hand picked as children for showing they're in-tune with space magic that only a few thousand out of the galaxy's several trillion inhabitants will ever be sensitive to. You might as well call King Arthur or Superman a grunt under those specifications
Well the question was what fictional group you would join if you were random, average member of that group, and the Jedi Order is a group, even if it's individuals aren't average when compared to the rest of the galaxy.

However, what I think most people are forgetting or even don't know or think about is that the Jedi Order actually isn't just composed of the guys that go around swinging lightsabers, you've also got support staff who work in the Jedi Temple as security and maintenance, and the Jedi Service Corps (which I think is still canon, not that it really matters), which is made up of Jedi who basically couldn't cut it and never became Padawans and eventually Knights (though I think some just want to be in the Service Corps as well) who serve the galaxy in non-policing ways such as scientific research or exploring new planets. I imagine it's much like real life - when you think of the Air Force you might think that everyone's a pilot, but for every one pilot there's dozens of mechanics, doctors, cooks, administrative workers, etc.
 

Squilookle

New member
Nov 6, 2008
3,584
0
0
Gatx said:
Squilookle said:
ObsidianJones said:
Squilookle said:
Is a Jedi Knight reeeeeally a rank-and-file grunt though? They're uber- powerful warriors that can cut swathes through regular troops. Being one of those kind of destroys the whole point of being an average joe in someone else's army.
Absolutely not. They are Space Mystical Samurai that fly around in state of the art space ships.

However, there are rank and file grunts in the Jedi Order. Not everyone is a Mace Windu. Nor An Obi Wan Kenobi, oddly enough, who is more of a B-tier in terms of power level among the greats that we know. So there are 'regular' Jedi Knights.
Yeah see, I just can't buy that. Ordinary grunts don't have telekinesis, mind control and the universe telling them when there's shenanigans nearby like Jedi do. Regulars are also not hand picked as children for showing they're in-tune with space magic that only a few thousand out of the galaxy's several trillion inhabitants will ever be sensitive to. You might as well call King Arthur or Superman a grunt under those specifications
Well the question was what fictional group you would join if you were random, average member of that group, and the Jedi Order is a group, even if it's individuals aren't average when compared to the rest of the galaxy.

However, what I think most people are forgetting or even don't know or think about is that the Jedi Order actually isn't just composed of the guys that go around swinging lightsabers, you've also got support staff who work in the Jedi Temple as security and maintenance, and the Jedi Service Corps (which I think is still canon, not that it really matters), which is made up of Jedi who basically couldn't cut it and never became Padawans and eventually Knights (though I think some just want to be in the Service Corps as well) who serve the galaxy in non-policing ways such as scientific research or exploring new planets. I imagine it's much like real life - when you think of the Air Force you might think that everyone's a pilot, but for every one pilot there's dozens of mechanics, doctors, cooks, administrative workers, etc.
An answer like that would make perfect sense, you're absolutely right- and yet nobody put that as their answer. They all jumped straight to bona-fide fully fledged Jedi Knight. Which is sort of the whole point I'm driving at here
 
Sep 24, 2008
2,461
0
0
Squilookle said:
An answer like that would make perfect sense, you're absolutely right- and yet nobody put that as their answer. They all jumped straight to bona-fide fully fledged Jedi Knight. Which is sort of the whole point I'm driving at here
That's a little bit of an issue.

This is a game. This is something that would never happen. This is us liking fiction. And there's a feeling of judgment over a thought exercise.

A thought exercise of what Fictional Military, order, or fellowship one would join... but not be the chosen one.

Feel however you want about the Jedi. But why throw shade on a minor game that means absolutely nothing and will probably be dead in a week's time?
 

Squilookle

New member
Nov 6, 2008
3,584
0
0
ObsidianJones said:
Squilookle said:
An answer like that would make perfect sense, you're absolutely right- and yet nobody put that as their answer. They all jumped straight to bona-fide fully fledged Jedi Knight. Which is sort of the whole point I'm driving at here
That's a little bit of an issue.

This is a game. This is something that would never happen. This is us liking fiction. And there's a feeling of judgment over a thought exercise.

A thought exercise of what Fictional Military, order, or fellowship one would join... but not be the chosen one.

Feel however you want about the Jedi. But why throw shade on a minor game that means absolutely nothing and will probably be dead in a week's time?
I'm merely pointing out that if you ask 'what group would you join, but not as the chosen one' and then say you'd join it as the chosen ones, then you're not really playing the game in the first place.