Poll: will machines ever be more intelligent than people?

Recommended Videos

emeraldrafael

New member
Jul 17, 2010
8,589
0
0
Redlin5 said:
God I hope not. It's one of my irrational fears that the toaster will start judging me because of all the toast I make...
One can only imagine what our computers will think of us.

OT: I dont think so. if they do they'll have to be programmed to be come so
 

Flailing Escapist

New member
Apr 13, 2011
1,602
0
0
Redlin5 said:
God I hope not. It's one of my irrational fears that the toaster will start judging me because of all the toast I make...
-"A toaster is just a death ray with a smaller power supply! As soon as I figure out how to tap into the main reactors, I will burn the world!"

OT: if they even do we'll just have to make super-machines to wipe them out every 50,000 years. It's the only way we will survive. Unless of course the super-machines we make kill all of us instead but I REALLY doubt that would happen. Oh ho ho ho ho *kneeslap*
 

Astoria

New member
Oct 25, 2010
1,887
0
0
Unless you mean they'll be able to think quicker and about more things at once then no (and in that case they already are). How can we create something smarter than ourselves? How do you even decide what makes something smarter than us?
 

FarleShadow

New member
Oct 31, 2008
432
0
0
SnakeoilSage said:
FarleShadow said:
Of course they are going to get progressively smarter until they are smarter than we are, we're bound by the limits of biology and neurotransmitters. Computer minds would be bound only by the limits of whatever medium they run on (So they'll be crazy-fast).
The most powerful computer right now has less than 1/6th of our brain's memory capacity and performs less than 1/10th of the calculations our brains process every second. I don't have the exact data on me but the math was done recently.

Don't assume that because lifting your arm seems effortless to your consciousness doesn't mean that your brain isn't working countless calculations to perform the task.

*Bit about coding*

People keep expecting some kind of singularity, that just because a Chess program has been programmed with thousands of potential moves to react to human actions, that AI will somehow explode out of the Internet. It won't. No amount of information uploaded into a computer or series of computers will give it the capacity to perceive that information as anything but data to be utilized by the humans accessing it. There is no "awareness" examining the information, no computer attempting to process the info for itself.

The fact that we can't even define sentience ourselves proves that it is a state of being that we cannot emulate no matter how advanced our technology is, because our biological brains with millions of years of programming cannot comprehend its meaning yet.
Wow, that was incredibly patronising and insultingly easy to refute.

For one, the comparison between human mind and machine breaks down immediately, the fastest computer in the world can perform 260 trillion operations a second while human minds can only react to a stimulus after 120 miliseconds, at best. Our own biological nature limits our maximum speed of thought, machines are not limited in this regard.

Memory falls into a different catagory, human memories are very efficient in packing stuff in, but are laughably inaccurate when it comes to decent recall (Since we filter input based on our own preconceptions and expectations) and tends to ditch information we don't need very rapidly. Computer memory suffers from none of these benefits and flaws, what is recorded is a near-perfect record.

And yes, while it is true I do not consciously call my arm to raise when I need to pick something up, the process isn't as complicated as you make it out to be, spatial awareness, the specific length of my arm, coded instructions to the muscle groups and, finally, engagement of the 'get thing' are all very easy for computers to replicate. Willow Garage's PR2 can perform an identical (If slow) feat with a simple command.

And so, the nature of how an AI might come about from simply pumping more data into a program? Hardly! You don't get intelligence from a database, nor, I suspect, do you actually know much of what you speak in this regard. My offer? A large system of genetic algorithms, designed to catagorise information and continually improve its accuracy. From that point, I magine its only a matter of time before it becomes aware of itself in a machine-awareness point of view. But that's speculation of the wildest sort.

It is not a fact that we cannot define sentience, it is hard, but still within our understanding given time, nor does the lack of that definition mean that the emulation of sentience is impossible. It could potentially be that very first, randomly spawned AI gives us the definition purely in the act of its creation.
 

Waffle_Man

New member
Oct 14, 2010
391
0
0
I'm not going to completely rule out the possibility of future advancement, but Computers will have to do a whole lot more than be able to compute with increased speed to be able to process qualitative information.
 

ResonanceSD

Elite Member
Legacy
Dec 14, 2009
4,538
5
43
Well. No. Given that they are programmed by humans, they can't become more intelligent than them. They can think faster, sure, but that's not the same thing.
 

Heronblade

New member
Apr 12, 2011
1,204
0
0
Depending on how you define the term, they already are, by several orders of magnitude.

In terms of adaptive thought comparable to the way we think, yes they certainly can be. As matters stand, the only real impediment to a true artificial intelligence is in terms of the physical technology we have available. Give us a more efficient mode of processing data (using the 32 possible quantum states as opposed to the current binary switches is a popular possibility), or a reasonably available material that is superconductive at or near room temperature, and all that remains is finding a rich sponsor crazy enough to go through with it.

Whether or not we can find a way to teach a growing AI in a manner that doesn't leave it insane, malevolent, incredibly ADHD, or otherwise unusable is another question.
 

NightHawk21

New member
Dec 8, 2010
1,273
0
0
It depends on what you consider to be intelligence. If you want to measure math and computational skills computers are exceptionally intelligent (more so than any person on earth). If you only want to consider the ability of the machine to learn you're already seeing the basics of that now in some programs that remotely track your searches, preferences and what not and adapt. So the answer to that is definitely.

BTW this reminded me of the Brian Herberts prequels to Dune. Very good read.
 

Bara_no_Hime

New member
Sep 15, 2010
3,646
0
0
Machines already ARE more intelligent than people.

They just don't have any sense of self, yet, so it doesn't matter.

Edit: Oh, and the ability to learn - already been done. It isn't very complex yet, but that's already changing. Sentience... I guess we'll see soon enough.
 

SnakeoilSage

New member
Sep 20, 2011
1,211
0
0
FarleShadow said:
Everything you just said to prove that computers have superior brain power to humans still relies on one fundamental flaw - they don't think until we tell them to, and then only within the confines of their programming, which we gave them. Even your concept of genetic algorithms depends entirely on its design, and its categorization and increased accuracy would either be bound by the definitions of its programming or subject to increasing error due to its inability to adapt beyond its programming. If it didn't confuse itself into some kind of catastrophic failure basing an organic process on logic-only programming, it wouldn't break beyond the boundaries of what it was built to do, and therefore would never come close to anything resembling adaptive instinct, let along cognitive thought.

I know it's a crude analogy but consider chaos theory: the more complex the system the greater the potential for and extent of the damage that can result from random elements introduced into that system. This is most harmful in computer systems, because of their dependency on their programming and their inability to adapt to it, no matter how advanced that system might be.

Do you think computer viruses are things you can define by biological terms? They're "ghosts in the machine," deliberately crafted errors inserted into a system to break that system in controlled ways.

The fact that human beings are capable of the mental processes that we are without suffering that kind of breakdown is just an example of how we can adapt to the information we process, while a computer still cannot even comprehend it.

They're tools. Not slaves, not thinking devices, tools. And as much information as we give them, they will only ever be tools, even as the complexity of the tasks set before them increases, they will never progress beyond "if x, then y."
 

Jetpack Stu

New member
Feb 12, 2012
13
0
0
learning algorithms already exist already, the real issues creating a smart machine lie in data storage, and using an algorithm to simulate conciousness. Until computer scientists can figure out a theoretical construct for a non-turing-machine, the only kind of artificial intelligence possible will be "dumb AI's" such as siri
 

Zakarath

New member
Mar 23, 2009
1,244
0
0
Ever is a very, very, long time. I'm sure at some point along that timeline, there will be synthetic/computer AI more intelligent than us. ;)

I'd say that it is a long way off, though.
 

aba1

New member
Mar 18, 2010
3,248
0
0
I predict that eventually machinery will be to a point where most jobs will simply be unnecessary. At this point society will have to find a way for people to make a living since most jobs will be done by machines.
 

Deathmageddon

New member
Nov 1, 2011
432
0
0
The US government sold our surplus crude oil to China, and not a week later, Obama says that there are no "silver bullets" to bring down gas prices. The average electric can opener is already smarter than most people, it seems. Now if we can just rip off Asimov's 3 laws and apply them to literally everything that runs on electricity, we're all set.
 

FalloutJack

Bah weep grah nah neep ninny bom
Nov 20, 2008
15,489
0
0
My current requirement for a computer intelligence test is that if it can't spit out something like "Fuck this, I'm going to Vegas." when not even remotely programmed to, then it has a mind of its own. I don't actually think this is going to happen. They're gonna keep pawning off fakes.
 

LilithSlave

New member
Sep 1, 2011
2,462
0
0
That all depends.

We're going to become the machines.

Though in some ways, they're already smarter.
 

Trippy Turtle

Elite Member
May 10, 2010
2,119
2
43
They can't be more intelligent then you make them really. They can calculate faster but they will never understand emotion or anything with random outcomes.
 

idarkphoenixi

New member
May 2, 2011
1,492
0
0
It's not such a black and white issue, there are different levels of intelligence. If a robot is able to do complex mathimatics and such, does that make it more intelligent? In a way yes, but that's not really whats important

Humans are able to look at a situation and process it in ways that is impossible to replicate, we think around the obvious logical answer. We are able to emote, convey feelings, sympathise. That is what makes us a higher intelligence.
 

Da Orky Man

Yeah, that's me
Apr 24, 2011
2,107
0
0
Alcamonic said:
Also, give me a talking toaster. "I toast, therefore I am!"

Red Dwarf to the rescue!

Anyway, I have a suspicion we will have at least some form of AI pretty soon. Exponential growth in computer power should eventually lead to computers with a similar processing power to a human brain, then it's just a matter of getting the right code in there.