Brain Power

Status
Not open for further replies.
funkstar said:
Why?

Why not?

Why not?

True, but it doesn't mean that it transcends mechanics.

You don't know that. You assume it. I don't find it to be a reasonable assumption, without an argument.

Funkstar, I thought that you were serious after the first one or two "why's" but by the time I got to the last one I realized you cannot be.

To not understand how the 'whole is greater than the sum of it's parts', someone would have to be either very young, very foolish, or simply wanting to argue for the fun of it.
 
Blue_UK said:
So, Light, do you suppose that there is something other than 'just neurons'? Anything supernatural? If we assume there is no supernatural element then there is no reason to assume that logical construction of the brain cannot be duplicated by another medium.

For example, it does not matter to you that the atoms which make up your brain change (in part) from day to day. Why should it matter that nerve impulses are controlled by voltage gated ion channels and not MOSFET switches?

It's not the material that counts, it's the what the system does - and you cannot possibly state that 'emotions' and 'feelings' are anything more than that if you disregard the supernatural.

Current AI is a (as you would probably agree) blind, mechanical 'dumb' computer trick that looks as if it's clever. However, that is the so-called top down approach. to build AI from artifical neurons (...bottom up) would create something that works in the same way as a brain.

Sorry, but you are taking a much too narrow view. I wouldn't classify love and hate as supernational - but do you deny that they exist? I would hardly think so.
 
Light said:
Funkstar, I thought that you were serious after the first one or two "why's" but by the time I got to the last one I realized you cannot be.
No, I am genuinely interested in your reasoning, if you have any.
To not understand how the 'whole is greater than the sum of it's parts', someone would have to be either very young, very foolish, or simply wanting to argue for the fun of it.
Yours was a vague statement and I gave a vague reply, so I'll let your rude reply pass. But to clarify: I think you hold an irrational belief that our brains are somehow special and essentially supernatural, and that you use this belief as an argument. The "parts" I allude to, are the material neurons and their interaction. Macroscale emergent phenomena does not mean that the simpler microscale understanding is insufficient to replicate it. You claim that even with a perfect simulation of neuron interactions in vast numbers, we cannot get the same functionality of a human brain. Back that up.
 
Here's my unrequested opinion on the matter:

I'm totally with Funkstar on this.
Light, do you think that there's a qualitative difference between organic and inorganic media? I tend to think that a computation is equally valid whether it's performed on an abacus, a computer or a brain.

Also, the phrase "the whole really IS greater than the sum of all it's parts" has always sounded rather mystical to me. The whole, by definition, is exactly the sum of its parts. If you find that they don't quite add up, it just means that all of the interactions have not yet been considered.
 
Also, the phrase "the whole really IS greater than the sum of all it's parts" has always sounded rather mystical to me.
I'd always interpreted it from a values standpoint, ie the value of the whole is greater than the sum of the values of the parts.

Interactions could be the key, as you suggested. If interactions are considered parts (they are usually not), then I suppose the whole would be exactly the sum of the parts. But then it wouldn't even make sense to talk about the sum of the parts, would it?

Anyway.

It's pretty clear that a brain is more than a bunch of neurons.
But an AI would just as clearly be more than a bunch of simulated neurons.

If you allow that a suitably connected set of neurons becomes something more (a sentient being), then it seem logical to also allow that a suitably connected set of simulated neurons should also become something more (a sentient being).

In both cases, the whole is more than the sum of its parts.
 
Last edited:
Laika said:
Here's my unrequested opinion on the matter:

I'm totally with Funkstar on this.
Light, do you think that there's a qualitative difference between organic and inorganic media? I tend to think that a computation is equally valid whether it's performed on an abacus, a computer or a brain.

Also, the phrase "the whole really IS greater than the sum of all it's parts" has always sounded rather mystical to me. The whole, by definition, is exactly the sum of its parts. If you find that they don't quite add up, it just means that all of the interactions have not yet been considered.

Laika, Funkstar, and any others that don't understand the 'sum of the parts' thing. We are NOT talking about simple mathematical addition. Evidentially, you've got some way to go before you can understand the principle involved. There's nothing at all mystical or supernatural about it.

And as I've said earlier, there's also nothing mystical or supernatural about all the human emotions like love, hate, fear, ambition, etc. But do you deny that any of them exist?

If you had the time and resources you could build a neuro-net computer the size of the planet Earth. But someone please tell me how you could ever endow it with genuine human emotions? And how could you teach it to have real original thoughts? A simple example would be inventing the wheel with no prior knowledge of it. It takes a human to "connect the dots", if you will, of totally unrelated bits of information. Yes, it would make mathematical computations in the order of tera-tera-teraflops. But that just numerical calculations and data-sorting which is nowhere near original thought nor mimicking human emotions.
 
Pete,

It's pretty clear that a brain is more than a bunch of neurons.
I disagree. I don't think it's at all clear that a brain is more than a bunch of neurons, although a suitably connected set of them certainly does constitute a sentient being.

In my opinion,

A bunch of suitably connected neurons = A sentient being.

a suitably connected set of neurons becomes something more (a sentient being)
This seems like word play.


Light,

You say...
There's nothing at all mystical or supernatural about it.
there's also nothing mystical or supernatural about all the human emotions like love, hate, fear, ambition, etc
...and I agree entirely. It is you who seeks to make a qualitative distinction between such emotions running on biological and artificial hardware, invoking a seemingly mystical separation between the two.
 
Light,

If you had the time and resources you could build a neuro-net computer the size of the planet Earth. But someone please tell me how you could ever endow it with genuine human emotions?
Presumably the same way the brain does it now. Why isn’t this just a matter of reverse engineering? There are only neurons, synapses and hormones. Why do you think we cannot replicate the effects they generate?

And how could you teach it to have real original thoughts?
Again, the same way the brain does it now. A new born baby has little ability at anything since it has very few synaptic connections. As it flails around, sucks, feels, it generates stimuli to the brain that force massive numbers of synaptic connections. An artificial brain could possibly take the same path once we understand more of the fundamentals of how the brain operates. I don’t see any show stopping issues here other than a tremendous amount of fundamental research and experimentation. First we need some computing power at appropriate levels to help us experiment.

A simple example would be inventing the wheel with no prior knowledge of it. It takes a human to "connect the dots", if you will, of totally unrelated bits of information. Yes, it would make mathematical computations in the order of tera-tera-teraflops. But that just numerical calculations and data-sorting which is nowhere near original thought nor mimicking human emotions.
Then clearly we would need to generate different algorithms rather than just use sheer power. The human brain isn’t a fast computer. It comprises some 200 billion small and relatively slow microprocessors we call neurons. It is the vast constantly changing patterns of networks that form its operational mechanism that we have yet to fully comprehend. But that seems to be nothing more than an issue of reverse engineering.
 
Last edited:
AB,

First of all there is no 32GHz computers now... very pity. And there will be no 64Ghz in 2006.
Not true. While individual microprocessors are not at that speed there are chips at that speed or very close. What Intel has done is place several microprocessors (cores) on the same chip. We now have 4 way core chips and 8 way cores are imminent. The effects is to have chips operating at close to Moores law as predicted. But for human brain emulation this is the right direction. The brain doesn't consist of fast microprocessors but a very large number of small ones that we know as neurons.

We are still on track to have human brain level compute power in a few years.
 
tab,

How long would it take just to build 200 billion small microprocessors?
I'd still expect we would have fewer but a lot faster. We have a 128p itanium processor in our lab now and we are starting to build a 256p system now. That's about 2 billion neurons. I'll need to build a hundred of them.

We don't have any budget for that. But the main problem won't be the compute power but the interconnectivity matrix needed. Constructing a mechanism to dynamically change a highly complex topology rapidly and maintain high bandwidth when needed is going to be a real challenge.

Current SAN technology needs to also go up a notch to help here. That should come as processor speeds also increase.

So to answer your question of how long - well right now if someone would put up the money. And it isn't that much - about $1B at current prices, using current off the shelf Intel Itanium processors. Bringing up the 128p took about a week so 100 systems in parrallel wouldn't take much longer. But the SAN would be the biggest bottleneck.
 
He will when he sees an opportunity for profit. For the moment we are using such systems in the BI space (Business Intelligence) where there is the opportunity for profit. And we are kinda competing with MS anyway.
 
Recreating a human mind in AI form will be only one among many things that will almost certainly be possible. Simulating emotions and creative thinking are certainly going to be a challenge, but once it can be done, an almost infinite realm of new endeavour should open up. Imagine designing new emotions for an artificial mind; new types of experience we can only dream of.
The joy a computer might learn to feel could be like nothing humans are familiar with; the joy of correlation between facts and self knowledge to a depth we could never be capable of.
Here is a piece of speculation about the sorts of mental landscapes which might emerge after hundreds, or thousands of years of AI autoevolution...
certainly it is science fiction, but it expresses the thought that developing human level AI is just the start, not the end, of this process.
http://www.orionsarm.com/sophontology/Toposophic_Mental_Abilities.html
 
Light said:
Just hardware, eh? Wow! That's got to go down in the books as the understatement of the century!

Keep in mind that even though I am a scientist by profession (biology, psychology) I still have to say that your expectations for science and technology are unreasonable.

And while you expect great complexity, you still tend toward over-simplifying the problem. While it may someday be possible to precisely mimic the operation of a single neuron and then replicate it as many times as we choose, that still cannot function as a human brain. To use your phrase - THAT would still be just hardware.

It might be well for you yo remember the old maxim "the whole is greater than the sum of it's parts" and that applies perfectly when speaking of the brain.

A computer is a computer and is just a computer. Regardless of it's size, computing power and how much data it can store, it will still be just a computer - a number-cruncher and data analyzer. You cannot program it to develop a consciousness, to feel true emotions and many other human attributes. The brain and it's mind are much more complex than any inorganic mechanism. The whole really IS greater than the sum of all it's parts.


i would have to agree with lights take to this argument,


a computer is a computer, i do not believe humans are just mere machines, there is alot more than computing that goes on within us all,

a computer in my opinion will never be able to become aware and concious or itself,

do yout hink a robot will ever question why it has been created?





you can put asmuch software and hardware and memory into a robot as you want, but that will not result in awareness, memory doubling will not create a fully functional free thinking unit,

you might think im a spiritual non scientific nut, but i believe we cannot create an equal to a human with tech,
peace,
 
You question how a robot can ever question why it was created...

How is it that we question this? One thing I believe we should get out of the way. EmptyForceOfChi, do you agree that there is nothing supernatural, and nothing mystical about the brain? That it is indeed just a collection of hardware (albeit incredibly complex organic neuron based hardware)? I think the answer to these questions is very important first off.

-AntonK
 
AntonK said:
You question how a robot can ever question why it was created...

How is it that we question this? One thing I believe we should get out of the way. EmptyForceOfChi, do you agree that there is nothing supernatural, and nothing mystical about the brain? That it is indeed just a collection of hardware (albeit incredibly complex organic neuron based hardware)? I think the answer to these questions is very important first off.

-AntonK


i believ we are as much spiritual as we are physical, i believe we all emit energy, and are energy.


i am very spiritual, but not religious atall, if i were to class myself as anything atall, i would have to say, im daoist (taoist) but im not even fully that either, i am a "possabilityist" i believe i know nothign and anything is possible.


peace,
 
It is interesting that you state that you know nothing and anything is possible, but you debate a point based upon belief in a unobserved spirituality. Is anything possible or are you just regurgitating sapien-centric doctrine?

- Kit
 
Status
Not open for further replies.
Back
Top