Ray Kurzweil is probably totally wrong

Status
Not open for further replies.

Roman

Banned
Banned
An interesting NYT article

Some interesting excerpts:

"Our colleague David Linden has compared the evolutionary history of the brain to the task of building a modern car by adding parts to a 1925 Model T that never stops running. As a result, brains differ from computers in many ways, from their highly efficient use of energy to their tremendous adaptability."

"One striking feature of brain tissue is its compactness. In the brain’s wiring, space is at a premium, and is more tightly packed than even the most condensed computer architecture. One cubic centimeter of human brain tissue, which would fill a thimble, contains 50 million neurons; several hundred miles of axons, the wires over which neurons send signals; and close to a trillion (that’s a million million) synapses, the connections between neurons."

"But unlike a computer, connections between neurons can form and break too, a process that continues throughout life and can store even more information because of the potential for creating new paths for activity. Although we’re forced to guess because the neural basis of memory isn’t understood at this level, let’s say that one movable synapse could store one byte (8 bits) of memory. That thimble would then contain 1,000 gigabytes (1 terabyte) of information. A thousand thimblefuls make up a whole brain, giving us a million gigabytes — a petabyte — of information. To put this in perspective, the entire archived contents of the Internet fill just three petabytes.

To address this challenge, Kurzweil invokes Moore’s Law, the principle that for the last four decades, engineers have managed to double the capacity of chips (and hard drives) every year or two. If we imagine that the trend will continue, it’s possible to guess when a single computer the size of a brain could contain a petabyte. That would be about 2025 to 2030, just 15 or 20 years from now.

This projection overlooks the dark, hot underbelly of Moore’s law: power consumption per chip, which has also exploded since 1985. By 2025, the memory of an artificial brain would use nearly a gigawatt of power, the amount currently consumed by all of Washington, D.C. So brute-force escalation of current computer technology would give us an artificial brain that is far too costly to operate.

Compare this with your brain, which uses about 12 watts, an amount that supports not only memory but all your thought processes. This is less than the energy consumed by a typical refrigerator light, and half the typical needs of a laptop computer. Cutting power consumption by half while increasing computing power many times over is a pretty challenging design standard. As smart as we are, in this sense we are all dim bulbs."
 
brains differ from computers in many ways, from their highly efficient use of energy to their tremendous adaptability.

What a Moronic comment! The comparison should be Brain verses Information Science. That is what computers do!

Brain computes, so does the Computer.

As to whether we will have computers, or whatever name it will be called (like Abacus to Excel Spreadsheet)...that is a no brainer....by people who have the brains to understand the reality!

No Kurzweil is not wrong...

Sam Wang is a Biologist, not a Information Scientist...never designed an Integrated Circuit or understood Information Science. It is same area when dentists try to do brain surgery....

Otherwise, he would be the chief architect of the Knowledge Management for the Department of Defense!

He is also a professor of Neuroscience, yet I bet he has not had a masters level course in cybernetics. Most pathologists fill slots in neuroscience and no body says anything.
 
Do not know, if it will be printed, but here is my comment to NYT

What we need is a Information Scientist to make the comment and not a pathologist who has no idea about Algorithms, Decision Theory, Information Science and Control Theory.

Otherwise, it is like dentists discussing the art of brain surgery!
 
So you really think we'll see singularities on everyone's desktop in the next 30 years? I guess if More's law has held true for this long, it must be true forever!

Besides Kurzweil is a futurist wanking to his own delusions that he's worth preserving forever. He was wrong last decade, he's wrong this decade.
 
Moore's Law did bridge several different paradigms of computer design. There is no reason that computing could not use less power in the future. Working computers have been made with DNA, which uses no power.
 
Moore's Law did bridge several different paradigms of computer design. There is no reason that computing could not use less power in the future. Working computers have been made with DNA, which uses no power.

Working computers have also been made out of neurons.

Oh wait!
 
Besides Kurzweil is a futurist wanking to his own delusions that he's worth preserving forever. He was wrong last decade, he's wrong this decade.

People who are in the computer field for the last 30 years know where the technology is going. It has nothing to do with Ray. He just agrees with us.
 
And future artificial computers could be made by simply scanning a human brain and reproducing it's components.

Really?
Just like that?
I don't think so.

A simple scanning isn't going to let you grow a brain. The TFs involved are highly complex, and it takes about 20 years for it to all mature.

That, and you really haven't hit a singularity; you've just grown a brain. That happens to be something some of us have already done!
 
Scanning technology is getting better and better, and the we don't need atomic resolution to create a working brain, since the brain itself is tolerant of error. The brain is limited by it's need to pass through the birth canal. We could merge two or more brains together. We wouldn't need to make it out of perishable flesh, but print it in durable silicon. The singularity isn't a single computer, but rather describes a point at which technology surpasses human intelligence and can start to direct itself. In other words, we can build a computer that designs computers.
 
Scanning technology is getting better and better, and the we don't need atomic resolution to create a working brain, since the brain itself is tolerant of error. The brain is limited by it's need to pass through the birth canal. We could merge two or more brains together. We wouldn't need to make it out of perishable flesh, but print it in durable silicon. The singularity isn't a single computer, but rather describes a point at which technology surpasses human intelligence and can start to direct itself. In other words, we can build a computer that designs computers.

But a trillion silicon connections per cubic inch?
Is that even feasible?

Is there any evidence that shows a silicon brain works the same as a fleshy one? There may be (likely are) emergent properties of having everything squishy and wet and tetravalent.
 
Maybe, but we aren't limited by the needs of a body. We might not have to provide blood. We might discover a better way to perform the same functions. Maybe we do grow it out of flesh instead of building it, a disembodied head a mile across!
 
Human brain manages data and information. All you have to do is duplicate that data and information management with a automaton. We are not there yet. I am trying to get certain agency to allocate some funds since I am the architect for their simple solutions....but no funds are available due to our economic crisis for new applications.

May be 5 years down the road....
 
Is there any evidence that shows a silicon brain works the same as a fleshy one? There may be (likely are) emergent properties of having everything squishy and wet and tetravalent.
No. AI researchers simply assume that the brain computes -- and does nothing but compute. The brain is an analog device. Computers are digital devices, and are thus limited by that fact that almost all numbers are not computable. Digital computers are limited by Church's thesis. That analog devices are is an assumption, not a conclusion.
 
Not only are artificial "neural" networks becoming real- so are synthetic synapses. We're not purely analog (DNA for example) and the "digital revolution" won't stay purely binary. It's natural to intuit that AI and ourselves will not long remain separate organisms.

On Singularity generally: How can we deny our trajectory?
 
Last edited:
Status
Not open for further replies.
Back
Top