Best Article I've Read About The Singularity In a While  

Posted by Wayne Bretski in

I, Cringely blog author Robert Cringely writes that staying alive until the Singularity will be the hardest part. It's a nice overview of an event that polarizes really intelligent people, from Ray Kurzweil (optimist) to Bill Joy (inventor of the Internet and pessimist).

For those not hep to meta-technology geeks, the event known as the Singularity occurs when a technology passes the Turing Test, and is smarter than the human with which it is interacting.

"the Singularity is a phenomenon with both technological and economic components. Moore's Law works the same way. The underlying concept of both is the level of technological development we can reach AT A CERTAIN PRICE. The most powerful supercomputers can cost tens of millions of dollars and it is logical to assume that something on the order of a supercomputer would be the first machine to reach Singularity status. That's fine and would undoubtedly result in the creation of knowledge that would have an impact on all of us simply through the existence of that knowledge and its subsequent use by people and machines who may not have yet grown to Singularity, but what will really change everything is when the price of Singularity drops low enough to apply to the computer on our desks or on our wrists."
The article here.

1 comments

I know how a machine can pass the Turing Test and I don't particularly care when it does. I don't believe the following: a machine successfully immitating a human => a machine necessarily has anything even close to human intelligence. See Searle's Chinese room example, (http://en.wikipedia.org/wiki/Chinese_room)
This leads me too a question, how do we know when a machine actually has greater intelligence than the human with which it is interacting? If we do have a way of knowing this, what is the metric for "intelligence?"

Post a Comment

Recent Comments

Archives