Thursday, July 21, 2011

Is the Singularity Near or Far?

In this article, titled "The Singularity is Far", http://www.kurzweilai.net/the-singularity-is-far-a-neuroscientists-view?utm_source=KurzweilAI+Daily+Newsletter&utm_campaign=a40a06de5c-UA-946742-1&utm_medium=email David J. Linden challenges many of Kurzweil's timetables for the reverse engineering of the Human Brain. His primary argument is that although data is growing exponentially, our understanding of that data appears to be growing only linearly.

Lincoln Cannon challenged this article's premises here:http://lincoln.metacannon.net/2011/07/singularity-merits-understanding-but.html

Lincoln's response was well thought out, and he had some excellent points.

Lincoln's primary argument seems to be that we don't need understanding, just simulation and scanning resolution. "In a sense, it would be like riding a bike versus understanding the physics of riding bike; we can do the former without the latter." I essentially agree with him there. It is indeed possible to develop a singularity like event using simulation without understanding.

However, when Kurzweil made his predictions, he assumed that functional simulation would require much less computational power than would a full simulation. Essentially, Kurzweil assumed that we would use the power of "understanding" to create algorithms that are more efficient than the brain, and his entire time table was based upon this assumption. If we are instead going to assume that we will use simulation without understanding to do the trick of creating the singularity, then we must recognize that the computing power needed will be far greater, and this will move the time table for the Singularity far back from Kurzweil's predictions.

There are several reasons why I reject Kurzweil's time tables for the Singularity:

When I look at most of the technology trend data, I tend to see exponentials where Kurzweil sees double exponentials. Furthermore, although I do see exponentials trends in our data gathering abilities, like David J. Linden, I see linear progress in our understanding of that flood of data. I actually believe that understanding is most likely on an exponential trend too, but it is just in the early, nearly linear, beginning of an exponential that will take time to "ramp up" to the knee of the curve. But it's hard to map numbers to brain "understanding" and so it is hard to predict when this shift will take place.

Kurzweil's time table is based on 10^14-10^16th cps to simulate the functionality of the brain, something that would require re-writing the brain's algorithms in a more efficient manner, and that requires understanding. To go the "simulation without understanding" rout you need more like 10^19th cps. That would push many of the dates for Kurzweil's predictions back several years at least.

10^19 cps should arrive in super computers by 2022-2025 according to my last projections. https://picasaweb.google.com/jlcarroll/Economy#5620432299391054642. This really isn't in time to meet Kurzweil's deadlines, because he predicts strong AI at about the time when the computation necessary to simulate the brain hits $1000, not when it can be done only on the world's most expensive supercomputers. In other words, he predicts that we will have good strong AI simulations AFTER we have had clumsy ones in a super computer for a while, after we have some time to study and perfect the clumsy ones, and then only after they get cheap enough to work really well and become ubiquitous.

Down the "simulation without understanding" road, you need 10^19th cps to hit $1000 instead of only needing 10^16th cps to hit $1000, and that shouldn't happen until significantly after 2022 when our fastest super computers should be able to do it. My last prediction put this landmark at about 2058! And that assumes that the doubling rate of cps/$ doesn't slow after Moore's Law hits the quantum barrier somewhere between 2020-2025. If it slows, which it might, then this could take even longer. I will admit that if I am wrong, things don't slow down but speed up, and if there is a double exponential at work here that I can't find, then this might happen significantly sooner. Nevertheless, even then, it would still happen some time after Kurzweil's deadline if you use a simulation without understanding paradigm.

In other words, I may buy into many of Kurzweil's predictions, but I find that I must also question some of his timetables.

No comments: