Monday, August 20, 2007

The Atomic Flying Car

Jumping off from Tim's comment on my We are The Sims? post, let's talk about singularity. I mean Singularity. This is the idea (according to my understanding) that, in the next fifty or a hundred years (or maybe longer, I guess), computers will become smarter than humans, humans will be integrated/uploaded to computers, and immortality and other weirdness will result.

I'm a bit skeptical about Singularity, at least on the relatively short scale of a hundred years. The argument has to do with the point made in my previous post about increasing computing power not leading to increasingly sophisticated software.

Mind you, our software today is more sophisticated than it used to be. There has been some progress made, and there are also market forces pushing us to try and make software more sophisticated. Still, I think if you look at what software could do ten or even twenty years ago, and compare to what it does now, the changes have mostly been the kind of thing that is enabled by hardware. 3D graphics is more sophisticated now, in terms of techniques, but the basic idea could have been (and was) implemented many years ago, and most of the improvement has come from simply being able to do more number crunching, faster. That's what most of it comes down to in software: more number crunching, faster. The principles behind the number crunching haven't really changed.

Even the internet, which supposedly changed everything, didn't really change the software that much. What the internet changed was how people used the software, and what sort of software they wanted.

So, to me, the idea that computers are really getting smarter is a bit suspect. They're doing more number crunching, faster. Being able to think is a different thing entirely.

"So?" you say, "We'll build thinking machines once we understand how thinking works and we have the computational power to do it." Maybe. But maybe not. Maybe understanding how thinking works is more difficult than we think. AI researchers have been banging their head against this problem for decades, and, frankly, progress is slow.

Maybe it's too difficult for humans to design a machine that thinks like a human. "But," you say, "We'll augment our intelligence with computers, until we're smart enough to design a thinking machine." Ah, but that assumes that the kind of augmentation computers can provide will help. I think it will help a little, but I'm not sure that it will solve the problem of designing complex systems. Until we have a machine that can actually think, all the thinking that goes into design has to pass through the bottleneck of a few square centimeters of wet, gray meat in the front of some human's head. Expanded external memories, instant recall and perfect arithmetic skills might not be enough when it comes to actually thinking and designing.

Maybe we will manage to design thinking machines, but they won't be all that better than us at thinking. Faster, possibly, and with the same perfect, instant memories and math skills, but perhaps not so much better at thinking. It's a possibility.

So, for Singularity, I'm skeptical. Maybe it will happen, but I think it might turn out to be the atomic flying car of the 21st century. In 2100, people might make jokes like the "Where's my jetpack?" jokes of today: "Where's my uploaded immortality?"

No comments: