Brad DeLong posts a Wikipedia page today showing the cost of computing power over time. I adjusted it to show the cost of about 30 petaflops of computing power, which is roughly the consensus estimate of the power of the human brain. Here it is:
Don’t take this too seriously. It’s an extrapolation based on multiplying the power of the fastest/cheapest chips by several hundred, which provides a highly thoretical construct. Actually getting 30 petaflops requires building a huge machine with all these chips running in parallel—plus power supplies, air conditioning, backplanes, support circuitry, etc. etc.—which increases the cost dramatically.
Still, as a sort of thought experiment it shows two things:
- The price of commercially-available computing power has come down about 7x in just four years.
- Even when you account for all the infrastructure, we’re really starting to get close to human-power computers at a relatively low price. This assumes that massively parallel computers turn out to be suitable for AI research, but since the brain is basically a massively parallel biological machine, this seems like a reasonable guess.
And don’t forget software! That’s almost certainly the gating item. My guess is that we’ll have affordable¹ human-power computers in another decade or so, but it will still be two or three decades before we get true AI. Needless to say, this assumes that all the AI boffins figure out how to write the code to do it.
¹By affordable, I don’t mean that you’ll be able to buy one for your desk. I mean that it will be reasonably affordable for just about any AI researcher who wants one. This is important, since software research will skyrocket when human-power computers are available to tens of thousands of researchers, not just the one or two hundred with gigantic budgets.