The computer mouse was invented in 1963, demonstrated in 1968, shown off in a lab in 1973, introduced on a personal computer in 1984, and finally widely adopted in the early 90s. That’s three decades:
That might seem like a long time, but as computer scientist Bill Buxton has argued, thirty years is actually a typical amount of time for a breakthrough computing invention to go from the first laboratory prototype to commercial ubiquity.
The first packet-switched network, the ARPANET, was launched in 1969. It took about 30 years, until the turn of the millenium, for Internet access to be widely adopted by American consumers.
….Why does it take so long? In all of these cases, it took a decade or longer for the new techniques to spread and mature inside the research community….Once a computing concept has been refined in the laboratory, it can take another decade to turn it into a viable commercial product.
….This 30-year rule of thumb can help to form an educated guess about when future innovations will reach the mass market. For example, the first car capable of driving itself long distances was created in 2005, and the technology has been maturing in academica and corporate labs over the last eight years. If self-driving technology follows the same trajectory as previous computing innovations, commercial self-driving cars will be introduced sometime in the 2020s, and the technology will become widely adopted in the 2030s.
That’s Tim Lee, and I’d add one more thing: a lot of these inventions depend on computing power. A mouse isn’t very useful without a graphical user interface, and you can’t run a useful GUI on a Z80. You can do it—barely—with a small black-and-white display—on a Motorola 68000. And then, finally you can do it at reasonable cost with a decent display on the microprocessors of the late 80s and early 90s.
Driverless cars are following the same arc. Obviously software is a huge issue too, but sufficient computer power at a reasonable price is a bare minimum. We’re still a decade or so away from that.