James Surowiecki writes that the reason the financial sector has grown so spectacularly over the past couple of decades is because, compared to the boring 50s and 60s, the demand of modern businesses for capital has also grown spectacularly:
The financial sector’s most important job is channelling money from investors to businesses that need capital for worthwhile investment. But in the postwar era there wasn’t much need for this….Thomas Philippon, an economist at N.Y.U., has shown that most of the increase in the size of the financial sector [during the period 1980-1999] can be accounted for by companies’ need for new capital….Philippon suggests that, given the demands of businesses for capital, a normal financial sector would be about the size it was in 1996.
But this is only part of the story. The need for capital may well have gone up considerably, but the combination of globalization, automation, and greater competition should also have made the finance industry far more efficient at providing it. As Felix Salmon says:
One would hope and expect that between sell-side productivity gains and a rise in the sophistication of the buy side, any increase in America’s financing needs would be met without any rise in the percentage of the economy taken up by the financial sector. That it wasn’t is an indication, on its face, that the financial sector in aggregate signally failed to improve at doing its job over the post-war decades — a failure which was then underlined by the excesses of the current decade and the subsequent global economic meltdown.
Most information technology sectors — and finance is decidedly one of them — have become far more efficient over the past few decades. They may be bigger in absolute terms, but the price per unit of whatever they’re selling — MIPS, bandwidth, gigabytes, etc. — is far lower. In the case of finance, the units they’re selling are dollars of capital. But has the per-unit cost of providing capital gone down substantially since, say, 1980? If not, why not?