Home > Economics - General > Why zero (multifactor) productivity growth is OK for Oz (very wonkish)

Why zero (multifactor) productivity growth is OK for Oz (very wonkish)

I’m writing a book chapter about productivity, much of which will be a rehash of my 20-year debate with the Productivity Commission over measures of multi-factor productivity (MFP). In the process, I reread this op-ed by Ross Gittins, and the Treasury article on which it is based, by Simon Campbell and Harry Withers. As a result, I had what seemed to me like a Eureka moment. As with all such moments, of course, my insight might turn out to be either wrong or obvious.

Campbell and Withers criticise the whole idea of MFP, which, as they note, is a residual; the extra growth after changes in labour and capital inputs are taken into account. This residual is sometimes called the Solow residual, after Robert Solow, who first discovered it when he tried to estimate models of economic growth in the 1950s. It’s normally explained as technological and more precisely (this is important) as “disembodied” technical change. That is, the residual consists of technological change that isn’t embodied in new and more powerful capital equipment, which should be captured in the measure of capital input.

Campbell and Withers point out that labour productivity has risen steadily, largely as a result of capital deepening (more capital per worker) and that this has been feasible because of a steady decline in the relative price of capital goods. They don’t however spell out the reasons for the falling price of capital goods. The crucial point is that nearly all technological progress in the last couple of decades has taken the form of cheaper, faster and more powerful information and communications technology (ICT). This appears, at least from the viewpoint of a country like Australia that imports its ICT equipment as a kind of embodied technological change. So, the capital contribution to increased productivity appears as capital deepening, not as increased capital productivity.

But wait, there’s more! The standard measure of labour productivity is output per hour worked. But that doesn’t take account of the quality of labour input, which is determined largely by education (also experience, but the average experience of the workforce doesn’t change much over time). If you accept that all productivity growth is explained either by better education or better (embodied) technology, then the Solow residual (that is, the rate of MFP growth) should be zero.

What’s left out here are the putative benefits of micro-economic reform, better management and so on. If these are important, MFP growth should be significantly greater than zero.

The ABS produces quality-adjusted measures of MFP and these are the ones we should be looking at. They are in Table 3 of the most recent report, which shows that MFP growth has been almost exactly zero this century.

The latest document excludes the fabled “productivity surge” of the mid-1990s, but also the productivity slump of the recession years immediately preceding that. Adding those periods in to the picture would still leave MFP so close to zero as to be negligible.

In summary, at least according to the ABS data, we can forget about microeconomic reform as a source of productivity growth. Changes in productivity are explained entirely by better education and faster computers.

Categories: Economics - General Tags:
  1. Smith
    October 13th, 2017 at 08:59 | #1

    Have we had better education? We have higher rates of Year 12 completion, and more young people going on to post-school education. But what is the evidence that educational outcomes as a whole have improved?

    As far as computers go, obviously the computers themselves have improved by the usual measures. But, to take one example, just because MS Word and Excel have more features, and are being use on computers with more processing power, doesn’t mean that the person using them is more productive. (This is not a trivial example. The productivity of people sitting at desks is a large part of the whole economy’s productivity, through weight of numbers.)

  2. Troy Prideaux
    October 13th, 2017 at 10:01 | #2

    @Smith
    but there are soooo many social media, online shopping and news-feeds to play with at work 🙂

  3. Ikonoclast
    October 13th, 2017 at 12:26 | #3

    I worked in Dept. of Social Security, later Centrelink, right through the introduction of desktop computers and the massive advances mainframe computers. The huge increase in productivity was plain to see but also what was done with that productivity increase was instructive.

    First productivity (just a few examples):

    (1) The typing pool was done away with. Everyone now did their own typing. Admittedly, many like me were poor typists but if one’s output was just the occasional short letter or memo the effect was a productivity increase. Corrections were easier too with a word processor compared to even an old electric typewriter. Then there document templates, easier printing and eventually emailing.

    (2) Punch cards gave way to online data entry. Online data entry moved from dedicated machines and staff to data entry at every desk (that did related work).

    (3) Data tapes gave way to databases. Mainframes become several orders of magnitude more powerful.

    Parts of this productivity improvement were used;

    (1) for better client service;
    (2) to cut staff numbers relative to client numbers serviced;
    (3) to increase the complexity and targeting of welfare;
    (4) to enable the government to change policy faster so it could look better and enhance (as it saw it) its re-election chances.

    If only ALL the productivity increase had been used for good social purposes. It was not, I am sad to say. Staff numbers were cut too far, welfare policy and payments were made too complicated for recipients (and often staff!) to understand. And you can be sure federal governments misused and abused point 4 to the hilt.

  4. Smith
    October 13th, 2017 at 13:27 | #4

    @Ikonoclast

    What you’ve described would have happened in workplaces around 30 years ago. No doubt the adoption of personal computers did a lot for productivity. But what about since then?

  5. Tom Davies
    October 13th, 2017 at 15:53 | #5
  6. Ernestine Gross
    October 13th, 2017 at 16:24 | #6

    Solow’s MFP “residual” has been challenged on methodological grounds almost 60 years ago. See:

    W.P. Hogan, “Technical Progress and Production Functions”, Review of Economics and Statistics, November 1958; pp.407-411.

    IMO, Campbell and Withers’ arguments, as represented in the thread is merely a verbal illustration of the more general critique in Hogan (1958).

  7. Ikonoclast
    October 13th, 2017 at 16:30 | #7

    @Smith

    There have been a lot of advances in computer power (and hence productivity) since the times I write about. For example, today a computer programmer, or a rather a fully qualified software engineer, is probably about a factor of 1,000 times more productive now (my rough guess) than a programmer 20 years ago.

    Where does all this extra productivity come from? It comes from all the powerful software tools, including editing tools, they have now to do their work and all the libraries of utilities they have and can import rather than have to write them from scratch. And then there are the utilities for version management and for multiple iterations of backups of earlier version of software. Then there are intelligent editing tools that guess what you want to write next and are getting better and better. Then there is AI, then there is the AI of programmes that are beginning to be able to write programs, and learning programs and genetic algorithms and… someone who knows a lot about this could no doubt add even more examples I don’t know.

    When I showed my son (a software engineer) all the stuff I can do on my home PC (being in my second childhood I re-script computer games with various free utilities) and said “Wow these programs and utilities are 100 times more powerful than what I did my work with 20 years ago…” My son said “Dad, the stuff we use at work makes what you have look like child’s play. It’s 100 times more powerful and productive again.”

    This whole arena is becoming mind boggling now. But not all of it is being used for truly productive things… for example, how much is being used for the intelligence battle, weapons, network-centric warfare research and so on?

  8. rog
    October 13th, 2017 at 16:52 | #8

    Much or most of what has been discussed in comments IMO is productivity gains by AI. The future is that AI will make huge gains and by default shareholders of those companies that embrace AI will reap those gains. The downside is unemployment.

  9. Smith
    October 13th, 2017 at 16:53 | #9

    @Ikonoclast

    Yes, computers are much more powerful than they used to be. But are they used for more productive purposes? For instance, in the old days, before computers, insurance claims were assessed without a computer. These days, I suppose, insurance claims are assessed with the aid of a computer. Has there been an improvement in the productivity of the insurance assessors that can be attributed to them using computers?

    Or, take shops. In the old days, cash registers were not linked to computers. These days they are. Are shop assistants more productive (as measured) as a result?

  10. dedalus
    October 13th, 2017 at 17:34 | #10

    @ikonoclast
    Hardware quicker, yes. But today’s software 1000 times more productive (times 100 more by the son’s reckoning): bullshit.

    At least not according to site’s like stackoverflow, where programmers routinely jam each other with Q&A’s about language and system bugs, and request for advice on coping with the latest iteration of language frameworks and browser updates.

    Take the browser. Browsers have bugs in them not attended to for ten years. Many apps developers are forced to switch frameworks on a 3-5 year cycle because the framework or language has gone out of fashion/support.

    The real productivity is the way the whole planet drip feeds off the web. On that broad level the productivity gain is anything but stratospheric.

    One reason banks make 20 billion profit per year in 2017 while shedding staff to atms is that they leech off the browser. The browser is free software which makes customers their own bank tellers. But here’s the thing: adjusted for inflation banks’ profits and productivity from the browser is basically the same as that of six years ago with version 2011. That’s why their CEOs get the big bucks. They’ve convinced the plebs that their computer systems are more “productive.”

  11. Peter T
    October 13th, 2017 at 20:56 | #11

    It’s not so much the computers themselves as the slow working through of networking – of connecting computer system to computer system. Protocols, standards for bar codes, putting bar codes on items, stock control (Point of sale to inventory to ordering to tracking), replacing hard copy with electronic communication (still a work in progress), real-time bank transfers and so on. This has taken decades and has some way to go. the obstacles are part technical, but more ones of national and international agreement, changes in systems, associated changes in production and packaging. Getting everyone to agree, then invest and then actually implement according to the standards is a mammoth set of tasks, always ongoing.

  12. Simon Fowler
    October 14th, 2017 at 07:49 | #12

    @dedalus

    Measuring productivity in software development is hard, but although I think three orders of magnitude may be a bit much programmer productivity for practical purposes really has increased enormously in the last twenty years. It’s not all monotonic improvements, but where you can make use of existing libraries, frameworks, shared infrastructure and cloud services you can get new systems up and running /extraordinarily/ quickly.

    I can create a brand new web service doing something genuinely useful from scratch in a day (including all the hosting and so forth through a cloud provider), which would have taken months (at least) to get working twenty years ago. Twenty years ago it would have required hardware purchases, hosting in a data centre, ISP services, potentially hiring sysadmins, doing the actual development work, bringing it up, and then being stuck with months more work as soon as it scaled past the limits of the hardware and network that I started with. Now I can not only get everything up and running as soon as I have (mostly) working code, I can even scale it out as far as my architecture will allow, on demand, and for relatively trivial cost.

    On top of that I have access to a /very/ wide range of frameworks that mean I only have to write the actual business logic rather than the guts of the service (i.e. I never have to create a socket, accept raw traffic, even parse HTTP requests – I can make use of other people’s code to do all that, and just write the higher level logic that implements what /I/ want to do). That can make the difference between writing thousands of lines of code and writing /hundreds of thousands/ of lines of code. In many/most cases those libraries and frameworks are totally free, removing any requirements for complex and expensive licensing agreements.

    I might not be 1000 times as productive measured by lines of code written (in fact, I’m most definitely not), but I can both drastically reduce the number of lines of code I /have/ to write, and make the lines of code that I /do/ write useful (and potentially profitable) with a fraction of the time and effort.

  13. Ikonoclast
    October 14th, 2017 at 09:57 | #13

    @Smith

    Yes, I covered those issues in my posts. Putting that aside, a portion of new computing power is used for genuine productive purposes and it is productivity multiplier.

  14. Ikonoclast
    October 14th, 2017 at 10:01 | #14

    @dedalus

    My point in that arena is that software engineers are far more productive. The “1,000” was a typo, I meant “100”. If you don’t have some familiarity with tools and utilities available you really will have no concept of what is possible now.

  15. Geoff Edwards
    October 14th, 2017 at 11:35 | #15

    Solow himself cautioned that his residual was not a product of theory but an observation of the gap between prevailing theory and reality. I think it is remarkable that mainstream economists have overlooked the disappearance in Solow’s equations of “land” from the classical concept of land, labour and capital as the three factors of productivity. Beaudreau and others with a background in engineering have postulated that consumption of energy as a proxy for “land” gives a far better fit than any proxy for knowledge or innovation.

    This makes analytical sense, because, traditionally, replacement of labour with machines has amounted to replacement of animal muscle power with power derived from quantifiable external sources.

    Indeed, the Solow-derived notion that an intangible such as “knowledge” or “innovation” can be punched into equations that track the tangible, quantifiable entities of labour or capital seems to me bizarre. Taking Solow’s model as a starting point seems to be counting angels on the head of a pin. It perpetuates the nonsense that the productive economy is somehow separated from the natural resources on which all economic enterprise depends.

  16. Greg McKenzie
    October 14th, 2017 at 14:39 | #16

    Geoff Edwards is right to point out that natural resources support economic enterprise. All economic resources must work towards multi-factor productivity improvements. Without surpluses of certain natural resources, computer advances would be impossible, especially as we approach the frontier of silicon chip technological improvement. But Professor John Quiggin is also right to stress the role played by human capital. Clever countries can enable clever economic enterprises. Perhaps the most underrated economic resource, in the multi-factor productivity debate is entrepreneurial ability. You only have to look to silicon valley to see the benefits of harnessing all four economic resources onto the productivity ‘coach’. Risk taking is facilitated by venture capitalists, organisation is so much more effective with a highly educated labour force and social costs are minimized by rewarding environmentally sustainable ventures. Its a bit like Quantum Physics, what works at the micro level is different from what works at the macro level. Simply put, micro level actions can improve quality with efficiency but macro level actions are needed to improve quantity with profit.

  17. derrida derider
    October 16th, 2017 at 18:44 | #17

    Not my field, but isn’t the education bit supposed to be captured by those Romer “endogenous growth” models? That is, it is part of A rather than K in Y=A f(K,L) and so represents capital widening rather than deepening. Or have I misunderstood all this?

  18. John Quiggin
    October 17th, 2017 at 04:16 | #18

    @derrida derider

    Education is typically represented by years of education, and included in labour. This is analogous to “embodied” technological change. Broader effects of the spread of knowledge arise in endogenous growth models and would, in principle, be in the residual. In practice, though, they are highly correlated with years of education.

  19. Moz of Yarramulla
    October 17th, 2017 at 13:11 | #19

    The great thing about Solow is that it’s a fairly easy measurement with no obvious bias. The more detailed measurements become exercises in answering “who is asking, about what, and why?”

    @Ikonoclast

    The problem is that as technology advances so do our expectations. As software advances, programmers do more complex things – face recognition and voice control are now common rather than impossible, for example. The productivity gains disappear/are necessary just to keep advancing at the same old exponential rate. Complexity scales exponentially, productivity struggles to keep up.

    If you look back it can be scary. Buy a 10 year old computer game from “Good Old Games” and run it on a modern computer… I do that while waiting for video transcoding to run in the background. But the games are primitive, low-res things that don’t have a social component. When those games were released transcoding was slow and took everything an expensive computer could supply. Now my phone can run half of them in an emulator.

    Programmer productivity is especially hard to measure because it’s a complex multifactor problem all by itself. Todays news about WiFi being insecure (WPA2 vulnerability) make it a bit topical: does producing a usable system quickly count if it’s also insecure? Is that productive or not?

  20. Moz of Yarramulla
    October 17th, 2017 at 13:59 | #20

    Amusing point about productivity in general: http://www.harrowell.org.uk/blog/2017/10/15/demand-determined-productivity/

    There is quite a large set of firms for which productivity is fundamentally demand-driven. … The most common form of this metric is labour productivity, output divided by inputs, per hour worked. So the production cost of that report is just what it cost to employ me for the time I spent working on it. That’s the input. The output is going to be the price Ovum charges for it, multiplied by the number of copies we sell. … It’s therefore obvious that the more copies of the report that go out, the higher productivity will be.

    Viz, my “productivity” is to a large extent outside my control. I make something, that’s a fixed cost. I reproduce copies of it for nanocents to millicents each (marginal cost)… my productivity is determined by what I get paid for all those copies. Or none, should my work not sell. But productivity is unknowable in advance, and I might suddenly become extremely productive a considerable time after I die. How best to fooster that productivity, I wonder? What actions should I take? What policy settings should the government use (please don’t suggest killing me).

  21. Nevil Kingston-Brown
    October 17th, 2017 at 15:39 | #21

    If “capital per worker” is increasing, but “capital” is becoming cheaper, how is the capital per worker measured? Not in dollars presumably, as the cheapening of capital would mean that capital per worker appears to decrease.