Covid and the climate emergency

(Another extract from the climate chapter of my book-in-progress, Economic Consequences of the Pandemic)

The Covid-19 pandemic has accelerated a variety of social and economic trends, some beneficial and some harmful, that were already underway before 2020.

An important example of a beneficial effect has been an acceleration of the decline of carbon-based fuels. Lockdowns early in the pandemic produced a substantial reduction in demand for both electricity and transport. As well as providing a brief glimpse of a world with greatly reduced atmospheric pollution, the lockdown accelerated shifts in the energy mix that were already underway.

Since solar PV and wind plants cost nothing to operate, the reduction in electricity demand fell most severely on carbon-based fuels, particularly coal. As a result, the combined contribution of PV, wind and hydroelectricity to US energy generation surpassed that of coal for the first time in 130 years.

Official projections from the EIA suggest that coal use will return to its gradually declining trend in the wake of the pandemic, exceeding renewables for some years to come. However, the pace at which coal plants are being closed or converted to run on gas has accelerated during pandemic. Meanwhile, despite weak demand, wind and PV plants are being installed at a record pace, partly because near-zero interest rates make capital investments cheaper.

The reduction in transport usage reduced demand for oil, at one point leading to a startling situation where the price of oil was negative, as unsold oil exceed the capacity for storage. Although the price has recovered somewhat, it seems unlikely that transport demand will return to its previous trend.

At the same time, there has been continued progress, both technological and political, in the electrification of transport. British Prime Minister Boris Johnson recently announced that the sale of petrol and diesel cars would be prohibited after 2030, an advance on previous commitments. The decline in long-term interest rates also enhances the economic position of electric vehicles, which have higher upfront costs and lower operating and maintenance costs than petrol and diesel vehicles. https://www.prnewswire.com/news-releases/auto-loan-interest-rates-drop-in-may-to-lowest-level-since-2013-according-to-edmunds-301069143.html

Not all energy-related developments associated with Covid have been positive. The convenience and cheapness of online taxi platforms like Uber and Lyft has reduced use of public transport in many cities. The pandemic, with the need to avoid crowded spaces like buses and subway cars has exacerbated this trend. And, while the option of working remotely reduces the need for travel, it has encouraged a more dispersed workforce with less need to commute to the central city locations best served by public transport.

Climate change after the pandemic

Even as the future of US democracy remains in the balance, and as the pandemic still rages, I’m still working on my book The Economic Consequences of the Pandemic. At this stage, it’s hard to get a clear idea of how things will look when and if the pandemic is brought under control. One thing that is certain is that the problem of climate change/global heating will not have gone away. Over the fold, the intro for the chapter I’m writing on this topic. Comments, criticism and compliments all gratefully accepted.

The pandemic disaster has absorbed all of our attention. But the longer-running, and ultimately more dangerous disaster of global heating has continued to wreak its ever-increasing havoc.

The hottest temperature ever reliably recorded (130 F or 54 C) was observed on Sunday August 16 2020, at Death Valley. Unsurprisingly the record temperatures gave rise to hundreds of disastrous fires throughout California The scale of the fires was described by the New York Times as ‘staggering; with 1.4 million acres burned by August. But this was not a once-off disaster. Fires in 2017 set a new record for their extent and damage, only to be eclipsed by even worse disasters in 2018. The fires of 2019, which saw much of the electricity grid shut down for days on end, and 250 000 acres burned, seemed mild by comparison.

This pattern is not unique to the US. Massive fires have occurred from the Arctic to the Amazon. Over the Southern hemisphere summer of 2019-20, my own home country, Australia, experienced the worst bushfire season on record, with major cities blanketed in toxic smoke for weeks on end. Thirty-four people were killed by the fires themselves, but hundreds more died from the acute effects of the smoke, and many more are likely to die of long-term effects. Humans weren’t alone. Nearly 3 billion animals were killed or displaced, with whole species threatened with extinction.

On the Atlantic coast of the US, the climate drove a different kind of disaster. As has become normal in recent years, the first storms of the North Atlantic hurricane season arrived in May, before the official start of the season on June 1. In August, Hurricane Laura became the strongest on record (by windspeed) to make landfall in Louisiana, tying a record set in 1856. Only the speed with which Laura moved inland prevented catastrophic damage on the scale seen with disasters like Katrina and Sandy. By mid-November, the 2020 season was declared the most active on record. There is now very strong evidence that climate change is causing more severe hurricanes, with heavier associated rainfall and rapid intensification.

As with the pandemic, we had plenty of warning about climate change. The science of global warming has been understood since the 19th century, and evidence that warming is taking place began to mount from the early 1980s. The Intergovernmental Panel on Climate Change was established in 1988, and produced its First Assessment Report in 1990, leading to the adoption of the United Nations Framework Convention on Climate Change (UNFCCC).

The report established that global warming was taking place and that “emissions resulting from human activities are substantially increasing the atmospheric concentrations of the greenhouse gases: CO2, methane, CFCs and nitrous oxide. These increases will enhance the greenhouse effect, resulting on average in an additional warming of the Earth’s surface. The main greenhouse gas, water vapour, will increase in response to global warming and further enhance it.” However, considerable uncertainty remained regarding whether observed global warming was due to natural variability, human activity or some combination of the two.

The Second Assessment Report in 1995 presented stronger evidence that warming was being driven by greenhouse gas emissions. But already there was pressure from some governments to water down the conclusion.

A series of subsequent IPCC Assessment Reports has documented the increase in global temperatures and established, beyond any reasonable doubt, that human activity is primarily responsible. The most recent was the Fifth Assessment Report, released in 2014. The key finding:

Warming of the atmosphere and ocean system is unequivocal. Many of the associated impacts such as sea level change (among other metrics) have occurred since 1950 at rates unprecedented in the historical record. There is a clear human influence on the climate. It is extremely likely [probability greater than 95 per cent] that human influence has been the dominant cause of observed warming since 1950,

Transmission too

In my article arguing that electricity from solar PV (and wind) could soon be too cheap to meter, I didn’t mention transmission networks. That was for space reasons.

The case for public investment is actually stronger for transmission than for generation. Electricity transmission lines have the same cost structure as renewables (low operational cost and long lives), if anything more so, meaning that the cost of transmission depends primarily on the need to secure a return to the capital invested.

More than this, the electricity grid as a whole is a complex network in which valuing the services of any individual component is just about impossible. That in turn means that relying on markets to make optimal investment decisions is untenable.

For these reasons, the electricity transmission network should never have been privatised. I’ve been arguing for renationalisation for years.

Amazingly, in the new low interest environment, this idea seems to be gaining traction, at least as regards new investment. Labor has proposed a $20 billion public investment. The government hasn’t gone that far, but is seeking to use its own borrowing capacity to provide low cost finance for transmission investment ( a half-baked compromise, but better than nothing).

Too cheap to meter

That’s the headline for my latest piece in Inside Story, looking at the implications of zero interest rates for renewable energy sources like solar and wind. Key para

Once a solar module has been installed, a zero rate of interest means that the electricity it generates is virtually free. Spread over the lifetime of the module, the cost is around 2c/kWh (assuming $1/watt cost, 2000 operating hours per year and a twenty-five-year lifetime). That cost would be indexed to the rate of inflation, but would probably never exceed 3c/kWh.

The prospect of electricity this cheap might seem counterintuitive to anyone whose model of investment analysis is based on concepts like “present value” and payback periods. But in the world of zero real interest rates that now appears to be upon us, such concepts are no longer relevant. Governments can, and should, invest in projects whenever the total benefits exceed the costs, regardless of how those benefits are spread over time.

Some facts, and claims, about the 21st Century Economy

In the process of working on my book-in-progress, The Economic Consequences of the Pandemic, I’ve been trying to integrate a number of facts about the economy of which I’ve been more or less aware for a while, along with claims I want to make, and put them together into a coherent account of the economic system prevailing (in advanced/developed economies( in the 21st century and how it differs from the industrial goods economy of the 20th century.

As a step towards this, I’ve put together a list of factual claims which I think can be established reasonably firmly, along with claims I want to make that will be more contentious. My plan is to put this together into a coherent analysis, including supporting evidence. So, I’m keen to get good supporting links for any of these points (I have quite a bit, but more would be helfpul). I also want to be sure I’m not missing contrary evidence, and to adjust the claims if necessary, so please point this out also.

Facts (I think)

  • Most economic activity in the 20th century, including services such as wholesale and retail trade, was fairly directly related to the production and distribution of goods
  • This is no longer true: most economic activity is now related to human services, information services and finance, and these are at most indirectly related to goods production
  • Real interest rates for government debt and high-grade corporate debt have been below zero since the GFC and seem likely to remain there permanently under current conditions
  • Massive issues of government debt during the pandemic crisis haven’t changed this
  • Net private business investment (non-residential) has been declining relative to GDP/national income since at least 2000
  • Service industries less capital intensive than goods industries
  • Information economy firms (Facebook, Google etc) invest very little even counting R&R
  • Government investment in traditional infrastructure has been falling since 1970s, at most partially offset by private infrastructure
  • Corporate profits high, mostly derived either from financial sector or from “intangible” assets in IT.

My claims

  • Finance sector profits even higher if payments to managerial level in finance sector are treated as part of profit
  • Intangibles = monopoly
  • Revenue and profits in finance and Internet do not arise from sales to final consumers, and bear no obvious relationship to consumer welfare
  • Implies similar regarding wages for market work
  • Incentives don’t work in in this kind of economy (if they ever did)
  • Unmet needs for public investment in human services: health, education, aged care, early childhood, social work
  • Capacity to meet these through short term increase in public debt, long term increase in taxation

Inequality and the Pandemic, Part 1: Luck

Here’s an extract from my contingent* book-in-progress, Economic Consequences of the Pandemic commissioned by Yale University Press. Comments and compliments appreciated, as always.

The Covid-19 pandemic has taught us several things about inequality, or rather, it has dramatically reinforced lessons we, as a society, have failed to learn. The first is the importance of luck in determining unequal outcomes.

Some of us will get Covid-19 and die or suffer lifelong health consequences. Others will lose their jobs and businesses. Many, however, will be unaffected or will even find themselves better off. Some of these differences may be traced to individual choices that are sensible or otherwise, such as deciding whether to wear a mask in public places. But mostly they are a matter of being in the wrong (or right) place at the wrong (or right time).

Moreover, this isn’t specific to the pandemic. From the moment we are born, luck plays a critical role in our life chances. Our families may or may not be in a position to help us succeed, and may or may not hold together through our childhood. Moreover, this isn’t specific to the pandemic. From the moment we are born, luck plays a critical role in our life chances. Our families may or may not be in a position to help us succeed, and may or may not hold together through our childhood. A child born into the bottom 20 per cent of the US income distribution has only a 4 per cent chance of ending up in the top 20 per cent. The opposite is true at the other end of the distribution with the striking exception of Black children, especially boys.


These facts have been known to social scientists for decades. Yet until recently, in the face of glaringly unequal outcomes, most Americans comforted themselves with the idea that the United States was a land of opportunity where everyone who worked hard had a fair chance of doing well. This was true a century ago, but now there is more mobility between economic classes in European countries than in the US.

That’s not to say everything in Europe is rosy. Piketty examined the UK and France as well as the US and found growing inequality in all three. It seems likely that other European countare are on the path towards what Piketty calls a patrimonial society, where inherited wealth is the most important determinant of success.


Luck doesn’t end with the lottery of family background. Young people who enter the labour force during a recession will experience permanently reduced life chances compared to those who enter during a boom. And at an individual level, lucky or unlucky breaks of various kinds are much more important than many of us like to believe. Robert Frank provides detailed evidence in Success and Luck: Good Fortune and the Myth of Meritocracy.

The pandemic has reinforced this lesson in the most brutal way possible. As is usual, the poorest members of society have been most exposed both to the risk of death and disease and to economic hardship. But everyone is vulnerable, and it is a matter of chance whether any of us gets infected, and whether the consequences are harmless, severe or fatal. Similarly, exposure to economic damage is largely random, depending on the way in which the pandemic affects different industries and regions.

The randomness of economic success implies that concerns about the incentive effects of high taxes on those at the top of the income distribution are misplaced. If the lucky winners in the economic lottery are discouraged from working (something unlikely to happen on a significant scale until marginal tax rates exceed 70 per cent), there are plenty of unlucky runners-up who can replace them.

  • Contingent because I’m writing on the assumption that Biden wins the US election, and takes office. While a Trump win would be an object lesson in the importance of luck, it would render any commentary on responses to the pandemic pointless as far as the US is concerned and would have drastic consequences for the rest of the world for which I have no analysis to offer at the moment

Sitting next to Nelly*

One of the big questions about the shift to working remotely has been “what about new staff?”. To spell this out, the idea is that, while experienced workers can do everything they need to online, new employees will need personal contact to pick up tacit knowledge and firm culture. It’s inherent in the argument that these terms are difficult to define with any precision – if not, they could be formalised and taught.

This is part of a debate that’s been going on for a couple of centuries, between proposals for formal education in work-related skills and learning on the job, sometimes through apprenticeships and sometimes through “sitting next to Nelly”, that is, picking up the relevant skills by working with people who have already acquired them.

Before 1800, and with the partial exception of ministers of religion, on the job training was the only kind on offer. Since then, starting with lawyers and doctors, formal education has steadily expanded at the expense of on the job training, across a wide range of occupations and in many different countries with radically different labor markets. That includes some economies and industries where lifetime employment by a single firm has been the norm and others where work is largely done on a contract or ‘gig’ basis.

This process has always been contentious. Terms like “credentialism”, “overqualification” and “academic” (used pejoratively) have set the tone of much of the discussion. Nevertheless, there has been little evidence that the trend has been or will be reversed, and no one has managed to find, and sustain, a successful altern ative.

The work of hiring, ‘onboarding’, promoting and firing employees has not been exempt from the process. “Human resource management” emerged as a distinct profession in the second half of the 20th century, taking over much of this work from individual managers. HR departments have in turn begun to outsource some of these tasks to specialised firms such as headhunters and ‘separation management advisers’, though onboarding still appears to be done in-house for the most part.

The shift to remote working will provide another test of this process, at least when firms start hiring new staff on a large scale. Some of the concerns expressed about lack of in-person contact will probably prove to be well-founded (though not insuperable). Others, I think, will not. After a few in-person (and ideally one-to-one or small group) meetings to be introduced to new colleagues, most new hires will be able to learn the ropes through email and Zoom.

Read More »

Firm-specific skills and working from home

One of the central features of the debate about working from home is that it leads to the loss of random, but productive, encounters with colleagues. I’ve responded with the observation that some of my best research ideas have come from largely unplanned encounters on the Internet.

It’s just struck me that there is a conflict here between the interests of workers and those of firms and managers.

A lot of universities (or, more precisely university managers), think of themselves as developing and promoting a corporate brand. In this context, research collaboration within the university (particularly if it is trans-disciplinary) is viewed very positively, while collaboration with other universities is less well-regarded. But for individual academics, the big rewards come from high-profile work within tightly defined fields, which implies a desire for collaboration with other people in the same field who will, in general, be located elsewhere. While intra-university collaboration may be rewarded in internal promotion decisions, the outside opportunities are greatest for people with external collaborators. Those outside options are routinely used as a bargaining chip in negotiations over salary.

This issue isn’t specific to universities. Labor market theory distinguishes between firm-specific skills and general skills (which are of value to any employer). Back in 1964, Gary Becker made the argument that firms would be willing to pay the cost of firm-specific training for their workers, but not for general training which increases their outside opportunities. (This seems entirely convincing to me, although the empirical evidence I found on a quick search is both limited and inconclusive).

What applies to training also applies to serendipitous encounters. Collaboration with co-workers can enhance productivity within the firm, but doesn’t do much for your market value outside. Conversely, if workers enhance their productivity at home by making more use of industry discussion groups, Skype chats with people in other firms who are addressing similar problems, and so on, that enhances their bargaining power relative to their employers.

In this context, it’s striking that the hardest push for a return to the office is coming from the finance sector, led by JP Morgan. Even though textbook finance is all about hard numbers on earnings, risk and so on, the industry actually operates largely on personal contacts, networks and exchanges of favours, particularly information. That’s why it’s concentrated in a handful of global cities, and why so much attention is paid to issues like “poaching” of staff, no-compete clauses and the like. It’s obviously in the interests of employers to build up internal networks and control external interactions.

As with all these issues, my ideas here are provisional and almost certainly wrong in some respects. So, feel free to correct me.