The question of whether technological progress is slowing down has been around for a fair while, and is up for discussion again (hat-tip Jack Strocch(. In many sectors of the economy, notably transport, the answer is very clearly “Yes”.
On the other hand, Moore’s Law (speed doubling every 18 months) still seems to hold for computer chips and they are playing an increasingly important role in the economy. So although progress in most areas is slower than the historical average, progress in this central area is faster.
In the end, it all comes down to the long-run price elasticity of demand for computation. If this is less than one, total revenue from the sale of computational services will eventually decline relative to national income, and the ultimate situation will be one where computation is effectively free, but no longer an important source of progress. If the elasticity is greater than one, the computation-based share of GDP will rise over time, as previously separate sectors like music, video and so on are computerised.
My reading of the evidence is that the value so far is very close to one, which accounts for some of the ambiguity surrounding this question
I don’t know the work of the technological forecaster mentioned in the article but this literature generally has had a bad reputation for relying heavily on mechanical extrapolations. Whether you are a technological optimist or pessimist seems to amount pretty much to an act of religious faith. At best you might want to be on your guard against possibly poor lomg-term outcomes.
Computational speed is no longer doubling every 18 months. AMD and Intel have pretty much hit a brick wall with heat dissipation problems. But density is continuing to double in line with Moore’s law, which is why performance increases are now coming from putting multiple cores on a single chip, instead of cranking the GHz up.
I am being lazy in not looking this up for myself; but what is “price elasticity” and why is “1” the critical value?
Price elasticity is the log derivative of demand, that is the ratio of the marginal proportional change in quantity demanded to a given marginal proportional change in price.
If the elasticity is greater than (less than) 1, total revenue (price*quantity demanded) increases (decreases) as price falls.
As I understand it ,in addition to the heat problems, chips are very rapidly approaching the quantum limit where the circuitry no-longer performs in our classical physics way. I remember reading 5 years ago or so that this was predicted to come into play around 2007, although its no doubt been pushed out by various incremental inovations on the way unforseen at the time.
Still there is a minimum size that we can’t get smaller than and have our current style circuits work in the normal way, and its most likely going to start pressing against it in ten years. Quantum computing or some other break through may change all this but the current eletronic circuitry is definitely limited in its development on the current path.
The story in New Scientist is just rubbish.
It uses two proxies for the rate of technological advancement:
– a list of “7200 key innovations listed in a recently published book, The History of Science and Technology (Houghton Mifflin, 2004). ”
– the number of patents granted each year in the US, divided by the population of the US in those years.
The first proxy depends solely on the opinion of the authors of the book, and not any kind of objective criterion. The second proxy is bizarre. The graph shows the number of patents per head of population in the US being roughly constant at 0.00025/year since 1850. But what has happened to the US population since 1850? It’s grown exponentially, and so, therefore, the absolute number of patents issued has also grown exponentially. Duh.
Data Storage is the new ‘key’ – the cost of storing an e-mail now is less than the cost of deleting it.
Just think about the possibilities…………
One of the long-run consequences of technological progress, especially when combined with the development of democratic political institutions and global trading relations, is to render redundant what have historically been seen as male “advantages” over women (size, strength, aggression, etc.). An article in this morning’s Sydney Morning Herald at:
http://www.smh.com.au/news/world/another-giant-leap-/2005/07/05/1120329448366.html?oneclick=true
suggests that in a cutting-edge field of human endeavour, male biological characteristics may well be a serious disadvantage. And when one considers the expenditure of global resources on sustaining the additional bulk and greater metabolic resource throughput of 3 billion male bodies, the feminist joke about the useless bit of flesh at the end of a penis takes on added significance.
So, is there any good reason why men continue to exist?
Troll…
But I’ll take the bait anyway. Norton, if you believe males are redundant and the planet would be better off without them, you should set an example for us all. In your case, you’d probably be right.
On the original question. Seems like price elasticity is not the issue when determining technological progress due to computation. Performance improvement has been so dramatic over the last several decades that the current uses of computation lag well behind the current capablities.
Or in other words, it is the price of writing and maintaining software that is holding back technological progress, not the cost of computation itself. I believe the next decade will be interesting as software development shifts to India and China. Products that are currently economically infeasible in the West will become feasible, so in addition to having a much larger pool of available engineers, we should also see previously marginal applications developed.
I think there is still plenty of progress (arguably technological) evident in the applications that make use of technology, even if intel etc are slowing down in advancement. The coolest thing I’ve seen recently is earth.google.com, where google have made satellite photos of the whole globe available. Type in Sydney and watch it zoom in on the CBD, then scroll over and find your own house in satellite photo.
And the speed of data carriage is still going up nicely. I’m paying the same in real terms for a 1.5Mbit connection now that I was paying for 28.8kbit dial up 10 years ago.In support of post
I don’t have a computer
In some areas it has slowed down. In 1990 I was still using MS-DOS and DOS menu as an operating system yet in 1995 it was Windows 95 which was a sort of GUI operating system totally different to MS-DOS.
Yet in 2005 I am still using Windows 2000 and all programs still run the same. I still prefer W2K on my home system and there is nothing that I cannot run (except movie maker) on my W2K system after 5 years.
Mind you my first real computer was a 386DX40 with a 100m hard disk that I purchased on 1995. I now have a PDA with a 128M SD card that is faster and has more storage that my first computer. So hardware has progressed however the operating system cycles have slowed a little.
Look at fighter aircraft – 18 years seperated the Sopwith Camel and the Spitfire. Only 25 years seperated the Camel and the Meteor/Me262. The F16 and F15 are now both 25 years old and still are in front line service albeit with upgrades. The SR-71 still is unsurpassed in speed and range and it was designed and flown in the sixties. Even a well flown Skyhawk from the late 50s can still shoot down a modern fighter.
Technology seems to rise rapidly and then plateau as it starts to reach physical limits of materials.
tcfkaa,
I’ve obviously struck a raw nerve. If you can’t see the humour in my post, you have some issues to deal with.
“Technology seems to rise rapidly and then plateau as it starts to reach physical limits of materials.”
Materials can easily be improved. Almost all the examples above hit the limits of human capabilities. You can’t think fast enough to use your CPU cycles, pilots can’t handle the stresses of greater speed and manouverability, faster cars/trains have numerous safety problems…
To go back to JQ. Once we stop being able to utilise the extra technical grunt price elasticity will fall, and it levels out. There are still likely to be great gains available if you can remove the human element from the process. A lot of future technology improvements will probably depend on artificial intelligence to do so.
norton,
which raw nerve would that be? That I am insecure about my own right to exist? Hmmm, I’ll need to speak to my therapist about that.
Anyway, irony is often lost in text, as has been pointed out here before. So, my bad. Perhaps we should give hints: enclose all irony in tags.
And following this thread, what is the first task we should set the AIs? Building AIs superior to themselves. And then superior to themselves. And so on.
So assuming no fundamental physical limitations, we’ll reach a tipping point, where there’ll be an exponential increase in intelligence. Scary but exciting. Pity I am very unlikely to live to see it.
Sorry about my last post which could have been titled ‘my brain hurts’.
but seriously, can an economic analysis of the rate of technological progress adequately take into account the material and physical limitations on such progress?
Can someone explain how the physical limitations on technological progress are reflected in the long-run price elasticity of computation and its component parts?
Much innovation is spurred by a crisis. Many significant advances were made because of World War Two and associated build up.
I have a very strong hunch that the end of cheap oil will spur significant technological advances (or changes perhaps, rather than advances), with personal transportation changing rapidly. There have been substantial bits of work done in stationary energy, if the solar tower does actually get built outside Mildura that will be a very significant development.
I think the internet and general connectivity has been impressive enough to be legitimately called a revolution. Sure the OS and the fibre optic cable may be pretty much what we’ve had for quite some time now, but the fact that we’re all here typing is still quite extraordinary. I no longer use travel agents or actually visit a bank, I don’t mail anything much at all, these have profound long-term social impacts.
“Almost all the examples above hit the limits of human capabilities. You can’t think fast enough to use your CPU cycles, pilots can’t handle the stresses of greater speed and manouverability, faster cars/trains have numerous safety problems…”
Technologies mature and it has nothing to do with human capabilities. Air transport hasn’t remained sub-sonic because of human capabilities but because the technology doesn’t make super-sonic transport cost effective. A new engine type, say the scram jet, may yet change this but the current jet technolgy is a mature technology and improvements are refinements.
Still I think the basic thesis of the paper is wrong. As SJ pointed out raw number of patents is probably a better measure than patents/head. One invention effects everybody not just the inventor.
I rather think Pr Q’s summary of the matter leans to heavily on the narrow minded financialism of economic rationalism:
Surely this ignores the fact that public goods have external economies which can provide utility in consumption, if not profitability in production. From which it follows that costless computing power might still provide the potential for futher improvements in consumer well-being, assuming GIGO. Thus costless computing resources – whether provided as zero-price utilities by the state or open-source gratuities by the community – could still be an engine of technological progress. Although it would be difficult to measure how much given the implicit subsidy from political or communal sources.
Russ – some of what you say is true however not all the limitations with technology are human. Titanium melts at a certain temperature etc. Designers call it pushing the envelope. Speed reduces manouverability and so on in an elastic envelope that when you bulge one area another area contracts. However as you correctly state the limiting factor on turning is the pilot as 9Gs is about the limit. There are no such limits on speed as the X15 flew to MACH 5 however no-one would call it a fighter.
With computer software the problem now is the complexity. A new GUI operating system is hard as more bugs creep in with each step in functionality. Perhaps the limit here is debugging.
I would have thought that you have to look at the institutional aspects more. In areas such as telework or telecommuting, we are a long way from exploiting the technological progress that has already been made – mainly because of the stupidity of managers who feel they have to be looking over someone’s shoulder. There are also areas such as housing where technological progress is being held back by the large amount of old stock we have.
The trend has been for for increases in computing power to be used reduce the cost of creating software.
For example, software written in Java take about a quarter as long to write as would to do the same in C (I’m not entirely sure about the number’s, but its around about that). This is because it is a higher level language which gets the computer to do more thinking to save the programmer from having to explicitly say everything he wants the program to do. The cost savings are even greater when you consider that Java software does not have to be re-written for different platforms.
But software written in Java requires far more computer resources to run at acceptable levels of performance than software written in C. Any current desktop computer can comfortably run half a dozen Java programs simultaneously. But running even one Java program on a computer from 1995 would be unworkably slow. However the same program written in C would may work almost equally well on both the 1995 and 2005 computers.
This contrast is even greater with even higher level languages such as Perl. Writing programs in Perl is a snap, but they run much slower. However with today’s very fast computers the slowndown is often not even noticable.
And it does not stop with higher level languages. Today applications can consist not of actual programs but rather various applications that have been customised.
For example: In the past if you wanted to write a program to run a Wikipedia server you would need to write it in C which would require a large dedicated team for maybe a year or so.
Today you can run MySQL, Apache, Linux and PHP on a cheap server. Write a few scripts and voila. A wiki server. I am simplifying a bit but it could definitely be done by one dedicated person.
Massive productivity improvements in software creation productivity due primarily to increases in computer power.
Ender – you are right, but then I didn’t claim all technology is limited by human capabilities, merely the examples mentioned. In the case of fighter planes, the pilot has been a limiting factor for the past 30 years. Recent technology improvements have been in automated weaponry systems which is what you’d expect.
I see the real gains in computer software coming from applications that can make use of the almost unbounded amount of data that is online. Google being the fore-runner of some aspects. I find most bugs tend to be problems doing data conversion for the end-user. So, again, AI to handle things without their input is hard, but conceivable.
Steve – cost-effectiveness and human capability to make use of a piece of technology are pretty closely related – as are cost and the technical barriers to be overcome. If you read back through the thread you’ll note that most “technical” limitations are followed by suggestions of possible solutions. However, if you are looking at, for example, why urban transport is going backwards in terms of average speed, you need to look at the human limitations – our inability to drive a car at speed without hitting things, the way we chose to live etc.
Not quite correct. Once compiled by the just-in-time compiler (JIT), most java programs execute within a factor of 2 of the performance of their well-optimized C counterpart, and sometimes faster than the C code due to the dynamic optimizations the JIT compiler can do that a static C compiler cannot do.
However, java does consume a lot more memory, because you have the whole JVM (Java Virtual Machine) infrastructure to load into memory.
Any computationally bound program will run about 10-100 times faster on the fastest 2005 PC than on the fastest 1995 PC.
SJ, to be fair to New Scientist they note the first problem you mention.
As for the second – unless the population continues to grow exponentially into the indefinite future, we’d better hope that the correlation between population growth and total number of patents issues is coincidental and won’t continue.
A more serious problem with using patents to measure innovation is that not all patents are equal. Before Thomas Edison, only the most importanrt and obviously commercial inventions were patented. Edison patented everythign that wasn’t down – if he could he would have patented the idea of patenting everything that wasn’t nailed down.
Since Edison’s time, institutions like Bell Labs have continued to patent anything that’s patentable.
In the last couple of decades, we’ve seen the rise of “business process” patents and patents over organic molecules and genes – which would probably tend to inflate the number of patents being issued. (If current patent laws had applied in the 50’s could Crick et al have patented DNA?)
I’m not convinced by the argument in the paper but I think the problem lies in the fact that innovation is poorly defined (as you noted) and that not all scientific or technical advances generate the same level of practical innovation and the time-lag between discovery and practical adoption is long and variable.
Superconductivity was discovered in 1911 – the first mainstream products using the new high-temperature superconductors are just now coming onto the market.
Sonoluminescence was discovered in 1934, until a few years ago it was considered a curiosity. A few years from now it may be the basis for a new form of energy generation (or not).
The fuel cell was invented almost 60 years before the internal combustion engine.
In a world of a fixed number of products the role of computation is to minimise the amount of energy and material used to produce those products. However this is a completely unrealistic model ; computation is used in the creation and destruction of products. So the relationship between price and computational cost seems uncomputable itself . It is therefore a contingent measurement .Furthermore the cost of computing is never zero , as it requires energy and materials . For a set of a fixed number of products the decreasing cost of computation allows an increased amount of computation to decrease the energy and material costs of production .
I was thinking more about memory requirements when saying that Java would not run on a computer from 1995. The damn thing uses up about 20mb per instance which would grind it to a halt on anything pretty much anything from that era.
It is not in fact the case that computer productivity has enabled improvements in programming productivity. There have been a great many programming productivity techniques from very early days, only they weren’t favoured by buyers (Forth, TRAC, even Basic on mainframes).
What made rapid development environments desirable to managers was the changing cost trade offs between programmers and hardware. More powerful hardware did not make them possible. The reason the older solutions were not adopted is in part sociological, in that resource-hungry tools were preferred by groups who manage resources.
If you want to see a really convenient RDE, look at Euphoria (that link is from memory – if it doesn’t work, google). Euphoria matches people’s common previous experience, so reducing learning curve, yet provides a far easier framework than Cobol and uses fewer resources than Perl. But socio-economically, the business world at various times preferred one or the other of those approaches instead.
On transport progress, it is entirely possible that exponential speed growth has not stopped in technological terms, only its adoption. Concorde lost out to the US approach of fast subsonic jets, largely for commercial reasons that were themselves part of an environment manufactured to favour the less advanced US aircraft industry. It doesn’t reflect technological rates of progress at all, but again socio-economic adoption processes.
On checking, I found that that link should have been Euphoria.