Wikipedia doubling time
The English language version of Wikipedia had its one-millionth article on 8 March, and has just passed 1.1 million, 50 days later. That gives an implied doubling time of about a year. The doubling time seems to be fairly stable, since the 500 000 mark was reached in March 2005, and 250 000 in April 2004.
A straightforward extrapolation gives a billion articles in 2016. I’ll open this up for comments now, then give my own thoughts (taking advantage of yours, naturally).
Update over the fold
Lots of fun in the comments, here and at CT, and now I’ll try to be at least semiserious. If a trend can’t be sustained it won’t be. Or, if you prefer, exponential curve eventually become logistic. So, the real point is to work out the constraints on Wikipedia’s growth, and take a guess at what the endpoint of this process will look like.
It’s a pretty safe bet that the current phase of rapid growth will take Wikipedia to 10 million articles, if not within the 3.5 years implied by the recent exponential trend. There’s no shortage of topics, plenty of room for growth in the number of contributors, and no obvious problem handling such an expansion within the current software and scheme of social organization. The result would be a general reference system that would be better for that purpose than anything that’s been seen previously. Among other things, such a system would replace Google for many purposes (though not the ones that make most of Google’s money).
Going beyond that to 100 million articles would imply some radical changes. As far as content is concerned, something on this scale would compete across the board with specialist reference works like national dictionaries of biography, the Palgrave dictonary of economics and so on. An obvious way to approach this goal would be to subsume a number of existing projects, as was done with the 1911 Britannica, and the gazetteer entries on US towns. But that would certainly require both new organizational and licensing arrangements, and probably a more complex architecture than would exist at present. More importantly, it’s hard to see something on this scale functioning without a substantial number of full-time paid staff. On this scale, Wikipedia coverage of current events would also be directly competitive with the mainstream media.
Another order of ten multiplication and Wikipedia would be comparable in size with the Internet as a whole (the visible Internet currently has about 10 billion articles). As Joel Turnipseed pointed out in comments, the obvious analogy is with Borges’ map, which was on the same scale as the country it described. Extrapolations to this point and beyond are fun, but probably best left to science fiction.
My best guess is that sometime around 10 million articles, the growth rate will slow, eventually becoming linear. At this point, either some other project will take off, or Wikipedia itself will transform into something radically different.