Sun 29 April 2018
Some dubious mathematics and lots of guesswork. Maybe the conclusion is within a few orders of magnitude of correct...
I found a paper by Martin Hilbert and Priscila Lopez that helps to answer this question. It was published in "Science" magazine (?). "Science" is gatekeeping access to the content, so you can't actually read it very easily. Fortunately, SciHub to the rescue: The World's Technological Capacity to Store, Communicate, and Compute Information (if the link doesn't work, just try one of the many SciHub mirrors you can find all over the web). Hilariously, "Science" magazine themselves have quite a good article on how great SciHub is.
The part of the paper we are most interested in is figure 5, "World's technological installed capacity to compute information on general-purpose computers, in MIPS":
Assumption: MIPS is a useful way to measure computing power.
(Some interesting asides: they estimate that, in 2007, a good 25% of the entire world's computing power was in the form of videogame consoles. Even more astonishing, in 1986, 41% was pocket calculators! Supercomputers have never been more than about 0.5%. Abundance of a particular type of device is clearly a much more important multiplier than raw power. Of course, it's also likely that the world's most superdupercomputers are kept secret...)
The chart gives us data from 1986 to 2007, but there seems to be a clear trend, so let's extrapolate it back in time a bit (note the scale is logarithmic).
Assumption: Extrapolating this data backwards in time is valid.
The raw data is in "table SA3", which is not included in the paper, but is in the "supporting online material" which is thankfully available direct from "Science". All 254 pages of it.
Eh? Only four data points? What about all the other points shown on the chart in figure 5? Never mind. Onwards.
I used PlanetCalc's function approximation tool to calculate an "exponential regression" to fit the 4 data points, where y is the total computing power in MIPS, and x is the year.
y = e0.4868x - 947.4013
It has a "correlation coefficient" of 0.994, which sounds good.
Let's try a quick sanity check. The first computer capable of at least 1 MIPS was the IBM 7030, first produced in 1961. Our exponential function places world computing power in 1961 at 834 MIPS. According to the History Learning Site, there were 250 computers in total in 1955, and 20,000 in 1965. This might put the number in 1961 at around 3,500 computers (again by exponential regression). If they averaged 0.23 MIPS (834 MIPS divided by 3,500 computers), then we're pretty close. That sounds believable. At a minimum, 834 MIPS is neither an unrealistic over- nor under-estimate.
Assumption: The exponential extrapolation I've done is accurate enough to be useful.
The exponential regression estimate of world computing power gives us this chart:
For a representative figure, let's look at renting a CPU on DigitalOcean. The most cost-effective way to rent CPU time on DigitalOcean is with a $5/mo instance. You get 1 CPU at a total cost of $5/mo + VAT = ~£4.35/mo. According to /proc/cpuinfo the CPU is an "Intel(R) Xeon(R) CPU E5-2630L v2 @ 2.40GHz". (I checked a handful of DigitalOcean machines I have, and they all listed the same CPU).
/proc/cpuinfo tells us that this CPU offers 4800 "BogoMIPS". Unfortunately BogoMIPS (as the name suggests) are not comparable with actual MIPS.
I found an article by Techspot which suggests an "Intel Xeon E5-2630 v4" provided around 37,000 MIPS in their 7zip benchmark. I don't know how close a v2 CPU is to a v4, and I don't know if Techspot's MIPS are comparable to the ones used by Hilbert and Lopez, but we'll just have to assume that they are close enough.
Assumption: A DigitalOcean CPU gives us about 37,000 MIPS. This will be wrong in the following ways: the 7zip benchmark is not the same type of MIPS; each droplet has access to a shared CPU, not the entire thing.
Assumption: You can actually rent tens to hundreds of thousands of CPUs on DigitalOcean without them putting their prices up.
At £4.35/mo for 37,000 MIPS we get a cost of 12p per month per 1000 MIPS. We already have an estimate of how many MIPS were available in the entire world for any given year, so let's multiply that by 12p per month per 1000 MIPS to get an estimate of how much it would cost today to rent those MIPS.
Again note that the y axis is a log scale. This makes it easier to differentiate the smaller values, but the true nature of the data is much more striking with a linear scale:
It's practically free to rent an equivalent to all of the computing power that existed in the entire world at any time before 1960, and in fact you can't even rent a single CPU slow enough to be equivalent to the world's computing power for any time before 1968. That's really quite remarkable. By way of comparison, if you wanted to rent an equivalent of all of the internal combustion engine power that existed before 1960, it would probably cost the same, within 1 order of magnitude, as what it cost at the time.
Renting historical computing power starts to sting, at £150/mo, if you want to rent all of the power that existed in 1975, and by 1979 you're spending over £1,000/mo.
If you want to rent all of the computing power that will exist in 2020, that will set you back about £500 billion/mo, although it should of course be noted that that much computing power won't physically exist in the world until 2020 arrives, and even then it likely won't all be in the form of DigitalOcean droplets.
In answer to the title: it might cost around £13/mo today to rent an equivalent of all of the computing power that existed in the entire world in 1970. So there we have it.
Estimates of the entire world's computing power, extrapolated and interpolated, with cost calculated at 12p per month per 1000 MIPS.
|Year||Computing power (MIPS)||Cost to rent in 2018 (£/mo)|