The biggest computer on Earth

(This blog post first appeared on ComparetheCloud.net.)

Think the Internet is big? It's beyond big. There are now at least two billion PCs, a billion cellphones plus 4-5 billion servers, consoles, tablets, smart TVs and embedded devices all joined together by enough cable to reach to the Moon and back 1000 times over. There are more devices on the Internet than there are people on Earth.

All that hardware could fill every football stadium in Britain to the brim – and it uses more electricity than the whole of Japan. The Net has become a vast, multi-trillion-dollar planetary machine; the most expensive and complex thing that humans have ever made. And everything about it is growing.

Over 1.5m smartphones are registered daily; 5x the human birthrate. CPUs and GPUs are finding their way into everything from fridges to teddy bears and Moore's Law continues to hold; computing power keeps doubling every year or two, making today's laptops quicker than yesterday's supercomputers. 

The sum total of the Internet's home computing power is almost beyond imagination; hundreds of exaflops. It's also over 1000x greater than all the supercomputers on Earth combined. And we're wasting almost all of it.

Sure, all those billions of PCs, tablets and phones are 'being used'. A third of all PCs are never switched off. But even though we are using these devices constantly, at any given time the average CPU load across the Internet is less than 2%. Unless video-encoding or playing the latest 3D game, the typical PC is almost completely idle. People don't type at GHz speeds or view holiday photos at 60 fps. 

As processors keep getting faster and more numerous, the ratio of used-to-idle computing continues to increase. Almost everyone has more than they need – almost, that is, because some people can never have enough; physicists, biologists, engineers, climatologists, astronomers, chemists, geologists – pretty much anyone doing fundamental research.

Science and industry spend billions on ever-faster-supercomputers for this reason; they have become indispensable to our modern way of life. Cars, planes, medicines, oil wells, solar cells, even the shape of Pringles crisps was designed by supercomputer. They are the most useful and productive research tools ever made.

But they don't last. The IBM Roadrunner, world's fastest computer in 2009, has just been decommissioned because it is already obsolete. It cost over $100m, as will its successor. The owners, Los Alamos National Lab, can use the same floor space and energy far more efficiently with new hardware. Like all supercomputers though, Roadrunner's limited shelf-life was unavoidable. Computers do not age gracefully. 

Contrast this with billions of idle CPUs on the Internet that are continually being replaced by their owners. Broken devices are repaired, old ones upgraded and more are constantly added. The Internet is unique among machines: effectively self-healing, self-improving and permanently switched on. Parts of it may turn on and off every second but, considered as a single system, the Net has 100% uptime and it always will do.

Using the Internet as a platform has been happening for years. Berkeley University's BOINC software has enabled dozens of science projects to harness over $2Bn-worth of donated CPU time over the last decade, from over 6m idle PCs. The concept of volunteer computing is technically proven, the only issue is persuading device owners to allow it. Considering that the Internet is wasting over $500m in unused computing per day, it is certainly an endeavour worth pursuing.

It is true that many HPC and cloud applications are not suitable for heterogeneous WAN computing. Low latency is out of the question (even with that new 99.7c hollow optic fibre) and highly secret data is unlikely to ever leave its owner's building without military-grade homomorphic encryption. But there are millions of tasks that are perfectly suited; Monte Carlos, parameter sweeps, climate modelling, rendering – anything that doesn't need Infiniband, shouldn't be queuing for HPC and will happily use public resources to gain 10x-100x more computing for the same budget.

Using the Internet to compute also permits a whole new class of tasks which might prove to be the most interesting – and the most valuable – of all: those which can only be served by a global grid of millions of CPUs working together. 

For some applications, $100m supercomputers will always be just too small. 

Mark McA ID: 179 Posts: 227
17 Aug 2013 02:36 AM

Hi Anil,

Sorry for such a long delay, didn't see your comment. It's a good question. We are in discussions with Dell right now, as it happens. Will ask!

Cheers,

Mark 

Anil Singh ID: 77 Posts: 2
23 Apr 2013 02:43 PM

and what about the unsold PCs all over the world from giants like HP, Dell, Sony, others? What do they do with PCs/parts they cant sell? Do they refurbish it themselves?