Suivre ce blog
Editer l'article Administration Créer mon blog

Il y a gigabyte et après il y a terabyte. Il y a ensuite petabyte, exabyte, zettabyte et ensuite yottabyte.





The Utah Data Centre which collects data on US citizens

Sunday April 8,2012
By Geraint Jones



DEEP in the Utah desert, an imposing behemoth is gradually taking shape. Ten thousand workers are charged with building the vast edifice. When completed next year, it will house the most powerful data storage capacity the world has known – covering 93,000 square metres, consuming £25million of electricity every year and necessitating the town of Bluffdale to expand its boundary.

The amount of data it will be capable of storing is mind-boggling – the US government is talking in terms of yottabytes, which is a septillion bytes, or one followed by 24 zeroes – about the same as some estimates of the number of stars in the universe or 500 quintillion pages of text (see graphic).

The blandly named Utah Data Centre will cost in total £1.25billion to construct and will be the nerve centre of a network of surveillance centres that are the manifestation of the Pentagon’s ambition to become capable of “total information awareness”.

Welcome to one of the major and growing challenges of our obsession with and reliance on computers, emails and other forms of electronic communication.

In 2011, more than two billion people were connected to the internet. By 2015, it is expected there will be 2.7 billion users. Global internet traffic is set to quadruple between 2010 and 2015. This year alone, mankind will create 1.2 trillion gigabytes of data, equivalent to 75 billion 16gb iPods.

Britain could follow suit with their snooping tactics

To meet this huge rise in demand, the number of data storage centres is increasing rapidly. Currently there are more than half a million of them around the world covering an area three times the size of Richmond Park in London – and thousands more will be required in the near future as computer traffic continues to multiply at an unprecedented rate.

Google is investing £500million on three centres in Asia, which will add 40 per cent to its storage capacity around the world.

All of which will create a particular problem in the densely populated UK, where we don’t have the vastness of the Utah desert.

Where will the growing number of data centres that will become necessary be built? It is a problem highlighted by the row over the Government’s proposed “Snooping Bill”.

The British plans, though nothing like as ambitious, would monitor mobile phone and Skype calls, emails, texts and website visits of everyone in the UK. Downing Street said the move would cover data such as the times, dates, location and destination of calls and messages, but not what was actually said.

Though funded by the taxpayer, the plans would place a huge burden on internet service providers. They would have to store vast amounts of additional data, significantly increasing the storage capacity required.

It could even be, if the Government gets its way, that a UK Data Centre becomes the solution to the problem of collating all the information our growing dependency on the computer and electronic communication creates.

Yet that idea would bring with it other issues. Apart from the ethical ones of governments spying on their own people, there is the practical one of where to put it, as any proposal would have to negotiate more planning hurdles than a nuclear power station.

The “Snooping Bill” proposes to require ISPs to keep records of every UK email, phone call, text and website visit.

No such browsing data is currently kept by internet service providers but in future ISPs would be required to save data for two years.

In addition, they would have to offer GCHQ access to communications if the spy agency believed a crime was in progress, allowing them, for example, to trace the exact location of a mobile phone owned by a terror suspect.

When similar plans to monitor internet and email use were outlined by the previous Labour government, the estimated cost of running the system over 10 years was about £2billion.

Trefor Davies, of the Internet Service Providers’ Association and chief technology officer at business internet provider Timico, warns that the task of fulfilling the Government’s requirement will be expensive, difficult and one the industry is totally unprepared for. He said: “We do not keep anything at the moment because it is considered private data. The amount of effort involved is not inconsiderable.”

He also warned that it was virtually impossible to stop people using the internet anonymously and that snooping on their activities could drive users underground.

One idea being tested in the US is that of floating data centres, housed inside ships. Old container vessels could be given a new lease of life and the costs of storage and cooling the equipment would be lower and more environmentally friendly. But there are security risks and the danger that the vessel might sink.

Other ideas centre on developing the “cloud” principle, a form of outsourcing data storage to specialist providers.

Under this system, experts argue, computing becomes a service rather than a product where shared resources, software and information are provided to computers and other devices as a utility, like the electricity grid, over a network, most likely the internet.

By 2015, it is expected that 34 per cent of data centre traffic will be associated with cloud-based applications compared with 11 per cent in 2010.

Such ideas will help ease the pressure on storage space but experts doubt they will be enough to accommodate the exponential rise in internet and computer use especially once emerging markets in Asia and the Indian sub-continent fulfil their potential.

Eric Schmidt, Google’s former CEO, once estimated that all human knowledge created from the dawn of man to 2003 totalled five exabytes or five billion gigabytes, of storage space, a tiny fraction of the capacity being prepared at Bluffdale. Despite having thousands of years of ingenuity at their fingertips and the supercomputers to programme it, no one seems close to solving the problem of what to do with all the resultant virtual clutter.



Partager cet article