Home International Scientists gear up to tackle 15 million gigabytes of data

Scientists gear up to tackle 15 million gigabytes of data

By IANS,

London : The four huge detectors of the new Large Hadron Collider near Geneva, when fully operational, are expected to generate up to a staggering 15 million gigabytes of data every year.

Andreas Hirstius, manager of CERN Openlab and the CERN School of Computing, explained how computer scientists have met the challenge of handling this unprecedented volume of data.

When CERN staff first considered in the mid-1990s how they might deal with the large volume of data that the vast circular contraption would produce when its two beams of protons collide, a single gigabyte of disk space still cost a few hundred dollars and CERN’s total external connectivity was equivalent to just one of today’s broadband connections.

It quickly became clear that computing power at CERN, even taking Moore’s Law into account, would be significantly less than that required to analyse Large Hadron Collider (LHC) data.

The solution, which transpired during 1990s, was to turn to “high-throughput computing” where the focus is not on shifting data as quickly as possible from A to B but rather from shifting as much information as possible between those two points.

High-performance computing is ideal for particle physics because the data produced in the millions of proton-proton collisions are all independent of one another – and can therefore be handled independently, according to an Institute of Physics (IOP) release.

So, rather than using a massive all-in-one mainframe supercomputer to analyse the results, the data can be sent to separate computers, all connected via a network.

Enter the LHC Grid. The Grid, which was officially inaugurated last month, is a tiered structure centred on CERN (Tier-0), which is connected by superfast fibre links to 11 Tier-1 centres at places like the Rutherford Appleton Lab (RAL) in Britain and Fermilab in the US.

Every second, more than one CD’s worth of data (about 700 MB) can be sent down these fibres to each of the Tier-1 centres.

Tier-1 centres then feed down to another 250 regional Tier-2 centres that are in turn accessed by individual researchers through university computer clusters and desktops and laptops (Tier-3).

Andreas Hirstius wrote: “the LHC challenge presented to CERN’s computer scientists was as big as the challenges to its engineers and physicists. The computer scientists managed to develop a computing infrastructure that can handle huge amounts of data, thereby fulfilling all of the physicists’ requirements and in some cases even going beyond them.”

These findings will appear in November’s Physics World.