The Large Hadron Collider (LHC) at CERN near Geneva is the largest scientific instrument on the planet. When it begins operations, it will produce roughly 15 Petabytes (15 million Gigabytes) of data annually, which thousands of scientists around the world will access and analyze. The first phase of the grid actually went online on 29th Sept, 2003
The world's largest computing grid is all set to tackle the biggest ever data challenge from the most powerful accelerator, the Large Hadron Collider (LHC). Three weeks after the first particle beams were injected into the LHC, the Worldwide LHC Computing Grid combines the power of more than 140 computer centers from 33 countries to analyze and manage more than 15 million gigabytes of LHC data every year. The mission of the Worldwide LHC Computing Grid (LCG) project is to build and maintain data storage and analysis infrastructure for the entire high energy physics community that will use the LHC.
Just to refresh your knowledge on CERN and LHC - CERN is the European Laboratory for Particle Physics, one of the world's most prestigious centres for fundamental research. The laboratory is currently building the Large Hadron Collider. The most ambitious scientific undertaking the world has yet seen, the LHC will collide tiny fragments of matter head on to unravel the fundamental laws of nature. It is due to switch on in 2007 and will be used to answer some of the most fundamental questions of science by some 7,000 scientists from universities and laboratories all around the world.
"Particle physics projects such as the LHC have been a driving force for the development of worldwide computing grids," said Ed Seidel, director of the NSF Office of Cyber infrastructure. "The benefits from these grids are now being reaped in areas as diverse as mathematical modeling and drug discovery."
"Open Science Grid members have put an incredible amount of time and effort in developing a nationwide (US) computing system that is already at work supporting America's 1,200 LHC physicists and their colleagues from other sciences," said OSG executive director Ruth Pordes from DOE's Fermi National Accelerator Lab.
The data from the LHC experiments will be distributed around the globe, according to a four-tiered model. A primary backup will be recorded on tape at CERN, the “Tier-0” centre of LCG. After initial processing, this data will be distributed to a series of Tier-1 centres, large computer centres with sufficient storage capacity and with round-the-clock support for the Grid. The Tier-1 centres will make data available to Tier-2 centres, each consisting of one or several collaborating computing facilities, which can store sufficient data and provide adequate computing power for specific analysis tasks. Dedicated optical fiber networks distribute LHC data from CERN in Geneva, Switzerland to 11 major 'Tier-1' computer centers in Europe, North America and Asia, including those at DOE's Brookhaven National Lab in New York and Fermi National Accelerator Laboratory in Illinois. From these, data is dispatched to more than 140 "Tier-2" centers around the world, including 12 in the US. Individual scientists will access these facilities through Tier-3 computing resources, which can consist of local clusters in a University Department or even individual PCs, and which may be allocated to LCG on a regular basis.
"Our ability to manage data at this scale is the product of several years of intense testing," said Ian Bird, leader of the Worldwide LHC Computing Grid project.
"Today's result demonstrates the excellent and successful collaboration we have enjoyed with countries all over the world. Without these international partnerships, such an achievement would be impossible," he said.
"When the LHC starts running at full speed, it will produce enough data to fill about six CDs per second," said Michael Ernst, director of Brookhaven National Laboratory's Tier-1 Computing Centre.
"As the first point of contact for LHC data in the US, the computing centres at Brookhaven and Fermilab are responsible for storing and distributing a great amount of this data for use by scientists around the country. We've spent years ramping up to this point, and now, we're excited to help uncover some of the numerous secrets nature is still hiding from us," informed Ernst.
Physicists in the US and around the world will sift through the LHC data torrent in search of tiny signals that will lead to discoveries about the nature of the physical universe. Through their distributed computing infrastructures, these physicists also help other scientific researchers increase their use of computing and storage for broader discovery.
"Grid computing allows university research groups at home and abroad to fully participate in the LHC project while fostering positive collaboration across different scientific departments on many campuses," said Ken Bloom from the University of Nebraska-Lincoln, manager for seven Tier-2 sites in the US.
No comments:
Post a Comment