Ordinary matter only makes up some four to five percent - what
By Brent Gregston, August 2007
What is the universe made of?
about the rest?
The way to find out is to travel back in time, say scientists at CERN (European Organisation for Nuclear Research), by recreating what happened in the universe a fraction of a second after the Big Bang 13.7 billion years ago.
To do this, CERN has constructed a particle accelerator known as the Large Hadron Collider (LHC), which will probe deeper into matter than ever before. This 27-kilometer circular tunnel beneath the French-Swiss border near Geneva is the world's largest and most complex scientific instrument. It is also the planet's coldest refrigerator, equipped to operate at a temperature of minus 271°C even colder than outer space.
By colliding two proton beams together at 99.9 percent of the speed of light, the CERN researchers believe they can replicate conditions that have not existed since the Big Bang, giving us a much greater understanding of the Universe.
Once the LHC is in operation - that's expected to happen later this year CERN will account for one percent of all the information generated on the planet: 15 million Gigabytes (15 Petabytes) of
data each year.
To cope with this sea of data, CERN is pioneering Grid computing
a software environment based on open standards and protocols that makes it possible to share disparate, loosely coupled IT resources across organizations and geographies.
HP was the first commercial company to bring its technology to bear on CERN's LCG - a Grid of epic proportions. HP Labs and HP's University Relations Program are collaborating with CERN openlab on software and hardware developments for the Grid.
Grid is based on some of the same ideas as the World Wide Web invented at CERN in the beginning of the 1990s but it goes much, much further, sharing not just information but computing power and storage as well. This means that scientists can log on to the Grid via their PCs and have calculations performed by machines all across the planet.
The promise of Grid computing makes it one of the most talked-about ideas in enterprise IT. HP has been working in this area for several years: developing, applying and adapting the technologies and open standards in Grid computing, virtualization and other related technologies to the enterprise arena.
HP researchers are creating innovative ways of deploying,
configuring and allocating Grid hardware and software. At CERN, researchers are experimenting with next-generation middleware for Grid computing, including two HP labs technologies: Tycoon and SmartFrog.
Tycoon, an open source project launched in HP Labs Palo Alto (US), provides a market-based system for managing compute resources within the Grid. It provides a step toward Grid democratization because it creates a dynamic, seamless market for computing resources that automatically adapts to changes in demand.
At CERN, a Tycoon prototype is being used to demonstrate its benefits on a large-scale grid environment, targeting high-energy physics users as an advanced research community with a high demand of resources.
For resource providers, Tycoon eliminates idle CPU time and avoids waste. In addition, providers can earn money to be spent on other servers.
"This provides our researchers with extremely useful data on how Tycoon works in a real environment, as well as offering a window into how very large-scale computational resources are allocated in real time," says Bernardo Huberman, director of the Information Dynamics Lab, which developed Tycoon.
Created by a team at HP Labs Bristol (UK), SmartFrog - short for Smart Framework for Object Groups - is an open source tool that can be used to describe and automate how resources on a Grid are configured, run and managed.
Building on SmartFrog, CERN openlab researchers are developing SmartDomains a management system for Xen virtual machines allowing the automated creation of groups of virtual machines running on multiple physical systems.
Besides offering CERN a tool for managing its large Grid infrastructure, the project offers HP Labs the opportunity to learn from the use of its technologies in a leading-edge computing context, says Peter Toft, who managed the SmartFrog team. In addition, CERN researchers contribute to SmartFrog.
HP's affiliation with the LHC project goes back to 2002 when the company joined the CERN openlab dedicated to developing
data-intensive Grid technologies to be used by the worldwide community of scientists working on the LHC.
But the company's involvement is not purely philanthropic. For one thing, HP's ProCurve Switches connect the thousands of PCs at the heart of the LHC Computing Grid.
In addition, HP believes that what it learns as part of the project will ultimately benefit customers.
"The LCG has the potential to redefine the technical limits of grid computing," said Arnaud Pierson, Program Manager for HP University Relations in EMEA. "We believe that participating in CERN's project is a great demonstration of how HP's collaborations can ultimately bring value to our customers."