The testing of CERN's Large Hadron Collider has been widely reported over the last few days and weeks in the media. You can find more details here.
Wednesday, 10 September 2008
In simplistic terms, the project is looking to re-create the conditions experienced in the Big Bang at the start of the universe in an attempt to detect elusive atomic sub-particles, which should help to solidify and prove the Standard Model.
There has been a lot of speculation that the testing could lead to the creation of a black hole and implosion of the earth and the universe. In fact, my son came home from school yesterday and told me the rumour in the playground was that Sweden were intending to launch two nuclear missiles at each other that would lead to the destruction of the earth!! Obviously, I had to bring his astrophysics knowledge up to scratch. I guess I shouldn't be too harsh on him - he's only 10.
Anyway, the interesting storage angle is in how much data the project will produce on an annual basis. According to this rather helpful PDF, the project will produce data at a rate of 700MB/s or 15 petabytes (PB) per year (presumably the collider will not be run 24 hours a day). That's a lot of data, especially when you consider the project will take 2-3 years to produce enough data to detect the elusive higgs-boson particles!
The collected data will be analysed by the LHC Computing Grid (see here for details). Managing this data, which will be distributed down 4 tiers will be an incredible job, mainly for the complexities of maintaining concurrency of access to the data and all the analysis results.
Let's hope it works!!