World's largest particle physics laboratory, CERN, recently crossed a milestone of 200 petabytes of data permanently archived in its Data Centre. Over a billion particles colliding per second in detectors of the 27-km-long Large Hadron Collider(LHC) generate data at a rate of one million GB per second. However, only the most "interesting" events (100-200 from 600 million) are archived. The filtered LHC data are then aggregated in the CERN Data Centre (DC), where initial data reconstruction is performed, and where a copy is archived to long-term tape storage.
Even after the drastic data reduction performed by the experiments, the CERN DC processes on average one petabyte of data per day. This is how the milestone of 200 petabytes of data permanently archived in its tape libraries was reached on 29 June. The four big LHC experiments have produced unprecedented volumes of data in the two last years. This is due in large part to the outstanding performance and availability of the LHC itself. Indeed, in 2016, expectations were initially for around 5 million seconds of data taking, while the final total was around 7.5 million seconds, a very welcome 50% increase. 2017 is following a similar trend. Further, as luminosity is higher than in 2016, many collisions overlap and the events are more complex, requiring increasingly sophisticated reconstruction and analysis.
This has a strong impact on computing requirements. Consequently, records are being broken in many aspects of data acquisition, data rates and data volumes, with exceptional levels of use for computing and storage resources. To face these challenges, the computing infrastructure at large, and notably the storage systems, went through major upgrades and consolidation during the two years of Long Shutdown 1. These upgrades enabled the data centre to cope with the 73 petabytes of data received in 2016 (49 of which were LHC data) and with the flow of data delivered so far in 2017.
These upgrades also allowed the CERN Advanced Storage system (CASTOR) to pass the challenging milestone of 200 petabytes of permanently archived data.
Source & Credit:http://ift.tt/2u7ZRlO