skip to main content

Title: Extending the Fermi-LAT data processing pipeline to the grid

The Data Handling Pipeline ("Pipeline") has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. Additionally, it receives heavy use in performing production Monte Carlo tasks.
 [1] ;  [2] ;  [3] ;  [3] ;  [2] ;  [4]
  1. Stockholm Univ., Stockholm (Sweden); The Oskar Klein Centre for Cosmoparticle Physics, Stockholm (Sweden)
  2. Univ. Montpellier 2, Montpellier (France)
  3. SLAC National Accelerator Lab., Menlo Park, CA (United States)
  4. Centre de Physique des Particules de Marseille, Marseille (France)
Publication Date:
OSTI Identifier:
Report Number(s):
Journal ID: ISSN 1742-6588; arXiv:1212.4115
DOE Contract Number:
Resource Type:
Journal Article
Resource Relation:
Journal Name: Journal of Physics. Conference Series; Journal Volume: 396; Journal Issue: 3
IOP Publishing
Research Org:
SLAC National Accelerator Lab., Menlo Park, CA (United States)
Sponsoring Org:
USDOE Office of Science (SC)
Country of Publication:
United States
astrophysics; computing; Experiment-HEP; ASTRO; HEPEX