Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

CMS data and workflow management system

Journal Article ·
CMS expects to manage many tens of peta bytes of data to be distributed over several computing centers around the world. The CMS distributed computing and analysis model is designed to serve, process and archive the large number of events that will be generated when the CMS detector starts taking data. The underlying concepts and the overall architecture of the CMS data and workflow management system will be presented. In addition the experience in using the system for MC production, initial detector commissioning activities and data analysis will be summarized.
Research Organization:
Fermi National Accelerator Laboratory (FNAL), Batavia, IL (United States)
Sponsoring Organization:
USDOE Office of Science (SC), High Energy Physics (HEP) (SC-25)
Contributing Organization:
CMS
DOE Contract Number:
AC02-07CH11359
OSTI ID:
1831871
Report Number(s):
FERMILAB-CONF-08-734-CMS; oai:inspirehep.net:1299490
Country of Publication:
United States
Language:
English

Similar Records

The Archive Solution for Distributed Workflow Management Agents of the CMS Experiment at LHC
Journal Article · Sun Mar 18 20:00:00 EDT 2018 · Computing and Software for Big Science · OSTI ID:1437402

CMS distributed computing workflow experience
Conference · Fri Dec 31 23:00:00 EST 2010 · J.Phys.Conf.Ser. · OSTI ID:1433875

CMS Data Processing Workflows during an Extended Cosmic Ray Run
Journal Article · Sun Nov 01 00:00:00 EDT 2009 · OSTI ID:969516

Related Subjects