skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Scaling the Earth System Grid to 100Gbps Networks

Abstract

The SC11 demonstration, titled Scaling the Earth System Grid to 100Gbps Networks, showed the ability to use underlying infrastructure for the movement of climate data over 100Gbps network. Climate change research is one of the critical data intensive sciences, and the amount of data is continuously growing. Climate simulation data is geographically distributed over the world, and it needs to be accessed from many sources for fast and efficient analysis and inter-comparison of simulations. We used a 100Gbps link connecting National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory (LBNL), Argonne National Laboratory (ANL) and Oak Ridge National Laboratory (ORNL). In the demo, the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) phase 3 of the Coupled Model Intercomparison Project (CMIP-3) dataset was staged into the memory of computing nodes at ANL and ORNL from NERSC over the 100Gbps network for analysis and visualization. In general, climate simulation data consists of relatively small and large files with irregular file size distribution in each dataset. In this demo, we addressed challenges on data management in terms of high bandwidth networks, usability of existing protocols and middleware tools, and how applications can adapt and benefit from nextmore » generation networks.« less

Authors:
 [1];  [1]
  1. Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
Publication Date:
Research Org.:
Ernest Orlando Lawrence Berkeley National Laboratory, Berkeley, CA (United States)
Sponsoring Org.:
USDOE Office of Science (SC)
OSTI Identifier:
1212108
Report Number(s):
LBNL-5794E
DOE Contract Number:
AC02-05CH11231
Resource Type:
Technical Report
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING; SC11 100Gbps demo

Citation Formats

Balman, Mehmet, and Sim, Alex. Scaling the Earth System Grid to 100Gbps Networks. United States: N. p., 2012. Web. doi:10.2172/1212108.
Balman, Mehmet, & Sim, Alex. Scaling the Earth System Grid to 100Gbps Networks. United States. doi:10.2172/1212108.
Balman, Mehmet, and Sim, Alex. 2012. "Scaling the Earth System Grid to 100Gbps Networks". United States. doi:10.2172/1212108. https://www.osti.gov/servlets/purl/1212108.
@article{osti_1212108,
title = {Scaling the Earth System Grid to 100Gbps Networks},
author = {Balman, Mehmet and Sim, Alex},
abstractNote = {The SC11 demonstration, titled Scaling the Earth System Grid to 100Gbps Networks, showed the ability to use underlying infrastructure for the movement of climate data over 100Gbps network. Climate change research is one of the critical data intensive sciences, and the amount of data is continuously growing. Climate simulation data is geographically distributed over the world, and it needs to be accessed from many sources for fast and efficient analysis and inter-comparison of simulations. We used a 100Gbps link connecting National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory (LBNL), Argonne National Laboratory (ANL) and Oak Ridge National Laboratory (ORNL). In the demo, the Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) phase 3 of the Coupled Model Intercomparison Project (CMIP-3) dataset was staged into the memory of computing nodes at ANL and ORNL from NERSC over the 100Gbps network for analysis and visualization. In general, climate simulation data consists of relatively small and large files with irregular file size distribution in each dataset. In this demo, we addressed challenges on data management in terms of high bandwidth networks, usability of existing protocols and middleware tools, and how applications can adapt and benefit from next generation networks.},
doi = {10.2172/1212108},
journal = {},
number = ,
volume = ,
place = {United States},
year = 2012,
month = 3
}

Technical Report:

Save / Share:
  • This report, which summarizes work carried out by the ESG-CET during the period April 1, 2007 through September 30, 2007, includes discussion of overall progress, period goals, highlights, collaborations and presentations. To learn more about our project, please visit the Earth System Grid website. In addition, this report will be forwarded to the DOE SciDAC project management, the Office of Biological and Environmental Research (OBER) project management, national and international stakeholders (e.g., the Community Climate System Model (CCSM), the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5), the Climate Science Computational End Station (CCES), etc.), and collaborators. Themore » ESG-CET executive committee consists of David Bernholdt, ORNL; Ian Foster, ANL; Don Middleton, NCAR; and Dean Williams, LLNL. The ESG-CET team is a collective of researchers and scientists with diverse domain knowledge, whose home institutions include seven laboratories (ANL, LANL, LBNL, LLNL, NCAR, ORNL, PMEL) and one university (ISI/USC); all work in close collaboration with the project's stakeholders and domain researchers and scientists. During this semi-annual reporting period, the ESG-CET increased its efforts on completing requirement documents, framework design, and component prototyping. As we strove to complete and expand the overall ESG-CET architectural plans and use-case scenarios to fit our constituency's scope of use, we continued to provide production-level services to the community. These services continued for IPCC AR4, CCES, and CCSM, and were extended to include Cloud Feedback Model Intercomparison Project (CFMIP) data.« less
  • Drawing to a close after five years of funding from DOE's ASCR and BER program offices, the SciDAC-2 project called the Earth System Grid (ESG) Center for Enabling Technologies has successfully established a new capability for serving data from distributed centers. The system enables users to access, analyze, and visualize data using a globally federated collection of networks, computers and software. The ESG software now known as the Earth System Grid Federation (ESGF) has attracted a broad developer base and has been widely adopted so that it is now being utilized in serving the most comprehensive multi-model climate data setsmore » in the world. The system is used to support international climate model intercomparison activities as well as high profile U.S. DOE, NOAA, NASA, and NSF projects. It currently provides more than 25,000 users access to more than half a petabyte of climate data (from models and from observations) and has enabled over a 1,000 scientific publications.« less
  • The climate and weather data science community gathered December 3–5, 2013, at Lawrence Livermore National Laboratory, in Livermore, California, for the third annual Earth System Grid Federation (ESGF) and Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT) Face-to-Face (F2F) Meeting, which was hosted by the Department of Energy, National Aeronautics and Space Administration, National Oceanic and Atmospheric Administration, the European Infrastructure for the European Network of Earth System Modelling, and the Australian Department of Education. Both ESGF and UV-CDAT are global collaborations designed to develop a new generation of open-source software infrastructure that provides distributed access and analysis to observed andmore » simulated data from the climate and weather communities. The tools and infrastructure developed under these international multi-agency collaborations are critical to understanding extreme weather conditions and long-term climate change, while the F2F meetings help to build a stronger climate and weather data science community and stronger federated software infrastructure. The 2013 F2F meeting determined requirements for existing and impending national and international community projects; enhancements needed for data distribution, analysis, and visualization infrastructure; and standards and resources needed for better collaborations.« less
  • This report summarizes work carried out by the ESG-CET during the period April 1, 2009 through September 30, 2009. It includes discussion of highlights, overall progress, period goals, collaborations, papers, and presentations. To learn more about our project, and to find previous reports, please visit the Earth System Grid Center for Enabling Technologies (ESG-CET) website. This report will be forwarded to the DOE SciDAC program management, the Office of Biological and Environmental Research (OBER) program management, national and international collaborators and stakeholders (e.g., the Community Climate System Model (CCSM), the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5),more » the Climate Science Computational End Station (CCES), the SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science, the North American Regional Climate Change Assessment Program (NARCCAP), and other wide-ranging climate model evaluation activities). During this semi-annual reporting period, the ESG-CET team continued its efforts to complete software components needed for the ESG Gateway and Data Node. These components include: Data Versioning, Data Replication, DataMover-Lite (DML) and Bulk Data Mover (BDM), Metrics, Product Services, and Security, all joining together to form ESG-CET's first beta release. The launch of the beta release is scheduled for late October with the installation of ESG Gateways at NCAR and LLNL/PCMDI. Using the developed ESG Data Publisher, the ESG II CMIP3 (IPCC AR4) data holdings - approximately 35 TB - will be among the first datasets to be published into the new ESG enterprise system. In addition, the NCAR's ESG II data holdings will also be published into the new system - approximately 200 TB. This period also saw the testing of the ESG Data Node at various collaboration sites, including: the British Atmospheric Data Center (BADC), the Max-Planck-Institute for Meteorology, the University of Tokyo Center for Climate System Research, and the Australian National University. This period, a total of 14 national and international sites installed an ESG Data Node for testing. During this period, we also continued to provide production-level services to the community, providing researchers worldwide with access to CMIP3 (IPCC AR4), CCES, and CCSM, Parallel Climate Model (PCM), Parallel Ocean Program (POP), and Cloud Feedback Model Intercomparison Project (CFMIP), and NARCCAP data.« less
  • This report summarizes work carried out by the ESG-CET during the period October 1, 2009 through March 31, 2009. It includes discussion of highlights, overall progress, period goals, collaborations, papers, and presentations. To learn more about our project, and to find previous reports, please visit the Earth System Grid Center for Enabling Technologies (ESG-CET) website. This report will be forwarded to the DOE SciDAC program management, the Office of Biological and Environmental Research (OBER) program management, national and international collaborators and stakeholders (e.g., the Community Climate System Model (CCSM), the Intergovernmental Panel on Climate Change (IPCC) 5th Assessment Report (AR5),more » the Climate Science Computational End Station (CCES), the SciDAC II: A Scalable and Extensible Earth System Model for Climate Change Science, the North American Regional Climate Change Assessment Program (NARCCAP), and other wide-ranging climate model evaluation activities).« less