Misplaced Pages

Yellowstone (supercomputer): Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 00:18, 14 December 2012 editSkyerise (talk | contribs)Extended confirmed users, Pending changes reviewers, Rollbackers141,332 editsm Yworo moved page Yellowstone Supercomputer to Yellowstone (supercomputer): as other supercomputer articles← Previous edit Revision as of 08:06, 11 January 2013 edit undoBG19bot (talk | contribs)1,005,055 editsm WP:CHECKWIKI error fix for #61. Punctuation goes before References. Do general fixes if a problem exists. - using AWB (8853)Next edit →
Line 1: Line 1:
'''Yellowstone'''<ref name="ysenv">, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> is the inaugural computing resource at the NCAR-Wyoming Supercomputing Center<ref>, University Corporation for Atmospheric Research (UCAR) website, Retrieved 2012-06-12.</ref> (NWSC) in Cheyenne, Wyoming. It was installed, tested, and readied for production in the summer of 2012<ref>, IBM News Release, 08 Nov 2011.</ref>. Yellowstone is a highly capable ] system designed for conducting breakthrough scientific research in the interdisciplinary field of ]. Scientists use this computer and its associated resources to model and analyze complex processes in the atmosphere, oceans, ice caps, and throughout the Earth system, accelerating scientific research in climate change, severe weather, geomagnetic storms, carbon sequestration, aviation safety, wildfires, and many other topics<ref>, ''NCAR/UCAR AtmosNews'', 7 November 2011.</ref><ref>, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref>. Funded by the ] and the ] and ], and operated by the ], Yellowstone’s purpose is to improve the predictive power of Earth system science simulation to benefit decision-making and planning for society<ref>, Proposal to The National Science Foundation by The National Center for Atmospheric Research and The University Corporation for Atmospheric Research in partnership with The University and State of Wyoming, 4 September 2009.</ref>. '''Yellowstone'''<ref name="ysenv">, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> is the inaugural computing resource at the NCAR-Wyoming Supercomputing Center<ref>, University Corporation for Atmospheric Research (UCAR) website, Retrieved 2012-06-12.</ref> (NWSC) in Cheyenne, Wyoming. It was installed, tested, and readied for production in the summer of 2012.<ref>, IBM News Release, 08 Nov 2011.</ref> Yellowstone is a highly capable ] system designed for conducting breakthrough scientific research in the interdisciplinary field of ]. Scientists use this computer and its associated resources to model and analyze complex processes in the atmosphere, oceans, ice caps, and throughout the Earth system, accelerating scientific research in climate change, severe weather, geomagnetic storms, carbon sequestration, aviation safety, wildfires, and many other topics.<ref>, ''NCAR/UCAR AtmosNews'', 7 November 2011.</ref><ref>, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> Funded by the ] and the ] and ], and operated by the ], Yellowstone’s purpose is to improve the predictive power of Earth system science simulation to benefit decision-making and planning for society.<ref>, Proposal to The National Science Foundation by The National Center for Atmospheric Research and The University Corporation for Atmospheric Research in partnership with The University and State of Wyoming, 4 September 2009.</ref>


== System description == == System description ==
Yellowstone is a 1.5-] IBM iDataPlex cluster computer with 4,518 dual-socket compute ] that contains 9,036, 2.6-GHz Intel Xeon E5-2670 8-core processors (72,288 cores), and its aggregate memory size is 144.6 ]<ref name="specifications">, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref>. The nodes interconnect in a full ] via a Mellanox FDR ] switching fabric<ref name="specifications" />. System software<ref>, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> includes the ] operating system for Scientific Computing<ref>, Red Hat Products website, Retrieved 2012-06-12.</ref>, ] Batch Subsystem and Resource Manager<ref></ref>, and ] (GPFS)<ref name="specifications" />. Yellowstone is a 1.5-] IBM iDataPlex cluster computer with 4,518 dual-socket compute ] that contains 9,036, 2.6-GHz Intel Xeon E5-2670 8-core processors (72,288 cores), and its aggregate memory size is 144.6 ]s.<ref name="specifications">, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> The nodes interconnect in a full ] via a Mellanox FDR ] switching fabric.<ref name="specifications" /> System software<ref>, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.</ref> includes the ] operating system for Scientific Computing,<ref>, Red Hat Products website, Retrieved 2012-06-12.</ref> ] Batch Subsystem and Resource Manager,<ref></ref> and ] (GPFS).<ref name="specifications" />


Yellowstone is integrated with many other high-performance computing resources in the NWSC (see Figure 1). Yellowstone is integrated with many other high-performance computing resources in the NWSC (see Figure 1).
Line 12: Line 12:
|} |}


The central feature of this supercomputing architecture is its shared ] that streamlines science ] by providing computation, analysis, and ] work spaces common to all resources. This common data storage pool, called the GLobally Accessible Data Environment<ref>, ''FY2011 CISL Annual Report''. '''Note:''' This October 2011 report describes GLADE at NCAR’s Mesa Lab Computing Facility in Boulder, Colorado. The design of GLADE at NWSC in Cheyenne, Wyoming is identical at this level of description.</ref> (GLADE), initially provides 11 ] of online disk capacity shared by the supercomputer, two ] and visualization (DAV) cluster computers (Geyser and Caldera)<ref name="specifications" />, data servers for both local and remote users, and a data ] with the capacity to store over 100 petabytes of research data. High-speed ] connect this Yellowstone environment to science gateways<ref>, ''FY2011 CISL Annual Report''.</ref>, data transfer services, remote visualization resources, ] sites, and partner sites around the world. The central feature of this supercomputing architecture is its shared ] that streamlines science ]s by providing computation, analysis, and ] work spaces common to all resources. This common data storage pool, called the GLobally Accessible Data Environment<ref>, ''FY2011 CISL Annual Report''. '''Note:''' This October 2011 report describes GLADE at NCAR’s Mesa Lab Computing Facility in Boulder, Colorado. The design of GLADE at NWSC in Cheyenne, Wyoming is identical at this level of description.</ref> (GLADE), initially provides 11 ]s of online disk capacity shared by the supercomputer, two ] and visualization (DAV) cluster computers (Geyser and Caldera),<ref name="specifications" /> data servers for both local and remote users, and a data ] with the capacity to store over 100 petabytes of research data. High-speed ] connect this Yellowstone environment to science gateways,<ref>, ''FY2011 CISL Annual Report''.</ref> data transfer services, remote visualization resources, ] sites, and partner sites around the world.


This integration of computing resources, file systems, data storage, and broadband networks allows scientists to simulate future geophysical scenarios at high resolution, then analyze and visualize them on one computing complex<ref>, ''HPCwire'', November 07, 2011.</ref>. This improves scientific productivity<ref name="glade" /> by avoiding the delays associated with moving large quantities of data between separate systems. Further, this reduces the volume of data that needs to be transferred to researchers at their home institutions. The Yellowstone environment at NWSC will make more than 600 million processor-hours available each year to researchers in the Earth system sciences<ref>, ''News@Unidata'', 22 November 2011.</ref>. This integration of computing resources, file systems, data storage, and broadband networks allows scientists to simulate future geophysical scenarios at high resolution, then analyze and visualize them on one computing complex.<ref>, ''HPCwire'', November 07, 2011.</ref> This improves scientific productivity<ref name="glade" /> by avoiding the delays associated with moving large quantities of data between separate systems. Further, this reduces the volume of data that needs to be transferred to researchers at their home institutions. The Yellowstone environment at NWSC will make more than 600 million processor-hours available each year to researchers in the Earth system sciences.<ref>, ''News@Unidata'', 22 November 2011.</ref>

== References ==
<references />


== See also == == See also ==
Line 27: Line 24:
* ] * ]
* ] * ]

== References ==
<references />


== External links == == External links ==

Revision as of 08:06, 11 January 2013

Yellowstone is the inaugural computing resource at the NCAR-Wyoming Supercomputing Center (NWSC) in Cheyenne, Wyoming. It was installed, tested, and readied for production in the summer of 2012. Yellowstone is a highly capable petascale system designed for conducting breakthrough scientific research in the interdisciplinary field of Earth system science. Scientists use this computer and its associated resources to model and analyze complex processes in the atmosphere, oceans, ice caps, and throughout the Earth system, accelerating scientific research in climate change, severe weather, geomagnetic storms, carbon sequestration, aviation safety, wildfires, and many other topics. Funded by the National Science Foundation and the State and University of Wyoming, and operated by the National Center for Atmospheric Research, Yellowstone’s purpose is to improve the predictive power of Earth system science simulation to benefit decision-making and planning for society.

System description

Yellowstone is a 1.5-petaflops IBM iDataPlex cluster computer with 4,518 dual-socket compute nodes that contains 9,036, 2.6-GHz Intel Xeon E5-2670 8-core processors (72,288 cores), and its aggregate memory size is 144.6 terabytes. The nodes interconnect in a full fat tree network via a Mellanox FDR InfiniBand switching fabric. System software includes the Red Hat Enterprise Linux operating system for Scientific Computing, LSF Batch Subsystem and Resource Manager, and IBM General Parallel File System (GPFS).

Yellowstone is integrated with many other high-performance computing resources in the NWSC (see Figure 1).

Diagram showing the Yellowstone supercomputer and other resources in the NWSC facility in Cheyenne, Wyoming.
Figure 1. This diagram shows the foundational architecture for the supercomputing equipment to be deployed in the NWSC facility that was built in 2011. It shows the integration and relative sizes of systems for computing, data analysis and visualization, online and archived data, data management, external interfaces, and networks. These high-performance computing resources are structured to share a common, high-speed, central file system that manages all data for all the computing and support systems necessary to complete scientific workflows. This highly efficient design reduces costs by eliminating the need to move or maintain multiple copies of data.

The central feature of this supercomputing architecture is its shared file system that streamlines science workflows by providing computation, analysis, and visualization work spaces common to all resources. This common data storage pool, called the GLobally Accessible Data Environment (GLADE), initially provides 11 petabytes of online disk capacity shared by the supercomputer, two data analysis and visualization (DAV) cluster computers (Geyser and Caldera), data servers for both local and remote users, and a data archive with the capacity to store over 100 petabytes of research data. High-speed networks connect this Yellowstone environment to science gateways, data transfer services, remote visualization resources, XSEDE sites, and partner sites around the world.

This integration of computing resources, file systems, data storage, and broadband networks allows scientists to simulate future geophysical scenarios at high resolution, then analyze and visualize them on one computing complex. This improves scientific productivity by avoiding the delays associated with moving large quantities of data between separate systems. Further, this reduces the volume of data that needs to be transferred to researchers at their home institutions. The Yellowstone environment at NWSC will make more than 600 million processor-hours available each year to researchers in the Earth system sciences.

See also

References

  1. ^ "Yellowstone", NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.
  2. "NCAR-Wyoming Supercomputing Center Fact Sheet", University Corporation for Atmospheric Research (UCAR) website, Retrieved 2012-06-12.
  3. NCAR Advances Weather Research Capabilities With IBM Supercomputing Technology, IBM News Release, 08 Nov 2011.
  4. NCAR Selects IBM for Key Components of New Supercomputing Center, NCAR/UCAR AtmosNews, 7 November 2011.
  5. Yellowstone, NWSC science impact, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.
  6. The NCAR-Wyoming Supercomputing Center Science Justification, Proposal to The National Science Foundation by The National Center for Atmospheric Research and The University Corporation for Atmospheric Research in partnership with The University and State of Wyoming, 4 September 2009.
  7. ^ System overview, Yellowstone: High-performance computing resource, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.
  8. Yellowstone Software, NCAR Computational and Information Systems Laboratory (CISL) website: Resources. Retrieved 2012-06-12.
  9. Red Hat Enterprise Linux For Scientific Computing, Red Hat Products website, Retrieved 2012-06-12.
  10. Data-intensive computing architecture, FY2011 CISL Annual Report. Note: The diagram shown has been updated with new information that was not available for the October 2011 report cited here.
  11. ^ New data service speeds the progress of research, FY2010 NCAR Annual Report.
  12. NCAR’s Globally Accessible Data Environment, FY2011 CISL Annual Report. Note: This October 2011 report describes GLADE at NCAR’s Mesa Lab Computing Facility in Boulder, Colorado. The design of GLADE at NWSC in Cheyenne, Wyoming is identical at this level of description.
  13. Science gateway services, FY2011 CISL Annual Report.
  14. NCAR to Install 1.6 Petaflop IBM Supercomputer, HPCwire, November 07, 2011.
  15. NCAR's next supercomputer: Yellowstone, News@Unidata, 22 November 2011.

External links

Category: