Skip to:

Computing Overview

The HPC² provides substantial high performance computing resources for use by its member centers, including a 5.5 PetaFLOPS cluster with 72,000 Intel Xeon Gold 6148 (Skylake) processors cores, 345 terabytes of RAM, and a Mellanox HDR100 InfiniBand interconnect; a 593 TeraFLOPS cluster with 4,800 Intel Xeon E5-2680v2 (Ivy Bridge) processor cores and 28,800 Intel Xeon Phi cores, 72 terabytes of main RAM, 4 terabytes of Xeon Phi memory, and a Mellanox FDR InfiniBand interconnect; and a 138 TeraFLOPS cluster with 12,800 Intel Xeon E5-2680 (Sandy Bridge) processor cores, 26 terabytes of RAM, and a Mellanox FDR InfiniBand interconnect. Data storage capabilities include 20 petabytes of high performance RAID-enabled disk systems including a large parallel file system and a near-line storage/archival system.

The HPC² networking infrastructure backbone consists of a 40-Gigabit Ethernet network interconnecting the organization's primary computing and storage systems, as well as an extensive number of high performance edge switches providing connectivity to the organization's more-than 800 high-end desktops and laptops. This network infrastructure supports full redundancy at the core and allows for aggregated connections to support high-bandwidth activities. Each of the three HPC2 research facilities obtains wide area (external) network connectivity to the commodity Internet and Internet2 through geographically diverse paths to the Mississippi Optical Network (MISSION), a regional optical network supporting research activities within the state. These two MISSION network connections provide for high-availability and fault tolerant communication channels to the Internet2 connector site in Jackson, Mississippi which supports a potential capacity of more than 8 terabits per second. The HPCC and CAVS buildings are connected via dual 100 Gigabit/sec circuits; the STC facility is connected via dual 10 Gigabit/sec circuits.