HPC² researchers utilize a variety of server systems to fulfill their research requirements. General purpose computational systems include numerous x86-based servers with up to 24 cores and 512GB of RAM each. Additionally, the HPC² researchers utilize nearly 800 desktops and laptops running Windows or Linux.
Disk storage is provided via high-performance RAID-enabled disk systems, including a Sun/Oracle and Data Direct Networks SAN storage systems. A total of nearly 12 Petabytes of disk space is available for HPC² researchers. In addition to the online disk space, a Sun StorageTek SL8500 tape library provides a capability of 9 Petabytes of near-line storage. A Lustre-based high performance parallel filesystem with approximately 3 Terabytes of storage is also available on the core computational systems.
The HPC²'s high-performance network backbone consists of a mixture of 10 Gigabit Ethernet (10Gb/s) and 40 Gigabit Ethernet (40Gb/s) circuits utilizing Juniper and Extreme Networks routers and switches. This backbone supports full redundancy at the core and allows for aggregated connections to support high-bandwidth activities. All computational systems are connected to via either a 10 Gb or 40 Gb connection; all primary desktop systems are connected via high-performance edge switches at 1 Gigabit Ethernet.
The HPC² research facilities on the main MSU campus, at the NASA Stennis Space Center (SSC), and the USACE Engineer Research and Development Center (ERDC) are interconnected via the Mississippi Optical Network (MissiON), a regional optical network supporting research activities within the state. MissiON provides for high-availability and fault tolerant communication channels via geographically diverse paths to the end research locations as well as to the Internet2 point-of-presence in Jackson, MS. The High Performance Computing Center (HPCC) and Center for Advanced Vehicular Systems (CAVS) buildings, located at the primary MSU campus, are connected via dual 100 Gigabit/sec circuits to MissioN; the Science and Technology Center building at SSC and Institute for Software Engineering Research (ISER) facilities at ERDC are connected via dual 10 Gigabit/sec circuits to MissioN.