Great Lakes: Next Generation HPC at U-M

Get to know your latest High-Performance Computing (HPC) resource—Great Lakes.
by Mark Champe, App Programmer/Analyst Intermediate

Great Lakes is a next generation High-Performance Computing (HPC) platform for the U-M community that will provide advantages compared to our current system, Flux, primarily in the areas of storage and networking. Great Lakes is built around the Intel Skylake CPU architecture and includes standard, large memory, visualization, and GPU-accelerated nodes. A full roll-out is anticipated for July 2019, with early-adopter testing beginning in June 2019. The Great Lakes timeline provides up-to-date milestone dates and statuses as we transition to the new system.

What will change for you?

New Job Script Format:

Great Lakes will use the resource manager and scheduler called Slurm, as opposed to Torque and Moab on Flux. This will be the most significant difference between the two clusters and will require some work on your part to transition from Flux to Great Lakes. To make this easier, there is a guide for migrating from Torque to Slurm and several HPC workshops that cover the new system and changes.
 

Simplified accounting structure:

Unlike Flux where you need an account for each resource, on Great Lakes you can use the same account and simply request the resources you need—from GPUs to large memory.
 

Updated Hardware, Networking and Performance:

  • 2 PB scratch storage providing approximately 80 GB/s performance (compared to 8 GB/s on Flux)

  • New networking with improved InfiniBand architecture and 100 GB/s to each node

  • Approximately 13,000 Intel Skylake Gold processors providing AVX512 capability with over 1.5 TFlop of performance per node

    • Compute nodes with significantly faster I/O via SSD-accelerated storage

    • Large Memory Nodes with 1.5 TB memory per node

    • GPU Nodes with NVidia Volta V100 GPUs (2 GPUs per node)

    • Visualization Nodes with Tesla P40 GPUs
       

If you have any questions about Great Lakes or you’re wondering how you might leverage HPC in your own research or teaching, please let us know by reaching out to [email protected]!

Release Date:
05/22/2019

TECHNOLOGY SERVICES

G155 Angell Hall, 435 South State St, Ann Arbor, MI 48109–1003
734.615.0100
[email protected] 

Technology Services Contact Center Chat