SABER

Shared Analytics & Big-data Enterprise Resource

For meeting 21st century’s big data challenge and to advance data science research, the Shared Analytics & Big-data Enterprise Resource (SABER) was introduced into the ACER HPC resource offerings in May 2017. In addition to traditional HPC workloads, this NSF-funded cluster provides access to various Big-Data/Analytics software packages.

SABER is UIC’s first fee-for-service grant where researchers can either buy service time or buy dedicated nodes for exclusive usage. With over one petabyte of research storage, SABER will allow researchers to run computation and analysis on massive datasets from heterogeneous sources.

Cluster Configuration

With over 75 Nodes and one Petabyte of research storage, SABER will allow researchers to run computation and analysis on massive data sets from heterogeneous sources and calculate results at run-time. Along with a super fast interconnect that supports high speed inter node communications, these results should be transferred as soon as possible.

The SABER cluster consists of traditional high performance computing nodes

  • 76 x HPC Nodes
    • Intel Xeon E5-2650 @ 2.20GHz
    • 24 x Cores/Node with 30 MB cache
    • 128GB RAM
    • 1TB Local Storage
  • Software on each node – CentOS 7.3

SABER utilises two file storage systems, NFS and Lustre.

  1. NFS Storage: A high capacity with 524 TB(0.5 PB) of raw persistent storage. All user home directories and group shares can be found here.
  2. Lustre Storage: An extremely fast storage with RAID 6 configuration and communicating with nodes over QDR Infiniband. It is a 760TB fast scratch storage, which should be used exclusively at run time; results should be transferred as soon as possible.

Interconnect: SABER uses FDR Infiniband by Mellanox that is capable of supporting internode communications at 56Gb/s to form a high-speed internal network. SABER operates on combination of EDR spine and FDR leaf switches with a 2:1 blocking factor ratio.