HPC Cluster

Getting Started

Note: Only faculty members may initiate the creation of new accounts. Students and staff must access the cluster through course enrollment or as a member of an established lab.

CWRU HPC Clusters:

HPC Access via Graphical Interface or Terminal

Graphical Access

To log into the cluster with a graphical interface, use either OnDemand, X2Go, or Terminal+X11 as discussed in the Graphical Access page. Login using your CaseID and SSO password (sometimes with Duo authentication as well)

Terminal Access

You can access the cluster using various terminal tools in your operating system, including from your mobile phone.

Job Scheduling

Once you are logged into the cluster, you can acclimate yourself to the cluster environment, choose the application that you will be using, and start running jobs on the cluster. We recommend submitting batch jobs, rather than run interactive jobs, which is more geared towards debugging a job. More information at the Job Scheduling page.

Software

We have installed many software packages and applications on the cluster, so you can often run your computations with the installed software. One common way to "add" the application to your job, is using the Module System. If you cannot find the software, you might need to install it yourself, and we have a guide for Installing Software.

Hardware

Our cluster consists of more than 250 compute nodes, including more than 60 GPU nodes, with a total of more than 7000 processors.  Due to the heterogeneity of the cluster, you would benefit from understanding what nodes are available (Resource View), and how to make use of each node type for your jobs. Especially if your job is running on a GPU node, making use of the node feature (for example: -C gpu2v100), will give you the exact GPU card that you need. On the other hand, you might not want to include the node feature option, if you want the job to run on any GPU nodes.