Welcome to Unity!

The Unity cluster is a collaborative, multi-institutional high-performance computing cluster located at the Massachusetts Green High Performance Computing Center (MGHPCC). The cluster is under active development and supports primarily research activities. Partnering institutions today include UMass Amherst, UMass Dartmouth, and University of Rhode Island.

 

Monthly service node maintenance:

On the first Tuesday of every month, from 6 am to 7 am Eastern, the Unity login nodes, Open OnDemand server, and web portal may be temporarily inaccessible for maintenance and upgrades. This will not affect running batch jobs, but interactive jobs may be interrupted. 

 

New users:

View our Fall 2023 Onboarding Workshop recording here.

Information on how to get started with the Unity cluster available here.

More information on the Unity Transition available here.

Unity documentation is available here.

 

Unity User Community Slack:

To join the Unity Slack community, please sign up with your school email here. If you’re unable to register with your school email, please contact hpc@umass.edu with your preferred email address and we’ll send you a direct invite. 

 

Once you’ve signed up, you can access the community here.

 

Something not working for you? 

Send email to hpc@umass.edu with as much detail as you can provide to open a support ticket.

 

Need additional help? 

We offer office hours every week on Tuesdays 2:30-4 PM on Zoom. Be sure to check the cluster notes page for up-to-date information on any canceled/delayed office hours.

 

Need expert help using the cluster and optimizing your code? 

We encourage you to schedule an appointment with one of our experienced HPC facilitators. Send an email to hpc@umass.edu and request an HPC facilitator consultation.

 

 

Cluster Notices


GPU Enforcement and Interactive Job Time Limits11-07-2023

In order to enforce optimal and appropriate use of Unity resources, after November 7, 2023 jobs submitted to the GPU partitions (gpu, gpu-long, gpu-preempt) must request at least one GPU. This will help us ensure that GPU partitions are used only for jobs that require GPUs. In addition, we are trialing limiting interactive job sessions to 8 hours. If you are used to long-running interactive sessions and would like help converting your workflow to batch (non-interactive) jobs, please email our facilitation team at hpc@umass.edu.

Unity Restored10-15-2023

As of 8 pm on Sunday, October 15, 2023, Unity and all Unity services are back online and available to users! Our fantastic systems administrators were able to revive Unity in record time after we got the all-clear from MGHPCC this morning. Any jobs that were running or pending during the power outage have not resumed, so you will need to resubmit your jobs. To our knowledge, there was no data loss. If you encounter anything amiss on Unity, please report it to hpc@umass.edu. This morning, MGHPCC conducted an investigation into why the backup generators did not sustain the server room after the data center lost power. They identified and fixed the issue so we expect a return to normal stability.

login1 Reboot 09-27-2023

Today, 9/27/23, we need to reboot login1 at 3 pm Eastern. Access to Unity should not be interrupted, but if you are connected to login1 at the reboot time you will be disconnected. login2 will not be affected. Any screen or tmux sessions associated with login1 will end upon reboot.

Unity Storage Interruption Resolved08-22-2023

Services involving our VAST storage have been restored as of 8/22/23.

Unity Storage Interruption08-21-2023

We are currently (8/21/23) experiencing problems with our high performance VAST storage, the storage system that underlies /home and /work directories. This issue is preventing Unity functions such as logging in and accessing the Open OnDemand portal. We are working with our storage vendor to resolve the problem and assessing the impact on in-progress jobs. 

NESE Outage Resolved07-25-2023

The /project and /nese storage issues are resolved.

NESE Outage 7/2407-24-2023

As of 11 am on 7/24/23, we are currently experiencing difficulties connecting to the Northeast Storage Exchange (NESE), the storage system that houses most /project and /nese directories. We are working to resolve the issue.

Gypsum Restored07-21-2023

As of 1:30 pm on 7/21, the Gypsum nodes and storage are operational again. Thank you for your patience.

Gypsum Outage 7/2007-20-2023

At 8:10 pm on 2023-07-20, the Gypsum nodes and storage on Unity experienced a networking failure. The majority of nodes and all other storage remain available. We are working to resolve the issue and will send a notification when Gypsum is available again.

Login and Portal Maintenance 6/3006-23-2023

On Friday, June 30, 2023, from 7 am to 9 am EDT, the Unity login nodes, Unity portal, and Open OnDemand portal will be inaccessible for maintenance. No running batch jobs will be affected, but interactive jobs may be disconnected. Open OnDemand jobs are considered “batch” jobs and will be able to be resumed after maintenance, but you will experience some interruption.

Open OnDemand Server Upgrade 06-13-2023

In order to resolve some sluggishness and instability in our Open OnDemand server, https://ood.unity.rc.umass.edu will be offline for a server upgrade from 7 am to 9 am on 6/14/23.

Unity Outage04-29-2023

At 1:30 pm EDT on April 29, 2023, there was an unplanned outage of the servers that host our Open OnDemand portal and login nodes. The outage lasted until 5:00 pm. We apologize for any inconvenience the outage caused. No running batch jobs appear to have been impacted. 

Lmod colorization04-03-2023

The output from `module avail` is now colorized!
You can configure this with your shell environment:


$ export LMOD_AVAIL_VERSION_COLOR=no # disable


$ export LMOD_AVAIL_VERSION_COLOR=red # change color


Valid colors are: black, red, green, yellow, blue, magenta, cyan, white, none.
To make this setting persistent, you can paste these commands in your ~/.bashrc or ~/.zshrc file, depending on your login shell.

Home Directory Quotas Increased to 50 GB09-15-2022

Home directory quotas have been increased from 10 GB to 50 GB.

Individual /work directories are being phased out and are no longer being created for new accounts.

Users should use the shared PI directory /work/pi_[piusername]

New Node Package Changes08-12-2022

Any new nodes that are provisioned on Unity as of Friday the 5th of August will have a different set of packages, mainly concerning development headers and shared libraries. Existing nodes are not yet affected, although the entire cluster will reflect this change in the distant future.

Our module system, Spack, offers a simple method to install shared libraries and development headers as modules. Previously, we installed these as system APT packages. This will now be handled through Spack. Therefore, users may need to change their scripts to add additional modules for packages that no longer exist.

For example, previously, the package "libnetcdff-dev" was installed, and did not require a module load. This is now the module "netcdf-fortran" and does require a module load on newer nodes. Please let us know if you are looking for a library that does not exist as a module.