Usage policy

From Storrs HPC Wiki
Revision as of 09:59, 27 April 2016 by Eds08006 (talk | contribs)
Jump to: navigation, search

To be fair to all users of the cluster, please be aware of these resource limits and usage expectations.

HPC Storage (short term)

The Storrs HPC cluster has a number of local high performance data storage options available for use during job execution and for the short term storage of job results. None of the cluster storage options listed below should be considered permanent, and should not be used for long term archival of data. Please see the next section below for permanent data storage options that offer greater resiliency.

Name Path Size Performance Persistence Backed up? Purpose
Scratch /scratch/scratch2 438GB shared Fastest No, 2 weeks No Fast parallel storage for use during computation
Node-local /work 100GB Fast No, 5 days No Fast storage local to each compute node, globally accessible from /misc/cnXX
Home ~ 2GB Slow Yes Yes Personal storage, available on every node
Group /shared By request Slow Yes Yes Short term group storage for collaborative work
  • Data deletion of directories inside the scratch2 folder is based on modification time. You will get 3 warnings by email before deletion.
  • If you try to run ls on either the /home, /shared, or /misc/cnXX directories, you might not see them. They are invisible because they are mounted on demand by autofs, when an attempt is made to access a file under the directory, or using cd to enter the directory structure.
  • You can recover files on your own from our backed up directories using snapshots within 2 weeks. Beyond 2 weeks we may be able to help if you contact us.
  • You can check on your home directory quota.

Permanent Data Storage (long term)

The university has multiple options for long term permanent data storage. Once data is no longer needed for computation, it should be transferred to one of these locations. Data transfer to permanent locations should be done from the login.storrs.hpc.uconn.edu login node.

Name Path Size Performance Resiliency Purpose
UITS Research Storage Use smbclient to transfer files By request Moderate Data is replicated between two datacenters on Storrs campus This storage is best used for long term data storage requiring good performance, such as data that will be accessed frequently for post-analysis.
Archival cloud storage /archive 1.5PB shared Low Data is distributed across three datacenter between the Storrs and Farmington campuses This storage is best for permanent archival of data without frequent access.
Departmental/individual storage Use smbclient to transfer files - - - Some departments and/or individual researchers have their own local network storage options. These can be accessed using smbclient

Scheduled Jobs

Jobs submitted through the slurm scheduler:

Job property Standard QoS Limit Longrun QoS Limit Haswell384 QoS Limit
Run time (hours) 36 72 18
Cores / CPUs 48 384
Jobs 8

Unscheduled programs

Run time (minutes) CPU limit Memory limit
20 5% 5%
  • We strongly discourage programs being run on the login nodes without the SLURM scheduler.
  • Any programs running for longer than 20 minutes and not in the approved list below will be throttled to 5% CPU usage.
  • Programs allowed on the login node are:
    • bzip
    • cp
    • du
    • emacs
    • fort
    • gcc
    • gfortran
    • gunzip
    • gzip
    • icc
    • mv
    • sftp
    • smbclient
    • ssh
    • tar
    • vim
    • wget

Shared Read-Only Datasets

Users who need read-only datasets can contact our administrators (hpc@uconn.edu) to request the dataset. For example, people who study bioinformatics often need reference dataset for different organisms. The reference dataset is usually very large so user can only save them in /scratch. But. it is inconvenient to touch the dataset every 15 days to prevent deletion. If you have such kind of dataset, we can store the dataset for you. The dataset must meet the following requirements:

The shared dataset is under path: /scratch/scratch2/shareddata/. The data under this directory will be stored permanently. Now we have 4 reference datasets in genome directory: hg19 hg38 mm9 and mm10.

To make the linking path shorter, you can create a soft link with dataset under your home directory. For example:

$ cd 
$ link -s /scratch/scratch2/shareddata/genome ./genome
Retrieved from "https://wiki.hpc.uconn.edu/index.php?title=Usage_policy&oldid=2367"