Cyberinfrastructure & Advanced Research Computing (CIARC) Team Services 

We are a team of research computing experts here to help you become a greater and more efficient researcher.  Maybe you inherited your job script and never looked back, or you play it safe when requesting memory.  We can look at your job script with you and see if there are any opportunities to be more efficient when requesting resources.  If you need to submit a large batch of jobs and are unsure about how best to orchestrate the workflow, or you want to know how to get started with parallel computing with your research, we can consult with you and your research team to work out your needs. Perhaps you are applying for a grant and thinking about adding computational simulations, we can help with your proof-of-concept and provide guidance for your budget.   

Check out our newly improved TDX portal where you can now schedule a meeting with a CIARC team member for live support.   

Checking HPC Usage 

If you would like to check your usage for a given month, we have provided some convenient commands on Wendian. 

To check usage as a user, use the command getUtilizationByUser: 

$janedoe@wendian002:[~]: getUtilizationByUser  
janedoe -- Cluster/Account/User Utilization 2023-04-01T00:00:00 - 2023-04-12T11:59:59 (993600 secs) 
"Account","User","Amount","Used" 
"hpcgroup","janedoe - Jane Doe",$1.23,0 
 

To check usage as a PI for all your users, use the command getUtilizationByPI: 

pi@wendian002:[~]: getUtilizationByPI 
pi -- Cluster/Account/User Utilization 2023-04-01T00:00:00 - 2023-04-12T11:59:59 (993600 secs) 
"Account"|"User"|"Amount" 
"hpcgroup","janedoe - Jane Doe",$1.23,0 
"hpcgroup","johnsmith - John Smith",$1000.00,0 
 

Checking Job Efficiency 

If you are interested in checking how efficient your job was after it is finished running, we have a tool installed called reportseff that allows one to quickly check the percent utilization of CPU and memory requested. This tool can let you check jobs for a given jobID, as well as check in each job directory that has Slurm output files. 

Please refer to the GitHub page for more information: https://github.com/troycomi/reportseff 

For more, go to our rates website and be sure to check out our blog for the most up-to-date information.