Paraview Guide

From Storrs HPC Wiki
Jump to: navigation, search


Post-Processing Remotely with the ParaView Server on Storrs HPC

Due to not allowing the pvserver process to run on the login nodes, the Paraview pvserver command has to be called on one of the compute nodes on HPC.

The version of ParaView on the client (your desktop) must match exactly the module version that is loaded on the compute node on the cluster.

Here are the steps to launch Paraview pvserver on a compute node on the Storrs HPC cluster and connect the Paraview client on your local PC

1. Connect to HPC: ssh <netId>@login.storrs.hpc.uconn.edu

2. Launch an interactive job to reserve a compute node: fisbatch -N 1 -n 36 -p generalsky

Note the compute node on which your job lands (e.g., cn341).

3. On the compute node, load the desired version of paraview:

module load paraview/<pv-version> 

To view which ParaView modules are available, enter the following command: module avail paraview

4. Launch the ParaView server from the compute node:

  • If using v5.8.0 or greater: pvserver --force-offscreen-rendering,
  • If using earlier than v5.8.0: pvserver --use-offscreen-rendering

The application should report that the server is awaiting connections on a given port (the default port is 11111).

5. Create an SSH tunnel from the client machine (your desktop) to the compute node on which the server is running. On Unix-based systems, issue (assuming default port 11111):

ssh -L 11111:<cn-number>:11111 <netId>@login.storrs.hpc.uconn.edu 

You will be prompted for your password and then the tunnel will be open.

6. Launch ParaView on the client machine.

Click the Connect button and choose Add server. 
Set the Host field to localhost and the Port field to 11111 
Click Connect and you will be able to use ParaView remotely.

7. The connection should be all set and the communications between the local Paraview Client and PvServer on the Compute node on HPC should be communicating successfully.