Difference between revisions of "X"

From Storrs HPC Wiki
Jump to: navigation, search
(Still cannot get X11 working)
(Categorize under "Core".)
Line 77: Line 77:
 
  exit
 
  exit
 
  ssh -X <username>@hornet-login1.engr.uconn.edu
 
  ssh -X <username>@hornet-login1.engr.uconn.edu
 +
 +
[[Category:Core]]

Revision as of 02:12, 18 August 2015

Running X11 GUI Apps

It is possible to run X11 GUI apps on the Hornet cluster. In general, this is not recommended and submitting batch jobs is strongly preferable. However, sometimes X11 is necessary and there are various ways to do this.

Connecting Using NX

The head node is running a NX server, so it is possible to connect with a NX client. The NX client allows you to disconnect and reconnect without crashing your apps. Note that this is beta software.

You can install an NX client (see installation instructions below) and then connect to hornet-login1.engr.uconn.edu. In your session definition be sure to specify a GNOME SESSION.

NX client for Linux

The recommended NX client for Linux is qtnx. From Debian or Ubuntu this can be installed with the command "sudo apt-get install qtnx" From Fedora, "yum install qtnx"

Connecting Using NX from OSX

The recommended NX client for OSX is OpenNX: http://opennx.net/download.html

Connecting Using NX from Windows

The recommended NX client for Windows is OpenNX: http://opennx.net/download.html Please download version 0.16.0.725 The configuration of the OpenNX shown as below. Make sure that you setup the red part same as that in the screenshot.

OpenNXConfig.png

Start an X-enabled interactive SLURM job

First of all, we DO NOT RECOMMEND to run the interactive job. The responding speed will be low.

If you want to run some jobs with the interactive GUIs, it is better to submit it to the SLURM scheduler with X-enabled interactive shell:

# Make sure you are on cn65
cn01$ ssh -X cn65
# list the partitions to see which one is free
cn65$ sinfo
PARTITION   AVAIL  TIMELIMIT  NODES  STATE NODELIST
IvyBridge*     up   infinite     12  alloc cn[105-116]
IvyBridge*     up   infinite     17   idle cn[117-133]
SandyBridge    up   infinite      6  alloc cn[72-73,81-84]
SandyBridge    up   infinite     28   idle cn[66-71,74-75,85-104]
debug       inact   infinite      2   idle cn[103-104]
Westmere       up   infinite     10   resv cn[52-61]
Westmere       up   infinite      5   idle cn[36,42,50-51,62]
# You can use "-p" to select a partition
# You can use "-c" to assign more cores or less cores for your job.
$ fisbatch -pSandyBridge -c4
FISBATCH -- the maximum time for the interactive screen is limited to 6 hours. You can add QoS to overwrite it.
FISBATCH -- waiting for JOBID 11522 to start on cluster=cluster and partition=SandyBridge
FISBATCH -- Connecting to head node (cn74)
cn74$

Then, you can do your work under cn74 as an example. When you finish your work, please do not forget to release the session so that the other users can use the resources. To do so:

cn74$ exit
 
[screen is terminating]
Connection to cn74 closed.
FISBATCH -- exiting job

If you think 6 hours is not enough, you can add QoS in the command:

$ fisbatch -pSandyBridge -c4 --qos=stnadard #Or --qos=longrun

Connection Rejected Because of Wrong Authentication Error and Solution

Make sure that ~/.Xauthority is owned by you

Run following command to find ownweship:

ls -l ~/.Xauthority

Run chown and chmod to fix permission problems

chown <username>:<username> ~/.Xauthority
chmod 0600 ~/.Xauthority

Replace <username>:<username> with your actual username.

Make sure X11 client forwarding enabled (Linux/iOS)

Make sure your local /etc/ssh/ssh_config has following lines:

Host *
ForwardAgent yes
ForwardX11 yes

Finally, login to remote server and run X11 as follows from your Mac OS X or Linux desktop system:

ssh -X <username>@hornet-login1.engr.uconn.edu

Still cannot get X11 working

try to remove the .Xauthority file in the Hornet and login again:

rm ~/.Xauthority
exit
ssh -X <username>@hornet-login1.engr.uconn.edu