Wolfram Computation Meets Knowledge

How do I use Mathematica in a managed high-performance cluster?

Background information

Larger computing clusters consist of many nodes where each node is made up of many CPU cores. Moreover, special software such as Mathematica is often available on the cluster. Users request these resources by logging into a head node and submitting batch jobs to the cluster manager. The job executes when the resources become available. Two common cluster managers include TORQUE and Slurm.

Parallelization in Mathematica uses the hub-and-spoke model where a controlling kernel manages a number of subordinate kernels (subkernels). In a cluster environment, the client runs the controlling kernel and the hosts provide subkernels. The cluster manager determines which node acts as the client and which nodes act as hosts.

The benefits of running Mathematica on a cluster are twofold: The number of available CPU cores, even on a single node, is usually more than on a desktop computer. Also, the speed of each CPU core is individually faster than those in a desktop computer.

Running a remote Mathematica front end

A remote front end requires the user to remain in control of the job’s resources. This is known as an interactive session.

Though there is usually a benefit in CPU speed over using a desktop computer, the front end via an interactive session is slower than running a local version of the front end. This is because the front end runs on the cluster while the interface is forwarded to the remote user’s computer.

An interactive session is not intended to run CPU-intensive calculations, but rather it is used to test and diagnose code. It is typical to request only one node’s resources.

  1. Log in to the head node via SSH with X-windows forwarding enabled.
  2. Launch an interactive session, requesting all of the resources on a single node.
  3. Start a Mathematica session.

From a Mathematica notebook, using the LaunchKernels[] command, or any other parallel functionality, will now include subkernels running from the cluster.

Running a remote Mathematica script

It is assumed that

  • you are familiar with launching remote subkernels
  • the cluster is a Unix environment
  • the cluster uses a cloned file system such that Mathematica is run by the same executable on all nodes

If any of the above assumptions are not true, then the following will need to be modified, but the general outline remains:

  1. query the system to find the names of the nodes that are assigned to the job, and how many cores per node are available
  2. manually launch remote kernels

For example, on a Torque managed cluster, in a Mathematica or Wolfram Language script you would have:

(*get association of resources, name of local host and remove local host 
from available resources*)
hosts = Counts[ReadList[Environment["PBS_NODEFILE"], "String"]];
local = First[StringSplit[Environment["HOSTNAME"],"."]];
hosts[local]--;

(*launch subkernels and connect them to the controlling Wolfram Kernel*)
Needs["SubKernels`RemoteKernels`"];
Map[If[hosts[#] > 0, LaunchKernels[RemoteMachine[#, hosts[#]]]]&, Keys[hosts]];

At this point, parallel functions can now be used that utilize the whole set of available resources. When the parallel code is complete, it is good practice to close the Wolfram kernels.

CloseKernels[];
Is this article helpful?
Yes
No

Any comments?

Thank you for your feedback.

Submit

Contact Support

Whether you have a question about billing, activation or something more technical, we are ready to help you.

1-800-WOLFRAM (+1-217-398-0700 for international callers)

Customer Support

Monday–Friday
8am–5pm US Central Time

  • Product registration or activation
  • Pre-sales information and ordering
  • Help with installation and first launch

Advanced Technical Support (for eligible customers)

Monday–Thursday
8am–5pm US Central Time

Friday
8:30–10am & 11am–5pm US Central Time

  • Priority technical support
  • Product assistance from Wolfram experts
  • Help with Wolfram Language programming
  • Advanced installation support