NGSolve

Application Software

For more on using NGSolve, visit ngsolve.org/docu/latest/i-tutorials/index.html

Getting Ready: Python Virtual Environments

To use MPI with Python, you need to import mpi4py into a virtual environment. If you know how to do this, you can skip to Sample Sbatch Script; otherwise, follow these steps to set up Python with MPI.

Load a Python module and build a virtual environment (venv).

$ module load Python/gcc/3.6.4/gcc-6.3.0

$ python3 -m venv mpiVirtualEnv

Turn on the venv.

$ source ./mpiVirtualEnv/bin/activate

(mpiVirtualEnv) $

Install mpi4py and Jupyter into this environment with Pip.

(mpiVirtualEnv) $ pip3 install mpi4py

(mpiVirtualEnv) $ pip3 install jupyter

Turn off the venv. In sbatch scripts, the termination of the script will deactivate the venv.

(mpiVirtualEnv) $ deactivate

$

Congratulations! You know have a Python-3.6.4 virtual environment with mpi4py installed.

Sample Sbatch Script

To run NGSolve, you will need to use this example in a Jupyter notebook.

Below are the files helloNgsolve.py and submitHelloNgsolve.sh, which allow for a simple node count off. They can be submitted to the slurm scheduler on Coeus with sbatch submitHelloNgsolve.sh. When importing NGSolve into a Python script, note that you want to use NGSolve’s MPI capabilities, which come from MPI_Init().

helloNgsolve.py

from mpi4py import MPI

from ngsolve import *

comm = MPI_Init()

print("Hello from rank ", comm.rank, ' of ', comm.size)


submitHelloNgsolve.sh

#!/bin/bash

#SBATCH --job-name helloNgsolve # job name

#SBATCH --time 00:01:00 # walltime

#SBATCH --partition medium # partition type

#SBATCH --ntasks 4 # number of processor cores (i.e. tasks)

#SBATCH --nodes 2 # number of nodes

#SBATCH --error err.txt

#SBATCH --output out.txt


module load General/NGSolve/6.2.1806/openmpi-2.0/gcc-6.3.0


source ./mpiVirtualEnv/bin/activate

mpiexec -np 4 python3 helloNgsolve.py

# -np 4 tells it to run 4, like the ntasks.

# This runs the script 4 times, and since can use 4 processors, it does so parallel.