Commit 31fe172c authored by Jens Jørgen Mortensen's avatar Jens Jørgen Mortensen
Browse files

Fix docs

parent 4274ce1d
Pipeline #90898350 failed with stage
in 25 minutes and 5 seconds
......@@ -12,9 +12,7 @@ Parallel runs
Running jobs in parallel
========================
Parallel calculations are done with MPI and a special
:program:`gpaw-python` python-interpreter.
Parallel calculations are done with MPI.
The parallelization can be done over the **k**-points, bands, spin in
spin-polarized calculations, and using real-space domain
decomposition. The code will try to make a sensible domain
......@@ -26,47 +24,44 @@ Before starting a parallel calculation, it might be useful to check how the
parallelization corresponding to the given number of processes would be done
with ``--gpaw dry-run=N`` command line option::
$ python3 script.py --gpaw dry-run=8
$ gpaw python --dry-run=8 script.py
The output will contain also the "Calculator" RAM Memory estimate per process.
In order to start parallel calculation, you need to know the
command for starting parallel processes. This command might contain
also the number of processors to use and a file containing the names
of the computing nodes. Some
examples::
In order to run GPAW in parallel, you
do one of these two::
$ gpaw -P <cores> python script.py
$ mpiexec -n <cores> python3 script.py
The first way is the recommended one: It will make sure that imports
are done in an efficient way.
Submitting a job to a queuing system
====================================
You can write a shell-script that contains this line::
mpiexec gpaw python script.py
mpirun -np 4 gpaw-python script.py
poe "gpaw-python script.py" -procs 8
and then submit that with ``sbatch``, ``qsub`` or some other command.
Alternatives:
Simple submit tool
==================
* If you are on a SLURM system: use the :ref:`sbatch <cli>` sub-command
of the ``gpaw`` command-line tool::
Instead writing a file with the line "mpirun ... gpaw-python script.py" and
then submitting it to a queueing system, it is simpler to automate this::
$ gpaw sbatch -- [sbatch options] script.py [script options]
#!/usr/bin/env python3
from sys import argv
import os
options = ' '.join(argv[1:-1])
job = argv[-1]
dir = os.getcwd()
f = open('script.sh', 'w')
f.write("""\
NP=`wc -l < $PBS_NODEFILE`
cd %s
mpirun -np $NP -machinefile $PBS_NODEFILE gpaw-python %s
""" % (dir, job))
f.close()
os.system('qsub ' + options + ' script.sh')
* Use MyQueue_::
Now you can do::
$ mq submit "script.py [script options]" -R <resources>
$ qsub.py -l nodes=20 -m abe job.py
* Write you own *submit* script. See this example:
:git:`doc/platforms/gbar/qsub.py`.
You will have to modify the script so that it works with your queueing
system.
.. _MyQueue: https://myqueue.readthedocs.io/
Alternative submit tool
......@@ -91,7 +86,6 @@ variable meaning
=============== ===================================
HOSTNAME name used to assing host type
PYTHONPATH path for Python
GPAW_PYTHON where to find gpaw-python
GPAW_SETUP_PATH where to find the setups
GPAW_MAIL where to send emails about the jobs
=============== ===================================
......
......@@ -21,7 +21,7 @@ def main():
help='Command-line argument for Python script.')
args = parser.parse_args()
arguments = ' '.join(args.argument)
cmd = f'gpaw-python {args.script} {arguments}'
cmd = f'gpaw python {args.script} {arguments}'
nodes, rest = divmod(args.processes, maxcores)
if nodes > 0:
ppn = maxcores
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment