Hilfe-Text | This module provides the quantum chemistry package VASP (Vienna Ab initio
Simulation Package) version 5.4.4.3.16052018 (see also 'http://www.vasp.at').
The VASP module is not available for everyone. If you do not possess a VASP
group license, you must not use this module. But if you own a license and
the vasp commands are not working, please contact us (contact see below).
Please cite VASP in your publications according to VASP documentation.
Local documentation:
Commands 'vasp -help', 'vasp-pot -help' (available after loading this module)
/opt/bwhpc/common/chem/vasp/5.4.4.3.16052018/doc/vasp.pdf
Online documentation (for latest version):
https://www.vasp.at/index.php/documentation
https://cms.mpi.univie.ac.at/vasp/vasp/vasp.html (old doc)
https://cms.mpi.univie.ac.at/wiki/index.php/The_VASP_Manual (new doc)
Primary input files (must exist in current directory before calling 'vasp'):
'INCAR', 'POSCAR', 'KPOINTS' and 'POTCAR'
Commands:
vasp - call specific version of vasp (for details see 'vasp -help')
vasp-pot - create POTCAR files (for details see 'vasp-pot -help')
How to reduce disk IO? Please add the following flags to your INCAR files:
LWAVE = .FALSE. ; LCHARG = .FALSE. ; LVTOT = .FALSE.
Only keep wave function, potential or charge files if really required.
Where to store scratch files? Where should VASP run?
* NEVER EVER calculate in your home directory.
* SINGLE-NODE JOBS: Use "${TMPDIR}" (or "${TMP}") as BASE-DIRECTORY,
so running jobs do not depend on any global file system.
* MULTI-NODE JOBS: Use a Lustre-based work space as BASE-DIRECTORY, so all
MPI worker threads have access to the same common directory across nodes.
If the job is submitted from within a work space directory one can
use "${MOAB_SUBMITDIR}" in queueing system jobs.
* For each job create a job-specific SUB-DIRECTORY, for example
"${USER}.${MOAB_JOBID}.$(date +%y%m%d_%H%M%S)"
below the BASE-DIRECTORY, so multiple jobs/users do not interfere.
* Please read your cluster's local storage documentation
about where the global and local scratch spaces are located.
* Please carefully read the comments in the cluster-specific queueing
system example mentioned below.
Interactive serial example (creates a lot of files in the current dir):
module load chem/vasp/5.4.4.3.16052018
tar -zxvf ${VASP_EXA_DIR}/bench.Hg.tar.gz
vasp
Interactive parallel example (creates a lot of files in the current dir):
module load chem/vasp/5.4.4.3.16052018
tar -zxvf ${VASP_EXA_DIR}/bench.Hg.tar.gz
vasp -n 4
Queueing system example:
${VASP_EXA_DIR}/bwunicluster-vasp-example.moab
Please carefully read the comments in the queueing system example script.
Parallelization is controlled via several flags in the 'INCAR' file:
'IALGO=48' (RMM-DIIS) usually scales better than 'IALGO=38' (DAV).
For large systems 'LREAL=Auto' (real space projectors) is very important.
'LPLANE=.TRUE.' reduces the communication between the workers.
Please test 'NSIM=4' or 'NSIM=8' (nb. of bands optimized simultaneously).
Please test 'NCORE=4' or 'NCORE=8' (nb. of cores working on one orbital).
Either specify 'NPAR' or 'NCORE': NPAR = (total_nb_of_cores_of_job) / NCORE
Use 'KPAR=2' or higher (defaults to 1) when there are several k-points.
Use 'vasp -s gam' for gamma-point-only calculation (factor 2 faster).
'total_number_of_cores_of_job' should be a multiple of 'KPAR * NCORE'.
For details see 'https://cms.mpi.univie.ac.at/wiki/index.php/NCORE' and
'https://cms.mpi.univie.ac.at/vasp/vasp/Parallelisation_NPAR_NCORE_LPLANE_KPAR_tag.html'.
WE CAN NOT GUARANTEE THE FUNCTIONALITY AND ACCURACY OF ANY OF THE
COMPILED BINARIES. IT IS THE RESPONSIBILITY OF THE SCIENTIST TO
VERIFY THAT THE RESULTS OF THE CALCULATIONS ARE SOUND AND CORRECT.
BEFORE USING VASP PLEASE REALLY CAREFULLY READ: (available after loading this module)
vasp -help
vasp-pot -help
less ${VASP_EXA_DIR}/bwunicluster-vasp-example.moab
The environment variables and commands are available after loading this module.
In case of problems, please contact 'bwunicluster-hotline (at) lists.kit.edu'. |