bwUniCluster : chem/vasp/5.3.3.4

Suchen (ALLE CLUSTER)

Cluster-Übersicht
Suchen (pro Cluster)
 
Ankündigungen
Neue Programme  < 14 Tage
alles anzeigen/reset

Cluster-Verwaltung

Hilfe
Login für
Software-Admins
zurück
 

bwHPC Links
  E-Mail an CIS-Admin
  bwHPC Support-Portal
  bwHPC WIKI
  bwHPC Hauptseite
zurück  Dieses Programm auf allen Clustern suchen
Kategorie/Name/Version (default)chem/vasp/5.3.3.4
ClusterbwUniCluster      (für ALLE verfügbar!)
Nachricht/Übersicht
Konflikte
Preregs =Abhängigkeiten
conflict chem/vasp
Lizenzcommercial-VASP-research-group-license
URL
What-ischem/vasp/5.3.3.4 : Quantum chemistry package VASP (Vienna Ab initio Simulation Package) version 5.3.3.4 (restricted to groups with VASP license)
Hilfe-Text
This module provides the quantum chemistry package VASP (Vienna Ab initio
Simulation Package) version 5.3.3.4 (see also 'http://www.vasp.at').

The VASP module is not available for everyone. If you do not possess a VASP
group license, you should not use this module. But if you own a license and
the vasp commands are not working, please contact us (contact see below).

Please cite VASP in your publications according to VASP documentation.

Online documentation (for latest version):
  http://www.vasp.at/index.php/documentation
  http://cms.mpi.univie.ac.at/vasp/vasp/vasp.html

Local documentation:
  /opt/bwhpc/common/chem/vasp/5.3.3.4/doc/vasp.pdf
  Commands 'vasp -help', 'vasp-pot -help'

Primary input files (must exist in current directory before calling 'vasp'):
  'INCAR', 'POSCAR', 'KPOINTS' and 'POTCAR'

Commands:
  vasp     - call specific version of vasp (for details see 'vasp -help')
  vasp-pot - create POTCAR files (for details see 'vasp-pot -help')

How to reduce disk IO? Please add the following flags to your INCAR files:
  LWAVE  = .FALSE. ; LCHARG = .FALSE. ; LVTOT  = .FALSE.

Where to store scratch files? Where should VASP run?
  In case of multi-node jobs, the worker MPI threads must run in one
  common directory. As long as you do not store the wave function on
  disk ('LWAVE = .FALSE.') vasp is not causing much disk IO. Therefore you may
  run the entire vasp calculation in a global scratch area (e.g. a work space
  if available). Please read your cluster's local storage documentation
  about where the global scratch space is located. You may also have a look
  into the queueing system example script mentioned further below.

Interactive serial example (creates a lot of files in the current dir):
  module load chem/vasp/5.3.3.4
  tar -zxvf ${VASP_EXA_DIR}/bench.Hg.tar.gz
  vasp

Interactive parallel example (creates a lot of files in the current dir):
  module load chem/vasp/5.3.3.4
  tar -zxvf ${VASP_EXA_DIR}/bench.Hg.tar.gz
  vasp -n 4

Queueing system example:
  ${VASP_EXA_DIR}/bwunicluster-vasp-example.moab
Please carefully read the comments in the queueing system example script.
The script include instructions about how to submit it. Furthermore the
directory contains several archives with example input files for VASP.

Parallelization is controlled via several flags in the 'INCAR' file:
  'IALGO=48' (RMM-DIIS) usually scales better than 'IALGO=38' (DAV).
  For large systems 'LREAL=Auto' (real space projectors) is very important.
  'LPLANE=.TRUE.' reduces the communication between the workers.
  Please test 'NSIM=4' or 'NSIM=8' (nb. of bands optimized simultaniously).
  Please test 'NCORE=4' or 'NCORE=8' (nb. of cores working on one orbital).
  Either specify 'NPAR' or 'NCORE': NPAR = (total_nb_of_cores_of_job) / NCORE
  Use 'KPAR=2' or higher (defaults to 1) when there are several k-points.
  Use 'vasp -s gamma' for gamma-point-only calculation (factor 2 faster).
  'total_number_of_cores_of_job' should be a multiple of 'KPAR * NCORE'.
  For details see 'http://cms.mpi.univie.ac.at/wiki/index.php/NCORE' and
  'http://cms.mpi.univie.ac.at/vasp/vasp/Parallelisation_NPAR_NCORE_LPLANE_KPAR_tag.html'.

The environment variables and commands are available after loading this module.
In case of problems, please contact 'kiz.software-support (at) uni-ulm.de'.
SupportbwHPC Support-Portal
Installationsdatum05.10.2017
Löschdatum
Best-Practice-Wiki