Chemical Physics of Solids


Name of the cluster:

MARS

Institution:

Max Planck Institute for Chemical Physics of Solids

Login nodes:

All nodes in cluster are login nodes

  • mars[1-5,8-9].opt.rzg.mpg.de

Hardware-Configuration:

Login/Compute nodes

mars1

mars2

mars3

mars4

mars5

mars8

mars9

CPU:

Intel(R) Xeon(R) CPU E7-4890 v2 @ 2.80GHz

X

Intel(R) Xeon(R) CPU E7-8867 v3 @ 2.50GHz

X

Intel(R) Xeon(R) CPU E7-8867 v4 @ 2.40GHz

X

Intel(R) Xeon(R) Gold 6148 CPU @ 2.40GHz

X

Intel(R) Xeon(R) Gold 6248 CPU @ 2.50GHz

X

Intel(R) Xeon(R) Platinum 8268 CPU @ 2.90GHz

X

X

CPU(s)

144

160

160

96

96

60

64

Thread(s) per core

2

2

2

1

1

1

1

Core(s) per socket

18

20

20

24

24

15

16

Socket(s)

4

4

4

4

4

4

4

RAM

1.5T

1.5T

1.5T

3.0T

3.0T

1.5T

1.5T

Node interconnect is based on 1Gb/s ethernet

Filesystems:

Home directories are stored on AFS

/batch3

shared home filesystem (44 TB); GPFS-based; no quotas enforced. NO BACKUPS!

Compilers and Libraries:

The “module” subsystem is implemented on MARS. Please use ‘module available’ to see all available modules.

  • Intel compilers (-> ‘module load intel’): icc, icpc, ifort

  • GNU compilers (-> ‘module load gcc’): gcc, g++, gfortran

  • Intel MKL (‘module load mkl’): $MKL_HOME defined; libraries found in $MKL_HOME/lib/intel64

  • Intel MPI 2017.4 (‘module load impi’): mpicc, mpigcc, mpiicc, mpiifort, mpiexec, …´

Batch system based on Slurm:

  • only for administrative system monitoring

Useful tips:

We recommend to submit jobs from /ptmp filesystem instead to use $HOME directories on network AFS

The OpenMP codes require a variable OMP_NUM_THREADS to be set.

Support:

For support please create a trouble ticket at the MPCDF helpdesk.