Vera¶
Queue¶
Below shows the current availability of resources in the queue on the login node (link):
Queue information is only accessible from within SUNET networks (use of VPN is necessary if you are outside).
Hardware¶
The Vera cluster contains several hardware models, and this page is updated to reflect the current state.
- Intel(R) Xeon(R) Gold 6338 CPU and Platinum 8358 (code-named "Icelake") CPUs.
- AMD EPYC 9354 Zen4 (code-named "Genoa") CPUs.
All nodes have dual CPU sockets and fast ethernet and Inifiniband network. Some ndoes have A40, A100, or H100 NVidia GPUs
The main vera
partition has:
#nodes | CPU | #cores | RAM (GB) | TMPDIR (GB) | GPUS | Nodes |
---|---|---|---|---|---|---|
83 | Zen4 | 64 | 768 | 845 | vera-r01-[16-24],vera-r[02-04]-[01-24],vera-r05-[04,06] |
|
1 | Zen4 | 64 | 1536 | 845 | vera-r05-05 |
|
2 | Zen4 | 64 | 1536 | 845 | 4xH100 | vera-r05-[01-02] |
4 | Icelake | 64 | 512 | 837 | 4xA40 | vera-r07-[01-04] |
3 | Icelake | 64 | 512 | 837 | 4xA100 | vera-r07-[05-06],vera-r08-01 |
5 | Icelake | 64 | 512 | 837 | (expandable) | vera-r08-[02-06] |
52 | Icelake | 64 | 512 | 852 | vera-r08-[07-14],vera-r09-[01-20],vera-r10-[01-24] |
|
6 | Icelake | 64 | 1024 | 407 | vera-r07-[07-15] |
Login nodes are AMD Zen4 machines with 1536GB of RAM and are equipped with NVIDIA L40s for remote graphics and 100G Ethernet.
Several local research groups have also purchased private partitions with additional nodes. You can specific node information from slurm with:
The Icelake expansion has 25G Ethernet network for filesystem access and 100 Gbps Infiniband high-speed/low-latency network for parallel computations.
The Zen4 expansion has 25G Ethernet network for filesystem access and 200 Gbps Infiniband high-speed/low-latency network for parallel computations.
GPU cost on Vera¶
Jobs "cost" based on the number of physical cores they allocate, plus
Type | VRAM | Additional cost | FP16 TFLOP/s | FP32 | FP64 |
---|---|---|---|---|---|
A40 | 45GiB | 16 | 37.4 | 37.4 | 0.58 |
A100 | 40GiB | 48 | 77.9 | 19.5 | 9.7 |
H100 | 94GiB | 160 | 248 | 62 | 30 |
- Performance numbers are theoretical and real world performance may differ greatly. The new Ampere and Hopper GPUs performance is based on using "Tensor Cores" and reduced precision for best performance.
#GPUs | GPUs | Capability | CPU |
---|---|---|---|
16 | A40 | 8.6 | Icelake |
12 | A100 | 8.0 | Icelake |
8 | H100 | 9.0 | Zen4 |
- Example: A job using a full node with 4 A40 for 10 hours:
(64 + 16*4) * 10 = 1280
core hours - Note: 16, 32, and 64 bit floating point performance differ greatly between these specialized GPUs. Pick the one most efficient for your application.
- Additional running cost is based on the price compared to a CPU node.
- You don't pay any extra for selecting a node with more memory; but you are typically competing for less available hardware.
Support¶
If you need some kind of support (trouble logging in, how to run your software, etc.) please first
- Contact the PI of your project and see if he/she can help
- Talk with your fellow students/colleagues
- Contact C3SE support
Vera allocations¶
Chalmers local allocations¶
All departments at Chalmers have the right to allocations on Vera. For those departments which not yet have any allocations, you should speak to your prefect who has been sent information.
Many departments have had their projected allocated already, and manage their members themselves via their selected Principal Investigators (PIs) in SUPR. Researchers should speak to their respective supervisor for access.
Below is a list of PIs with current allocations (C3SE20YY-1-XX projects):
- Architecture and Civil Engineering (ACE)
- Holger Wallbaum C3SE 2025/1-8
- Computer Science and Engineering (CSE)
- Miquel Pericas C3SE 2025/1-14
- Electrical Engineering (E2)
- Thomas Rylander C3SE 2025/1-17
- Physics
- Henrik Grönbeck C3SE 2025/1-7
- Industrial and Material Science (IMS)
- Martin Fagerström C3SE 2025/1-4
- Chemistry and Chemical Engineering
- Ronnie Andersson C3SE 2025/1-9
- Ergang Wang C3SE 2025/1-18
- Itai Panas C3SE 2025/1-3
- Martin Rahm C3SE 2025/1-2
- Alexander Giovannitti C3SE 2025/1-5
- Life Sciences
- Jens Nielsen (has not yet applied for resources)
- Aleksej Zelezniak (has not yet applied for resources)
- Annika Polster C3SE 2025/1-16
- Johan Bengtsson-Palme C3SE 2025/1-11
- Eduard Kerkhoven (has not yet applied for resources)
- Thomas Svensson (has not yet applied for resources)
- ChemBio common - Johan Bengtsson-Palme C3SE 2025/1-13
- SysBio common - Johan Bengtsson-Palme C3SE 2025/1-12
- Mathematical Sciences
- Tobias Gebäck C3SE 2025/1-15
- Mechanics and Maritime Sciences (M2)
- Lars Davidson & Rickard Bensow C3SE 2025/1-10
- Microtechnology and Nanoscience (MC2)
- Elsebeth Schröder C3SE 2025/1-6
- Space, Earth and Environment (SEE)
- Wouter Vlemmings C3SE 2025/1-1
- Technology Management and Economics
- No PI assigned yet
- Communication and Learning in Science
- No PI assigned yet
GU and others¶
Following people have bought time on Vera:
- Department of Physics
- Mats Granath C3SE 408/25-1
- Department for Chemistry & Molecular Biology
- Richard Neutze C3SE 408/22-1
Student allocations¶
Chalmers funds students projects that need HPC clusters.
The e-Commons Stud round is only available for teachers involved in the MATS, KFM programs, and Tracks courses at Chalmers.
To create a new proposal for e-Commons Stud see https://supr.naiss.se/round/stud-e-commons/
Important
Reminder: This round in SUPR is for teaching allocations only. Students should contact their course teachers to create the project if they require access to HPC resources. For those who have not yet applied for resources linked to their existing allocations, we kindly encourage you to do so via your PI or supervisor. This will ensure that students can benefit from the available computational resources. Students who require access but do not see an active project under their supervisor are encouraged to reach out to their supervisor to request that an application be submitted in SUPR.
Note
For CSE students and researchers seeking access to the Minerva cluster, please contact Matti Karppa at karppa@chalmers.se
.
The public webpage for the Minerva cluster, intended for students and other users, is available here.