Lichtenberg II Cluster Darmstadt

Lichtenberg II Cluster Darmstadt

Cluster Access

University

Technische Universität Darmstadt

Info

Like its predecessor, the successor to the Lichtenberg cluster is a high-performance computer in the medium performance class (Tier 2). It has a heterogeneous architecture including large memory nodes, accelerator systems with dedicated NVIDIA GP-GPUs and several DGX A100 systems to support modern AI research.

Add. Info

HPC consultation hour an cluster introduction:

  • Video Consultation for all questions about using the cluster and user and project applications: every Wednesday 10 – 11 am as a Zoom meeting:
    Meeting-ID: 665 7163 2291
    Kenncode: 113329
  • Monthly introduction to the use of the cluster: usually every 2nd Tuesday of the month
HPC Support
HKHLR Team Technische Universität Darmstadt
Downloads
Quick Reference Card Lichtenberg Darmstadt
Cluster Access

The cluster is open to scientists from all universities and public research institutions in Germany. The computing resources are granted via a science-led allocation procedure.

Typical Node Parameters

Title
MPI section
Cores (sockets x cores/socket)
2x48
Memory
384 GB
FLOPS/Core (DP, theor. peak)
52 GFLOPS
CPU Type
Intel Xeon Platinum 9242
Bandwidth
10 GB/s
Memory Bandwidth (triad)
400 GB/s
Title
MEM section
Cores (sockets x cores/socket)
2x48
Memory
1536 GB
FLOPS/Core (DP, theor. peak)
52 GFLOPS
CPU Type
Intel Xeon Platinum 9242
Bandwidth
10 GB/s
Memory Bandwidth (triad)
400 GB/s
Title
ACC section GPU
Cores (sockets x cores/socket)
4x24
Memory
384 GB
CPU Type
Intel Xeon Platinum 8260
Bandwidth
10 GB/s
Memory Bandwidth (triad)
400 GB/s
Accelerators

4x Nvidia Tesla V100

Title
ACC section DGX A100
Cores (sockets x cores/socket)
4x24
Memory
384 GB
CPU Type
Intel Xeon Platinum 8260
Bandwidth
10 GB/s
Memory Bandwidth (triad)
400 GB/s
Node Allocation
Shared and exclusive

Global Cluster Parameters

Processors (CPU, DP, peak)
ca. 4.5 PFLOPS
Accelerators (GPU, DP, peak)
424 TFLOPS
Computing cores (CPU)
62.592
Job Manager
Slurm Workload Manager
Other Job Constraints

runtime: 24h, max. 7d

Related Projects