• Home
  • Accessing Our Facilities
    • Apply for Access
    • HPC Resource List
    • Our Staff
    • Our Research Projects
    • Our Research Software

    • Contributions & Costings
    • HPC Driving Test
  • Documentation
    • Documentation Home
    • Getting Started
    • Advanced Topics
    • Training & Workshops
    • FAQ
    • Policies & Procedures
    • Using the Wiki

    • Data & Report Terminology
    • About this website

    • Reports
  • My Account
    • My HPC Projects
HPC Support
Trace: • 2026_04

HPC Newsletter - 2026/04

Welcome to the April 2026 HPC newsletter.

This is a bit of a “light” month due to Easter, but we have a few software packages which have been added or updated since last months newsletter, as well as some minor system changes which are aimed to improve stability of your compute jobs and reduce a small number of error states.

We're also starting to log performance data from all of our Slurm partitions so that we can start to analyse where resources should be allocated. As you will all know, Comet operates with a split free/paid resource split - the initial allocation of resources to each side of that split (at time of commissioning of the system) was a best guess since we had no previous data. Now that we have real work being undertaken on Comet we are able to log and analyse where the resource bottlenecks are, and this information will be used as we move forward to attempt to give everyone the best possible service with the available hardware.

HPC Summary for April 2026

  • Registered projects: 541
  • Active projects: 176
  • Total users: 983 registered
  • Total driving tests: 960 tests taken, 394 passes
  • CPU time: hours of compute April 1st - 20th: 1008143

Software Changes

New software

  • Freesurfer is in the process of being installed as an application environment
  • SAIGE application container added:
    • https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software:saige
  • ldsc application container added:
    • https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software:ldsc
  • vLLM LLM inference engine container added:
    • https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software:vllm

Changed software

  • CASTEP updated to 26.01
    • https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software:castep
  • Bioapps container - Tophat 1 + 2 added to the container environment
    • https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software:bioapps

Website & Documentation

We have expanded the Reports section of the website, and you can now publicly browse regularly updated data on staff / student use of the HPC, free / paid partition demand, a heatmap of node utilisation and detailed node performance metrics. Most reports allow viewing Comet data by day, month and year to identify trends and performance characteristics.

You can access the expanded report section at: https://hpc.researchcomputing.ncl.ac.uk/reports/

  • Staff / Student use of HPC facilities:
    • https://hpc.researchcomputing.ncl.ac.uk/reports/people
  • Free / Paid partition use and demand backlog:
    • https://hpc.researchcomputing.ncl.ac.uk/reports/partitions/free/2026/4/9
    • https://hpc.researchcomputing.ncl.ac.uk/reports/partitions/paid/2026/4/9
  • Comet HPC compute node utilisation heatmap:
    • https://hpc.researchcomputing.ncl.ac.uk/reports/nodemap/2026/4
  • Comet HPC node performance statistics:
    • https://hpc.researchcomputing.ncl.ac.uk/reports/nodes

We have also added help pages for more than 200 of the software packages which were requested for install on Comet. This has identified a small number of software modules which are missing from the list which was intended to be installed by our vendor, and we'll be following up with these in due course: https://hpc.researchcomputing.ncl.ac.uk/dokuwiki/doku.php?id=advanced:software_list

Previously the HPC website also hosted details on the advanced computing equipment request process/panel under the Policies & Procedures section of the wiki - those pages have now been removed; NUIT will now handle all advanced computing equipment requests via normal channels (i.e. https://nuservice.ncl.ac.uk) in the first instance.

System Changes

All compute nodes are being updated with a symlink from the /tmp directory to the /scratch filesystem in order to prevent “Out of disk space” issues related to temporary files created during the runtime of your compute jobs.

All GPU nodes (gpu001, gpu002, gpu003, gpu004, hgpu001) have been updated to a consistent stable Nvidia driver version (to 580.126.20 as well as CUDA 13 support) and are now identical in terms of software version. This should reduce the possibility of errors between your jobs and variations in the Nvidia driver implementation.

Several compute nodes are moving from paid partitions to free partitions this move to increase the number of CPU cores available to unfunded project by an additional 1536. This should reduce the backlog/job waiting times for users of free projects somewhat. We will continue to analyse the performance data to monitor demand and wait times after this change has been made.

Community Events

On Thursday and Friday last week (16, 17 April 2026) we had our first ever miniHPC hackathon, at Newcastle University, as part of our head of Training, Jannetta Steyn's CarpentriesOffline activities. The event was sponsored by the Software Sustainability Institute, Society of Research Software Engineering and, as a bonus, Filamentive sponsored us with some filament for 3D printed goodies and National Innovation Centre for Data sponsoring the venue.

Attendees came from all over the UK as well as Newcastle University to work on various different Raspberry Pi based HPCs (and one RockPi HPC) which included the DRIFT "Visible HPC" brought by Jeremy Cohen from Imperial.

Our MS Teams HPC and Code Community is growing! Please post your experiences and questions in the HPC channel, or just browse the conversations to see if others have encountered similar issues.

Don't forget our monthly inductions for experienced HPC users - a 2 hour online session with your HPC support team, a chance to meet us and find out about Comet's facilities and how they differ from other HPCs you may have used.

For new users and those wondering if HPC will be right for them, we have our 'Introduction to HPC' full day, in-person workshops, one a month until the Summer break. Our one-day Unix Shell workshop is a pre-requisite for the HPC workshop so please book this first, or ensure you have independently worked through Unix Shell lesson materials. PGR students can gain credits by attending the full day Unix Shell and Intro to HPC workshops.

Book at https://pretix.eu/ncl/ or browse all our learning materials and upcoming workshops (DO join the waiting list if your workshop is full. It helps us gauge interest and you will get priority to book later workshops if you miss out. We often have places open up due to late cancellations)

Training Workshops next month

  • Mon 11 May 2026: Induction to Comet for experienced HPC Users
  • Thu 14 May 2026: Software Carpentries: Programming with Python
  • Tue 19 May 2026: Software Carpentries: Introduction to HPC

In The Pipeline

We are looking at further optimisation of the compute resource distribution across the free and paid partitions to increase our utilisation figures further. Expect to see further development in late April/early May.


Back to HPC Newsletters

Previous Next

HPC Support

Table of Contents

Table of Contents

  • HPC Newsletter - 2026/04
    • HPC Summary for April 2026
    • Software Changes
    • Website & Documentation
    • System Changes
    • Community Events
      • Training Workshops next month
    • In The Pipeline

HPC Service

  • News & Changes

Main Content Sections

  • Documentation Home
  • Getting Started
  • Advanced Topics
  • Training & Workshops
  • FAQ
  • Policies & Procedures
  • Using the Wiki
  • Contact us & Get Help

Documentation Tools

  • Wiki Login
  • RSE-HPC Team Area
Developed and operated by
Research Software Engineering
Copyright © Newcastle University
Contact us @rseteam