Written by Chris Kelly, Vice President and General Manager, Compute and Networking, Dell EMC Australia and New Zealand
It’s not every day that you can truthfully claim to be continuing the work of Albert Einstein. With Swinburne University of Technology’s selection of Dell EMC to build the supercomputer for its ARC Centre of Excellence for Gravitational Wave Discovery (OzGrav), it’s something I intend to drop into conversation at the merest hint of an opportunity.
OzGrav, funded by the government through the Australian Research Council and hosted at the University, is at the forefront of a relatively new field of physics, gravitational wave astronomy.
Gravitational waves are ripples in space-time created by accelerating objects, such as neutron stars or colliding black holes. They were first predicted by Einstein in 1915 in his theory of General Relativity. He theorised that space-time is dynamic, warped by the massive objects within it, such as planets and stars. Ask an astrophysicist to explain this to you, and they will usually start talking about what happens to the shape of a trampoline (space) when you place a bowling ball (massive object) on it.
Small Needle, Big Haystack
However, unlike the dramatic bowling-ball-on-trampoline bending of space-time properties of planets, gravitational waves are very feeble, which is why for a century, they remained in the realm of theoretical physics. There just wasn’t the technology capable of detecting them.
All that changed in 2015, when the Laser Interferometer Gravitational-wave Observatory (LIGO) was finally able to detect ripples created by the collision of two black holes, which created one big black hole. LIGO has since detected two more black hole mergers, with the most recent one in June. This collision happened three billion light years away and, despite the resulting black hole that’s 50 times mores massive than the sun, the ripples only made the LIGO detector vibrate by one attometer (0.000000000000000001m, if you’re keeping count).
The confirmation of gravitational waves opens up a new frontier in understanding how our universe works, and as a world-renowned university in astronomy and physics, Swinburne University was keen to lead the charge. But when you’re talking about a place as a big as space and an effect size that small, you need some serious computation grunt for the task.
A Solution As Malleable as Space-Time
Which is why Swinburne turned to Dell EMC for its new high-performance computing solution. The $4 million OzStar supercomputer will spend around a third of its time sifting through reams of data to search for new relative objects in space using gravitational waves.
The supercomputer is a modular solution featuring the new Dell EMC 14th Generation PowerEdge Servers, Dell EMC H-Series Networking Fabric and Dell EMC HPC Storage with the Intel EE Lustre file system. With 115 PowerEdge R740 compute nodes, each with Intel Xeon Scalable processors and NVIDIA Tesla GPUs, it will deliver a performance exceeding one PetaFLOP.
It’s a long way from an earlier Swinburne University Dell EMC collaboration. When the university installed its original and Dell EMC’s first Australian supercomputer 15 years ago, the system, built on Dell PowerEdge 1950 nodes, was only the second machine in Australia to exceed a one teraflop performance.
Scaling-Out to Infinity and Beyond
Given that only part of its processing time will be dedicated to the hunt for gravitational waves, the OzStar is an ideal solution for Swinburne. It will be deployed across a range of disciplines, including molecular dynamics, nanophotonics (the use of light on a nanoscale), advanced chemistry and atomic optics. Consequently, flexibility is king.
Most high-performance computing systems are modular enough to add nodes as needed, accommodating a massive project or a host of smaller ones. But traditionally, adding nodes also increased your bottlenecks.
By using a fast networking and server architecture, we minimise this hit on performance. The ability to easily scale-out or up also means it fits within the confines of the university’s existing infrastructure, but lets the university build up when needed. Such flexibility also keeps upfront costs down because investment can happen on an “as-needed” basis.
Swinburne has a long-held commitment to minimising the environmental impact of its supercomputer systems. The original one was dubbed The Green Machine and its 2011 successor Green II. OzStar continues that tradition. Through carefully considering heating and cooling, and a very high performance per watt ratio of power consumption, we can help Swinburne minimise emissions to reduce its carbon footprint.
The Search for All Kinds of Answers
The OzStar is expected to be up and running by the end of the month, at which point it can start on its mission to calculate theoretical models and crunch data that will increase our understanding of the extreme physics of black holes and warped space-time. It’s exciting to think it will be making advances in a field of study that didn’t really exist two years ago.
It highlights the role that high-performance computing is carving out as emerging technologies, such as artificial intelligence, deep analytics and machine learning, require increasingly large processing power that can’t be achieved with commodity cloud or enterprise infrastructure solutions. It’s not just following in the footsteps of Einstein in understanding gravitational waves, systems such as the OzStar will let us reach for the stars in a range of fields, from understanding diseases to improve treatments, to manufacturing better and safer products. A whole universe is waiting for us and that’s something to brag about.