High Performance and Cloud Computing at SMART Infrastructure Facility

Matthew BerrymanBy Dr Matthew Berryman

What are High Performance Computing and Cloud Computing?

High performance computing is where we connect lots of smaller computers (compute nodes) via very high-speed connections into one machine that we can treat, using special software, almost as if it’s just a single very big computer. Cloud computing takes a more disaggregate approach — although the computers are still networked together, we treat them more as single computers, and take a more distributed approach to how we map software onto different machines, for example, having some as database servers, some as web application servers, others for data processing.

In addition to general purpose processors known as CPUs, (central processing units), some HPC and cloud computing facilities now include other types of processors such as the Intel Xeon Phi family of co-processors and GPUs—graphics processing units repurposed for raw computing rather than graphics—both of which can assist with scientific computing, big data, and machine learning applications.

Dr Berryman recently visited the National Computational Infrastructure in Canberra, which is home to the Southern Hemisphere’s most highly-integrated supercomputer and filesystems, Australia’s highest performance research cloud, and one of the nation’s largest data catalogues.

Dr Berryman recently visited the National Computational Infrastructure in Canberra, which is home to the Southern Hemisphere’s most highly-integrated supercomputer and file systems, Australia’s highest performance research cloud, and one of the nation’s largest data catalogues.

Why do we need them?

Simulation, Modelling, Analysis, Research and Teaching (SMART) can all benefit from high performance and cloud computing in different ways. Simulation and modelling of complex urban systems can require large amounts of computing power to simulate the way that citizens interact with urban infrastructure. Some of SMART’s models, for example, Shaping the Sydney of Tomorrow, feature hundreds of thousands of simulated citizens at an individual level, and for this we have used both CPU and GPU computing on high performance and cloud computing architectures. Our analyses can be quite data-intensive, requiring large computers—HPC and/or cloud, depending on the exact data and compute requirements. Then we deliver our analyses over web platforms, which are ideal to be hosted on the cloud, since we can automatically and dynamically scale the web applications to cope with varying load in web traffic, as we do in the petajakarta.org project.

Leave a Reply

Your email address will not be published.