IT Architecture at SMART

An explainer, by SMART’s IT Architect Matthew Berryman. 

Modelling and analysis of large systems of infrastructure systems carries with it a number of challenges, in particular around the volume of data and the requisite complexity (and thus computing resources required) of models. Our IT infrastructure needs to support the diverse range of models that we use to model integrated infrastructure in the real world.

In the Shaping of the Sydney of Tomorrow project, which is typical of the sorts of models built at SMART, we need to run difference scenarios with different random seeds, all at the same type, to keep our results timely. We use a central database server to store input data, and in particular configuration data for the different scenarios (including any random number seeds for the replicates), intermediate data that we want to keep a record of, and model outputs.

Having a centralised database server aids us in quickly spooling up multiple instances of the model from the same virtual machine image—each one in turn will load a different scenario as managed through the input database. The central database then captures all of the output data (allowing each VM instance to be small), ready for collection in our dashboard data visualisation and analytic system.

SMART Infrastructure Facility is has recently installed a large VPS (virtual private server) with 512 cores, 2TB of RAM (4GB / core), and 30 TB of hard disk space to support scalability of scenario runs across our range of models. We also will have access to an upgraded HPC facility, with 1408 cores, 5.5 TB of RAM and 130 TB of disk space, for running very large scale models, such as an expanded version of the Sydney model to cover all of Sydney instead of just a region at a time.

Leave a Reply

Your email address will not be published. Required fields are marked *