I had a brief meeting with my supervisor, Ruth Falconer, this week so that she could point me in the direction of relevant resources. It was a very productive session and I came away with a book and a master’s dissertation.
The book was ‘Large-Scale Computing Techniques for Complex System Simulations’, a collection of essays covering a broad range of information surrounding simulations. Of particular interest was an essay that looked at a distributed cloud framework which was analysed for its performance at different scales. This essay challenged my assumption that the application I would be creating would be a real-time simulation. I initially made this assumption as I am used to working in a games environment where everything is run time, however I realised whilst reading this essay that a simulation may need to be run faster than real-time. The concept of stochasticity, the random nature of something, was introduced to me. To account for a stochastic simulation a thorough statistical analysis of multiple runs of the simulation should be performed, this means it would be ideal to have the simulation run a lot faster than real-time so that more data can be gathered in a shorter space of time.
The masters dissertation was also interesting, covering a GPGPU agent-based simulation. It demonstrated a simple model, using the electrostatic force of cells, to demonstrate an increase in performance of the simulation when run on a GPU rather than a CPU which an initial study had created. This was helpful in giving me an idea about the scope of the model of the simulation which I would be creating.