Scalability of large neural network simulations via activity tracking with time asynchrony and procedural connectivity
We present here a new algorithm based on a random model for simulating efficiently large brain neuronal networks. Model parameters (mean firing rate, number of neurons, synaptic connection probability and postsynaptic duration) are easy to calibrate further on real data experiments. Based on time asynchrony assumption, both computational and memory complexities are proved to be theoretically linear with the number of neurons. These results are experimentally validated by sequential simulations of null millions of neurons and billions of synapses in few minutes on a single processor desktop computer.