Taming Chaos in the Brain!

Much of the brain’s computational power emerges from the dynamics of recurrently connected networks of neurons. While computationally powerful, such networks are prone to chaos—that is, very small amounts of noise can dramatically alter the behavior of the network. Laje and Buonomano show that chaos can be tamed in recurrent neural networks by appropriately tuning the weights of synapses within the recurrent network. The result is a system that exhibits a “dynamic attractor”. For example, the pattern of activity produced by the network can be used to produce any other arbitrary pattern, such as writing of the word “chaos”; once chaos has been “tamed” if the pattern is perturbed and kicked-off its current trajectory it can find its way back and complete the pattern, much as humans can when we produce complex motor patterns.