Bachelor of Arts
Mathematics, Neuroscience, Computational neuroscience, Chaos, Learning, Lyapunov, Dynamics, Dynamical systems, Kolmogorov-Sinai entropy, Hopfield network
We study a family of discrete-time recurrent neural network models in which the synaptic connectivity changes slowly with respect to the neuronal dynamics. The fast (neuronal) dynamics of these models display a wealth of behaviors ranging from simple convergence and oscillation to chaos, and the addition of slow (synaptic) dynamics which mimic the biological mechanisms of learning and memory induces complex multiscale dynamics which render rigorous analysis quite difficult. Nevertheless, we prove a general result on the interplay of these two dynamical timescales, demarcating a regime of parameter space within which a gradual dampening of chaotic neuronal behavior is induced by a broad class of learning rules.
Banks, Jess M., "Chaos and Learning in Discrete-Time Neural Networks" (2015). Honors Papers. 251.