Degree Year

2015

Document Type

Thesis - Open Access

Degree Name

Bachelor of Arts

Department

Mathematics

Advisor(s)

Jim Walsh

Keywords

Mathematics, Neuroscience, Computational neuroscience, Chaos, Learning, Lyapunov, Dynamics, Dynamical systems, Kolmogorov-Sinai entropy, Hopfield network

Abstract

We study a family of discrete-time recurrent neural network models in which the synaptic connectivity changes slowly with respect to the neuronal dynamics. The fast (neuronal) dynamics of these models display a wealth of behaviors ranging from simple convergence and oscillation to chaos, and the addition of slow (synaptic) dynamics which mimic the biological mechanisms of learning and memory induces complex multiscale dynamics which render rigorous analysis quite difficult. Nevertheless, we prove a general result on the interplay of these two dynamical timescales, demarcating a regime of parameter space within which a gradual dampening of chaotic neuronal behavior is induced by a broad class of learning rules.

Included in

Mathematics Commons

Share

COinS