Using Information Theory to Understand Neural Representation in the Auditory Cortex

Presenter: Hannah (Qiaochu) Cui — Mathematics

Faculty Mentor(s): James Murray, Christian Schmid

Session: (In-Person) Poster Presentation

Neurons in the brain face the challenge of representing sensory stimuli in a way that accurately encodes the features of these stimuli while minimizing the effects of noise. This thesis will use the concept of mutual information from information theory, which quantifies the amount of information one variable can tell us about another and vice versa, to better understand neural coding in the auditory cortex. Previous research has been done in maximizing mutual information to better understand neural behavior patterns in the visual cortex, with limited auditory findings. We will perform numerical optimization in Python to maximize information that a population of neurons contains about an auditory stimulus within the framework of information theory. This is done by first finding the optimal width and location of tuning curves that characterize neural response to one dimensional stimuli (sound frequency), then updating the optimization algorithm to fit two-dimensional stimuli (sound frequency and intensity). By testing the algorithm with a set of natural sound data, our computations show that in the latter case, optimal stimulus information is represented by multiple populations of neurons that respond in qualitatively different ways to auditory stimulus features, rather than by a homogeneous population with similar response properties. Our findings provide a method to better understand neural representation in the auditory cortex.