Neural and Behavioral Dynamics of a Social Interaction

Tuesday, February 25, 2020 - 12:00pm
Mala Murthy, Ph.D., Professor, Princeton Neuroscience Institute, Princeton University, Princeton, New Jersey

Abstract:
Social interactions require continually adjusting behavior in response to sensory feedback. For example, when having a conversation, sensory cues from our partner (e.g., sounds or facial expressions) affect our speech patterns in real time. Our speech signals, in turn, are the sensory cues that modify our partner’s actions. What are the underlying computations and neural mechanisms that govern these interactions? To address these questions, my lab focuses on the acoustic communication system of Drosophila. To our advantage, the fly nervous system is relatively simple, with a wealth of neural circuit tools to interrogate it. Importantly, Drosophila acoustic behaviors are highly quantifiable and robust. During courtship, males produce time-varying songs via wing vibration, while females arbitrate mating decisions. We have discovered that male song patterns are continually sculpted by internal state dynamics and interactions with the female, over timescales ranging from tens of milliseconds to minutes. On the listener side, we have found that courtship song representations are widespread throughout the brain, but that subsets of neurons are critical for extracting complex song features and driving responsive behaviors. I will discuss our work to map the circuits and computations underlying both song production and perception, and explain how our focus on natural acoustic signals provides a powerful, quantitative handle for studying the basic building blocks of communication.

Location: The Neuroscience Research Building (NRB) 1st Floor Auditorium

Date: Tuesday, February 25, 2020

Time: 12:00pm

Host: Jeff Donlea