A core goal of neuroscience is to understand how the brain adaptively orchestrates movements to execute complex behaviors. Quantifying behavioral dynamics, however, has historically been prohibitively laborious or technically intractable, particularly for unconstrained and naturalistic behaviors which the brain evolved to produce. Driven by advances in computer vision and deep learning, new methods are being developed to overcome these limitations and enable precise and automated quantification of behavior from conventional across species and experimental settings. In this talk we will: introduce the problem of pose tracking for behavioral quantification; show how deep learning can be employed to achieve markerless motion capture; and highlight examples of how our work on making this technology accessible through open-source tools like SLEAP (sleap.ai) is enabling studies across domains and application areas ranging from social and motor neuroscience in flies, rodents, and primates, to ecology, digital humanities, and even plant biology to tackle climate change. We will conclude with preliminary results from a large-scale home cage phenotyping project designed to detect prodromal markers of disease.
Learning objectives:
1. Summarize deep learning technologies for markerless motion capture in animals.
2. Review open-source software tools that can be used for behavioral analysis (SLEAP and Keypoint-MoSeq).
3. Identify how behavior quantification technologies can be used to enable the identification of behavioral biomarkers of neurodegenerative disease progression.