Eric Yttri, CMU
Abstract:
Interacting with the world around you requires a complex set of decisions that select from a wide array of behaviors. However, the rich diversity of these behaviors has all but precluded them from study – with the field relying upon studying a handful of simple, arbitrary, and over-trained actions. To open the door to understanding the naturalistic behaviors we produce in our daily lives, our lab developed a suite of machine learning tools (A-SOiD/B-SOiD) to identify and quantify the breadth of behavior in species ranging from flies to humans. In one of several use-cases, we monitored mouse behavior continuously over the course of many days, yielding ~1 million action bouts. We simultaneously recorded from neurons across motor cortex and the basal ganglia, an area involved in decision-making and Parkinson's. Surprisingly, each of the 27 algorithm-discovered behaviors exhibited a robust neural signature. Although these neural hallmarks of spontaneous behavior were present across brain areas, we observed vast differences in the sparsity and dimensionality of representation as it passed through these areas. Finally, using a simple brain-computer interface, we were able to robustly predict the behavior being performed at any given time demonstrating the strength of the neurobehavioral relationships. These results, including the differing levels of complexity underlying the representation of behavioral features, shed new light into how the coordinated, synergistic translation of information between areas orchestrates complex behaviors.
Bio:
Eric Yttri << pronounced it-tree>> is the Eberly Family Associate Professor of Biological Science. He seeks to understand the multi-scale brain interactions underlying movement decisions. His lab creates a partnership between technique development and informed experimentation to establish a blueprint for the circuit mechanisms of movement. After receiving his PhD working with monkeys at Washington University in St Louis, he moved to Janelia HHMI research campus to begin his exploration of cortical-subcortical dynamics in mice. Since joining the faculty at Carnegie Mellon University, he has earned several awards and he was named the chair of the Allen Institute Next Generation Leaders Council. He also is an active proponent of diversity and equity initiatives in his lab and across the field.
Summary:
Representative problems:
Diagnose & characterize the symptoms of Parkinsons at home throughout the day
Medical decisions by army medics in the heat of battle
Understanding sign language
Identification of pathological behavior that requires brain stimulation
Understanding how the brain controls movement comes from very simple arbitrary tasks; we need to generalize to more flexible, natural behavior
Common objective: identify motion from video
Computer vision is very helpful with position detection
However, behavior is much more complex
Depends on angle, speed, joint rotation, etc.
B-SOiD: package that adds behavior interpretability to pose estimation using spatiotemporal statistics
Application for pose estimation
Unsupervised: discovers patterns
Humans label behavioral groups with meaningful labels
Used UMAP but TSNE and others also work
Uses 2D camera to collect data, cluster pose feature vectors
<5ms detection latency
Robust to missing data
Generates across individuals
Application to sign language detection
Sign language uses fingers to spell out works
B-SOiD can identify different letters and decode a simplified subset of the language
Application: discovery of chronic pain
Images of mice in pain
Can separate different types of pain based on how the animal holds their foot
Expanding to OCD, stroke, Parkinson’s
Application: action analysis
Looking at rat hand motions while it reaches for a sugar pellet
Can predict based on looking at hand whether the reaching action will succeed
Also, can model the gesture-to-gesture transition process, and how it changes during training
Unsupervised vs Supervised learning
Unsupervised: easy but limited in complexity of discovered patterns
Supervised: needs significantly more data but can capture more complexity
Goal: identify user-defined behaviors
A-SOiD: active learning for targeted behavioral extraction
Humans classify ~1% of dataset
Algorithm
Labels rest of data, with confidence labels
Asks for additional labels on low-confidence data points
Automatically adjusts for data imbalance (some behaviors are much more common than others)
Integrated with Unsupervised detection (B-SOiD) to find new behaviors outside of labels
Was able to detect major mouse behaviors (attack, mount, investigation) from just 800 labeled frames
Detected social interactions between pairs of dogs
Detected range of human behaviors (individual and multiple), such as checking time, pushing another person, handshake
Relating behavior to the brain
How does the brain orchestrate behavior?
Data:
Recorded neural electrodes spanning all layers of motor cortex + other reward and movement control centers
Video from 4 cameras at 60fps
Neural activity + behavior + 3D kinematics (3TB/day)
Pattern of neural activation is closely related to the phenomenological behavior labels
Currently analyzing it for actionable patterns
Application for medical treatment: control robot arms using thoughts
Currently slow and task specific
Can use a simple random forest classifier based on the number of spikes from each neuron, can detect motion
60% accuracy with simple, low-latency algorithm
Can predict upcoming motions ~600ms before a behavior is executed
Potential application: can turn on deep brain stimulation for Parkinsons patients only when needed