A team from the University of Michigan has developed a new software tool to help researchers across the life sciences more efficiently analyze animal behaviors.
The open-source software, LabGym, capitalizes on artificial intelligence to identify, categorize and count defined behaviors across various animal model systems.
Scientists need to measure animal behaviors for a variety of reasons, from understanding all the ways a particular drug may affect an organism to mapping how circuits in the brain communicate to produce a particular behavior.
Researchers in the lab of U-M faculty member Bing Ye, for example, analyze movements and behaviors in Drosophila melanogaster-;or fruit flies-;as a model to study the development and functions of the nervous system. Because fruit flies and humans share many genes, these studies of fruit flies often offer insights into human health and disease.
“Behavior is a function of the brain. So analyzing animal behavior provides essential information about how the brain works and how it changes in response to disease,” said Yujia Hu, a neuroscientist in Ye’s lab at the U-M Life Sciences Institute and lead author of a Feb. 24 Cell Reports Methods study describing the new software.
But identifying and counting animal behaviors manually is time-consuming and highly subjective to the researcher who is analyzing the behavior. And while a few software programs exist to automatically quantify animal behaviors, they present challenges.
Many of these behavior analysis programs are based on pre-set definitions of a behavior. If a Drosophila larva rolls 360 degrees, for example, some programs will count a roll. But why isn’t 270 degrees also a roll? Many programs don’t necessarily have the flexibility to count that, without the user knowing how to recode the program.”
Bing Ye, Professor, Cell and Developmental Biology, University of Michigan
Thinking more like a scientist
To overcome these challenges, Hu and his colleagues decided to design a new program that more closely replicates the human cognition process-;that “thinks” more like a scientist would-;and is more user-friendly for biologists who may not have expertise in coding. Using LabGym, researchers can input examples of the behavior they want to analyze and teach the software what it should count. The program then uses deep learning to improve its ability to recognize and quantify the behavior.
One new development in LabGym that helps it apply this more flexible cognition is the use of both video data and a so-called “pattern image” to improve the program’s reliability. Scientists use videos of animals to analyze their behavior, but videos involve time series data that can be challenging for AI programs to analyze.
To help the program identify behaviors more easily, Hu created a still image that shows the pattern of the animal’s movement by merging outlines of the animal’s position at different timepoints. The team found that combining the video data with the pattern images increased the program’s accuracy in recognizing behavior types.
LabGym is also designed to overlook irrelevant background information and consider both the animal’s overall movement and the changes in position over space and time, much as a human researcher would. The program can also track multiple animals simultaneously.
Species flexibility improves utility
Another key feature of LabGym is its species flexibility, Ye said. While it was designed using Drosophila, it is not restricted to any one species.
“That’s actually rare,” he said. “It’s written for biologists, so they can adapt it to the species and the behavior they want to study without needing any programming skills or high-powered computing.”
After hearing a presentation about the program’s early development, U-M pharmacologist Carrie Ferrario offered to help Ye and his team test and refine the program in the rodent model system she works with.
Ferrario, an associate professor of pharmacology and adjunct associate professor of psychology, studies the neural mechanisms that contribute to addiction and obesity, using rats as a model system. To complete the necessary observation of drug-induced behaviors in the animals, she and her lab members have had to rely largely on hand-scoring, which is subjective and extremely time-consuming.
“I’ve been trying to solve this problem since graduate school, and the technology just wasn’t there, in terms of artificial intelligence, deep learning and computation,” Ferrario said. “This program solved an existing problem for me, but it also has really broad utility. I see the potential for it to be useful in almost limitless conditions to analyze animal behavior.”
The team next plans to further refine the program to improve its performance under even more complex conditions, such as observing animals in nature.
Source:
Journal reference:
Hu, Y., et al. (2023) LabGym: Quantification of user-defined animal behaviors using learning-based holistic assessment. Cell Reports Methods. doi.org/10.1016/j.crmeth.2023.100415.