January 3, 2022

|

by: admin

|

Tags: Artificial, categorizes, Intelligence, interactions, mice, Spectrum, system

|

Categories: autism

Synthetic intelligence system categorizes interactions between mice | Spectrum

Your browser does not support the video tag. Break

New software uses artificial intelligence to automatically identify and quantify certain types of social behavior in mice from videos of mice interacting in a cage – even animals with cables implanted to monitor their brain activity.

The tool, called Mouse Action Recognition System (MARS), could speed research into how genetic mutations associated with autism or drug treatments affect mice behavior, says co-head Ann Kennedy, assistant professor of neuroscience at Northwestern University in Chicago . Illinois. It could also standardize how different laboratories characterize behaviors, as different researchers can identify the same behavior differently, she says.

MARS is part of a recent effort to develop software that can automatically analyze videos of mouse behavior. In most labs, researchers manually annotate behaviors, which Kennedy says can take four to five hours for every hour of video. MARS can analyze an hour of video in just two to three hours, runs in the background and lets the researchers do other tasks.

The software processes footage from an overhead monochrome video camera with a lens set to detect infrared wavelengths. (The experimental setup is only illuminated with red light, since mice are active at night.) The software tracks seven key points on the rodents’ bodies in order to calculate the relative posture of two mice. From this information, the software can determine whether the two animals are examining, attacking or climbing on each other as long as the mice have different coat colors. Researchers described the system in eLife in November.

TThe researchers trained the software on nearly seven hours of video, including about four hours on mice that were subjected to a standard test in which a single mouse in a cage is exposed to a foreign intruder mouse. In some recordings, at least one of the mice was attached to the skull with a fiber optic cable or endoscope, a device often used to monitor or record the activity of neurons.

The team used the crowdsourcing service Amazon Mechanical Turk to recruit people to manually mark the body parts of the animals in each frame of the videos. Using these key points, the software learned to read the rodents’ poses. The researchers then ran MARS on an additional seven hours of unannotated footage and allowed the software to infer the animal husbandry itself.

Finally, the researchers fed MARS the same footage, more than 14 hours in total, this time with individual behaviors coded by one of the team’s researchers. MARS learned to translate the animals’ poses into specific interactions: mount, attack or examine.

Tested with approximately two hours of video footage of interactions between residents and intruders, MARS was as accurate as team members in identifying key points on the mice and identifying attacks and investigations. it was only about 3 percentage points worse at identifying suspensions.

TThe researchers also deployed the software to 45 hours of video recording of interactions with mice with mutations in genes associated with autism: CHD8, CUL3, and NLGN3. The software confirmed previous results; For example, CHD8 mice showed more aggression than controls.

The software also found that BTBR mice, an antisocial inbred strain that lacks a corpus callosum, spend less time examining intruder mice than control mice, consistent with previous results. And MARS was also able to identify which region of the intruder mouse the resident mouse was interacting with: BTBR mice spend less time inspecting the intruder’s face and genitals than controls. The BTBR mice could miss clues about pheromones, says Kennedy.

The software includes a user interface called BENTO, which allows researchers to synchronize MARS-processed video with other types of data captured during the rodents’ interactions, such as: B. neural activity and audio. This feature indicated that a subset of 28 neurons in the hypothalamus of a male mouse became active only during the first moments the mouse boarded a female intruder. The BENTO interface enables researchers to spot significant moments in mouse behavior that they might otherwise miss, says Kennedy.

The team found that human annotators identified different behaviors that could cause problems when comparing results in laboratories, Kennedy says. The use of software like MARS creates a standard so that “you can make a real apple-to-apple comparison between different groups,” she says.

The software is available for download on GitHub. Labs can run the trained software unchanged or train them on new key points and behaviors. Kennedy and her colleagues plan to next explore software that uses “unsupervised learning” to create their own classifications based on raw data, rather than being trained.

Cite this article: https://doi.org/10.53053/NNFZ8503

close

Don’t miss these tips!

We don’t spam! Read our privacy policy for more info.