Skip to main content

Table 1 Available tracking, pose-estimation, and classification programs for rodent behavioral analysis

From: Beyond the three-chamber test: toward a multimodal and objective assessment of social behavior in rodents

Name

Function

Number of subjects

General description and relevant features

Measured variables

Citation

UMA tracker

Tracking

Multiple subjects (without identity preservation)

An image-based tracking algorithm

Allows the application of multiple image processing algorithms so as to choose the most suitable

An option for manual correction of tracking and trajectory swapping errors

Requires arenas with high contrast

Trajectory, interaction times in regions of interest (ROI)

[132]

Rodent arena tracker (RAT)

Tracking

Individual subjects

Machine-vision tracking device that is inexpensive, has low battery demand and does not require a tethered computer

Requires a high-contrast arena

Real-time online image processing

Can be synchronized with other devices for pellet dispensing\optogenetic stimulation, etc.

Trajectory and speed

[133]

Mousemove

Tracking

Individual subjects

A software for centroid-based tracking; thus does not offer orientation-dependent information

Requires high-contrast circular arenas

Restricted to video resolution of 320 × 240 [151]

Batch processing option [129]

Trajectory, traveled distance, speed, turning and curvature in the entire arena or within a ROI

[134]

Mouse tracking

Tracking

Individual subjects

Neural network-based tracker for long periods (days) in multiple, complex, and dynamic environments

The option to train a new network suitable for the user’s need with minimal training data needed (minimum of 2500 annotated images)

Indifferent to coat color or animal size

Traveled distance, speed

[126]

Automated rodent tracker (ART)

Tracking

Individual subjects

A rule-based system for tracking a rodent’s nose and body points with minimal user interference needed

Detection of orientation and head-direction of subjects

Requires high-contrast arenas

Option for batch processing of multiple videos

The frequency of certain behaviors (exploration, moving forward, turning, interacting with a ROI), locomotion variables (speed, distance), and body size estimation

[135]

Janelia automatic animal behavior annotator (JAABA)

Behavior Classification

Single or multiple subjects

Machine learning algorithm for neural networks-based behavioral classification using animal trajectory

Users annotate a small set of video frame to create classifiers for detecting behaviors of interest in screen-scale data sets

Can operate on the output of multiple tracking systems (e.g., Ctrax, MoTr)

 

[127]

MoTr

Tracking

Multiple subjects

A software for long-duration tracking (days) in home cage environment, with identity preservation

Identity is assigned by coloring subjects with distinct bleach patterns that are detected and learned by the tracking software

Detects the position and orientation of the animal based on previous frames by applying an Expectation–Maximization algorithm

Suitable for quantification of social behaviors with long-scale dynamics (dominance, aggression, courtship)

Preferred location, preferred associates, following rate and duration, speed

[142]

DeepLabCut

Pose estimation

 

Based on transfer learning of deep neural networks with minimal training data needed [193] for classifier creation

Can be used for detecting the pose, orientation, and posture change of body parts of multiple free-interacting mice

Option for retraining the network for fine-tuning to a specific need/task by providing it with labeled data on annotated body parts locations

Trajectories, traveled distance, and location of annotated points

[149]

MiceProfiler

Tracking

Two subjects

The software requires no specific tagging of subject animals by implementing a physics engine capable of maintaining identity even after occlusions and with hidden body parts

Can detect the orientation of the mouse’s head (oral-oral, oral-genital, side-by-side interactions)

The system is limited by its need for supervision and correction by an expert [143]

The frequency, duration, type (follow, escape, investigation), and temporal evolution of pre-determined behavioral events

The identity of the animal initiating an action (follower/leader), and the response of the other conspecific

[125]

A 3D-video-based computerized analysis of social and sexual interactions in rats

Tracking

Two subjects

The system is used to detect behavioral events that include vertical changes in posture (rearing, mounting) by using four depth cameras positioned at different viewpoints to extract a 3D image of two freely-moving rats. The merged extracted image is fitted into a “skeleton” model by physics-based algorithm to estimate the location of four body parts (head, neck, trunk, hip) for identifying spatio-temporal patterns of these parts

May need manual corrections for identity swaps and dislocated body-parts

Frequency, latency, and duration of dynamic behavioral events like rearing, head-head\hip contact, approach, follow and mount

[144]

RFID-assisted socialscan

Tracking

Multiple subjects

A system for long-term tracking (days) tracking in ethologically relevant environments, and with identity preservation

Identity preservation is obtained through radio frequency identification – each subject is implanted with a RFID chip that transmits a unique radiofrequency detected by RFID antennas placed underneath the arena that is then synchronized with the video frames for identity assignment

An option for adjusting\adding new parameters for detection by the user

Animal identity is preserved even when out of frame, thus enabling the attachment of other components to the arena (nests and enrichments)

Detection of specific social events (identified by built-in rules), like approach, contact, follow, leave, and locomotor activity within ROIs

[141]

Autotyping

Tracking

Individual subjects

A toolbox for locating and measuring time spent in ROIs in multiple behavioral tasks, including open field, fear conditioning, elevated zero maze, Y\T-maze, spatial\novel object recognition, and three-chamber task

Requires high-contrast arenas

Interaction time\time spent in a given location, number of exploratory bouts, approach angle during bouts of interaction, distance traveled

[137]

ToxTrac and ToxId

Tracking

Multiple subjects

An open-source software for image-based tracking, with high processing speed (less than 25 frames per second), integrated distortion correction and camera calibration, and identity preservation of multiple subjects

Can be used in multiple arenas

ToxId algorithm can be implemented in ToxTrac and enables identity preservation of multiple “untagged” animals by linking trajectory segments using their intensity and Hu-moments with no training or complex configuration steps or access to past and future frames needed

Average speed, acceleration, distance traveled, time spent near\in a ROI

[140, 151]

Mouse action recognition system (MARS)

Pose estimation and behavior classification

Two subjects

Automated pipeline and software tools (deep learning based) for supervised training and evaluating of novel pose estimation, behavior classification, and joint visualization of neural and behavioral data

Subjects need to have different coat colors (one black, one white)

Include three pre-trained supervised classifiers trained to detect attack, mounting, and close investigation events

An option for training MARS pose and behavior models for creating user-specific classifiers from manually annotated videos (minimum of 1500 annotated frames needed for training)

Suitable for head-mounted animals with implantations

Includes an open-source interface Behavior Ensemble and Neural Trajectory Observatory (BENTO) for synchronous display, navigation, and analysis of behavior annotations, audio recordings, and recorded neural activity

Frequency and time spent in specific behavioral events (attack, mounting, close investigation)

Can detect orientation-sensitive behaviors like face\anogenital- directed sniffing)

[128]

TrackRodent

Tracking

Up to two subjects (without identity preservation)

Suitable for rats and mice

Requires high-contrast arenas

Suitable for implanted animals

Options for multiple tracking algorithms based on species (mouse\rat), coat color, head-\body-based tracking, and head implantation

Total time of investigation of ROI, bouts of investigation and their distribution, transitions between defined regions, investigation along time, and intervals between investigation bouts

[75]

Idtracker.ai

Tracking

Multiple subjects

An algorithm and software for extracting the trajectories of freely-moving and unmarked animals in high-contrast arenas

The software is based on two convolutional networks, one for detecting events of animals touching or colliding, and one for assigning identification to the detected animal using classification analysis

The software can be used for detecting multiple subjects of various species in different environments but requires large training data to adapt to new animals and experimental settings

Idtracker’s ability to maintain the identity of multiple mice in a long recording given their deformable geometric shape is yet to be established [143]

High computational demands [131]

Trajectories of detected animals

[148]

Simple behavioral analysis-SimBA

Behavior classification

Two subjects

An open-source software that uses pose-estimation to create supervised machine learning predictive classifiers of rodent social behavior

Requires different coat coloring

Uses labelling interfaces and pose-estimation data from DeepLabCut and DeepPoseKit to annotate body parts on subject animals

Durations and frequencies of classified behaviors

[150]

Video-RFID tracking system

Tracking

Multiple subjects

A system for automated location tracking within a semi-naturalistic setting, and for long periods of time (from minutes to several weeks)

Identity preservation is obtained through radio frequency identification—each subject is implanted with a RFID chip that transmits a unique radiofrequency detected by RFID antennas that is then synchronized with the video frames for identity assignment

The system integrates MiceProfiler algorithms for improving identity detection and defining mouse-body orientation for more sensitive behavioral characterization

Locomotion (travelled distance and time spent in running, walking, hiding, sleeping, and staying still), and the number of social events (include avoiding, being-avoided, chasing, being-chased) that can be later used for quantifying social dominance

[147]

Live mouse tracker

Tracking and behavior classification

Multiple subjects

A real-time tracking software combining computer vision, machine learning, and RFID identification methods for tracking and classifying behavior in a home-like environment for up to several days

The tracking is possible with any coat color, wired animals, and enriched environments

Identity of subject mice is appointed using RFID and machine learning algorithms with the ability for the user to monitor the quality of the tracking live during the experiment

the setup includes a depth camera that enables the extraction of animals’ orientation (head–tail) and the detection of various head parts, like ears and nose

The option to synchronize behavioral tracking with USV recording through the LMT USV Toolbox thus enabling the investigation of spontaneously emitted USVs in home-like environments. The system does not record audio continuously, but only when a certain power threshold is crossed, and then uses machine-learning-based classifier to filter out recorded files of noise. Extracted USVs are then correlated with behavioral events detected by LMT. Cannot appoint the identity of the emitter

Based on changes in shape geometry, the system is able to detect and classify up to 35 different behavioral events related to individual behavior, social dyadic events, dynamic events (escape and follow), and subgroup configuration events

[143]