What's new in EthoVision XT 12

EthoVision XT new version 12 is all about better detection, better tracking, and better data. As technology advances, so do the opportunities that our software has to offer. For example, we can now offer an algorithm specific to multiple body points tracking for fish, and new parameters for interval analysis. 

Free interval selection

For many users, this new feature might be a game-changer, because Free interval selection allows you to define your custom analysis. Let’s get into some examples to get a better idea of what this means. 

Object recognition

In a (novel) object test, the main goal is to measure how much interest the animal has with a familiar and/or a novel object. EthoVision XT could already give you the time and frequency of ‘nose point in object zone’, but maybe you are interested in how long it took, over the entire trial, for the animal to show a total of 10 seconds cumulative interest in one or more objects. The new free interval selection makes this very easy to analyze.  

Conditioning and learning

Say you want to teach your mouse to jump on top of the shelter to get a reward from the feeder. With free interval selection, you can easily measure how long it took for the animal to move from the top of the shelter to the feeding zone, and compare this number over several trials. The trend of the interval duration can be taken as a measure of learning. 

Choice tasks

In operant conditioning tasks or radial arm mazes, questions like “How much time did the animal need to get 3 correct lever presses?” or “How long did it take the animal to get to a total of 5 correct arm entries?” are now easily answered using free interval selection. With this method you can also determine how long it took for your animal to learn that it has to stay in the correct arm for, let’s say, 20 seconds consecutively. 

Activity monitoring

Free interval selection also allows you to easily measure how many sleep bouts there are in a 48-hour period, and if there are any trends in the durations over that period. 

If you are interested to find out if EthoVision XT can answer your specific research question, please contact us and ask for a demo or trial version!

Multiple body points tracking in fish  

Multiple body points tracking offers many advantages over center point tracking, as it provides much more detail on the behavior and orientation of the animals. It has been available for rats and mice for quite some time now, but now we have developed this technology for adult zebrafish as well. 

This means that fish can now be tracked based on a specific top-view fish-shape model in EthoVision XT, allowing accurate detection of nose, tail, and center of gravity points. This adds both detail and accuracy to your scientific data. 

Tracking other animals

There is also good news for those of you that study animals other than rats, mice, or zebrafish, because the detection of those animals has been improved as well. There is a new shape-based model that more accurately detects insects, farm animals, and others. Better detection leads to better tracking and thus better data. This new technology especially shines in multiple-subject scenarios where individual identities do not matter! 

Body contact

When tracking during social interaction studies, it is important to learn how much contact there is between animals. Traditionally, this is measured by analyzing the proximity of certain body points: for example, how much time did animal A spend with its nose in close proximity to the nose of animal B, or how frequently did the nose of animal C come in contact with the tail of animal D? 

Now EthoVision XT offers an additional measure of contact with the body contact parameter. This method looks at contact between the bodies of animals, regardless of their body points. When body contours touch during video tracking, EthoVision XT stores this information for analysis. This measure offers an alternative and potentially more sensitive assessment (compared to specific body points proximity) of social interactions, in specific scenarios.