New in EthoVision XT 13.0

We are on the verge of releasing a new version of EthoVision XT! Version 14 is coming up soon, so keep an eye on this web page. 

New in the previous version of EthoVision XT:

Get the full picture from one software package

Video tracking takes the guesswork out of behavioral observations. EthoVision XT tracks activity, movement, path shape, angular velocity, and even automatically recognizes some behaviors such as grooming and jumping.

EthoVision XT

Test EthoVision XT yourself

Request a FREE trial and find out what EthoVision XT can do for your research!

  • Automatically track any animal in any arena
  • Integrate physiology and control of equipment such as doors and stimuli
  • Build a platform to exactly suit your research needs

All the behaviors

But there is more to behavior, and more and more studies require annotation of specific behavioral events that simply cannot be scored automatically.

That is why the latest version of EthoVision XT offers a Manual Event Recorder (MER). With dramatically improved functionality, this feature adds even more to what was already built in. Before, you could compile a list of behaviors and annotate them only during the acquisition (tracking) phase. Now you can also:

  • Score live and offline
  • Edit scored events
  • Score at reduced speed (frame by frame accuracy)
  • Score point events (instances) in addition to behaviors with a duration
  • Use scored behaviors for trial and hardware control

Score live or offline and edit events

Previously it was possible to score behaviors only during the acquisition phase, when tracking live. Now, added functionality also allows for manually scoring offline using the video recordings. Edit manually scored events and change the time-stamps, simply by clicking and dragging the scored behavior across the timeline!

Scoring offline enhances scoring accuracy, particularly because you can slow down playback speed.

Point events or behaviors with a duration

Scored behaviors can either have duration (start-stop behavior) or be point events. The latter is useful for behaviors that occur quickly (such as a bite or kick), or for when you only need to know the frequency of occurrence.

Integrated control of equipment based on animal behavior

Trial and hardware control allows you to program and control external hardware devices (e.g., the opening of door or a light or sound cue) based on the behavior of the subject. For example, when the animal is a in a specific zone, or has been inactive for more than 20 seconds, EthoVision XT triggers the hardware device.

This external control can now also be based on manually scored events, when you are scoring live. With this functionality, hardware triggering can occur immediately upon annotating the behavior.

Build your own parameters

Our customers often request to be able to define their own parameters within EthoVision XT. This would indeed be very useful, for example in novel object testing.

This test often involves using the “proximity” parameter, defined as proximity of the nose point to the objects. But maybe that parameter is not sufficient for object exploration; other parameters, such as animal movement and head direction, can be just as important.


EthoVision XT 13.0 provides a solution: you can now define a multi-condition variable. Continuing with the previous example, this means that you effectively create a new, customized parameter by combining “heading to zone”, “subject is moving”, and “nose-point is in object zone”.

In addition to using existing parameters, you can also combine manually or automatically scored behaviors, hardware events, and external data to create your own variable. This allows for an effective way to automatically measure behavior in more detail.

Ask for custom parameters

Do you feel like you are missing a parameter? If the variables you are interested in are based on the x,y coordinates of the animal, we can build it for you and make it available in your EthoVision XT software.

Compare behavior before and after an event

In stimulus-response scenarios, you are likely interested in comparing the subjects’ behavior before and after the stimulus was presented. The previously introduced feature Free interval analysis is now updated so that you can select a timeframe before a certain event took place as the starting point of your interval selection. This makes it very easy to see how the behavior of the animal changed in reaction to the stimulus.