New in EthoVision XT 17.5
EthoVision XT 17.5 brings important integrations and improvements. With this new version you can import and analyze data from the Live Mouse Tracker system very easily. And the connection of up to 4 Ugo Basile fear conditioning chambers has become plug and play with just a USB connection.
New upgrades, integrations and improvements
We are constantly working on making our solutions for measuring behavior more user-friendly, while not compromising on quality. EthoVision XT is the golden standard in animal video tracking and analysis, and with version 17.5 we've bridged another gap by adding valuable integrations aimed at advancing your research
Live Mouse Tracker Importer and Analysis
Live Mouse Tracker (LMT), available in the open source domain, was developed by the Human Genetics and Cognitive Functions unit of Institut Pasteur. LMT is a system for real-time behavioral analysis of groups of mice in a dedicated (DIY) hardware setup. Through the use of RFID, computer vision and machine learning, LMT is able to track multiple mice, extract their individual and social behaviors and assign these parameters to individual animals.
Data extraction and analysis from LMT can be quite complex (using a convoluted Python framework). This is why in EthoVision XT 17.5 we have implemented easy data importing and analysis of social interaction data from the LMT hardware system.
This means that LMT data can be imported and analyzed directly in EthoVision XT, which brings all of the functionality and power of EthoVision XT to LMT users. EthoVision XT offers easy organization of LMT trials, which in turn enables data filtering, nesting and analysis. Furthermore, LMT data can be visualized in EthoVision’s graphs and integrated data plots.
Deep learning-based body points detection
EthoVision XT 16 introduced a new deep learning technology to detect rodent body points. It is a kind of machine learning that can find structure in unstructured data, such as video images. EthoVision XT 17 (and 17.5) expands this technology by being able to use deep learning tracking in 4 arenas simultaneously.
‘Deep’ means that the trained neural networks are layered, so it learns to represent data at various levels. Low level data can simply be the colors, high level data is more abstract, such as ‘a tail’. Read more about EthoVision XT detection methods here.
The result is a much more accurate and recognition of the animal, resulting in a much more stable detection of the nose point, center point, and tail base of mice. This new technology will take any experiment that relies on multiple body points tracking to a higher level. Think of novel object recognition, nose poke/hole board tests, social preference testing, or 3-chambered setups.
Deep learning technology in EthoVision XT works on rats and mice with an even fur color and hooded (Long-Evans) rats.
Using deep learning requires substantial computation power, you need a Graphics Processing Unit (GPU) that is able to sustain those computations.
If you want to find out more about all the benefits of using deep learning to track your rodents, make sure to check out our webinar, presented by our resident behavioral experts. This dives into tracking situation in which the deep learning technique really shines, tricky to track conditions and the power of improved automation to advance your research.
New plug and play support for the Ugo Basile Fear Conditioning set-up
The Ugo Basile fear conditioning chamber is a state-of-the-art apparatus designed for studying fear and anxiety-related behaviors in rodents. A key integrative feature of this chamber is it's ease of use in combination with EthoVision XT, which in version 17.5 has become even more user-friendly by offering plug-and-play USB connection of up to 4 chambers and allowing for a quick and easy set-up of protocols.
Features like this highlight the integrative capabilities of EthoVision XT with other devices and systems, making measuring behavior with this powerful piece of software easier, more robust and time saving.
"EthoVision XT is for dummies."
Prof. Roberto Rimondini is a neuropsychopharmacologist at the Department of Medical and Clinical Sciences DIMEC, Alma Mater Studiorum Bologna (University of Bologna, Italy). He has been using EthoVision XT video tracking software since its DOS version! He finds it so easy to use, it’s ‘dummy proof’!
Watch the video to learn more about why he chooses Noldus' tools.
Automatic adjustment of Deep learning settings: The Automated Setup in Detection Settings now also works for Deep Learning based Nose-Tail point detection
- GetBehaviorEvent, to extract events that were coded manually
- GetViewDirection, to extract the value of the Head direction
- Point, to define a new point of interest with 2D coordinates
- GetPointPoi, to extract the coordinates of a point of interest or the center of a zone
Basler camera driver: EthoVision XT 17.5 includes the newest driver software for Basler USB 3.0 and GigE cameras (Basler pylon version 7.1.0).
Video recording and playback using the GPU (Graphics card): More power - lowers CPU load in demanding situations
- Time window in NEST: can now be longer than 24 hours (Nest function)
- Group charts: Apply to all button now works
- Group charts: Reset to default button now resets the values and slider position to defaults
- In the Action box for the DanioVision White light, Fade duration of light can be set in ms, s, min, and hours.