TY - JOUR
T1 - Combining Unity with machine vision to create low latency, flexible and simple virtual realities
AU - Ogawa, Yuri
AU - Aoukar, Raymond
AU - Leibbrandt, Richard
AU - Manger, Jake S
AU - Bagheri, Zahra M
AU - Turnbull, Luke
AU - Johnston, Chris
AU - Kaushik, Pavan K
AU - Mitchell, Jaxon
AU - Hemmi, Jan M
AU - Nordström, Karin
PY - 2025/1
Y1 - 2025/1
N2 - In recent years, virtual reality arenas have become increasingly popular for quantifying visual behaviours. By using the actions of a constrained animal to control the visual scenery, the animal perceives that it is moving through a virtual world. Importantly, as the animal is constrained in space, behavioural quantification is facilitated. Furthermore, using computer-generated visual scenery allows for identification of visual triggers of behaviour. We created a novel virtual reality arena combining machine vision with the gaming engine Unity. For tethered flight, we enhanced an existing multi-modal virtual reality arena, MultiMoVR, but tracked wing movements using DeepLabCut-live (DLC-live). For tethered walking animals, we used FicTrac to track the motion of a trackball. In both cases, real-time tracking was interfaced with Unity to control the location and rotation of the tethered animal's avatar in the virtual world. We developed a user-friendly Unity Editor interface, CAVE, to simplify experimental design and data storage without the need for coding. We show that both the DLC-live-Unity and the FicTrac-Unity configurations close the feedback loop effectively and quickly. We show that closed-loop feedback reduces behavioural artefacts exhibited by walking crabs in open-loop scenarios, and that flying Eristalis tenax hoverflies navigate towards virtual flowers in closed loop. We show examples of how the CAVE interface can enable experimental sequencing control including use of avatar proximity to virtual objects of interest. Our results show that combining Unity with machine vision tools provides an easy and flexible virtual reality environment that can be readily adjusted to new experiments and species. This can be implemented programmatically in Unity, or by using our new tool CAVE, which allows users to design new experiments without additional programming. We provide resources for replicating experiments and our interface CAVE via GitHub, together with user manuals and instruction videos, for sharing with the wider scientific community.
AB - In recent years, virtual reality arenas have become increasingly popular for quantifying visual behaviours. By using the actions of a constrained animal to control the visual scenery, the animal perceives that it is moving through a virtual world. Importantly, as the animal is constrained in space, behavioural quantification is facilitated. Furthermore, using computer-generated visual scenery allows for identification of visual triggers of behaviour. We created a novel virtual reality arena combining machine vision with the gaming engine Unity. For tethered flight, we enhanced an existing multi-modal virtual reality arena, MultiMoVR, but tracked wing movements using DeepLabCut-live (DLC-live). For tethered walking animals, we used FicTrac to track the motion of a trackball. In both cases, real-time tracking was interfaced with Unity to control the location and rotation of the tethered animal's avatar in the virtual world. We developed a user-friendly Unity Editor interface, CAVE, to simplify experimental design and data storage without the need for coding. We show that both the DLC-live-Unity and the FicTrac-Unity configurations close the feedback loop effectively and quickly. We show that closed-loop feedback reduces behavioural artefacts exhibited by walking crabs in open-loop scenarios, and that flying Eristalis tenax hoverflies navigate towards virtual flowers in closed loop. We show examples of how the CAVE interface can enable experimental sequencing control including use of avatar proximity to virtual objects of interest. Our results show that combining Unity with machine vision tools provides an easy and flexible virtual reality environment that can be readily adjusted to new experiments and species. This can be implemented programmatically in Unity, or by using our new tool CAVE, which allows users to design new experiments without additional programming. We provide resources for replicating experiments and our interface CAVE via GitHub, together with user manuals and instruction videos, for sharing with the wider scientific community.
KW - arthropod vision
KW - closed loop
KW - gain
KW - motion vision
KW - naturalistic stimuli
KW - navigation
KW - open loop
UR - http://www.scopus.com/inward/record.url?scp=85210093303&partnerID=8YFLogxK
UR - http://purl.org/au-research/grants/ARC/DP180100491
UR - http://purl.org/au-research/grants/ARC/DP200102642
UR - http://purl.org/au-research/grants/ARC/DP210100740
UR - http://purl.org/au-research/grants/ARC/DP230100006
UR - http://purl.org/au-research/grants/ARC/FT180100289
U2 - 10.1111/2041-210X.14449
DO - 10.1111/2041-210X.14449
M3 - Article
AN - SCOPUS:85210093303
SN - 2041-210X
VL - 16
SP - 126
EP - 144
JO - Methods in Ecology and Evolution
JF - Methods in Ecology and Evolution
IS - 1
ER -