Toolset for Run-time Dataset Collection of Deep-scene Information

Gustav Aaro, Daniel Roos and Niklas Carlsson


Paper: Gustav Aaro, Daniel Roos and Niklas Carlsson. "Toolset for Run-time Dataset Collection of Deep-scene Information", Proc. IEEE MASCOTS workshop, Nice, France, Nov. 2020. (pdf)

Abstract: Virtual reality (VR) provides many exciting new application opportunities, but also present new challenges. In contrast to 360-degree videos that only allow a user to select its viewing direction, in fully immersive VR, users can also move around and interact with objects in the virtual world. To most effectively deliver such services it is therefore important to understand how users move around in relation to such objects. In this paper, we present a methodology and software tool for generating run-time datasets capturing a user's interactions with such 3D environments, evaluate and compare different object identification methods that we implement within the tool, and use datasets collected with the tool to demonstrate example uses. The tool was developed in Unity, easily integrates with existing Unity applications through the use of periodic calls that extracts information about the environment using different ray-casting methods. The software tool and example datasets are made available with this paper.

Software and datasets

To help build upon our work, below, we make available code and example datasets.

Note: If you use or build on our datafiles, code, or ideas in your research, please include a reference to our paper (pdf).