Multimodal cross-system virtual reality (VR) ball throwing dataset for VR biometrics.
Journal:
Data in brief
Published Date:
Jun 26, 2025
Abstract
In this paper we present a multimodal cross-system dataset for virtual reality (VR) biometrics. The dataset consists of 41 right-handed participants performing a ball-throwing task in a Unity-based VR environment. Data is collected from the participants using the Meta Quest, HTC Vive, and HTC Vive Cosmos VR systems, covering both lighthouse and camera-based tracking systems. The dataset is the only known multi-VR system dataset for VR biometrics, in addition to being the only known one with external video of the user performing their VR task. During each session participants provide 10 trials by lifting and throwing the ball from the virtual pedestal and throwing it at the target. Participants provide data across 6 distinct sessions separated by at least 1 day, where data is collected from each VR system for 2 sessions. Using the VR system, our dataset records headset and hand controller, both left and right, positions and orientations as well as the trigger position of the dominant hand controller. The VR system data is provided as NumPy files. In addition to the data gather using the VR systems, we collect data using an externally mounted GoPro Hero 7 camera. The GoPro Hero 7 camera is mounted perpendicular to the participant such that the whole interaction is visible in the image frame. The data from the GoPro is collected at 60 frames per second and manually cropped to include only the full view of the participant. The cropped videos are synchronized to the VR motions and provided as MPEG-4 (MP4) videos. Using the GoPro videos, we generate COCO body keypoints using the MMPose and OpenPose toolboxes and provide the data as JSON files. The VR system headset and hand controllers capture movement of the participant's head and hands. The external GoPro video and associated body keypoints enable capture of body parts on the participant's dominant side that are not tracked by the VR system. Our dataset also provides participant demographics in the form of Self-Identified Gender, Age, Height (in), Weight (lb), Writing Hand, Throwing Hand, Throwing Sport Experience, Type of Throwing Sport, and VR Experience. Finally, we provide capture time summary data that provides temporal differences in days between successive sessions for each participant. The provided dataset enables research in cross-system VR biometrics with and without external video and body keypoint. The dataset also enables researchers to extend beyond the headset and hand controllers by using 2D motion trajectories from the body keypoints. Using demographic information, researchers can use the dataset to understand the impact of various demographic characteristics on VR biometrics. Temporal data can enable an understanding of short-term movement evolution captured using virtual and real-world sensing techniques. The dataset enables development of cross domain motion trajectory prediction from 2D to 3D or vice versa.
Authors
Keywords
No keywords available for this article.