Characterizing VR Headset Performance with Cobots
I interned at Ingenuity Labs (Queen’s University) in the summer of 2024 under the supervision of Dr. Matt Pan (MITHRIL), working on human-robot interaction and virtual reality with Ph.D candidate Eric Godden. We spent the summer testing the hand-tracking accuracy of various virtual reality headsets. We used a 7 DOF robot arm (KUKA LBR iiwa R7) with an anthropomorphically realistic open-source 3D hand model to mimic a human hand, and moved the arm through various motions to track the headset performance of Meta Quest Pro and 3 models. Our paper was published to IEEE VR 2025 (France). It is available here.
Our system to track the ground truth of the hand’s position relies on a motion-capture system that tracks reflective markers’ position relative to multiple cameras. This system provides sub-millimeter accuracy. We embedded markers into a 3D-printed hand, as well as adding markers on the VR headset. Since we knew the position of the hand and the VR headset, we could track the position of the hand relative to the VR headset in a ground truth frame. This transformation (VR-hand using Mocap) was compared to the VR headset’s internal measurement of where the hands were, providing a comparison that we could use to estimate the accuracy of the VR headset’s hand tracking.
We tested the hand tracking in many different postions. To keep the testing consistent between trials for the two different headsets we studied, we used a highly repeatable robot that could go to the same positions over and over again. This way the trials between headsets were exactly the same and some uncertainty in testing conditions could be eliminated.
Our results show that the Quest Pro has slightly better hand tracking performance. The heat map below shows how accurate the two headsets were over a volume range; blue squares representing lower euclidian error and red squares representing higher euclidean error. There are more nuances to the results though, so if you are more interested in exactly what this means the paper explains the results well.
We also learned that Meta uses predictive algorithms to “guess” where the hand is going to be based on its current trajectory.The below figure on the right shows that it is slow to react at the start of a motion, but as it gathers more data on the hand trajectory, it begins to guess as to where the hand is going to be and catches up to the hand.
Related albums

Preparing Snow Cones Using a Robot Arm
6 photos

Fail-Proofing Mechanisms for Wheeled Robots
6 photos

Making Race Car Parts Strong Using Simulation
16 photos