Hand Tracking UX Lab
Cover imagery for Hand Tracking UX Lab
intermediate

Hand Tracking UX Lab

Design affordances for near-field interactions without breaking immersion or comfort.

¥52,000 · 4 weeks · self-paced · intermediate

Overview

Blend OpenXR hand poses with spatial UI rules: pinch thresholds, error states, and fallback to controllers. You prototype three micro-interactions and test them with classmates.

What is included

  • Pose-driven UI hit targets with dead zones
  • Gesture fallbacks when tracking drops
  • Spatial audio cues for successful grabs
  • Accessibility notes for seated vs standing play
  • Peer review rubric focused on clarity, not polish
  • Figma → engine handoff worksheet

Outcomes

  • Ship a three-interaction sandbox with documented thresholds
  • Capture a short usability note sheet from two testers
  • Refine one interaction after mentor critique
Avatar for Ravi Menon

Responsible instructor

Ravi Menon

Frontend mentor specializing in spatial UI and interaction graphs.

FAQ

Quest 2/3 paths are primary; Pico references are provided as read-only notes.

Experience notes

Hand Tracking UX Lab made us write pinch tolerances in ms and mm — weirdly specific and exactly what our playtesters needed.

— Leo · Osaka indie trio · Google

Request information