Hand tracking, eye tracking, haptics, mixed reality, and spatial anchors. Built on OpenXR 1.1 for every major headset.
Full 26-joint skeleton tracking per hand with sub-millimeter precision. Build natural interactions without controllers.
Gaze-driven interaction with precision targeting, dwell activation, and analytics. Optimize rendering with foveated data.
Rich tactile feedback across controllers and hands. From simple vibrations to texture simulation and adaptive triggers.
Blend virtual content with the real world. Passthrough, scene understanding, plane detection, and portal cutouts.
Persistent world-locked content. Place objects that stay where you put them, across sessions and across users.
Built on OpenXR 1.1, Raku abstracts platform differences so your XR code runs everywhere. Write once, deploy to every major headset.
The runtime queries device capabilities at startup. Hand tracking, eye tracking, and passthrough gracefully degrade on unsupported hardware.
One input system handles controllers, hands, eyes, and voice across every platform. No platform-specific code paths needed.
Automatic foveated rendering, resolution scaling, and refresh rate selection tuned for each headset's capabilities.