🎮 GDC 2026 Open Beta — Pro features FREE through June 30, 2026 | Try It Now →
BEGINNER TUTORIAL

Getting Started with XR Development in Raku

Set up hand tracking, eye tracking, haptic feedback, and passthrough mode. Build your first hand-tracked interaction from scratch.

← All Tutorials

SETUP

Prerequisites and Setup

Before you begin building XR experiences with Raku, make sure your development environment is ready.

What you need:

  • Raku RuntimeDownload the latest release (19 MB)
  • An XR headset — Any OpenXR 1.1 compatible device (Meta Quest 3, Quest Pro, Pico 4, HTC Vive XR Elite, etc.)
  • C compiler — MSVC, GCC, or Clang for building your game
  • OpenXR runtime — Installed on your target device (ships with most headsets)

Initialize the engine and the XR subsystem:

// Initialize the Raku engine raku_init(); // Initialize XR in immersive VR mode // Supported modes: "vr", "mr", "ar", "overlay" raku_xr_init("vr"); // Create a renderer window (used as mirror display on desktop) raku_renderer_create_window(1280, 720, "My XR App", true); // Create a camera (the XR runtime overrides pose each frame) raku_renderer_create_camera("main", 90.0f, 0.01f, 1000.0f);
Note: raku_xr_init() is a privileged call. The engine will prompt the user for confirmation before activating XR hardware. This is a safety measure defined in the API manifest.
XR INPUT

Enabling Hand Tracking and Configuring Gestures

Raku supports articulated hand tracking through OpenXR. Once enabled, you get access to 26 joint positions per hand, plus built-in gesture recognition.

// Enable hand tracking (requires OpenXR hand tracking extension) raku_xr_hand_tracking_enable(true); // Register built-in gesture callbacks raku_xr_hand_gesture_register("pinch", on_pinch); raku_xr_hand_gesture_register("grab", on_grab); raku_xr_hand_gesture_register("point", on_point); raku_xr_hand_gesture_register("open_palm", on_open_palm); // Configure gesture sensitivity (0.0 = very sensitive, 1.0 = firm) raku_xr_hand_gesture_set_threshold("pinch", 0.7f); raku_xr_hand_gesture_set_threshold("grab", 0.8f);

You can also query individual joint positions for custom gesture logic:

// Get the position of the right index finger tip float pos[3]; raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_INDEX_TIP, pos); // Get the distance between thumb and index tip (pinch detection) float thumb[3], index_tip[3]; raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_THUMB_TIP, thumb); raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_INDEX_TIP, index_tip); float dist = raku_math_distance_3d(thumb, index_tip); if (dist < 0.02f) { // Custom pinch detected! raku_log(0, "Custom pinch detected"); }
XR INPUT

Setting Up Eye Tracking with Permissions

Eye tracking requires explicit user permission on most headsets. Raku handles the permission flow and gives you gaze direction and fixation data.

// Request eye tracking permission (shows OS permission dialog) bool granted = raku_xr_eye_tracking_request_permission(); if (granted) { // Enable eye tracking raku_xr_eye_tracking_enable(true); // Set gaze smoothing (0.0 = raw, 1.0 = heavily smoothed) raku_xr_eye_tracking_set_smoothing(0.3f); } else { raku_log(1, "Eye tracking permission denied - using head gaze fallback"); }

Read gaze data in your update loop:

// In your per-frame update callback void on_update(float dt) { float gaze_origin[3], gaze_dir[3]; raku_xr_eye_tracking_get_gaze(gaze_origin, gaze_dir); // Raycast from gaze to find what the user is looking at RakuRayHit hit; if (raku_physics_raycast(gaze_origin, gaze_dir, 100.0f, &hit)) { // Highlight the object the user is looking at raku_scene_set_outline(hit.tag, true); } // Check if user is fixating (staring at one spot) bool fixating = raku_xr_eye_tracking_is_fixating(); if (fixating) { // Trigger dwell-based selection after 1.5 seconds raku_xr_eye_tracking_dwell_select(1.5f, on_dwell_select); } }
Privacy: Eye tracking data never leaves the device. Raku processes all gaze data on-device and does not transmit it to any server. See the privacy policy for details.
XR FEEDBACK

Adding Haptic Feedback to Interactions

Haptic feedback makes XR interactions feel tangible. Raku supports controller vibration, hand haptics, and custom haptic patterns.

// Simple vibration on the right controller (intensity 0.0-1.0, duration in seconds) raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 0.5f, 0.1f); // Different intensities for different interactions raku_xr_haptic_pulse(RAKU_HAND_LEFT, 0.2f, 0.05f); // Light tap raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 0.8f, 0.3f); // Strong buzz raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 1.0f, 0.5f); // Impact

Create custom haptic patterns for richer feedback:

// Define a haptic pattern: pairs of (intensity, duration) RakuHapticSegment pattern[] = { { 0.3f, 0.05f }, // light { 0.0f, 0.03f }, // pause { 0.6f, 0.05f }, // medium { 0.0f, 0.03f }, // pause { 1.0f, 0.1f }, // strong }; // Play the pattern on the right hand raku_xr_haptic_play_pattern(RAKU_HAND_RIGHT, pattern, 5); // Register a named haptic effect for reuse raku_xr_haptic_register_effect("grab_object", pattern, 5); // Later, play by name raku_xr_haptic_play_effect(RAKU_HAND_RIGHT, "grab_object");
MIXED REALITY

Enabling Passthrough / Mixed Reality Mode

Passthrough lets users see the real world through their headset cameras, with virtual content overlaid on top. This is the foundation of mixed reality in Raku.

// Switch to mixed reality mode (enables passthrough) raku_xr_set_mode("mr"); // Configure passthrough blend (0.0 = fully virtual, 1.0 = fully real) raku_xr_passthrough_set_opacity(1.0f); // Set the background to passthrough (transparent clear color) float clear_color[] = { 0.0f, 0.0f, 0.0f, 0.0f }; raku_renderer_set_clear_color(clear_color);

You can also create a passthrough window or portal rather than full passthrough:

// Create a circular passthrough portal at a world position raku_xr_passthrough_create_portal( 0.0f, 1.5f, -2.0f, // position (x, y, z) 1.0f, // radius in meters "portal_1" // tag for later reference ); // Toggle passthrough on/off at runtime raku_xr_passthrough_set_enabled(true);
Compatibility: Passthrough requires a headset with camera passthrough capability (Meta Quest 3, Quest Pro, Pico 4, etc.). On devices without passthrough, raku_xr_set_mode("mr") falls back to VR mode gracefully.
STEP-BY-STEP

Your First Hand-Tracked Interaction

Let's put it all together and build a complete hand-tracked interaction: picking up a virtual cube with your hand, getting haptic feedback, and placing it on a surface.

Step 1: Initialize everything

raku_init(); raku_xr_init("vr"); raku_renderer_create_window(1280, 720, "Hand Tracking Demo", true); raku_renderer_create_camera("main", 90.0f, 0.01f, 100.0f); // Set up physics for grabbable objects float gravity[] = { 0.0f, -9.81f, 0.0f }; raku_physics_create_world(gravity); // Enable hand tracking raku_xr_hand_tracking_enable(true);

Step 2: Create a grabbable cube

// Create a scene and add a cube raku_scene_create("main_scene"); raku_scene_add_model("cube.glb", 0.0f, 1.0f, -0.5f); // Make it a rigid body so it can be grabbed and dropped raku_physics_add_rigidbody("cube.glb", 0.5f); raku_physics_add_collider("cube.glb", "box"); // Add a floor so the cube doesn't fall forever raku_scene_add_model("floor.glb", 0.0f, 0.0f, 0.0f); raku_physics_add_static_collider("floor.glb", "box");

Step 3: Register the grab gesture

bool is_grabbing = false; const char* grabbed_object = NULL; void on_grab(RakuHandEvent* event) { if (event->phase == RAKU_GESTURE_BEGIN) { // Raycast from the palm to see if we're near an object float palm_pos[3]; raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_PALM, palm_pos); RakuRayHit hit; float down[] = { 0, -1, 0 }; if (raku_physics_raycast(palm_pos, down, 0.15f, &hit)) { grabbed_object = hit.tag; is_grabbing = true; // Make object kinematic while held raku_physics_set_kinematic(grabbed_object, true); // Haptic feedback on grab raku_xr_haptic_pulse(event->hand, 0.6f, 0.1f); } } else if (event->phase == RAKU_GESTURE_END && is_grabbing) { // Release the object raku_physics_set_kinematic(grabbed_object, false); // Light haptic tap on release raku_xr_haptic_pulse(event->hand, 0.2f, 0.05f); is_grabbing = false; grabbed_object = NULL; } } raku_xr_hand_gesture_register("grab", on_grab);

Step 4: Update grabbed object position each frame

void on_update(float dt) { if (is_grabbing && grabbed_object) { // Move the object to follow the hand float palm_pos[3]; raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_PALM, palm_pos); raku_physics_set_position(grabbed_object, palm_pos[0], palm_pos[1], palm_pos[2]); } }

Step 5: Start the game loop

// Register the update callback and start raku_scene_set_update_callback(on_update); raku_scene_start_loop(); // Cleanup on exit raku_xr_shutdown(); raku_shutdown();
Next steps: Now that you have basic hand tracking working, explore the Mixed Reality tutorial to place objects on real surfaces, or the Physics tutorial to add more realistic object behavior.

Ready for More?

Explore the full XR API reference or check out more tutorials on physics, AI, and spatial audio.