SETUP
Prerequisites and Setup
Before you begin building XR experiences with Raku, make sure your development environment is ready.
What you need:
- Raku Runtime — Download the latest release (19 MB)
- An XR headset — Any OpenXR 1.1 compatible device (Meta Quest 3, Quest Pro, Pico 4, HTC Vive XR Elite, etc.)
- C compiler — MSVC, GCC, or Clang for building your game
- OpenXR runtime — Installed on your target device (ships with most headsets)
Initialize the engine and the XR subsystem:
raku_init();
raku_xr_init("vr");
raku_renderer_create_window(1280, 720, "My XR App", true);
raku_renderer_create_camera("main", 90.0f, 0.01f, 1000.0f);
Note: raku_xr_init() is a privileged call. The engine will prompt the user for confirmation before activating XR hardware. This is a safety measure defined in the API manifest.
XR INPUT
Enabling Hand Tracking and Configuring Gestures
Raku supports articulated hand tracking through OpenXR. Once enabled, you get access to 26 joint positions per hand, plus built-in gesture recognition.
raku_xr_hand_tracking_enable(true);
raku_xr_hand_gesture_register("pinch", on_pinch);
raku_xr_hand_gesture_register("grab", on_grab);
raku_xr_hand_gesture_register("point", on_point);
raku_xr_hand_gesture_register("open_palm", on_open_palm);
raku_xr_hand_gesture_set_threshold("pinch", 0.7f);
raku_xr_hand_gesture_set_threshold("grab", 0.8f);
You can also query individual joint positions for custom gesture logic:
float pos[3];
raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_INDEX_TIP, pos);
float thumb[3], index_tip[3];
raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_THUMB_TIP, thumb);
raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_INDEX_TIP, index_tip);
float dist = raku_math_distance_3d(thumb, index_tip);
if (dist < 0.02f) {
raku_log(0, "Custom pinch detected");
}
XR INPUT
Setting Up Eye Tracking with Permissions
Eye tracking requires explicit user permission on most headsets. Raku handles the permission flow and gives you gaze direction and fixation data.
bool granted = raku_xr_eye_tracking_request_permission();
if (granted) {
raku_xr_eye_tracking_enable(true);
raku_xr_eye_tracking_set_smoothing(0.3f);
} else {
raku_log(1, "Eye tracking permission denied - using head gaze fallback");
}
Read gaze data in your update loop:
void on_update(float dt) {
float gaze_origin[3], gaze_dir[3];
raku_xr_eye_tracking_get_gaze(gaze_origin, gaze_dir);
RakuRayHit hit;
if (raku_physics_raycast(gaze_origin, gaze_dir, 100.0f, &hit)) {
raku_scene_set_outline(hit.tag, true);
}
bool fixating = raku_xr_eye_tracking_is_fixating();
if (fixating) {
raku_xr_eye_tracking_dwell_select(1.5f, on_dwell_select);
}
}
Privacy: Eye tracking data never leaves the device. Raku processes all gaze data on-device and does not transmit it to any server. See the
privacy policy for details.
XR FEEDBACK
Adding Haptic Feedback to Interactions
Haptic feedback makes XR interactions feel tangible. Raku supports controller vibration, hand haptics, and custom haptic patterns.
raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 0.5f, 0.1f);
raku_xr_haptic_pulse(RAKU_HAND_LEFT, 0.2f, 0.05f);
raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 0.8f, 0.3f);
raku_xr_haptic_pulse(RAKU_HAND_RIGHT, 1.0f, 0.5f);
Create custom haptic patterns for richer feedback:
RakuHapticSegment pattern[] = {
{ 0.3f, 0.05f },
{ 0.0f, 0.03f },
{ 0.6f, 0.05f },
{ 0.0f, 0.03f },
{ 1.0f, 0.1f },
};
raku_xr_haptic_play_pattern(RAKU_HAND_RIGHT, pattern, 5);
raku_xr_haptic_register_effect("grab_object", pattern, 5);
raku_xr_haptic_play_effect(RAKU_HAND_RIGHT, "grab_object");
MIXED REALITY
Enabling Passthrough / Mixed Reality Mode
Passthrough lets users see the real world through their headset cameras, with virtual content overlaid on top. This is the foundation of mixed reality in Raku.
raku_xr_set_mode("mr");
raku_xr_passthrough_set_opacity(1.0f);
float clear_color[] = { 0.0f, 0.0f, 0.0f, 0.0f };
raku_renderer_set_clear_color(clear_color);
You can also create a passthrough window or portal rather than full passthrough:
raku_xr_passthrough_create_portal(
0.0f, 1.5f, -2.0f,
1.0f,
"portal_1"
);
raku_xr_passthrough_set_enabled(true);
Compatibility: Passthrough requires a headset with camera passthrough capability (Meta Quest 3, Quest Pro, Pico 4, etc.). On devices without passthrough, raku_xr_set_mode("mr") falls back to VR mode gracefully.
STEP-BY-STEP
Your First Hand-Tracked Interaction
Let's put it all together and build a complete hand-tracked interaction: picking up a virtual cube with your hand, getting haptic feedback, and placing it on a surface.
Step 1: Initialize everything
raku_init();
raku_xr_init("vr");
raku_renderer_create_window(1280, 720, "Hand Tracking Demo", true);
raku_renderer_create_camera("main", 90.0f, 0.01f, 100.0f);
float gravity[] = { 0.0f, -9.81f, 0.0f };
raku_physics_create_world(gravity);
raku_xr_hand_tracking_enable(true);
Step 2: Create a grabbable cube
raku_scene_create("main_scene");
raku_scene_add_model("cube.glb", 0.0f, 1.0f, -0.5f);
raku_physics_add_rigidbody("cube.glb", 0.5f);
raku_physics_add_collider("cube.glb", "box");
raku_scene_add_model("floor.glb", 0.0f, 0.0f, 0.0f);
raku_physics_add_static_collider("floor.glb", "box");
Step 3: Register the grab gesture
bool is_grabbing = false;
const char* grabbed_object = NULL;
void on_grab(RakuHandEvent* event) {
if (event->phase == RAKU_GESTURE_BEGIN) {
float palm_pos[3];
raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_PALM, palm_pos);
RakuRayHit hit;
float down[] = { 0, -1, 0 };
if (raku_physics_raycast(palm_pos, down, 0.15f, &hit)) {
grabbed_object = hit.tag;
is_grabbing = true;
raku_physics_set_kinematic(grabbed_object, true);
raku_xr_haptic_pulse(event->hand, 0.6f, 0.1f);
}
}
else if (event->phase == RAKU_GESTURE_END && is_grabbing) {
raku_physics_set_kinematic(grabbed_object, false);
raku_xr_haptic_pulse(event->hand, 0.2f, 0.05f);
is_grabbing = false;
grabbed_object = NULL;
}
}
raku_xr_hand_gesture_register("grab", on_grab);
Step 4: Update grabbed object position each frame
void on_update(float dt) {
if (is_grabbing && grabbed_object) {
float palm_pos[3];
raku_xr_hand_get_joint_position(RAKU_HAND_RIGHT, RAKU_JOINT_PALM, palm_pos);
raku_physics_set_position(grabbed_object, palm_pos[0], palm_pos[1], palm_pos[2]);
}
}
Step 5: Start the game loop
raku_scene_set_update_callback(on_update);
raku_scene_start_loop();
raku_xr_shutdown();
raku_shutdown();
Next steps: Now that you have basic hand tracking working, explore the
Mixed Reality tutorial to place objects on real surfaces, or the
Physics tutorial to add more realistic object behavior.