CONCEPTS
Understanding MR vs VR vs Passthrough
Raku supports three XR modes through OpenXR, each serving different use cases:
- VR (Virtual Reality) — Fully immersive, opaque background. The user sees only your virtual world. Use for games, simulations, and cinematic experiences.
- MR (Mixed Reality) — Camera passthrough with virtual objects overlaid on the real world. Uses scene understanding to anchor content to real surfaces.
- Passthrough — Camera feed displayed as the background, but without scene understanding. Lighter weight than full MR. Good for HUDs and simple overlays.
raku_xr_init("vr");
raku_xr_init("mr");
raku_xr_set_mode("mr");
const char* mode = raku_xr_get_mode();
raku_log(0, "Current XR mode: %s", mode);
SPATIAL AWARENESS
Scanning the Room with Scene Understanding
Scene understanding uses the headset's depth sensors to build a 3D model of the real world. This gives you planes (floors, walls, tables), meshes, and semantic labels.
raku_xr_scene_understanding_start();
raku_xr_scene_understanding_set_detect_planes(true);
raku_xr_scene_understanding_set_detect_meshes(true);
raku_xr_scene_understanding_set_detect_semantic(true);
raku_xr_scene_understanding_set_callback(on_scene_ready);
Query detected surfaces and objects:
void on_scene_ready() {
RakuScenePlane planes[64];
int count = raku_xr_scene_get_planes(planes, 64);
for (int i = 0; i < count; i++) {
raku_log(0, "Plane %d: type=%s, size=%.1fx%.1fm, pos=(%.1f, %.1f, %.1f)",
i, planes[i].semantic_label,
planes[i].width, planes[i].height,
planes[i].position[0], planes[i].position[1], planes[i].position[2]);
}
RakuScenePlane table;
if (raku_xr_scene_find_plane_by_label("table", &table)) {
raku_log(0, "Found a table at (%.1f, %.1f, %.1f)",
table.position[0], table.position[1], table.position[2]);
}
raku_xr_scene_get_mesh("room_mesh");
raku_physics_add_static_collider("room_mesh", "mesh");
}
PLACEMENT
Placing Virtual Objects on Real Surfaces
Once you have detected planes, you can place virtual objects on them. Use hit-testing to find the exact position on a surface where the user is pointing or looking.
float ray_origin[3], ray_dir[3];
raku_xr_get_controller_ray(RAKU_HAND_RIGHT, ray_origin, ray_dir);
RakuSceneHit hit;
if (raku_xr_scene_hit_test(ray_origin, ray_dir, &hit)) {
raku_scene_add_model("vase.glb",
hit.position[0], hit.position[1], hit.position[2]);
raku_scene_set_up_vector("vase.glb",
hit.normal[0], hit.normal[1], hit.normal[2]);
raku_log(0, "Placed vase on %s", hit.plane_label);
}
Constrain placement to specific surface types:
if (raku_xr_scene_hit_test_filtered(ray_origin, ray_dir, "horizontal", &hit)) {
raku_scene_add_model("pet.glb",
hit.position[0], hit.position[1], hit.position[2]);
}
if (raku_xr_scene_hit_test_filtered(ray_origin, ray_dir, "vertical", &hit)) {
raku_scene_add_model("painting.glb",
hit.position[0], hit.position[1], hit.position[2]);
}
float pos[] = { 0.0f, 1.0f, -1.0f };
RakuSceneHit snap;
if (raku_xr_scene_snap_to_surface(pos, 0.5f, &snap)) {
}
EFFECTS
Creating Passthrough Portals
Passthrough portals are windows into the real world placed inside a virtual environment (or vice versa). They create a dramatic "looking through a window" effect.
raku_xr_init("vr");
raku_xr_passthrough_create_portal(
0.0f, 1.5f, -3.0f,
0.8f,
"wall_portal"
);
raku_xr_passthrough_create_rect_portal(
3.0f, 1.5f, -3.0f,
1.2f, 2.0f,
"door_portal"
);
Animate and control portals:
float portal_open = 0.0f;
void on_update(float dt) {
if (opening) {
portal_open += dt * 0.5f;
if (portal_open > 1.0f) portal_open = 1.0f;
raku_xr_passthrough_set_portal_scale("wall_portal", portal_open);
}
}
raku_xr_passthrough_set_portal_edge_softness("wall_portal", 0.05f);
raku_xr_passthrough_set_brightness(1.1f);
raku_xr_passthrough_set_contrast(1.2f);
raku_xr_passthrough_set_saturation(0.8f);
Creative use: In VR, place portals on dungeon walls to let players peek at the real world. In MR, use "inverse portals" to create virtual windows into fantasy worlds on your real walls.
PERSISTENCE
Using Spatial Anchors for Persistent Placement
Spatial anchors lock virtual objects to a physical location so they stay in place across sessions. When the user restarts the app, anchored objects reappear exactly where they were left.
handle anchor = raku_anchor_create(1.2f, 0.75f, -0.8f);
raku_anchor_attach(anchor, "pet.glb");
raku_anchor_save(anchor, "my_pet_anchor");
Restore anchors when the app launches again:
handle anchor = raku_anchor_load("my_pet_anchor");
if (anchor != RAKU_INVALID_HANDLE) {
float pos[3];
raku_anchor_get_position(anchor, pos);
raku_log(0, "Pet anchor restored at (%.2f, %.2f, %.2f)",
pos[0], pos[1], pos[2]);
raku_scene_add_model("pet.glb", pos[0], pos[1], pos[2]);
raku_anchor_attach(anchor, "pet.glb");
} else {
raku_log(1, "No saved anchor found, place pet manually");
}
char anchor_names[32][64];
int count = raku_anchor_list_saved(anchor_names, 32);
for (int i = 0; i < count; i++) {
raku_log(0, "Saved anchor: %s", anchor_names[i]);
}
raku_anchor_delete_saved("my_pet_anchor");
MULTIPLAYER MR
Cloud Anchors for Shared MR Experiences
Cloud anchors allow multiple headsets to see the same virtual objects in the same physical locations. User A places an object, and User B sees it in the exact same spot in the real room.
handle anchor = raku_anchor_create(1.2f, 0.75f, -0.8f);
raku_anchor_cloud_share(anchor, on_cloud_share_complete);
void on_cloud_share_complete(const char* cloud_id, bool success) {
if (success) {
raku_log(0, "Anchor shared! Cloud ID: %s", cloud_id);
}
}
Resolve a cloud anchor on another device:
raku_anchor_cloud_resolve("cloud-anchor-abc123", on_cloud_resolve);
void on_cloud_resolve(handle anchor, bool success) {
if (success) {
float pos[3];
raku_anchor_get_position(anchor, pos);
raku_scene_add_model("shared_object.glb",
pos[0], pos[1], pos[2]);
raku_anchor_attach(anchor, "shared_object.glb");
raku_log(0, "Shared anchor resolved at (%.2f, %.2f, %.2f)",
pos[0], pos[1], pos[2]);
} else {
raku_log(1, "Failed to resolve cloud anchor - ensure both devices see the same area");
}
}
Requirements: Cloud anchors require network access and the Raku Network subsystem. Both devices must be able to see some overlapping physical features (walls, furniture) for the spatial alignment to work.
COMPLETE EXAMPLE
Complete Example: Virtual Pet on Your Real Desk
A mixed reality app where a virtual pet sits on your real desk. The pet responds to hand gestures, is anchored persistently, and can be shared with others via cloud anchors.
raku_init();
raku_xr_init("mr");
raku_renderer_create_window(1280, 720, "MR Pet", true);
raku_renderer_create_camera("main", 90.0f, 0.01f, 100.0f);
float clear[] = { 0, 0, 0, 0 };
raku_renderer_set_clear_color(clear);
float gravity[] = { 0, -9.81f, 0 };
raku_physics_create_world(gravity);
raku_audio_init(16, true);
raku_audio_load("sounds/purr.wav", "purr");
raku_audio_load("sounds/meow.wav", "meow");
raku_audio_load("sounds/happy.wav", "happy");
raku_xr_hand_tracking_enable(true);
raku_scene_create("pet_world");
raku_xr_scene_understanding_start();
raku_xr_scene_understanding_set_detect_planes(true);
raku_xr_scene_understanding_set_detect_semantic(true);
bool pet_placed = false;
handle pet_anchor = RAKU_INVALID_HANDLE;
pet_anchor = raku_anchor_load("pet_desk_anchor");
if (pet_anchor != RAKU_INVALID_HANDLE) {
float pos[3];
raku_anchor_get_position(pet_anchor, pos);
raku_scene_add_model("cat.glb", pos[0], pos[1], pos[2]);
raku_anchor_attach(pet_anchor, "cat.glb");
raku_animation_load("cat.glb", "anims/idle.glb", "idle");
raku_animation_play("idle", true);
pet_placed = true;
raku_log(0, "Pet restored from saved anchor");
}
void on_pinch(RakuHandEvent* event) {
if (pet_placed) return;
if (event->phase != RAKU_GESTURE_BEGIN) return;
float origin[3], dir[3];
raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_INDEX_TIP, origin);
dir[0] = 0; dir[1] = -1; dir[2] = 0;
RakuSceneHit hit;
if (raku_xr_scene_hit_test_filtered(origin, dir, "horizontal", &hit)) {
raku_scene_add_model("cat.glb",
hit.position[0], hit.position[1], hit.position[2]);
pet_anchor = raku_anchor_create(
hit.position[0], hit.position[1], hit.position[2]);
raku_anchor_attach(pet_anchor, "cat.glb");
raku_anchor_save(pet_anchor, "pet_desk_anchor");
raku_animation_load("cat.glb", "anims/idle.glb", "idle");
raku_animation_play("idle", true);
raku_audio_play_spatial("meow",
hit.position[0], hit.position[1], hit.position[2]);
pet_placed = true;
raku_xr_haptic_pulse(event->hand, 0.4f, 0.1f);
}
}
raku_xr_hand_gesture_register("pinch", on_pinch);
void on_open_palm(RakuHandEvent* event) {
if (!pet_placed) return;
float hand_pos[3], pet_pos[3];
raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_PALM, hand_pos);
raku_anchor_get_position(pet_anchor, pet_pos);
float dist = raku_math_distance_3d(hand_pos, pet_pos);
if (dist < 0.3f) {
raku_animation_load("cat.glb", "anims/happy.glb", "happy");
raku_animation_play("happy", false);
raku_audio_play_spatial("purr", pet_pos[0], pet_pos[1], pet_pos[2]);
raku_xr_haptic_pulse(event->hand, 0.15f, 0.3f);
}
}
raku_xr_hand_gesture_register("open_palm", on_open_palm);
raku_scene_start_loop();
raku_xr_shutdown();
raku_shutdown();
Next steps: Add the
AI system to give your pet autonomous behavior, or use
spatial audio to make the purring sound come from the pet's exact position.