🎮 GDC 2026 Open Beta — Pro features FREE through June 30, 2026 | Try It Now →
ADVANCED TUTORIAL

Building Mixed Reality Experiences

Scene understanding, surface placement, passthrough portals, spatial anchors, and cloud-shared MR. Build a virtual pet on your real desk.

← All Tutorials

CONCEPTS

Understanding MR vs VR vs Passthrough

Raku supports three XR modes through OpenXR, each serving different use cases:

  • VR (Virtual Reality) — Fully immersive, opaque background. The user sees only your virtual world. Use for games, simulations, and cinematic experiences.
  • MR (Mixed Reality) — Camera passthrough with virtual objects overlaid on the real world. Uses scene understanding to anchor content to real surfaces.
  • Passthrough — Camera feed displayed as the background, but without scene understanding. Lighter weight than full MR. Good for HUDs and simple overlays.
// Initialize in different modes raku_xr_init("vr"); // Full VR, opaque background raku_xr_init("mr"); // Mixed reality with scene understanding // Switch modes at runtime raku_xr_set_mode("mr"); // Check current mode const char* mode = raku_xr_get_mode(); raku_log(0, "Current XR mode: %s", mode);
SPATIAL AWARENESS

Scanning the Room with Scene Understanding

Scene understanding uses the headset's depth sensors to build a 3D model of the real world. This gives you planes (floors, walls, tables), meshes, and semantic labels.

// Start scene understanding (requires MR mode) raku_xr_scene_understanding_start(); // Configure what to detect raku_xr_scene_understanding_set_detect_planes(true); raku_xr_scene_understanding_set_detect_meshes(true); raku_xr_scene_understanding_set_detect_semantic(true); // Wait for initial scan to complete raku_xr_scene_understanding_set_callback(on_scene_ready);

Query detected surfaces and objects:

void on_scene_ready() { // Get all detected planes RakuScenePlane planes[64]; int count = raku_xr_scene_get_planes(planes, 64); for (int i = 0; i < count; i++) { raku_log(0, "Plane %d: type=%s, size=%.1fx%.1fm, pos=(%.1f, %.1f, %.1f)", i, planes[i].semantic_label, // "floor", "wall", "table", "ceiling" planes[i].width, planes[i].height, planes[i].position[0], planes[i].position[1], planes[i].position[2]); } // Find a specific surface type RakuScenePlane table; if (raku_xr_scene_find_plane_by_label("table", &table)) { raku_log(0, "Found a table at (%.1f, %.1f, %.1f)", table.position[0], table.position[1], table.position[2]); } // Get the room mesh for physics collision raku_xr_scene_get_mesh("room_mesh"); raku_physics_add_static_collider("room_mesh", "mesh"); }
PLACEMENT

Placing Virtual Objects on Real Surfaces

Once you have detected planes, you can place virtual objects on them. Use hit-testing to find the exact position on a surface where the user is pointing or looking.

// Hit-test against real-world surfaces using a ray float ray_origin[3], ray_dir[3]; raku_xr_get_controller_ray(RAKU_HAND_RIGHT, ray_origin, ray_dir); RakuSceneHit hit; if (raku_xr_scene_hit_test(ray_origin, ray_dir, &hit)) { // Place a virtual object at the hit point raku_scene_add_model("vase.glb", hit.position[0], hit.position[1], hit.position[2]); // Align the object with the surface normal raku_scene_set_up_vector("vase.glb", hit.normal[0], hit.normal[1], hit.normal[2]); raku_log(0, "Placed vase on %s", hit.plane_label); }

Constrain placement to specific surface types:

// Only place on horizontal surfaces (tables, floors) if (raku_xr_scene_hit_test_filtered(ray_origin, ray_dir, "horizontal", &hit)) { raku_scene_add_model("pet.glb", hit.position[0], hit.position[1], hit.position[2]); } // Only place on walls (vertical surfaces) if (raku_xr_scene_hit_test_filtered(ray_origin, ray_dir, "vertical", &hit)) { raku_scene_add_model("painting.glb", hit.position[0], hit.position[1], hit.position[2]); } // Snap to the nearest detected surface within a radius float pos[] = { 0.0f, 1.0f, -1.0f }; RakuSceneHit snap; if (raku_xr_scene_snap_to_surface(pos, 0.5f, &snap)) { // Object snapped to the nearest surface within 0.5m }
EFFECTS

Creating Passthrough Portals

Passthrough portals are windows into the real world placed inside a virtual environment (or vice versa). They create a dramatic "looking through a window" effect.

// In VR mode, create a portal that shows the real world raku_xr_init("vr"); // Create a circular portal on a wall raku_xr_passthrough_create_portal( 0.0f, 1.5f, -3.0f, // world position 0.8f, // radius (meters) "wall_portal" // tag ); // Create a rectangular portal (width, height) raku_xr_passthrough_create_rect_portal( 3.0f, 1.5f, -3.0f, // position 1.2f, 2.0f, // width, height "door_portal" // tag );

Animate and control portals:

// Animate portal opening (0 = closed, 1 = fully open) float portal_open = 0.0f; void on_update(float dt) { if (opening) { portal_open += dt * 0.5f; // open over 2 seconds if (portal_open > 1.0f) portal_open = 1.0f; raku_xr_passthrough_set_portal_scale("wall_portal", portal_open); } } // Set portal edge softness (feathering) raku_xr_passthrough_set_portal_edge_softness("wall_portal", 0.05f); // Apply color correction to the passthrough feed raku_xr_passthrough_set_brightness(1.1f); raku_xr_passthrough_set_contrast(1.2f); raku_xr_passthrough_set_saturation(0.8f); // desaturate for stylized look
Creative use: In VR, place portals on dungeon walls to let players peek at the real world. In MR, use "inverse portals" to create virtual windows into fantasy worlds on your real walls.
PERSISTENCE

Using Spatial Anchors for Persistent Placement

Spatial anchors lock virtual objects to a physical location so they stay in place across sessions. When the user restarts the app, anchored objects reappear exactly where they were left.

// Create a spatial anchor at a world position handle anchor = raku_anchor_create(1.2f, 0.75f, -0.8f); // Attach a virtual object to the anchor raku_anchor_attach(anchor, "pet.glb"); // Save the anchor to persistent storage (survives app restarts) raku_anchor_save(anchor, "my_pet_anchor");

Restore anchors when the app launches again:

// On app start, try to restore saved anchors handle anchor = raku_anchor_load("my_pet_anchor"); if (anchor != RAKU_INVALID_HANDLE) { // Anchor restored! Get its current position float pos[3]; raku_anchor_get_position(anchor, pos); raku_log(0, "Pet anchor restored at (%.2f, %.2f, %.2f)", pos[0], pos[1], pos[2]); // Re-attach the pet model raku_scene_add_model("pet.glb", pos[0], pos[1], pos[2]); raku_anchor_attach(anchor, "pet.glb"); } else { raku_log(1, "No saved anchor found, place pet manually"); } // List all saved anchors char anchor_names[32][64]; int count = raku_anchor_list_saved(anchor_names, 32); for (int i = 0; i < count; i++) { raku_log(0, "Saved anchor: %s", anchor_names[i]); } // Delete a saved anchor raku_anchor_delete_saved("my_pet_anchor");
MULTIPLAYER MR

Cloud Anchors for Shared MR Experiences

Cloud anchors allow multiple headsets to see the same virtual objects in the same physical locations. User A places an object, and User B sees it in the exact same spot in the real room.

// Host: share an anchor to the cloud handle anchor = raku_anchor_create(1.2f, 0.75f, -0.8f); raku_anchor_cloud_share(anchor, on_cloud_share_complete); void on_cloud_share_complete(const char* cloud_id, bool success) { if (success) { raku_log(0, "Anchor shared! Cloud ID: %s", cloud_id); // Send cloud_id to other users via networking } }

Resolve a cloud anchor on another device:

// Guest: resolve a cloud anchor by ID raku_anchor_cloud_resolve("cloud-anchor-abc123", on_cloud_resolve); void on_cloud_resolve(handle anchor, bool success) { if (success) { float pos[3]; raku_anchor_get_position(anchor, pos); // Place the same virtual object at the shared location raku_scene_add_model("shared_object.glb", pos[0], pos[1], pos[2]); raku_anchor_attach(anchor, "shared_object.glb"); raku_log(0, "Shared anchor resolved at (%.2f, %.2f, %.2f)", pos[0], pos[1], pos[2]); } else { raku_log(1, "Failed to resolve cloud anchor - ensure both devices see the same area"); } }
Requirements: Cloud anchors require network access and the Raku Network subsystem. Both devices must be able to see some overlapping physical features (walls, furniture) for the spatial alignment to work.
COMPLETE EXAMPLE

Complete Example: Virtual Pet on Your Real Desk

A mixed reality app where a virtual pet sits on your real desk. The pet responds to hand gestures, is anchored persistently, and can be shared with others via cloud anchors.

// === Virtual Pet on Your Desk === raku_init(); raku_xr_init("mr"); // Mixed reality mode raku_renderer_create_window(1280, 720, "MR Pet", true); raku_renderer_create_camera("main", 90.0f, 0.01f, 100.0f); // Transparent background for MR float clear[] = { 0, 0, 0, 0 }; raku_renderer_set_clear_color(clear); // Physics (for pet walking on surfaces) float gravity[] = { 0, -9.81f, 0 }; raku_physics_create_world(gravity); // Audio (for pet sounds) raku_audio_init(16, true); // HRTF enabled for spatial audio raku_audio_load("sounds/purr.wav", "purr"); raku_audio_load("sounds/meow.wav", "meow"); raku_audio_load("sounds/happy.wav", "happy"); // Hand tracking raku_xr_hand_tracking_enable(true); // Scene setup raku_scene_create("pet_world"); // --- Step 1: Scan the room --- raku_xr_scene_understanding_start(); raku_xr_scene_understanding_set_detect_planes(true); raku_xr_scene_understanding_set_detect_semantic(true); bool pet_placed = false; handle pet_anchor = RAKU_INVALID_HANDLE; // --- Step 2: Try to restore a saved anchor --- pet_anchor = raku_anchor_load("pet_desk_anchor"); if (pet_anchor != RAKU_INVALID_HANDLE) { float pos[3]; raku_anchor_get_position(pet_anchor, pos); raku_scene_add_model("cat.glb", pos[0], pos[1], pos[2]); raku_anchor_attach(pet_anchor, "cat.glb"); raku_animation_load("cat.glb", "anims/idle.glb", "idle"); raku_animation_play("idle", true); pet_placed = true; raku_log(0, "Pet restored from saved anchor"); } // --- Step 3: Place pet on table tap if not restored --- void on_pinch(RakuHandEvent* event) { if (pet_placed) return; if (event->phase != RAKU_GESTURE_BEGIN) return; float origin[3], dir[3]; raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_INDEX_TIP, origin); dir[0] = 0; dir[1] = -1; dir[2] = 0; // point down RakuSceneHit hit; if (raku_xr_scene_hit_test_filtered(origin, dir, "horizontal", &hit)) { // Place the pet on the surface raku_scene_add_model("cat.glb", hit.position[0], hit.position[1], hit.position[2]); // Create and save a persistent anchor pet_anchor = raku_anchor_create( hit.position[0], hit.position[1], hit.position[2]); raku_anchor_attach(pet_anchor, "cat.glb"); raku_anchor_save(pet_anchor, "pet_desk_anchor"); // Start idle animation and purring raku_animation_load("cat.glb", "anims/idle.glb", "idle"); raku_animation_play("idle", true); raku_audio_play_spatial("meow", hit.position[0], hit.position[1], hit.position[2]); pet_placed = true; raku_xr_haptic_pulse(event->hand, 0.4f, 0.1f); } } raku_xr_hand_gesture_register("pinch", on_pinch); // --- Step 4: Pet reacts to being petted (open palm near it) --- void on_open_palm(RakuHandEvent* event) { if (!pet_placed) return; float hand_pos[3], pet_pos[3]; raku_xr_hand_get_joint_position(event->hand, RAKU_JOINT_PALM, hand_pos); raku_anchor_get_position(pet_anchor, pet_pos); float dist = raku_math_distance_3d(hand_pos, pet_pos); if (dist < 0.3f) { // Pet is being petted! raku_animation_load("cat.glb", "anims/happy.glb", "happy"); raku_animation_play("happy", false); raku_audio_play_spatial("purr", pet_pos[0], pet_pos[1], pet_pos[2]); raku_xr_haptic_pulse(event->hand, 0.15f, 0.3f); } } raku_xr_hand_gesture_register("open_palm", on_open_palm); // --- Step 5: Run --- raku_scene_start_loop(); raku_xr_shutdown(); raku_shutdown();
Next steps: Add the AI system to give your pet autonomous behavior, or use spatial audio to make the purring sound come from the pet's exact position.

Ready for More?

Explore the full XR API (151 functions) in the documentation, or continue with another tutorial.