The RakuAI platform exposes 9,500+ REST endpoints across 17 subsystem routers — every capability in the engine is available from any client language, on any target. Browse a router to see the developer documentation for that subsystem.
Each router groups related capabilities. Endpoints are consistent across targets — call the same API from a browser, a native game client, or an XR headset.
Rigid bodies, cloth, vehicles, ragdolls, fluid simulation. Collision, joints, constraints, and queries for every shape you need.
Read reference →Node graph, prefabs, CSG, GridMap, terrain. Streaming, spatial queries, and scene composition primitives.
Read reference →PBR materials, lights, shadows, particles, post-processing. LOD, culling, and frame-accurate capture.
Read reference →Navmesh, behavior trees, NPC dialogue, utility AI, GOAP. Crowd simulation and sensory systems for believable agents.
Read reference →Spatial 3D audio, HRTF, propagation, adaptive music. Bus mixing, effects chains, and procedural sound design.
Read reference →Skeletal animation, blend trees, inverse kinematics, root motion. State machines and retargeting for every skeleton.
Read reference →Sessions, state sync, WebRTC, matchmaking. Lobby management, NAT traversal, and delta compression.
Read reference →Action maps, gamepad, XR controllers, gestures. Unified input layer with haptic feedback across every platform.
Read reference →Canvas, widgets, layouts, accessibility. 2D and 3D widgets with theming, localization, and screen reader support.
Read reference →Mesh import, texture compression, hot reload. Content pipeline for audio, textures, and 3D formats with streaming.
Read reference →Gizmos, inspector, profiler, debug draw. Remote tooling endpoints that power live editing and diagnostics.
Read reference →Embedded Lua VM, event bus, quests, dialogue trees. Visual scripting hooks and hot reload for rapid iteration.
Read reference →On-device LLM inference for NPC dialogue, intent detection, and emotion modeling — runs without a network connection.
Read reference →OpenXR, hand and eye tracking, spatial anchors. Foveated rendering and session management for every supported headset.
Read reference →Speech-to-text, text-to-speech, VAD, opus codec. Real-time voice chat with noise suppression and echo cancellation.
Read reference →Runtime lifecycle, logging, analytics. The foundational layer every other subsystem is built on.
Read reference →Boolean geometry operations: union, subtract, intersect. Real-time mesh generation with UV preservation.
Read reference →Request an API key to try the production endpoints, or run the browser engine to explore the surface offline.