Kiru Mehari
Spatial Index
00:00:00
3DGS · World Models · Spatial Computing

Spatial Index

04 experiments across the full spatial pipeline —
capture · reconstruction · synthesis · navigation

scene_04 · DURÉE
● live
Durée — video as navigable 3D object
Video as Spatial Object
Three.js · WebCodecs · GLSL · Gemini AI → Frame Stack

All frames of a video rendered simultaneously as stacked semi-transparent planes — treating time as a navigable spatial axis. Orbit the full timeline as a physical object. Gemini AI search lets you query frames by natural language.

Three.js WebCodecs GLSL Gemini
scene_03 · DEPTHSHIFT
● live
DepthShift depth reprojection hologram
Depth Reprojection Hologram
Three.js · WebXR · GLSL Vertex Shader → Particle Field

A source video and depth map are displaced by a custom GLSL vertex shader — extruding a dense particle grid along Z per pixel to create an orbitable 3D point cloud. A WebXR companion anchors the same depth mesh to real-world surfaces via plane detection and hand tracking on Quest 3.

Three.js WebXR GLSL Particles
scene_02 · MARBLE
● live
SSENSE Montreal World Labs
Spatial World Generation
World Labs · Marble · Multi-Image → World Model

Reference photos of David Chipperfield's brutalist SSENSE flagship reconstructed into a navigable 3D world via World Labs' Marble model. Outputs .spz at 100k–full resolution with collider meshes, explorable in-browser.

World Labs Marble 3DGS .spz
scene_01 · SHARP
● live
SHARP monocular view synthesis
Monocular View Synthesis
Apple ML · SHARP · Single Image → 3DGS

A single photograph enters a feedforward neural network and exits as a full 3D Gaussian Splat in under a second. No multi-view capture. No photogrammetry pipeline. Pure inference.

Python Gradio 3DGS .ply