Spatial AR Bridge:
From Physical Design to Spatial Interaction (NUI)
This project originates from a physical design by Dr. Neslihan Tepehan and Dr. Melike Mühür, originally designed in 2014. My journey began with the challenge of preserving the tactile essence and strategic depth of this physical object while envisioning its potential in the digital realm. It served as the fundamental blueprint for everything that followed.
.jpeg)


Digital Evolution: Web-Based Adaptation
The first step was architecting a fully playable, web-based digital version. By translating physical constraints into 3D logic, I developed a core engine with over 4,500 lines of code, ensuring cross-platform accessibility and robust state management. This phase was about building a stable digital "home" for the original design, proving that strategic depth could thrive within a browser environment.
Transcending the Screen (Spatial UX)
The ultimate evolution is the Spatial AR Bridge. Here, I’ve architected an ecosystem that transforms the physical world from a mere background into an active component of the experience. Using AI-Assisted Iterative Development (Gemini & Claude), I integrated a Natural User Interface (NUI) that bypasses traditional controllers in favor of 6 Degrees of Freedom (6DOF) through natural hand gestures.
By mapping interactions like 'Pinch-to-Select' and 'Wrist-Roll' based on Spatial Affordance principles, I’ve minimized cognitive load to create a low-latency, high-feedback architecture. I am currently refining the "Gestural IA", utilizing hand orientation and depth data to answer a critical question: How should products be positioned in the future of Spatial Computing?
Real-time interaction testing: Mapping physical movement to digital 3D space.
Gestural IA & Interaction Mapping
-
Pinch-to-Select: High-precision object pickup using hand-depth and landmark data.
-
Wrist-Roll (Rotation): Mapping wrist orientation to 3D axes to replace traditional WASDQE keys.
-
Spatial Affordance: Designing interactions that feel "physically correct" to the user's hand.
-
AI-Assisted Prototyping: Leveraging Gemini & Claude for rapid iteration of complex spatial coordinate systems.

Under the Hood
To bridge the gap between 2D camera frames and the 3D game engine, I implemented a custom coordinate transformation layer. This snippet showcases the core logic where real-time hand displacement (dX/dY) is translated into precise 6DOF orbital rotation. By utilizing MediaPipe Hands for landmark detection and mapping those points to a stable 3D interaction zone, I ensured that physical gestures are converted into digital commands with sub-millisecond latency—providing the high-fidelity feedback necessary for a professional-grade Spatial UX.

Iterative Design Cycle: Recognition vs. Recall
Currently, I am navigating the trade-off between immediate usability and long-term immersion. While spatial buttons offer a familiar safety net, pure gestural control represents the future of buttonless AR. My current iterations focus on finding the 'Minimum Viable Interaction' (MVI) for this specific UX.