FrAImed

MIT Reality Hack 2026 - Winning Project

FrAImed is an AI-powered spatial photography assistant developed during MIT Reality Hack 2026, one of the world’s leading immersive technology hackathons.

Built for Snap Spectacles, FrAImed transforms how people capture real-world moments by guiding users toward better photo composition directly within augmented reality (before the photo is taken). Instead of fixing photos afterward through editing, FrAImed helps users capture meaningful moments beautifully in real time.

Date:

Jan 2026

Timeline:

48hrs

Role:

Design Engineer

Context

MIT Reality Hack brings together designers, engineers, researchers, and storytellers to prototype the future of spatial computing.
Coming from a background in immersive storytelling, AR interaction design, and Snap ecosystem development, I joined a team exploring a fundamental question:

What happens when cameras stop being passive recording tools and become intelligent creative collaborators?


While smartphone cameras have reached technical perfection, users still struggle with composition, framing, and perspective — especially in spontaneous social situations.


The opportunity was clear:
Spatial computing can teach visual literacy in the moment of capture.

The Problem

Most bad photos are not caused by hardware limitations, but by missing compositional guidance.

Solution

FrAImed reimagines photography as a guided spatial experience, coaching users toward professional composition while they remain fully present.

Spatial Anchoring

Geo-located AR pins indicate ideal shooting positions and camera angles within physical space. Users move naturally through the environment until alignment is achieved.

AI Composition Guidance

Using Gemini 2.0 Flash, live camera frames are analyzed to provide real-time compositional feedback.

The system dynamically adjusts:

  • framing

  • subject placement

  • perspective balance

Capture Moment

Once aligned, users simply snap the photo through Spectacles.

No interruption.
No editing workflow.
Just presence.

My Role

During the hackathon, I contributed across concept, interaction, and experience layers:

  • Experience & interaction design for spatial guidance

  • User flow definition for Spectacles interaction

  • AR UX logic and feedback systems

  • Rapid prototyping decisions

  • Visual storytelling & presentation strategy

  • Cross-disciplinary collaboration between AI + spatial engineering

Technical Architecture

Pipeline flow:
Camera → AI Analysis → Direction Feedback → Capture → Cloud Storage

Outcome

Within just 48 hours, FrAImed evolved from an initial concept into a working spatial photography prototype on Snap Spectacles, combining AI analysis with geo-anchored AR guidance. The project was honored at MIT Reality Hack 2026 with Half League’s sponsored award for Best Use of AR for Urban Portals and Spatial Overlays, recognizing our exploration of how augmented reality can meaningfully augment urban spaces rather than simply overlay information onto them.

Reflection

FrAImed represents a shift I have been exploring throughout my work:

Moving from screen-based interaction toward environment-based interaction.

Rather than designing interfaces people look at, we designed intelligence embedded into the world itself.

The project reinforced a key insight:
The future of cameras is not better sensors, but spatial understanding.

FrAImed

MIT Reality Hack 2026 - Winning Project

FrAImed is an AI-powered spatial photography assistant developed during MIT Reality Hack 2026, one of the world’s leading immersive technology hackathons.

Built for Snap Spectacles, FrAImed transforms how people capture real-world moments by guiding users toward better photo composition directly within augmented reality (before the photo is taken). Instead of fixing photos afterward through editing, FrAImed helps users capture meaningful moments beautifully in real time.

Date:

Jan 2026

Timeline:

48hrs

Role:

Design Engineer

Context

MIT Reality Hack brings together designers, engineers, researchers, and storytellers to prototype the future of spatial computing.
Coming from a background in immersive storytelling, AR interaction design, and Snap ecosystem development, I joined a team exploring a fundamental question:

What happens when cameras stop being passive recording tools and become intelligent creative collaborators?


While smartphone cameras have reached technical perfection, users still struggle with composition, framing, and perspective — especially in spontaneous social situations.


The opportunity was clear:
Spatial computing can teach visual literacy in the moment of capture.

The Problem

Most bad photos are not caused by hardware limitations, but by missing compositional guidance.

Solution

FrAImed reimagines photography as a guided spatial experience, coaching users toward professional composition while they remain fully present.

Spatial Anchoring

Geo-located AR pins indicate ideal shooting positions and camera angles within physical space. Users move naturally through the environment until alignment is achieved.

AI Composition Guidance

Using Gemini 2.0 Flash, live camera frames are analyzed to provide real-time compositional feedback.

The system dynamically adjusts:

  • framing

  • subject placement

  • perspective balance

Capture Moment

Once aligned, users simply snap the photo through Spectacles.

No interruption.
No editing workflow.
Just presence.

My Role

During the hackathon, I contributed across concept, interaction, and experience layers:

  • Experience & interaction design for spatial guidance

  • User flow definition for Spectacles interaction

  • AR UX logic and feedback systems

  • Rapid prototyping decisions

  • Visual storytelling & presentation strategy

  • Cross-disciplinary collaboration between AI + spatial engineering

Technical Architecture

Pipeline flow:
Camera → AI Analysis → Direction Feedback → Capture → Cloud Storage

Outcome

Within just 48 hours, FrAImed evolved from an initial concept into a working spatial photography prototype on Snap Spectacles, combining AI analysis with geo-anchored AR guidance. The project was honored at MIT Reality Hack 2026 with Half League’s sponsored award for Best Use of AR for Urban Portals and Spatial Overlays, recognizing our exploration of how augmented reality can meaningfully augment urban spaces rather than simply overlay information onto them.

Reflection

FrAImed represents a shift I have been exploring throughout my work:

Moving from screen-based interaction toward environment-based interaction.

Rather than designing interfaces people look at, we designed intelligence embedded into the world itself.

The project reinforced a key insight:
The future of cameras is not better sensors, but spatial understanding.

© Selected Works / Fynn Langnau

Your brand deserves the best. Let me help you achieve it with a human-centered mindset.

Available for work

Let's Work Together.

© Selected Works / Fynn Langnau

Your brand deserves the best. Let me help you achieve it with a human-centered mindset.

Available for work

Let's Work Together.

© Selected Works / Fynn Langnau

Your brand deserves the best. Let me help you achieve it with a human-centered mindset.

Available for work

Let's Work Together.