Ally SenseScape

Multisensory Indoor Navigation using AI Assistant for People who are Blind or have Low Vision (PBLV)

Master Thesis (2025)
Author(s)

P. Prasun (TU Delft - Industrial Design Engineering)

Contributor(s)

S.U. Boess – Mentor (TU Delft - Human Technology Relations)

A.I. Keller – Graduation committee member (TU Delft - Society, Culture and Critique)

Ferkan Metin – Graduation committee member (Envision Technologies B.V.)

Faculty
Industrial Design Engineering
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
29-07-2025
Awarding Institution
Delft University of Technology
Programme
['Design for Interaction']
Sponsors
Envision Technologies B.V.
Faculty
Industrial Design Engineering
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

This graduation project explores how a virtual personal assistant powered by AI can enhance independence and productivity for people who are blind or have low vision (PBLV) in educational and professional settings. While existing assistive tools provide some support, tasks like navigating dynamic indoor environments or using touchscreen devices still pose barriers.

The project was conducted in collaboration with Envision Technologies BV, whose personal AI assistant ‘Ally’ formed the basis for exploring new use cases. Using a mixed-method, user-centered approach—including interviews, co-creation sessions, shadowing, AI model testing, and iterative prototyping—the research identified 19 common challenges. Two key directions emerged: touchscreen interaction and navigation to non-fixed indoor locations. Based on user input and feasibility, the focus shifted to navigation.

The final concept, Ally SenseScape, enables hands-free, voice-activated navigation using Envision Glasses, floor plans, photos, compass input, and sensory cues like sound or texture. The design was tested with one real PBLV user and two blindfolded participants. Feedback showed that while the concept is promising, especially for hands-free step-based navigation, further refinement is needed in pacing, orientation support, and fallback options.

This project offers a direction for extending Ally’s capabilities in real environments, through a scalable and co-created approach.

Files

License info not available
License info not available
License info not available
Z._Interaction_Video.mp4
(mp4 | 247 Mb)
License info not available