This graduation project explores how a virtual personal assistant powered by AI can enhance independence and productivity for people who are blind or have low vision (PBLV) in educational and professional settings. While existing assistive tools provide some support, tasks like n
...
This graduation project explores how a virtual personal assistant powered by AI can enhance independence and productivity for people who are blind or have low vision (PBLV) in educational and professional settings. While existing assistive tools provide some support, tasks like navigating dynamic indoor environments or using touchscreen devices still pose barriers.
The project was conducted in collaboration with Envision Technologies BV, whose personal AI assistant ‘Ally’ formed the basis for exploring new use cases. Using a mixed-method, user-centered approach—including interviews, co-creation sessions, shadowing, AI model testing, and iterative prototyping—the research identified 19 common challenges. Two key directions emerged: touchscreen interaction and navigation to non-fixed indoor locations. Based on user input and feasibility, the focus shifted to navigation.
The final concept, Ally SenseScape, enables hands-free, voice-activated navigation using Envision Glasses, floor plans, photos, compass input, and sensory cues like sound or texture. The design was tested with one real PBLV user and two blindfolded participants. Feedback showed that while the concept is promising, especially for hands-free step-based navigation, further refinement is needed in pacing, orientation support, and fallback options.
This project offers a direction for extending Ally’s capabilities in real environments, through a scalable and co-created approach.