FURI | Summer 2024

Generative AI-Aided Navigation for the Visually Impaired and Blind

Health icon, disabled. A red heart with a cardiac rhythm running through it.

This project explores state-of-the-art vision language models to provide new navigation solutions for the visually impaired and blind. Vision and language features from these models have proved capable of identifying objects and reasoning from image references. Leveraging these capabilities, VLMs can facilitate seamless conversations such that users can receive assistance at any moment about their current environment, increasing their independence to navigate safely with this tool. This opens new opportunities for assistive technology by integrating VLMs into wearable or extendable devices for daily use alongside walking sticks.

Student researcher

Kelly Raines

Computer science

Hometown: Tempe, Arizona, United States

Graduation date: Fall 2024