AIris
AIris: an AI-powered wearable assistive device for the visually impaired
A pivotal point of my undergraduate studies was the development of AIris, an AI-powered wearable device designed to assist the visually impaired by translating the visual environment into auditory feedback and answering the user’s questions.
Collaborating with organizations for the visually impaired, I engaged directly with individuals to understand their daily challenges and we co-designed AIris to address these needs. The device consists of a camera mounted on a pair of 3D printed glaases, a microprocessor and earphones. The user can ask a question, like “How much money am I holding?” and the NLP module selection would determine the appropriate ML model to process the camera feed and finally formulate the answer, like “You are holding 30 euros”.
While it originally started as an undergraduate course’s project, this device evolved to win numerous national and international awards, including first place in the National Innovation Robotics Contest, and secured funding to supply prototypes to those in need.
This work was also published in the IEEE RAS EMBS 10th International Conference on Biomedical Robotics and Biomechatronics (BioRob 2024) - available here.