A 19-year-old engineering student from Bengaluru has developed Perceivia, AI-powered smart glasses that help visually impaired individuals navigate and understand their surroundings using voice and vibration feedback earning national recognition at a prestigious innovation competition.
Glimpse:
Using advanced AI (including Google’s Gemini 2.0 Flash model), Perceivia detects objects, recognizes faces and describes environments in real time. The device translates visual information into audio and tactile cues, enabling safer, more independent movement and spatial awareness for users.
Perceivia an AI-enabled wearable device was developed by Tushar Shaw, a 19-year-old second-year engineering student from Bengaluru. The glasses are designed to assist people with visual impairments by analysing the environment in real time and providing intuitive feedback that enhances spatial awareness and mobility. The system uses machine vision and multimodal AI processing to interpret images and scene information, then delivers alerts through speech and vibration cues that create a dynamic “sensory map.”
Shaw built the prototype by leveraging Google’s Gemini 2.0 Flash model, which enables fast and accurate interpretation of visual inputs. The AI can detect objects, estimate distances, recognize human faces and voices and describe surroundings all processed locally so users receive near-instant guidance. Despite having limited prior experience in hardware design, Tushar refined the system through mentorship and community feedback, particularly from visually impaired volunteers whose real-world testing helped improve usability.
Perceivia’s impact was recognised when it won a national award at Samsung’s “Solve for Tomorrow 2025” innovation challenge, one of India’s most prominent youth tech competitions focused on real-world problem solving. As a national winner, Tushar received support including incubation backing to further develop and scale the technology toward broader adoption. His long-term vision includes making Perceivia more affordable and enhancing it with indoor navigation features, so it can benefit millions beyond early prototype users.
This breakthrough reflects a growing trend where AI and wearable technologies are being applied to accessibility challenges, helping close the gap between disability and independence and offering a powerful example of how student innovation can drive real social impact.
“I wanted to build something that restores not just convenience, but independence for people who face barriers every single day.”
By
HB Team
