Bengaluru Hospital Introduces AI Glasses for the Visually Impaired
The AI glasses can read printed and handwritten text in all 18 Indian languages and several foreign languages, translate content, identify colours, and detect signboards in public spaces such as metro stations.
A Bangalore eye hospital, Narayana Nethralaya, has introduced AI glasses aimed at improving mobility, access to information, and everyday independence for people who are blind or have severe low vision.
The Smart Vision Glasses Ultra, developed by SHG Technologies and clinically validated by Narayana Nethralaya, have been unveiled as a wearable assistive device that blends advanced artificial intelligence with a familiar eyewear design.
According to Dr Mohammed Danish, Optometrist at the Buds to Blossoms Department of Narayana Nethralaya, the AI glasses have been shaped by years of clinical engagement with visually impaired patients.
“This product is powered by Artificial Intelligence. It can identify objects and obstacles, recognise faces, read text and help with navigation,” he said, adding, “Unlike general smart glasses available in the market, this device is designed specifically for people with low vision or complete blindness.”
The Buds to Blossoms clinic has worked for over a decade with children and adults who have complex visual and neurological conditions, often using multidisciplinary therapies to support education and independent living.
“That long clinical experience gave the foundation for this kind of innovation. We understand what patients actually need in their daily lives, and that feedback has shaped the development of devices like the Smart Vision Glasses Ultra,” Dr Danish said.
The AI glasses use machine vision, AI algorithms, and a built-in LiDAR sensor to scan surroundings and provide real-time audio feedback through a private Bluetooth speaker.
Features include object and obstacle detection, facial recognition, currency identification, and navigation assistance. The device can read printed and handwritten text in all 18 Indian languages as well as several foreign languages, translate content, identify colours, and detect signboards in public spaces such as metro stations.
An emergency calling function sends the user’s location, last spoken words, and a front-view image to a designated contact when activated.
Priced at around INR 46,000, the AI glasses are positioned as a more affordable alternative to imported assistive devices, which often cost several times more.
Dr Danish noted that affordability remains a challenge for many patients, particularly from rural or low-income backgrounds, and said the hospital helps guide eligible users toward government schemes or free device support where possible.
Users have reported greater confidence in daily activities. “Earlier, I had to depend on others to identify currency notes or read signboards. Now the glasses tell me what is in front of me. I feel more confident when I step out alone,” said Gurumurthy, a beneficiary of the device.
As AI glasses continue to evolve, developers are working on improving battery life, camera quality, and design to ensure comfort and wider acceptance.
Stay tuned for more such updates on Digital Health News