Exclusive: Iksha Labs Rolls Out Hindi AI Voice Agents for Healthcare in NCR, J&K, & Punjab

Exclusive: Iksha Labs Rolls Out Hindi AI Voice Agents for Healthcare in NCR, J&K, & Punjab

Iksha Labs plans to support regional languages across India, with Kannada, Marathi, and Telugu expected in the next 4-5 months.

Iksha Labs is betting big on Bharat with a new wave of Hindi-speaking AI voice agents tailored for hospitals and diagnostic centers. Deployed across NCR, J&K, and Punjab, these AI agents automate everything from call-based bookings to report sharing, no human intervention needed.

Founded in 2019, the Gurugram-headquartered AI solutions company builds intelligent agents and workflows for businesses and healthcare providers.

In this exclusive conversation, Hitesh Ganjoo, founder & CEO - Iksha Labs, unpacks the vision, what it takes to build India-grade agentic workflows, and why vernacular voice is the future of AI in healthcare.

Let’s start with the big reveal. You’re officially launching deep‑vertical Hindi AI agents for healthcare in select states. What makes this launch different from what’s already out there?

We’re not just shipping “voice with AI.” We’re shipping Hindi-first digital co-workers. At one Delhi hospital pilot, over 100,000 minutes of patient calls per month are now being handled by our Hindi AI agents, calls that would otherwise choke a human call center. Patients say things like, “पहली बार रपोट समझ आई है fबना बार-बार पूछे” (“For the first time I understood my report without having to ask again”).

What exactly have you built & how is your stack different?

A true voice agent must feel human‑smooth. Ours responds in < 1 s, understands cultural context, and plugs into CRMs or hospital software to finish real tasks, not just talk. We fine‑tune every model on live use‑cases instead of endless prompt‑tuning experiments.

Our stack isn’t a chatbot with a microphone. It’s a near-human digital worker:

● Sub-second responses (< 1 s) even on patchy 4G.

● Handles Hindi-English code-switching mid-sentence.

● Executes tasks in hospital software directly.

For example, in a Delhi diagnostic chain, when a patient said, “कल MRI करवाना है पर सुबह जल्दी चाfहए” the agent didn’t just reply—it opened the hospital system, checked slots, and confirmed a 7:30 AM booking.

Why healthcare & finance as your first focus sectors?

Because the volume is crushing, financial and healthcare inclusion are national priorities, but skilled staff are scarce.

At a 250-bed hospital in Haryana, 70%+ inbound calls are in Hindi, yet the staff pool is small and often English-trained. Patients were holding for 15 minutes just to ask about fasting before a test. With our agents, the average response time dropped to under 10 seconds.

In finance, one insurer saved 2,000 human agent hours a month by letting our Hindi agent handle premium reminders and claim FAQs. Our agents act as digital co‑workers, booking doctor visits or answering policy questions in local languages, so businesses scale without chasing more headcount.

Can you share one real use‑case where your voice agents moved the needle?

Yes. At a fast-growing IVF chain, our Hindi agents now handle calls across 10+ clinics, answering questions like: “fकस fदन इंजेक्शन लगाना है?” or “क्या अल्ट्रासाउंड से पहले खाना खा सकते हैं?” Before, nurses spent hours repeating these answers.

Now, nurses focus on care, and patients get instant clarity. Early numbers show ~30% uplift in conversions for follow-up cycles.

From a patient’s POV, what kind of questions or requests can your AI voice agents actually handle today, start to end?

Our voice agents handle everything from basic FAQs to complex protocol queries, cutting call‑center load and lifting conversion rates.

– Book or reschedule appointments.

– Hear Hindi summaries of lab reports (“आपका शुगर लेवल थोड़ा बढ़ा हुआ है”).

– Confirm insurance cover.

– Get medicine reminders on WhatsApp.

In one hospital, patients even asked the agent to “मुझे घर के पास वाली शाखा का पता बता दो” (“Tell me the address of the branch near my home”). It did, in Hindi, with a map link.

Can you explain the core issue of bias or irrelevance in global models when applied to Indian users?

Global models shine in Western contexts but stumble in India, because they break on Indian reality.

● Cultural bias: They mis‑handle Indian place‑names, medicines, and mixed‑language sentences.

● Voice quality gap: Truly natural Hindi TTS is still evolving; you still need custom pronunciation, neutral accent, and smooth Hindi‑English code‑switching

  • A global model misread “Gurugram” as “Guru-gram” in Hindi speech.
  • It mispronounced “Telmisartan” so badly that patients thought it was a different drug.
  • When a caller said: “Doctor ne bola bypass करना पड़ेगा, पर sugar control करना होगा” - most global agents froze.

Our India-first stack gets this right because it’s trained on millions of real Hindi minutes, not just Western data.

What’s the rollout plan?

We are primarily focused on Delhi and North Indian states like J&K, Punjab, and Haryana to start with, but within the next 6-12 months, we plan to expand and cover regional languages in the rest of India, as we are working on supporting several local languages as of now. We hope to cover Kannada, Marathi, and Telugu in the next 4-5 months.

  • Phase 1: Delhi NCR, Haryana, J&K, UP - regions where Hindi dominates inbound hospital calls.
  • Scale: In Delhi alone, a mid-size hospital crosses 100,000 patient call minutes/month. That’s our baseline.
  • Next 4–5 months: Add Marathi, Kannada, Telugu.
  • Within 12 months: Pan-India regional language coverage.

You say “India‑first tech travels, world‑first tech breaks here.” How so?

India is a demanding market, but at the same time, what works here definitely is likely to work elsewhere with no/little customization. Being a culturally diverse land (like no other country) that supports regional languages, cultural nuances, and that too in a frugal setting, is a terrific ground for innovating and testing product market fit.

A global player coming to India, on the other hand, will have to fit an existing solution to India, and in many cases, that may not be a frugal/fast approach to building for India.

If your AI survives here, like ours handling 100k+ Hindi minutes per hospital per month, it will scale globally with minimal changes. But world-first products often break here: they assume fluent English, high bandwidth, and single-language calls.

What are the non-obvious technical challenges in building contextual, action-taking AI voice agents for India?

Even though there is huge data, in many cases, getting access to high-quality relevant data is a painful process. So we had to do controlled pilots after generating and augmenting real data with synthetic data. This also often dictates the final quality of the models/system, as garbage in is garbage out in the world of AI.

Serving ultra-low latency voice agents( < 1 sec response time) is as much about infrastructure and deployment as it is about research and development. We built our custom deployment pipelines to solve for every millisecond lost in things like network hops, cross-data region requests, etc. A lot of times, we encountered cutting-edge models that are often more supportive of users in the US/Europe because of the proximity of their data centers. This may be an important consideration in crafting the final solution.

  • Data quality: 80% of the open Hindi transcripts we reviewed were irrelevant. We had to generate synthetic + real hospital conversations to train useful models.
  • Latency obsession: A hospital CFO told us bluntly, “अगर 3 सेकंड चुप रहा तो caller disconnect कर देगा” (“If it pauses even 3 seconds, the caller will disconnect”). That forced us to build custom low-latency pipelines in India.
  • Accent range: A patient in Noida says “एमआरआई” differently from a caller in Jaipur. Our stack normalizes both, trained on tens of thousands of such samples

Looking ahead, if we’re talking again a year from now, what do you hope has changed significantly about how Indians interact with AI?

AI, as of today, has a significant journey to travel before it becomes invisible yet omnipresent in everyone' lives. We believe that in one year, AI agents will be as common as wifi connections and will be there to support us every step of the way at almost all touchpoints.

– A patient in Agra calls a hospital and speaks in Hindi without realizing it’s AI.

– An SME in Gurgaon has an AI co-worker calling vendors in Hindi, managing renewals.

– A hospital in Lucknow runs 150,000+ monthly minutes entirely via AI, freeing doctors and staff for actual care.

We hope to see AI being a widely accepted assistant (and a digital coworker) that boosts productivity and helps teams and businesses (especially MSMEs) across the board.

One feature still flying under the radar?

We are building an RL agent(reinforcement learning agent), that would be able to clone the style of a human agent or worker with very little effort and time.

At one Delhi hospital, we recorded 30 minutes of a star nurse explaining post-surgery care. The agent cloned her reassuring tone, “घबराइए मत, ये दद सामान्य है”, and patients responded as if speaking to the nurse herself.

The idea is to have truly digital coworkers who are not just co-workers but almost twins. These agents would be able to predict the next steps exactly like a human would. That’s the future: digital co-workers as true twins.

Stay tuned for more such updates on Digital Health News

Follow us

More Articles By This Author


Show All

Sign In / Sign up