Sarvam launches Indus chat app on top of 105B model
India AI market shifts from importing chatbots to building language-first distribution, Data sovereignty sold as latency feature while deletion and identity frictions remain
Images
Image Credits:Jagmeet Singh / TechCrunch
Image Credits:Jagmeet Singh / TechCrunch
India’s AI market is no longer a passive consumption zone for Silicon Valley chatbots. Bengaluru-based Sarvam AI has launched “Indus”, a chat app for web and mobile that fronts the company’s newly announced 105B-parameter large language model, according to TechCrunch.
The launch is a small product release with big strategic implications: “local” AI is becoming a performance feature, not a patriotic slogan. Sarvam’s pitch is that Indian users want low latency, speech-first interaction, and competence in India’s linguistic mess—needs that generic English-centric models often satisfy only after a long chain of translation, prompt engineering, and failure modes.
Indus supports typed and spoken queries and can respond in text and audio, TechCrunch reports. Access is currently gated: the app appears limited to India, and Sarvam warns users they may hit a waitlist as it expands compute capacity. That is the unglamorous reality behind “AI sovereignty”: you can’t fine-tune your way around a GPU shortage.
Sarvam is also leaning into distribution rather than just model training. The company announced partnerships with HMD to push AI features to Nokia feature phones and with Bosch for automotive applications, per TechCrunch. If you want to compete with OpenAI and Google in India, you don’t just need a model; you need to be preinstalled, bundled, and integrated into the devices that actually dominate the market.
Technically, a 105B model is a statement: large enough to be taken seriously, small enough to be plausibly served at scale with aggressive optimization. But Sarvam’s own product constraints hint at the trade-offs. Users can’t disable the app’s “reasoning” mode, which can slow responses, and they can’t delete chat history without deleting the account, TechCrunch notes. The first is a compute and UX problem; the second is a governance problem. Nothing says “trust us” like making deletion impossible unless you nuke your identity.
The competitive backdrop is brutal. OpenAI CEO Sam Altman recently said ChatGPT has more than 100 million weekly active users in India, while Anthropic says India is its second-largest market by usage share, TechCrunch reports. That scale advantage means global players can amortize training and inference across the planet. Sarvam’s counter is specialization: Indian languages, Indian domains, and the ability to ship features where bandwidth and hardware are constrained.
For readers, the interesting part isn’t whether New Delhi gets a homegrown ChatGPT. It’s that “data sovereignty” is quietly becoming a product requirement that aligns state ambition with corporate incentives. If models are localized, hosted domestically, and authenticated via phone numbers, the same infrastructure that reduces latency also reduces friction for surveillance and regulatory capture. The market will build what users pay for—and governments will regulate what markets are forced to build.
Sarvam says it has raised $41 million since its 2023 founding. In India’s AI arms race, that’s a down payment, not a war chest. But in a mobile-first country, distribution is leverage—and Indus is Sarvam’s bid to turn language and latency into a moat before the global incumbents finish localizing their own.