
ET Explainer: What OpenAI's local data residency means for India
ChatGPT-maker
OpenAI
earlier this month enabled
local data residency
in key Asian countries including India—its second largest market—and Japan, Singapore, and South Korea. This was in a bid to help organisations who want to leverage its
ChatGPT
Enterprise, ChatGPT Edu, and the
OpenAI API
(application programming interface) offerings, but also have
data localisation
requirements. ET explains what this move means for Indian businesses and whether
data sovereignty
is on the horizon.
What does OpenAI's data residency policy mean for India?
The feature allows 'data at rest' such as prompts, uploaded files, and chat interactions to be stored within India. But, models still reside in foreign servers and processing enterprise information at inference time (run-time) will need exchange outside India servers.
Play Video
Pause
Skip Backward
Skip Forward
Unmute
Current Time
0:00
/
Duration
0:00
Loaded
:
0%
0:00
Stream Type
LIVE
Seek to live, currently behind live
LIVE
Remaining Time
-
0:00
1x
Playback Rate
Chapters
Chapters
Descriptions
descriptions off
, selected
Captions
captions settings
, opens captions settings dialog
captions off
, selected
Audio Track
default
, selected
Picture-in-Picture
Fullscreen
This is a modal window.
Beginning of dialog window. Escape will cancel and close the window.
Text
Color
White
Black
Red
Green
Blue
Yellow
Magenta
Cyan
Opacity
Opaque
Semi-Transparent
Text Background
Color
Black
White
Red
Green
Blue
Yellow
Magenta
Cyan
Opacity
Opaque
Semi-Transparent
Transparent
Caption Area Background
Color
Black
White
Red
Green
Blue
Yellow
Magenta
Cyan
Opacity
Transparent
Semi-Transparent
Opaque
Font Size
50%
75%
100%
125%
150%
175%
200%
300%
400%
Text Edge Style
None
Raised
Depressed
Uniform
Drop shadow
Font Family
Proportional Sans-Serif
Monospace Sans-Serif
Proportional Serif
Monospace Serif
Casual
Script
Small Caps
Reset
restore all settings to the default values
Done
Close Modal Dialog
End of dialog window.
Data localisation has until now prevented OpenAI from gaining market share in India as BFSI customers opted to host open models like Meta's Llama and DeepSeek on-premise.
According to Aadya Misra, Partner at Spice Route Legal, 'OpenAI's residency option could allow financial institutions to deploy AI for use cases like payment processing while remaining compliant with existing requirements that require payment data to be stored locally.'
She explained that the
Reserve Bank of India
does permit transient cross-border processing under certain conditions, 'so if implemented thoughtfully, concerns about data in motion could also be addressed. This move could shift reliance on self-hosted open-source models to enterprise-grade and centrally managed AI solutions.'
Discover the stories of your interest
Blockchain
5 Stories
Cyber-safety
7 Stories
Fintech
9 Stories
E-comm
9 Stories
ML
8 Stories
Edtech
6 Stories
Does this spell data sovereignty for India?
The move may at best be seen as a first step towards compliance-enablement that could help companies to bolster contracts with responsible data handling clauses. It has fallen short of complete data sovereignty, however, experts said.
'The architecture stores data 'at rest' locally, but not necessarily 'in transit' or during model inference. That data may still leave the country, exposing enterprises to regulatory scrutiny,' said Leslie Joseph, principal analyst at Forrester.
Joseph noted that OpenAI has not announced local hosting of its GPT models or inference engines in India. 'There's no evidence of compute or model weights residing in-country. This is partial localisation at best, not sovereign AI,' Joseph added.
He explained that although OpenAI has added AES-256 level encryption for data at rest and TLS 1.2+ for data in transit, without full model localisation, including inference compute, enterprises handling PII (personally identifiable information) will continue to face regulatory and data exposure concerns.
'...There is no explicit indication that the underlying GPT models, including their inference engines, tokens, or trained weights, will themselves be hosted in India,' said Ankit Sahni, Partner at Ajay Sahni & Associates.
What impact could the move have?
Speculation remains that OpenAI may eventually bring full-stack model hosting to India, given its enterprise ambitions and steady competition from cost-effective open-weight models. For now, experts say companies must treat this as a 'compliance-forward gesture.'
It could also mean opportunities for Indian data centre players.
Although the company is likely to host local storage within its long-time exclusive partner Microsoft's data centres, sources told ET that OpenAI is hearing proposals from other colocation data centres in India as well.
'Given OpenAI's shift to a for-profit structure and changing dynamics with Microsoft, we are actively seizing this opportunity to commit to a long-term relationship with them,' the senior executive at a leading data centre company told ET.
Annapurna Roy contributed to this story.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
27 minutes ago
- Time of India
AI 'vibe coding' startups burst onto scene with sky-high valuations
By Anna Tong, Krystal Hu NEW YORK: Two years after the launch of ChatGPT, return on investment in generative AI has been elusive, but one area stands out: software development. So-called code generation or "code-gen" startups are commanding sky-high valuations as corporate boardrooms look to use AI to aid, and sometimes to replace, expensive human software engineers. Cursor , a code generation startup based in San Francisco that can suggest and complete lines of code and write whole sections of code autonomously, raised $900 million at a $10 billion valuation in May from a who's who list of tech investors, including Thrive Capital, Andreessen Horowitz and Accel. Windsurf , a Mountain View-based startup behind the popular AI coding tool Codeium, attracted the attention of ChatGPT maker OpenAI, which is now in talks to acquire the company for $3 billion, sources familiar with the matter told Reuters. Its tool is known for translating plain English commands into code, sometimes called "vibe coding," which allows people with no knowledge of computer languages to write software. OpenAI and Windsurf declined to comment on the acquisition. "AI has automated all the repetitive, tedious work," said Scott Wu, CEO of code gen startup Cognition. "The software engineer's role has already changed dramatically. It's not about memorizing esoteric syntax anymore." Founders of code-gen startups and their investors believe they are in a land grab situation, with a shrinking window to gain a critical mass of users and establish their AI coding tool as the industry standard. But because most are built on AI foundation models developed elsewhere, such as OpenAI, Anthropic, or DeepSeek, their costs per query are also growing, and none are yet profitable. They're also at risk of being disrupted by Google , Microsoft and OpenAI, which all announced new code-gen products in May, and Anthropic is also working on one as well, two sources familiar with the matter told Reuters. The rapid growth of these startups is coming despite competing on big tech's home turf. Microsoft's GitHub Copilot, launched in 2021 and considered code-gen's dominant player, grew to over $500 million in revenue last year, according to a source familiar with the matter. Microsoft declined to comment on GitHub Copilot's revenue. On Microsoft's earnings call in April, the company said the product has over 15 million users. LEARN TO CODE? As AI revolutionizes the industry, many jobs - particularly entry-level coding positions that are more basic and involve repetition - may be eliminated. Signalfire, a VC firm that tracks tech hiring, found that new hires with less than a year of experience fell 24% in 2024, a drop it attributes to tasks once assigned to entry-level software engineers are now being fulfilled in part with AI. Google's CEO also said in April that "well over 30%" of Google's code is now AI-generated, and Amazon CEO Andy Jassy said last year the company had saved "the equivalent of 4,500 developer-years" by using AI. Google and Amazon declined to comment. In May, Microsoft CEO Satya Nadella said at a conference that approximately 20 to 30% of their code is now AI-generated. The same month, the company announced layoffs of 6,000 workers globally, with over 40% of those being software developers in Microsoft's home state, Washington. "We're focused on creating AI that empowers developers to be more productive, creative, and save time," a Microsoft spokesperson said. "This means some roles will change with the revolution of AI, but human intelligence remains at the center of the software development life cycle." MOUNTING LOSSES Some "vibe-coding" platforms already boast substantial annualized revenues. Cursor, with just 60 employees, went from zero to $100 million in recurring revenue by January 2025, less than two years since its launch. Windsurf, founded in 2021, launched its code generation product in November 2024 and is already bringing in $50 million in annualized revenue, according to a source familiar with the company. But both startups operate with negative gross margins, meaning they spend more than they make, according to four investor sources familiar with their operations. "The prices people are paying for coding assistants are going to get more expensive," Quinn Slack, CEO at coding startup Sourcegraph , told Reuters. To make the higher cost an easier pill to swallow for customers, Sourcegraph is now offering a drop-down menu to let users choose which models they want to work with, from open source models such as DeepSeek to the most advanced reasoning models from Anthropic and OpenAI so they can opt for cheaper models for basic questions. Both Cursor and Windsurf are led by recent MIT graduates in their twenties, and exemplify the gold rush era of the AI startup scene. "I haven't seen people working this hard since the first Internet boom," said Martin Casado, a general partner at Andreessen Horowitz, an investor in Anysphere, the company behind Cursor. What's less clear is whether the dozen or so code-gen companies will be able to hang on to their customers as big tech moves in. "In many cases, it's less about who's got the best technology -- it's about who is going to make the best use of that technology, and who's going to be able to sell their products better than others," said Scott Raney, managing director at Redpoint Ventures, whose firm invested in Sourcegraph and Poolside, a software development startup that's building its own AI foundation model. CUSTOM AI MODELS Most of the AI coding startups currently rely on the Claude AI model from Anthropic, which crossed $3 billion in annualized revenue in May in part due to fees paid by code-gen companies. But some startups are attempting to build their own models. In May, Windsurf announced its first in-house AI models that are optimized for software engineering in a bid to control the user experience. Cursor has also hired a team of researchers to pre-train its own large frontier-level models, which could enable the company to not have to pay foundation model companies so much money, according to two sources familiar with the matter. Startups looking to train their own AI coding models face an uphill battle as it could easily cost millions to buy or rent the computing capacity needed to train a large language model. Replit earlier dropped plans to train its own model. Poolside, which has raised more than $600 million to make a coding-specific model, has announced a partnership with Amazon Web Services and is testing with customers, but hasn't made any product generally available yet. Another code gen startup Magic Dev, which raised nearly $500 million since 2023, told investors a frontier-level coding model was coming in summer 2024 but hasn't yet launched a product. Poolside declined to comment. Magic Dev did not respond to a request for comment.


The Hindu
35 minutes ago
- The Hindu
MSC Irina, billed as largest ship in the world, calls at Vizhinjam
MSC Irina, known as the biggest container vessel in the world operated by a Swiss-based container shipping giant, Mediterranean Shipping Company (MSC), reached the outer anchorage of the Vizhinjam International Seaport on Tuesday evening. This is the first time MSC Irina, the lead ship of the MSC Irina-class, a series of six identical ultra-large container vessels (ULCVs), is sailing into the coast of a south Asian port. Berthing of the ship is expected on Saturday or Sunday, as another six more vessels are in the queue to berth at the port before Irina. The Irina-class ships have a capacity of 24,346 TEUs (Twenty-foot Equivalent Units). The vessel could carry more than 24,300 standard 20-foot shipping containers stacked up to 22 decks. MSC Irina, amassive ship measuring 399.99 m in length and 61.3 m in beam, is the third ship in the Irina-class to call at the port. Earlier, MSC Türkiye and MSC Michel Cappellini, belonging to the same class, had called at the port here. The MSC Irina is expected to discharge around 4,000 container movements here. 2.20 lakh TEUs The port has handled a total of around 7.2 lakhs TEUs containers after the trial run and subsequent operations began at the port. In the current financial year, the port has managed to handle around 2.20 lakh TEUs containers, according to the port authorities.


Hans India
44 minutes ago
- Hans India
India tops ChatGPT usage globally, signaling major AI-driven job shifts ahead
India has officially surpassed the United States to become the largest user base of ChatGPT, signaling a dramatic shift in how the country is adopting artificial intelligence (AI) in everyday life and work. A recent report by venture capital firm BOND highlights how AI is rapidly reshaping workflows, redefining job roles, and transforming the future of work across sectors. During a recent tech event in Bengaluru, attendees were welcomed by a robot—not a human—demonstrating just how seamlessly AI is being integrated into real-world environments in India. From fluent English-speaking bots to AI-powered workflows in offices, the landscape is evolving quickly. According to the Trends – Artificial Intelligence report, Indians are adapting to AI tools faster than anyone else. ChatGPT adoption in India has now overtaken the U.S., and the country also ranks high in usage of other AI tools like DeepSeek, following only China and Russia—where ChatGPT access is restricted. This rapid uptake is more than a trend—it's a signal of readiness for AI-led work models. The report further notes a 448% increase in AI-related job postings in the U.S. since 2018, while traditional tech roles saw a 9% decline. This suggests a growing demand for AI proficiency across industries. Enterprises are already seeing major benefits. Bank of America's virtual assistant 'Erica' has handled over two billion customer queries, while JP Morgan is leveraging AI across functions—from fraud detection to creative brainstorming—with up to 65% gains in productivity. Yum! Brands, owners of KFC and Pizza Hut, have implemented 'Byte by Yum!' to streamline operations from inventory to staffing. Healthcare is also undergoing an AI transformation. At Kaiser Permanente, doctors now use ambient AI scribes to document consultations in real-time, enabling more direct patient care. As per NVIDIA CEO Jensen Huang, 'AI is now part of infrastructure,' equating AI data centers to the factories of the modern age. India's enthusiasm for AI adoption positions it as a global leader in this transformation. But as AI tools become smarter and more ubiquitous, organizations and individuals alike will need to adapt quickly—either evolve, collaborate with AI, or risk being left behind.