logo
Google announces general availability for Veo 3 and Veo 3 Fast on Vertex AI

Google announces general availability for Veo 3 and Veo 3 Fast on Vertex AI

The Hindu30-07-2025
Google has made their AI video generation model, Veo 3 generally available on Vertex AI. The company also said Veo 3 Fast, the faster and more economical version of Veo 3, will also be generally available on Veo 3.
The models will be open for public preview via Vertex AI in August and will offer the feature that generates video clips from images. Users just need to enter a text prompt along with the image.
Google said that the videos generated would be watermarked with SynthID. Veo 3 and Veo Fast are also one of the tools covered by Google's indemnity for GenAI services.
In a blog posted, the company shared that since Veo 3's preview launch on Vertex AI in June, enterprise clients had generated over 6 million videos signalling high demand.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google, schmoogle: When to ditch web search for deep research
Google, schmoogle: When to ditch web search for deep research

Mint

time2 hours ago

  • Mint

Google, schmoogle: When to ditch web search for deep research

Searching for the perfect electric car could have taken hours. Instead, I opened ChatGPT, clicked the deep research button and walked away from my computer. By the time I'd made coffee, ChatGPT delivered an impressive 6,800-word report. This year, ChatGPT and other popular AI chatbots introduced advanced research modes. When activated, the AI goes beyond basic chat, taking more time, examining more sources and composing a more thorough response. In short: It's just more. Now free users can access this feature, with limits. Recent upgrades, such as OpenAI's latest GPT-5 model, have made research even more powerful. For the past few months, I've experimented with deep research for complicated questions involving big purchases and international trip planning. Could a robot-generated report help me make tough decisions? Or would I end up with 6,000-plus words of AI nonsense? The bots answered questions I didn't think to ask. Though they occasionally led me astray, I realized my days of long Google quests were likely over. This is what I learned about what to deep research, which bots work best and how to avoid common pitfalls. Deep research is best for queries with multiple factors to weigh. (If you're just getting started, hop to my AI beginner's guide first, then come back.) For my EV journey, I first sought advice from my colleagues Joanna and Dan. But I needed to dig deeper for my specific criteria including a roomy back row for a car seat, a length shorter than my current SUV and a battery range that covers a round trip to visit my parents. I fed my many needs into several chatbots. When I hit enter, the AI showed me their 'thinking." First, they made a plan. Then, they launched searches. Lots of searches. In deep research mode, AI repeats this cycle—search then synthesize—multiple times until satisfied. Occasionally, though, the bot can get stuck in its own rabbit hole and you need to start over. Results varied. Perplexity delivered the quickest results, but hallucinated an all-wheel drive model that doesn't exist. Copilot and Gemini provided helpful tables. ChatGPT took more time because it asked clarifying questions first—a clever way to narrow the scope and personalize the report. Claude analyzed the most sources: 386. Deep research can take 30 minutes to complete. Turn on notifications so the app can let you know when your research is ready. My go-to bot is typically Claude for its strong privacy defaults. But for research, comparing results across multiple services proved most useful. Models that appeared on every list became our top contenders. Now I'm about to test drive a Kia Niro, and potentially spend tens of thousands based on a robot's recommendation. Basic chat missed the mark, proposing two models that are too big for parallel parking on city streets. Other successful deep research queries included a family-friendly San Francisco trip itinerary, a comparison of popular 529 savings plans, a detailed summary of scientific consensus on intermittent fasting and a guide to improving my distance swimming. On ChatGPT and Claude, you can add your Google Calendar and other accounts as sources, and ask the AI to, for example, plan activities around your schedule. Deep research isn't always a final answer, but it can help you get there. Ready for AI to do your research? Switch on the 'deep research" or 'research" toggle next to the AI chat box. ChatGPT offers five deep research queries a month to free users, while Perplexity's free plan includes five daily. Copilot, Gemini and Grok limit free access, but don't share specifics. Paid plans increase limits and offer access to more advanced models. Claude's research mode requires a subscription. Here are tips for the best results: Be specific. Give the AI context (your situation and your goal), requirements (must-haves) and your desired output (a report, bullets or a timeline). Chatbots can't read your mind…yet. Enable notifications. Deep research takes time. Turn on notifications so the app can ping you when your response is ready. Verify citations. AI can still make mistakes, so don't copy its work. Before making big decisions, click on citations to check source credibility and attribution. Summarize the output. Reports can be long. Ask for a scannable summary or table, then dive into the full text for details. Understand limitations. The information is only as good as its sources. These chatbots largely use publicly available web content. They can't access paywalled stuff, so think of it as a launchpad for further investigation. Whatever the imperfections of deep research, it easily beats hours and days stuck in a Google-search black hole. I have a new research partner, and it never needs a coffee break. News Corp, owner of Dow Jones Newswires and The Wall Street Journal, has a content-licensing partnership with OpenAI. Last year, the Journal's parent company, Dow Jones, sued Perplexity for copyright infringement.

Last-minute Pixel 10 leak says no SIM tray, launch in 10 days at Made by Google event
Last-minute Pixel 10 leak says no SIM tray, launch in 10 days at Made by Google event

India Today

time4 hours ago

  • India Today

Last-minute Pixel 10 leak says no SIM tray, launch in 10 days at Made by Google event

Rumours are swirling once again in Pixel land, and this time they could spell a rather big change for Google's upcoming flagship series. The Pixel 10 family, expected sometime next year, might just wave goodbye to a little metal component that's been a part of every smartphone since the early days: the humble SIM card to a fresh post by well-known tipster Evan Blass on X (formerly Twitter), Google is allegedly plotting to strip the physical SIM slot from three of its next-gen handsets: the Pixel 10, Pixel 10 Pro, and Pixel 10 Pro XL. Instead, these devices would rely entirely on eSIM technology, specifically, two active eSIM slots to keep you connected. In other words, if this leak is accurate, you'll never have to poke your phone with that little pin again, though you might also lose the ability to quickly swap SIMs on the Blass claims the Pixel 10 Pro Fold, the foldable expected to launch alongside the rest of the line-up, will still hang on to its physical SIM tray. It's almost as if Google has decided foldable owners deserve that extra bit of old-school flexibility. But there's another wrinkle: Blass responded to one curious follower asking whether this SIM tray removal would be a global decision or a region-specific one. His answer? The eSIM-only move could be limited to the United States. If that's the case, buyers in other markets might still get their beloved SIM slot, though, as always, that remains while the leak is intriguing, it's far from bulletproof. For starters, Blass is usually a deadpan, detail-first type of tipster, but this post began with an oddly vague 'tipster suggests' intro, unusual for someone with his track record. Then there's the matter of other Pixel 10 leaks we've already seen. Early CAD-based renders showed a perfectly ordinary SIM slot across all models. Even real-life prototype images that surfaced online featured a SIM tray in the frame. That doesn't exactly scream 'eSIM revolution'.It's also worth pointing out that the images we've seen in recent weeks haven't shown the top edge of the phone in great detail, which is where the SIM slot would usually be visible. That leaves just enough mystery for speculation to thrive. And thrive it has: the replies under Blass's post quickly filled with strong Pixel fans see this potential change as inevitable. Apple has already gone eSIM-only with its US iPhone 14 and 15 models, and the writing could be on the wall for physical SIMs in certain regions. eSIMs can make devices more waterproof, free up internal space for other components, and simplify network switching for those comfortable doing everything however, are less enthusiastic. Travellers, in particular, often prefer a physical SIM slot because it makes buying and popping in a local SIM card quick and painless. With eSIMs, the process can be more fiddly – and if your phone breaks, transferring an eSIM to another device isn't always as straightforward as swapping a Google does follow through with this change, it could find itself walking a tightrope between innovation and alienating a chunk of its audience. In the US, the transition to eSIMs has been slow but steady, with major carriers embracing the tech. Still, there are plenty of people who simply like the reassurance of having a physical card they can hold in their course, there's every chance this rumour turns out to be a false alarm, or at least a premature one. Google might be testing eSIM-only models internally, or exploring a regional rollout, without committing to ditching the SIM tray everywhere. After all, smartphone manufacturers often trial multiple hardware configurations before settling on the final production we see more concrete leaks, or hear from Google itself, it's wise to keep the salt shaker handy. But if you're a die-hard SIM card loyalist living in the US, you might want to brace yourself for the possibility that your next Pixel could be missing that little slot you've been using for years. On the bright side, you'll no longer have to rummage through drawers looking for that fiddly SIM ejector tool.- Ends

Why Big Tech is focusing on Indian languages
Why Big Tech is focusing on Indian languages

Mint

time4 hours ago

  • Mint

Why Big Tech is focusing on Indian languages

On Thursday, OpenAI chief executive Sam Altman unveiled GPT-5 with native support for 12 Indian languages. Last year, Google expanded its AI model Gemini's native support for nine Indian languages. With artificial intelligence startups Anthropic and Perplexity also focusing on Indian languages, regional-language internet is fast emerging as a huge AI battleground. Mint explains why. Why are languages important for AI firms? Foundational AI models are trained on massive troves of data and produce responses in plain text. To do this, AI firms rely on publicly available information to train their models, and most information on the internet is in English. As a result, the world's top AI models are all natively based on data that's available in English. This leads to various biases in the way AI models understand user queries, which makes wider language access a fundamental necessity for AI companies. Why are Indian languages important for AI firms? Hindi, as per global language repository Ethnologue, is the world's third-most spoken language, after English and Mandarin. Cumulatively, 10 Indian languages are spoken by 1.7 billion people, or 21% of the world's population—ahead of English (with 1.5 billion speakers), and varying versions of Chinese (1.4 billion). This makes India the world's single-largest region for tech companies to tap into. Beyond the numbers, experts underline that each language has its own nuance, regional dialects, biases, and complications. Indian languages, owing to their scale, are crucial resources for AI models that cater to the world. Are all global firms targeting India? Yes. Last week, Sam Altman said OpenAI's latest model, GPT-5, natively supports 12 Indian languages. Last year, Google announced native support for nine Indian languages. Meta, too, said last year that its Llama family of AI models would support eight Indian languages. Anthropic's Claude supports Hindi and Bangla. Perplexity, another prominent Silicon Valley startup, supports inputs and outputs in Hindi. How important is India in terms of business potential? This is difficult to assess. India is one of the world's largest user bases for any AI firm. However, diverse consumer behaviour makes it difficult to monetize this market. As a result, India's contribution to the net revenue of global tech firms has only ranged between 1% and 4%. AI-first companies, however, are of the opinion that they can incrementally add to the way global companies have generated revenue from India, as most AI tools and platforms need enterprise-grade subscriptions to leverage AI. With a vast base of users, most tech firms expect India to become a major monetization hub. Can AI see the replication of India's DPI push? India, through the government's backing, is keen to build foundational models trained natively on Indian languages. Startups and industry veterans state that in the long run, an AI model trained on most Indian languages can be used as a template for other non-English AI models around the world. This, in the long run, could be akin to India's push to offer digital public infrastructure (DPI) to the world—which it did in digital payments via the unified payments interface (UPI). While other nations are also building their own sovereign AI models, India believes it can gain soft power by offering AI models to the global south.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store