Zoom expands AI agentic features across Zoom Workplace
Zoom has expanded AI agentic features across Zoom Workplace with a Custom AI Companion add-on, Voice Recorder, Tasks and Custom Avatars as well as new features for Zoom Meetings, Zoom Team Chat, Zoom Revenue Accelerator and others.
'Last month, we announced that AI Companion now includes a set of agentic skills so it can understand, plan, and get things done with minimal input from the user. We're further delivering on that promise by launching Custom AI Companion, agentic AI, and many other features. We continue to push the boundaries of innovation to bring our customers the very best AI-first solutions that drive productivity and collaboration,' said Jeff Smith, head of Product for Workplace AI, Meetings and Spaces at the company.
The AI Companion helps users organise tasks and perform various tasks like taking notes during Zoom Meetings etc. It can automatically generate recommended tasks and assign them to the correct person based on the meeting summaries or be prompted to surface tasks from Zoom Team Chat, Zoom Mail, Zoom Docs converting action items into actionable tasks. It also recommends the next steps to be taken and gives insights for the actions. It also tracks tasks via the task management tab.
The platform has also launched the Custom AI Companion as a paid add-on so organisations can tailor AI agents and skills as per their needs. Users will now also be able to connect beyond just Zoom via third-party AI agents with Custom AI Companion. The platform will be supporting both Model Context Protocol from Anthropic and Agent to Agent Protocol from Google in Zoom AI Studio so users can integrate external AI agents with AI Companion.
The AI Companion Voice Recorder is able to transcribe, summarise and record action items even when users aren't in a Zoom Meeting and Zoom Phone call. This feature will be available later this month on mobile and on Rooms eventually.
Zoom is also using small language models optimised for specific AI skills to translate languages including German, Spanish, Italian, French, Portuguese, Portuguese-Brazilian, Simplified Chinese and Traditional Chinese, in the Zoom Team Chat. Users can also mention people who are not in the chat and share their contact info over Zoom Team Chat without adding them.
Organisations can connect AI Companion chats with a variety of enterprise data sources like project management tools, cloud storage, email accounts and customer databases through Amazon Q Business and Glean. This can help users quickly get some context during a Zoom meeting.
The AI Companion helps users create an avatar for Zoom Clips so they can just record a clip and share it with their team member to help them catch up in case they missed something. The feature will be available with the Custom AI Companion add-on as well as a separate SKU in May.
There's also a new app developed with Salesforce to provide users with AI summaries of Zoom Phone calls directly in the app. Another tool available now called Workvivo AI helps users write better updates, search for answers easier and make more engaging surveys.
The Zoom Revenue Accelerator has also been improved by using AI insights and automation to analyse sales conversations and bring in customer insights. The new feature has an AI agent that can run on admin-configured sales methodologies like BANT, SPICED and MEDDICC on behalf of sales reps to reduce time spent on manual data entry and ensure CRM entries are accurate.
Zoom has also introduced Zoom Workplace for Frontline, a mobile solution for frontline workers in retail, healthcare, manufacturing, hospitality and others. Meanwhile Zoom Workplace for Clinicians which provides AI notes to healthcare clients.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Mint
2 hours ago
- Mint
Android 16 begins rollout on 11 June: Eligible devices and what to expect
Google is set to begin the phased rollout of its latest mobile operating system, Android 16, on Wednesday, 11 June. The update will initially be available to select Pixel devices as an over-the-air (OTA) software update, with broader availability expected in the coming weeks. Unlike previous years, the tech giant is introducing the stable version of Android 16 earlier than usual, hinting at a renewed focus on improving its release cycle. Although the update marks the official launch of Android 16, most of its headline features — including a redesigned interface and advanced battery health tools — will not be part of the initial version and are scheduled to arrive later through the Android 16 QPR1 (Quarterly Platform Release 1) update. The announcement was confirmed by the official Android Developers account on social media platform X. While Google has confirmed the 11 June release date, no specific rollout time has been disclosed. As with previous Android releases, the update is expected to reach users in stages depending on region and device eligibility, due to global time zone variations. The update is being rolled out first to compatible Pixel smartphones. Google's own devices are usually the first to receive Android updates, with other brands pushing the update later after integrating their respective user interfaces or skins. Confirmed devices expected to receive the update include: Pixel 6, 6 Pro, 6a Pixel 7, 7 Pro Pixel 8, 8 Pro, 8a Pixel Fold, Pixel Tablet Pixel 9, 9 Pro, 9 Pro XL, 9 Pro Fold Pixel 9a S Series: Galaxy S22, S23, S24, S25 Z Series: Fold 4/5/6, Flip 4/5/6 A Series: A24–A26, A34–A36, A54–A56, A73 M Series: M34–M36, M54–M56 F Series: F34, F54, F55 Xiaomi 13, 14, 15 Redmi 12, 13 Redmi Note 13, 14 Redmi K70 OnePlus 11, 12, 13 Nord 3, Nord CE 4, Nord CE 4 Lite Edge 40, 50, 60 Razr 50, 60 G45, G85 While the initial update may not bring sweeping changes, significant enhancements are expected in the coming months with the QPR1 update. One of the standout upgrades could be the Material 3 Expressive interface — a refined take on Google's dynamic design language. This new version will likely feature adaptive colours, smoother animations, and enhanced haptic feedback, offering a more intuitive and responsive user experience. Quick Settings will likely see a design refresh. The brightness slider could be reimagined with a rectangular shape, and tiles may be resized and grouped into categories. A new reset option for quick toggles is believed to be in development. Live Activities, a feature designed for real-time updates, such as tracking sports scores or deliveries, is anticipated to be integrated into the lock screen, status bar, and notification panel.


Time of India
3 hours ago
- Time of India
Snap to launch lightweight, immersive spectacles in 2026 as smartglass race heats up
Snap Inc. , the parent company of the ephemeral messaging app Snapchat, announced on Tuesday that it will launch immersive smart glasses called Specs in 2026. This is in line with other Big Tech companies targeting the lucrative smart glasses market, each looking for the next breakthrough in tech hardware. Snap's Specs are wearable computers integrated into a lightweight pair of glasses featuring see-through lenses that augment the physical world. In 2024, the tech company released the fifth generation of its Spectacles specifically for developers, paving the way for the public launch of Specs in 2026. 'We believe the time is right for a revolution in computing that naturally integrates our digital experiences with the physical world, and we will publicly launch our new Specs next year,' said CEO and cofounder Evan Spiegel . The company said its AR Lenses in the Snapchat camera are used eight billion times per day, and over 400,000 developers have built more than four million Lenses with Snap's AR tools. The company said it is working with Niantic to integrate its Visual Positioning System (VPS) into its Lens Studio development tools and smart glasses. Additionally, it will integrate WebXR support directly into the browser, allowing developers to build, test, and access augmented reality (AR) and virtual reality (VR) experiences. New for Snap OS Discover the stories of your interest Blockchain 5 Stories Cyber-safety 7 Stories Fintech 9 Stories E-comm 9 Stories ML 8 Stories Edtech 6 Stories Snap also announced major updates to Snap OS, including integration with OpenAI and Gemini on Google Cloud. The company now enables developers to build multimodal AI-powered Lenses and publish them for the Spectacles community. It also announced the Automated Speech Recognition API, which allows real-time transcription for over 40 languages, including non-native accents. Big Tech eyes new market Big Tech companies are racing to one-up each other in the smart glasses market, as they see it as the next space for human-computer interaction and a new way to interact with technology. Here are some of the latest developments in the space: - Meta currently leads in smart glass sales with its Ray-Ban Meta smart glasses , with over one million units sold in 2024. These glasses offer features like photo or video capture, music, calls, and voice commands powered by Meta AI. - Google marked a return to this market after its initial failure to crack it with "Google Glass." - At Google I/O 2025, the tech giant unveiled Project Aura, a new pair of immersive smart glasses powered by Android XR, also highlighting its lightweight nature. It is an "optical see-through XR" device, overlaying virtual content onto the real world through transparent lenses. Google's AI capabilities will also be integrated into these with Gemini. - Apple is reportedly planning a 2026 launch for its smart glasses as well, as it faces pressure in the AI-enhanced gadgets segment. Apple is developing dedicated chips for these glasses and aims to start mass production next year.


Time of India
3 hours ago
- Time of India
Google Cloud executive's tip to Computer Science students: It's not just about prompts, but what....
Google Cloud 's chief technology officer, Will Grannis , believes the fundamentals of computer science remain vital—even as the tech industry undergoes dramatic changes driven by artificial intelligence . In a recent interview with Business Insider, Grannis said 'You still have to understand, you know, how computers work, how data stores work.' 'Because that context will allow you to think about how you design something for efficiency and value,' he further added. Grannis emphasized that a traditional computer science degree or coding program is still a solid foundation for entering the tech workforce. He said professionals need to prepare for the era of agentic AI, where understanding the data, tools, and systems behind the AI is just as important as the user inputs. 'You're moving from like in the old days, we had to think about it at the application layer and just kind of stop there,' Grannis said. 'Now, we're entering the era of context engineering.' 'So it's not just about prompts,' he added, 'but what data, what tools, what systems does your AI, does your multi-agent system need in order to function correctly?' Learn beyond the classroom: Google CTO While Grannis stands by the value of formal education, he also encouraged job seekers to learn beyond the curriculum. He said today's tech professionals should adopt modern tools and systems to keep pace with AI's rapid evolution. 'That's what I've been doing my entire career,' he said. 'Formal education wouldn't have been enough for me to stay current.' He gave an example of his approach, revealing that he's encouraging his global team to use AI during an upcoming hackathon. 'We're going to use AI to just generate code, to modify it, to steer it, to refine it,' Grannis explained. He also warned that mastering prompt engineering alone isn't enough, saying that while it's important, it has started to become "boilerplate code." HP EliteBook Ultra Review: Thin, light, power in a premium package AI Masterclass for Students. Upskill Young Ones Today!– Join Now