logo
Google is testing a vibe-coding app called Opal

Google is testing a vibe-coding app called Opal

TechCrunch4 days ago
AI-powered coding tools have become so popular over the past few months that almost every major tech company is either using one or making its own. Makers of these so-called 'vibe-coding' tools are a hot commodity at the moment, with startups like Lovable and Cursor fending off buyers and investors keen to tap a hot trend.
Google's now become the latest to hop on this bandwagon: the company is testing a vibe-coding tool called Opal, available to users in the U.S. through Google Labs, which the company uses as a base to experiment with new tech.
Opal lets you create mini web apps using text prompts, or you can remix existing apps available in a gallery. All users have to do is in a description of the app they want to make, and the tool will then use different Google models to do so.
Once the app is ready, you can navigate into an editor panel to see the visual workflow of input, output, and generation steps. You can click on each workflow step to look at the prompt that dictates the process, and edit it if you need to. You can also manually add steps from Opal's toolbar.
Opal also lets users publish their new app on the web and share the link with others to test out using their own Google accounts.
Google's AI studio already lets developers build apps using prompts, but Opal's visual workflow indicates the company likely wants to target a wider audience.
The company joins a long list of competitors, including Canva, Figma, and Replit, that are making tools to encourage non-technical people to create prototypes of apps without having to do any coding.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google is bringing image and PDF uploads to AI Mode
Google is bringing image and PDF uploads to AI Mode

Engadget

time27 minutes ago

  • Engadget

Google is bringing image and PDF uploads to AI Mode

Google is updating AI Mode on desktop this week with the ability to process images, so you can ask it detailed questions about the pictures like you already can on mobile. In the coming weeks, the company is also adding support for PDF uploads on desktop in the US, which could help you digest lengthy course or work materials. You can ask AI Mode to summarize the documents for you and ask follow-up questions that it will then answer by cross-referencing the materials you uploaded with information available on the web. Google says AI Mode's responses will also include links to its references that you can visit in order to dig deeper. AI Mode will support additional file types for upload, including ones straight from your Google Drive, in the coming months as well. In addition to PDF upload support, Google is also rolling out a new Canvas feature that you can access if you're enrolled in the AI Mode Labs experiment in the US. You can use Canvas to consolidate all relevant information about a specific topic or for a specific purpose in a side panel that updates as you ask AI Mode more follow-up questions. If you're traveling, for instance, you can ask AI Mode to make you an itinerary and click the Create Canvas button. You'll be able to keep refining the itinerary with more questions, and you can always leave it alone for a while and come back to it later. AI Mode's Search Live is also getting video input on mobile this week, a feature Google announced at I/O 2025, after voice input arrived in June. To be able to access video input, you'll have to open Lens in the Google app and tap the Live icon before asking questions on what the camera sees. When Google revealed the feature during its annual developers' event, it said you could point the camera at a math problem, for example, and ask Search to help you solve it or to explain a concept you're having trouble understanding. Finally, with Lens in Chrome, you'll be able to ask AI Mode what's on your desktop screen. The company will roll out an "Ask Google about this page" dropdown option in the address bar "soon." When you click on it, AI Mode will create an overview with key information on what's being shown on your screen, whether it's a web page or a PDF.

Google's AI Mode is quietly turning search into a productivity tool
Google's AI Mode is quietly turning search into a productivity tool

Digital Trends

time27 minutes ago

  • Digital Trends

Google's AI Mode is quietly turning search into a productivity tool

Google Search has been one of the primary gateways to information on the internet, but it's about to evolving into something more. With the latest set of features being added to AI Mode, Search will no longer be just a tool for finding links or information, but an assistant that can help you organize, understand, and act on that information. Instead of just answering questions, AI Mode is being transformed into what seems to be a helpful workspace. It will soon support you through complex documents, explain visuals, and even help with multi-step tasks. With the addition of these new tools, Google is slowly changing how search works by helping you do more than just find information. What's new in AI Mode A handful of new capabilities are being introduced that expand what AI Mode can do particularly on desktop browsers, where users often juggle multiple tabs, files, and formats during more complex workflows. PDF and image uploads for context-aware queries AI Mode on desktop will now support uploading PDFs and images which will allow users to ask questions about the content in those files and receive web-informed, AI-generated responses. Recommended Videos Imagine having a research report or technical manual in front of you. Instead of searching for terms manually, you can now upload the file, highlight a section, and ask, 'Can you explain this further?' The AI will analyze the document and return contextually relevant explanations, along with links for deeper reading. Support for additional file types, including those from Google Drive, is expected in future updates expanding this capability to more kinds of content. Canvas for task planning and organization Another interesting addition is Canvas, that allows users to create and refine plans in a persistent, editable side panel. It's a tool designed for tasks that span multiple sessions like project planning, research outlines, trip itineraries, and more. The system will let you iterate in real time, ask AI Mode to draft a plan, make changes through follow-up prompts, and organize the results visually in the side panel. Users will also be able to upload their own files, like meeting notes, to help personalize the output. Canvas essentially helps you stay organized and make progress across sessions, documents, and devices. Search Live: AI conversations with visual input Perhaps the most technically ambitious update is Search Live, which integrates Google's camera-based Lens tool with AI Mode to deliver real-time, conversational help based on what your camera sees. Whether it's a diagram, a schematic, or a physical object, you can point your phone's camera at it and start a conversation. The AI interprets the visual data, offers insights, and even lets you refine questions, creating a kind of live tutoring or troubleshooting session, powered by AI and the web. This feature is based on Google's Project Astra work, and is being rolled out on mobile in the U.S. for users enrolled in the Labs experiment. AI Mode in Chrome: Smarter browsing, fewer tabs For desktop users, AI Mode is getting more closely integrated into Chrome. Soon, you will be able to click 'Ask Google about this page' from the address bar, which will launch Lens and the AI assistant to help you understand whatever is on your screen, whether it is a complex chart, a technical section, or a difficult diagram. You can even ask follow-up questions directly in the side panel, making it easier to explore a topic without switching tabs or starting a new search. This could change how people interact with web content and with Google itself. A more useful, less interruptive AI? There's no shortage of AI tools that come with a promise to boost productivity. But where many require a full switch of platforms or behavior, AI Mode is being embedded into existing habits including Search, Chrome, and Lens. Rather than pitching itself as a digital co-pilot or assistant with a personality, Google is trying to make AI Mode feel more like a context-aware layer for everyday digital tasks. Upload a file, ask a question, build a plan, check in later, all within the browser or the search bar. Google says that it is gradually rolling out these features to AI Mode, with some already available in early access for users who have joined the AI Mode Labs program.

Google Search just got a major upgrade — you can now ask images, PDFs and more with AI Mode
Google Search just got a major upgrade — you can now ask images, PDFs and more with AI Mode

Tom's Guide

time27 minutes ago

  • Tom's Guide

Google Search just got a major upgrade — you can now ask images, PDFs and more with AI Mode

Google is rolling out a new wave of AI features to Search just in time for the back-to-school season. The updates expand what users can do in AI Mode, Google's experimental interface that blends generative AI with traditional search. The latest features include support for asking questions about PDFs and images, a new planning workspace called Canvas and real-time help with video input through Search Live. While many of these tools are still in preview through Google's AI Labs, they show how the company is layering more interactivity into everyday search experiences. Now users can ask about PDFs, images and soon, their own files. Arguably one of the most useful upgrades is the ability to ask questions about uploaded PDFs, a feature rolling out to desktop over the next few weeks. This builds on existing image-based search in AI Mode, which is already available in the Google app on Android and iOS and is now expanding to desktop browsers as well. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. For example, students can upload lecture slides or scanned handouts, then ask detailed questions about the content. AI Mode cross-references the file with web sources to generate a summary or explanation, complete with links to dig deeper. Support for additional file types, including documents stored in Google Drive, is expected in the months ahead. Google is also launching Canvas, a side panel that helps you organize plans or content from your AI searches. For example, a student could ask AI Mode to help create a study guide for an upcoming test. Now, users will see an option to 'Create Canvas,' which opens a dynamic space where your outline or plan begins to take shape. Canvas is designed to be flexible, meaning, you can refine it over multiple sessions, ask follow-up questions and soon even upload your own files (like class notes or a syllabus) to tailor the results. Of course, this feature is for users that aren't students, too. It can be useful to build travel itineraries, compare shopping lists or brainstorm creative projects. Canvas is rolling out first to users enrolled in Google's AI Mode Labs experiment in the U.S., and will appear in desktop browsers. Another new feature, Search Live, brings a real-time, video-enhanced layer to Search. Using your phone camera and Google Lens, you can point at an object or diagram and start a conversation with AI Mode, asking questions as you move your camera or change the angle. The tool, powered in part by Google's Project Astra, is designed to feel like having an expert explain something as you look at it. Search Live is rolling out now on mobile to AI Labs users in the U.S.; it integrates directly with the existing Lens tool in the Google app. Finally, Google is adding tighter integration between AI Mode and Chrome. Starting soon, when you click the Chrome address bar, you'll see a new 'Ask Google about this page' option. This lets you use AI Mode to get an overview of what you're reading, such as an explanation of a diagram in a math problem, or follow up with deeper questions using the 'Dive deeper' button. The feature builds on Lens in Chrome, and the AI summary appears in the side panel, helping users better understand complex topics without leaving the page. If you're in the U.S. and enrolled in Google's Search Labs, you can activate AI Mode by selecting the new button on the Google homepage (on desktop) or through the Google app (on mobile). Many of these features like Canvas and Search Live, will appear gradually over the coming weeks. Google says more file types, better customization, and smarter follow-up interactions are planned in the months ahead. For students, professionals and casual users alike, there's no doubt that AI Mode is becoming a more capable and more personalized tool for learning and exploration. Follow Tom's Guide on Google News to get our up-to-date news, how-tos, and reviews in your feeds. Make sure to click the Follow button.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store