logo
This tiny prompt change makes ChatGPT way more useful — here's how

This tiny prompt change makes ChatGPT way more useful — here's how

Tom's Guide08-05-2025

We've all been there: you open ChatGPT, Claude, Gemini — whichever AI chatbot you're using — and type in a quick query, expecting something smart, helpful, maybe even insightful. But what you get back is nothing short of disappointment.The response is bland, generic, or sometimes completely off the mark. Before you blame the AI, consider this: the real issue might be how you're prompting the chatbot. Those short, vague queries might work fine for pulling up web pages but fall flat when interacting with a language model that is trained to deliver answers differently.
The truth is most of us are still using AI the way we use Google. This is understandable since Google has been around longer. But as more users start moving towards AI tools instead of search engines, until we shift that mindset, we're going to keep getting lackluster results from tools that are actually capable of so much more.Here's how to get the results you want with one simple change.
For decades, we've trained ourselves to think in keywords. We've Googled: "Best travel backpacks,' "How to ask for a raise," and "Easy pasta recipes." While these shorthand phrases work fine for a search engine that pulls from indexed pages, modern AI tools don't search the web in the same way as Google.
AI chatbots generate answers based on context, language patterns, and predictive reasoning.
When you give AI a search-style query, it gives you a search-style answer, which is often too vague, too surface-level, or too robotic to be truly helpful.
The major chatbots of 2025 — ChatGPT-4o, Claude 3.7 Sonnet, Gemini 2.0 and Perplexity aren't just pulling from a database. They're trained to respond like humans and they thrive when you given real human input including tone, context, and intent.
The image above shows a side-by-side example using the following prompts:
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Google-style prompt:"Best productivity hacks 2025"
Conversation-style prompt: "I'm juggling a full-time job and two kids and feel like I'm constantly behind. What are three realistic productivity tips I can try this week?"
The difference is clear. The second version is so much more natural and gives the AI something to work with. As a result, the results are often more specific, personalized and usable.
If you remember nothing else, remember this: Start your prompt with 'I.'
It seems simple, but it's game-changing. Try prompts like this:
'I'm struggling with time management — can you help me structure my day?''I'm prepping for a job interview in marketing. What questions should I be ready for?''I need help brainstorming ideas for my mom's birthday gift — she loves gardening and dogs.'
Adding a human voice (yours) activates the AI's conversational strengths. You're no longer searching. You're collaborating with an AI assistant that's ready to help.
With the release of models like ChatGPT-4o and Gemini 2.5 Pro, users now have access to voice, vision, memory and multimodal tools.
You'll get the most out of your queries if you think of your prompts as texts to a good friend
These models are more powerful than ever, but power doesn't mean much if you're still feeding them flat, lifeless prompts.
You'll get the most out of your queries if you think of your prompts as texts to a good friend. The better you phrase your message, the more helpful (and human) their response will be.If you've avoided getting personal because you're nervous about the chatbot training on your data, there are ways to opt out so you can get personal without some of the risks.
You don't need to overthink it. You just need to start with 'I.' Here are 5 upgraded prompt starters to get better answers fast:
Chatbots aren't mind readers, but they are excellent conversationalists when you treat them like one. As chatbots integrate more into search, keep this in mind. Next time you open your favorite AI chatbot and enter a query, try ditching the keyword search and talking like a human instead.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Google Issues Emergency Update For All 3 Billion Chrome Users
Google Issues Emergency Update For All 3 Billion Chrome Users

Forbes

time9 minutes ago

  • Forbes

Google Issues Emergency Update For All 3 Billion Chrome Users

Update all browsers now. Google has suddenly released an emergency Chrome update, warning that a vulnerability discovered by its Threat Analysis Group has been used in attacks. Such is the severity of the risk, that Google also confirmed that ahead of this update, The issue 'was mitigated on 2025-05-28 by a configuration change' pushed out to all platforms. Google says it 'is aware that an exploit for CVE-2025-5419 exists in the wild,' and that full access to details on the vulnerability will 'be be kept restricted until a majority of users are updated with a fix. We will also retain restrictions if the bug exists in a third party library that other projects similarly depend on, but haven't yet fixed.' CVE-2025-5419 is an out of bounds read and write in V8, the type of dangerous memory flaw typically found and fixed on the world's most popular browser. While it's only marked as high-severity, the fact attacks are underway means apply the fix is critical. There is already a U.S. government mandate for federal staff to update Chrome by Thursday or stop using the browser, after a separate attack warning. And there has been another high-severity fix since then, with two separate fixes. It is inevitable that this latest warning and update will also prompt CISA to issue a 21-day update mandate. There is a second fix included in this emergency update — CVE-2025-5068 is another memory issue, a 'use after free in Blink,' that was disclosed by an external researcher. NIST warns that CVE-2025-5419 'allows a remote attacker to potentially exploit heap corruption via a crafted HTML page,' and that it applies across Chromium, suggesting other browsers will also issue emergency patches. As usual, you should a flag on your browser that see the update has downloaded. You need to restart Chrome to ensure it takes full effect. All your normal tabs will then reopen — unless you elect not to do that. But your Incognito tabs will not reopen, so make sure you save any work or copy down any URLs you want to revisit.

Android XR: Everything you need to know
Android XR: Everything you need to know

Tom's Guide

timean hour ago

  • Tom's Guide

Android XR: Everything you need to know

Android XR is Google's new AI-powered platform for powering a new wave of mixed reality headsets and smart glasses. The mixed reality platform has had a slow rollout so far, but we expect the first devices this year. There aren't any devices powered by Android XR for sale yet, though Samsung is slated to be the first manufacturer out of the gate with its Project Moohan headset at some point in 2025. A new pair of smart glasses is also on the way from Xreal called Project Aura. Smart glasses and headsets are going to be a significant part of Google's future product lineup. Google will also fully integrate Gemini, its homegrown artificial intelligence, into this family of immersive devices. The Apple Vision Pro and visionOS should soon face stiff competition. So what is Android XR, and how will it shape the next generation of AI-powered headsets and smart glasses? Here's what you need to know about Google's extended reality platform and when devices will be available. Android XR is Google's new operating system for extended reality devices. It's intended for use with virtual reality (VR) and mixed reality (XR) headsets, as well as smart glasses. Android XR is part of the Android platform, which extends beyond smartphones to tablets, wearables, car dashboards and TVs. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. Android XR enables developers and device makers to utilize tools such as ARCode, Android Studio, Jetpack Compose, Unity and OpenXR to create specialized apps, games and other experiences within a development environment similar to the rest of the ecosystem. Google is collaborating on the framework with key manufacturing players Samsung and Qualcomm. Google has developed versions of its suite of apps for use on the XR platform. They include favorites like Google Photos, Google Maps, Chrome, and YouTube. That's just the start of the Google-led experiences that will be available at launch. Extended reality is an umbrella term encompassing an immersive experience combining physical and digital components. The physical component is something you wear on your head or face, while the digital part refers to something like the heads-up display on a pair of smart glasses. Android XR is not Google Glass, despite Glass being the predecessor. While it is an evolution of the initial platform launched in 2013, Android XR is an extension of the broader Android platform. Its existence should help expand Android's reach beyond phones, tablets, cars and TVs. Android XR shares many similarities with Apple's visionOS on the Vision Pro, as well as Meta's extended reality offerings. Meta calls its software Horizon OS, which powers the Quest 3 and Quest 3S headsets. Android XR offers two main experiences out of the gate. The first is in the form of a visor-like headset that goes over the head. Samsung's Project Moohan is an example of that. The device uses outward-facing cameras and sensors to map the environment and projects it inward, allowing you to walk around. The headset then projects a desktop-like environment that spans the length of the headset. Place your hand in view, and Android XR will recognize it as input. Pinch and grab the various translucent windows or layer them on top of one another. You can even click out of them like on the desktop. Or, use Gemini to summon a fully immersive video experience using spatial audio. Android XR on a pair of smart glasses is a different experience. The demonstration at Google I/O 2025 showed a pair of thick, wire-frame glasses with discreet buttons on either side and a touchpad. Once the smart glasses are on, a heads-up display (HUD) is visible, positioned off to the side. Unlike Android XR on the headset, there is no desktop or main home screen. There's also no physical input or need to extend your hands out front to control anything. Instead, menu screens and information pop in as needed and only hover when active. For instance, while in navigation mode, Google Maps will display arrows pointing in the direction to walk or ride. Android XR also lets you summon Gemini Live, particularly on smart glasses. Most interaction is conducted through voice commands. You can ask Gemini contextual questions and prompt it to provide information on what you're looking at. Samsung is the first name you'll likely associate with Android XR since it was the first to be mentioned alongside Google's announcement that it would essentially 're-enter' the extended reality realm. Right now, we're waiting on Samsung's Project Moohan to make its debut (and reveal its actual name). Samsung teased in a recent earnings call that it would 'explore new products such as XR' in the second half of 2025. Xreal is another major player in the Android XR space. The company announced its Project Aura headset would be the second device launched with Android XR and that it would reveal more details at Augmented World Expo (AWE) in June 2025, with a potential product launch for later this year. That would put it around the same timeline as Samsung's headset. Other device makers like Lynx and Sony have also been mentioned as partners in the Android XR push. Qualcomm makes the Snapdragon XR2+ Gen 2 silicon, made especially for this particular product category. For smart glasses, Google is working on an in-house pair. Although there's a reference device, there's nothing available for consumers quite yet. Eyewear brands like Gentle Monster and Warby Parker have been tapped by Google to develop stylish smart glasses with Android XR, though there's no timeline available there. The first Android XR devices should be available in the second half of 2025. Based on what Samsung and Xreal have mentioned in earnings calls and press releases, they should be among the first to roll out Android XR-based products. The overall cost of Android XR headsets and smart glasses has yet to be determined. Samsung and Xreal will be the companies to set the standard pricing for the headset and glasses, respectively. Any Android XR smart glasses would have to be priced on par with the Ray-Ban Metas, which start at $300. Snap, the company behind the social media app Snapchat, has had its foray into the smart glasses space with Spectacles. The company is still refining its entry into the extended reality space. It's unclear if it would attempt to use Android XR in its product lineup. Meta has established its presence in the headset space with the Quest 3 and Quest 3S, but we're still awaiting the release of Orion AI Glasses. These are Meta's next-generation smart glasses. Like the Ray-Ban Metas, they're designed after a pair of Ray-Ban wireframes. They also have a built-in camera and open-ear audio, but the main feature is the heads-up display you can interact with, similar to the Quest headsets. Meta hasn't revealed when the Orion AI Glasses will be available. The Apple Vision Pro is the first-generation version of the company's foray into extended reality. With a starting price of $3,500, it's a pricey way to enter Apple's spatial computing ecosystem. The device boasts a similar eye and hand-tracking interface to Project Moohan. The Vision Pro also works within Apple's ecosystem of devices and can tether to the MacBook. But one of its biggest caveats is its high price. It's also quite heavy to wear. There have been reports that Apple is working on a more affordable version of the Vision Pro and a second-generation that's lighter for 2026. The company also has sights set on Apple glasses.

A data-center stock is up more than 50% today after sealing a lucrative AI partnership
A data-center stock is up more than 50% today after sealing a lucrative AI partnership

Yahoo

timean hour ago

  • Yahoo

A data-center stock is up more than 50% today after sealing a lucrative AI partnership

Shares of Applied Digital (APLD) surged as much as 54% on Monday. The data-center operator announced a lease deal with Nvidia-backed AI firm CoreWeave. The 15-year agreement is expected to generate $7 billion of revenue for Applied Digital. The move: Applied Digital Corporation stock surged as much as 54% on Monday to an intraday high of $10.54. It closed 48% higher, at $10.14. The chart: This embedded content is not available in your region. Why: Shares of the AI data center operator soared on the announcement of two 15-year lease deals with CoreWeave that will generate $7 billion in revenue for Applied Digital. Under the terms of the deal, CoreWeave, a cloud services firm that's been backed by Nvidia, will receive 250 megawatts of data center capacity from an Applied Digital campus in North Dakota, with the option for CoreWeave to access another 150 megawatts. "We believe these leases solidify Applied Digital's position as an emerging provider of infrastructure critical to the next generation of artificial intelligence and high-performance computing," said Wes Cummins, Chairman and CEO of Applied Digital. What it means: The deal is a massive win for Applied Digital, which is in the process of converting itself into a data center real estate investment trust. Data centers are seeing massive demand from the so-called AI hyperscalers, like Meta and Microsoft, as they pursue their ambitions in the booming space. A note from Needham, cited by Bloomberg, said that the deal could also pave the way for other enterprise AI customers to turn to Applied Digital for their data center needs. The note also said OpenAI could be the end customer of the lease agreement, given the ChatGPT creator's $4 billion deal with CoreWeave last month. Read the original article on Business Insider Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store