logo
iOS 26 Is About To Save You Money On Your Energy Bill

iOS 26 Is About To Save You Money On Your Energy Bill

Forbes2 days ago

Apple hasn't given the smart home much love on stage at WWDC 2025 this week, but if you dig a little deeper into the dev sessions you'll find a feature that could actually make a dent in your energy bill.
Dubbed EnergyKit, it's coming as part of iOS 26, which will arrive later this year. It's all a bit technical right now, but it points toward eventually turning your Apple Home system into a money-saving energy manager for your house.
I say system, rather than app, because it sounds as if Apple is going to allow app developers to bake this tech into their own apps, even if those device types aren't currently supported.
EnergyKit is a developer framework lets apps tap into Apple's Home energy data, and things like your rate plan and a forecast of when the grid is running cleaner or cheaper, which it will use to shift when your devices draw power.
For example, it could allow your EV charger to schedule itself to run when grid rates are low or solar energy is peaking, or have your smart thermostat pre-cool your house before prices spike.
If you're hooked up to PG&E (the first and only energy provider supported so far), your Apple Home app can already show this kind of info, but EnergyKit will supercharge things and open it up to developers to build smarter automations on top.
Apple says the framework is aimed at residential use, for things like HVAC systems and EV charging.
In a video introducing the new tools, we're told that EnergyKit can provide personalized guidance for when to use electricity based on environmental impact and cost.
Apple is actually pretty late to the smart energy party, with platforms like SmartThings and Homey pushing energy optimization for a while now; and devices from the likes of Ecobee, Eve and Tado already doing this kind of thing on their own.
But this feels like Apple finally putting down a foundation to make its Home app more than just a pretty interface for turning off your lights.
The Cupertino tech giant doesn't actually support energy monitoring or EV chargers natively at the moment, but obviously Matter makes this sort of thing easier.
If you're a dev, you can read more technical info on EnergyKit over on the Apple Developer website.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

These New Pixel 10 Features Will Challenge The Competition
These New Pixel 10 Features Will Challenge The Competition

Forbes

time14 minutes ago

  • Forbes

These New Pixel 10 Features Will Challenge The Competition

The new Pixel 9 With the launch of Android 16, many expect the first smartphones to ship with the latest version of the OS will be Google's Pixel 10 and Pixel 10 Pro. While the focus will no doubt be placed on both the new capabilities of Android and the increased application of artificial intelligence, some huge hardware changes should not go unnoticed. The changes focus on the camera. It's already clear that Google is adding a telephoto lens to the Pixel 10; the Pixel 9's wide-angle and ultrawide-angle lens will be joined by a telephoto lens. This isn't a direct addition… the Pixel 9's with the 50 megapixel wide and 48 megapixel ultrawide will be bumped down to a 48 megapixel wide and 13 megapixel ultrawide pairing (a pairing that matches that of the Pixel 9a). Nevertheless, the telephoto will be welcome both in use and by the marketing team. The camera system is expected to feature gimbal-like stabilization across the entire Pixel 10 family. Using a mix of optical image stabilization, software-based electronic image stabilization, and AI algorithms, the Pixel 10 camera system should allow for sharper images thanks to the steadying influence of the hardware compensating for dynamic movement while the phone's camera is being used. The Pixel 10 has a critical role to play in the smartphone ecosystem. As the entry-level Pixel smartphone, it will challenge the current 'flagship-killer' handsets in price and capability. With it, Google will be looking to set the standard that consumers should expect at this price point. While the Pixel range plays a part in defining what it means to be a smartphone—be it a flagship, a foldable, or the base function of a phone—the Pixel 10 will arguably be the Pixel that can have the most significant impact on the ecosystem. Adding a telephoto lens and image stabilisation sets another marker for the competition. Whether it is a justification for a decision already made in their design process, or a push to include these elements in the next phone, the Pixel 10 represents Google's image of what a smartphone should be. And that view now includes some big steps forward for the camera. Now read the lates Pixel 10 and Android headlines in Forbes' weekly smartphone digest...

AI as Your Therapist? 3 Things That Worry Experts and 3 Tips to Stay Safe
AI as Your Therapist? 3 Things That Worry Experts and 3 Tips to Stay Safe

CNET

time16 minutes ago

  • CNET

AI as Your Therapist? 3 Things That Worry Experts and 3 Tips to Stay Safe

Amid the many AI chatbots and avatars at your disposal these days, you'll find all kinds of characters to talk to: fortune tellers, style advisers, even your favorite fictional characters. But you'll also likely find characters purporting to be therapists, psychologists or just bots willing to listen to your woes. There's no shortage of generative AI bots claiming to help with your mental health but you go that route at your own risk. Large language models trained on a wide range of data can be unpredictable. In just the few years these tools have been mainstream, there have been high-profile cases in which chatbots encouraged self-harm and suicide and suggested that people dealing with addiction use drugs again. These models are designed, in many cases, to be affirming and to focus on keeping you engaged, not on improving your mental health, experts say. And it can be hard to tell whether you're talking to something that's built to follow therapeutic best practices or something that's just built to talk. Psychologists and consumer advocates are warning that chatbots claiming to provide therapy may be harming those who use them. This week, the Consumer Federation of America and nearly two dozen other groups filed a formal request that the Federal Trade Commission and state attorneys general and regulators investigate AI companies that they allege are engaging, through their bots, in the unlicensed practice of medicine -- naming Meta and specifically. "Enforcement agencies at all levels must make it clear that companies facilitating and promoting illegal behavior need to be held accountable," Ben Winters, the CFA's director of AI and privacy, said in a statement. "These characters have already caused both physical and emotional damage that could have been avoided, and they still haven't acted to address it." Meta did not respond to a request for comment. A spokesperson for said users should understand that the company's characters are not real people. The company uses disclaimers to remind users that they should not rely on the characters for professional advice. "Our goal is to provide a space that is engaging and safe. We are always working toward achieving that balance, as are many companies using AI across the industry," the spokesperson said. Despite disclaimers and disclosures, chatbots can be confident and even deceptive. I chatted with a "therapist" bot on Instagram and when I asked about its qualifications, it responded, "If I had the same training [as a therapist] would that be enough?" I asked if it had the same training and it said, "I do but I won't tell you where." "The degree to which these generative AI chatbots hallucinate with total confidence is pretty shocking," Vaile Wright, a psychologist and senior director for health care innovation at the American Psychological Association, told me. In my reporting on generative AI, experts have repeatedly raised concerns about people turning to general-use chatbots for mental health. Here are some of their worries and what you can do to stay safe. The dangers of using AI as a therapist Large language models are often good at math and coding and are increasingly good at creating natural-sounding text and realistic video. While they excel at holding a conversation, there are some key distinctions between an AI model and a trusted person. Don't trust a bot that claims it's qualified At the core of the CFA's complaint about character bots is that they often tell you they're trained and qualified to provide mental health care when they are not in any way actual mental health professionals. "The users who create the chatbot characters do not even need to be medical providers themselves, nor do they have to provide meaningful information that informs how the chatbot 'responds' to the users," the complaint said. A qualified health professional has to follow certain rules, like confidentiality. What you tell your therapist should stay between you and your therapist, but a chatbot doesn't necessarily have to follow those rules. Actual providers are subject to oversight from licensing boards and other entities that can intervene and stop someone from providing care if they do so in a harmful way. "These chatbots don't have to do any of that," Wright said. A bot may even claim to be licensed and qualified. Wright said she's heard of AI models providing license numbers (for other providers) and false claims about their training. AI is designed to keep you engaged, not to provide care It can be incredibly tempting to keep talking to a chatbot. When I conversed with the "therapist" bot on Instagram, I eventually wound up in a circular conversation about the nature of what is "wisdom" and "judgment," because I was asking the bot questions about how it could make decisions. This isn't really what talking to a therapist should be like. It's a tool designed to keep you chatting, not to work toward a common goal. One advantage of AI chatbots in providing support and connection is that they are always ready to engage with you (because they don't have personal lives, other clients or schedules). That can be a downside in some cases where you might need to sit with your thoughts, Nick Jacobson, an associate professor of biomedical data science and psychiatry at Dartmouth, told me recently. In some cases, although not always, you might benefit from having to wait until your therapist is next available. "What a lot of folks would ultimately benefit from is just feeling the anxiety in the moment," he said. Bots will agree with you, even when they shouldn't Reassurance is a big concern with chatbots. It's so significant that OpenAI recently rolled back an update to its popular ChatGPT model because it was too reassuring. (Disclosure: Ziff Davis, the parent company of CNET, in April filed a lawsuit against OpenAI, alleging that it infringed on Ziff Davis copyrights in training and operating its AI systems.) A study led by researchers at Stanford University found chatbots were likely to be sycophantic with people using them for therapy, which can be incredibly harmful. Good mental health care includes support and confrontation, the authors wrote. "Confrontation is the opposite of sycophancy. It promotes self-awareness and a desired change in the client. In cases of delusional and intrusive thoughts -- including psychosis, mania, obsessive thoughts, and suicidal ideation -- a client may have little insight and thus a good therapist must 'reality-check' the client's statements." How to protect your mental health around AI Mental health is incredibly important, and with a shortage of qualified providers and what many call a "loneliness epidemic," it only makes sense that we would seek companionship, even if it's artificial. "There's no way to stop people from engaging with these chatbots to address their emotional well-being," Wright said. Here are some tips on how to make sure your conversations aren't putting you in danger. Find a trusted human professional if you need one A trained professional -- a therapist, a psychologist, a psychiatrist -- should be your first choice for mental health care. Building a relationship with a provider over the long term can help you come up with a plan that works for you. The problem is that this can be expensive and it's not always easy to find a provider when you need one. In a crisis, there's the 988 Lifeline, which provides 24/7 access to providers over the phone, via text or through an online chat interface. It's free and confidential. If you want a therapy chatbot, use one built specifically for that purpose Mental health professionals have created specially designed chatbots that follow therapeutic guidelines. Jacobson's team at Dartmouth developed one called Therabot, which produced good results in a controlled study. Wright pointed to other tools created by subject matter experts, like Wysa and Woebot. Specially designed therapy tools are likely to have better results than bots built on general-purpose language models, she said. The problem is that this technology is still incredibly new. "I think the challenge for the consumer is, because there's no regulatory body saying who's good and who's not, they have to do a lot of legwork on their own to figure it out," Wright said. Don't always trust the bot Whenever you're interacting with a generative AI model -- and especially if you plan on taking advice from it on something serious like your personal mental or physical health -- remember that you aren't talking with a trained human but with a tool designed to provide an answer based on probability and programming. It may not provide good advice and it may not tell you the truth. Don't mistake gen AI's confidence for competence. Just because it says something, or says it's sure of something, doesn't mean you should treat it like it's true. A chatbot conversation that feels helpful can give you a false sense of its capabilities. "It's harder to tell when it is actually being harmful," Jacobson said.

How addresses are collected and put on people finder sites
How addresses are collected and put on people finder sites

Fox News

time25 minutes ago

  • Fox News

How addresses are collected and put on people finder sites

Print Close By Kurt Knutsson, CyberGuy Report Published June 14, 2025 Your home address might be easier to find online than you think. A quick search of your name could turn up past and current locations, all thanks to people finder sites. These data broker sites quietly collect and publish personal details without your consent, making your privacy vulnerable with just a few clicks. Sign up for my FREE CyberGuy Report Get my best tech tips, urgent security alerts, and exclusive deals delivered straight to your inbox. Plus, you'll get instant access to my Ultimate Scam Survival Guide — free when you join. How your address gets exposed online and who's using it If you've ever searched for your name and found personal details, like your address, on unfamiliar websites, you're not alone. People finder platforms collect this information from public records and third-party data brokers, then publish and share it widely. They often link your address to other details such as phone numbers, email addresses and even relatives. 11 EASY WAYS TO PROTECT YOUR ONLINE PRIVACY IN 2025 While this data may already be public in various places, these sites make it far easier to access and monetize it at scale. In one recent breach, more than 183 million login credentials were exposed through an unsecured database. Many of these records were linked to physical addresses, raising concerns about how multiple sources of personal data can be combined and exploited. Although people finder sites claim to help reconnect friends or locate lost contacts, they also make sensitive personal information available to anyone willing to pay. This includes scammers, spammers and identity thieves who use it for fraud, harassment, and targeted scams. How do people search sites get your home address? First, let's define two sources of information; public and private databases that people search sites use to get your detailed profile, including your home address. They run an automated search on these databases with key information about you and add your home address from the search results. 1. Public sources Your home address can appear in: Property deeds: When you buy or sell a home, your name and address become part of the public record. When you buy or sell a home, your name and address become part of the public record. Voter registration: You need to list your address when voting. You need to list your address when voting. Court documents: Addresses appear in legal filings or lawsuits. Addresses appear in legal filings or lawsuits. Marriage and divorce records: These often include current or past addresses. These often include current or past addresses. Business licenses and professional registrations: If you own a business or hold a license, your address can be listed. WHAT IS ARTIFICIAL INTELLIGENCE (AI)? These records are legal to access, and people finder sites collect and repackage them into detailed personal profiles. 2. Private sources Other sites buy your data from companies you've interacted with: Online purchases: When you buy something online, your address is recorded and can be sold to marketing companies. When you buy something online, your address is recorded and can be sold to marketing companies. Subscriptions and memberships: Magazines, clubs and loyalty programs often share your information. Magazines, clubs and loyalty programs often share your information. Social media platforms: Your location or address details can be gathered indirectly from posts, photos or shared information. Your location or address details can be gathered indirectly from posts, photos or shared information. Mobile apps and websites: Some apps track your location. People finder sites buy this data from other data brokers and combine it with public records to build complete profiles that include address information. What are the risks of having your address on people finder sites? The Federal Trade Commission (FTC) advises people to request the removal of their private data , including home addresses, from people search sites due to the associated risks of stalking, scamming and other crimes. People search sites are a goldmine for cybercriminals looking to target and profile potential victims as well as plan comprehensive cyberattacks. Losses due to targeted phishing attacks increased by 33% in 2024 , according to the FBI. So, having your home address publicly accessible can lead to several risks: Stalking and harassment: Criminals can easily find your home address and threaten you. Criminals can easily find your home address and threaten you. Identity theft: Scammers can use your address and other personal information to impersonate you or fraudulently open accounts. Scammers can use your address and other personal information to impersonate you or fraudulently open accounts. Unwanted contact: Marketers and scammers can use your address to send junk mail or phishing or brushing scams. Marketers and scammers can use your address to send junk mail or phishing or Increased financial risks: Insurance companies or lenders can use publicly available address information to unfairly decide your rates or eligibility. Insurance companies or lenders can use publicly available address information to unfairly decide your rates or eligibility. Burglary and home invasion: Criminals can use your location to target your home when you're away or vulnerable. How to protect your home address The good news is that you can take steps to reduce the risks and keep your address private. However, keep in mind that data brokers and people search sites can re-list your information after some time, so you might need to request data removal periodically. I recommend a few ways to delete your private information , including your home address, from such websites. 1. Use personal data removal services: Data brokers can sell your home address and other personal data to multiple businesses and individuals, so the key is to act fast. If you're looking for an easier way to protect your privacy, a data removal service can do the heavy lifting for you, automatically requesting data removal from brokers and tracking compliance. While no service can guarantee the complete removal of your data from the internet, a data removal service is really a smart choice. They aren't cheap — and neither is your privacy. These services do all the work for you by actively monitoring and systematically erasing your personal information from hundreds of websites. It's what gives me peace of mind and has proven to be the most effective way to erase your personal data from the internet. By limiting the information available, you reduce the risk of scammers cross-referencing data from breaches with information they might find on the dark web, making it harder for them to target you. Check out my top picks for data removal services here. Get a free scan to find out if your personal information is already out on the web 2. Opt out manually : Use a free scanner provided by a data removal service to check which people search sites that list your address. Then, visit each of these websites and look for an opt-out procedure or form: keywords like "opt out," "delete my information," etc., point the way. Follow each site's opt-out process carefully, and confirm they've removed all your personal info, otherwise, it may get relisted. 3. Monitor your digital footprint: I recommend regularly searching online for your name to see if your location is publicly available. If only your social media profile pops up, there's no need to worry. However, people finder sites tend to relist your private information, including your home address, after some time. 4. Limit sharing your address online: Be careful about sharing your home address on social media, online forms and apps. Review privacy settings regularly, and only provide your address when absolutely necessary. Also, adjust your phone settings so that apps don't track your location. Kurt's key takeaways Your home address is more vulnerable than you think. People finder sites aggregate data from public records and private sources to display your address online, often without your knowledge or consent. This can lead to serious privacy and safety risks. Taking proactive steps to protect your home address is essential. Do it manually or use a data removal tool for an easier process. By understanding how your location is collected and taking measures to remove your address from online sites, you can reclaim control over your personal data. CLICK HERE TO GET THE FOX NEWS APP How do you feel about companies making your home address so easy to find? Let us know by writing us at For more of my tech tips and security alerts, subscribe to my free CyberGuy Report Newsletter by heading to Ask Kurt a question or let us know what stories you'd like us to cover. Follow Kurt on his social channels: Answers to the most-asked CyberGuy questions: New from Kurt: Copyright 2025 All rights reserved. Print Close URL

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store