Latest news with #Reolink
Yahoo
14 hours ago
- Health
- Yahoo
This AI system found a potential cure for blindness
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. People developing frontier AI models often speculate about how amazing the world will be once AGI and AI superintelligence are available. OpenAI CEO Sam Altman's recent essay is the latest example of that. Once superhuman intelligence is reached, the AI will make scientific discoveries that are beyond our reach for now, significantly improving our lives and well-being. There's also the possibility of AI taking over the world, of course. But we don't like to entertain it too much. Once that age of AI superintelligence arrives, we might see cancer eradicated, or a cure for diabetes and other types of illnesses that impact hundreds of millions of people. One can hope that AI will lead us there, at least. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 But we don't have to wait for one AI firm or another to proclaim that AGI is here to see meaningful developments in the field of medicine driven by AI. AGI, by the way, is the intermediate AI step we need to reach before superintelligence. Researchers from FutureHouse designed an AI system to automate scientific discovery. Called Robin, it's a scientist's companion that has several AI models to assist with the various stages of a discovery. Robin AI helped the researchers find a potential cure for a condition called dry age-related macular degeneration (dAMD) that can lead to irreversible blindness in the elderly. As many as 200 million people suffer from dAMD. The Robin AI system helped the scientists come up with a potential treatment by repurposing a drug called ripasudil, which is also used to treat glaucoma. Robin AI isn't just one large language model. It has three components, each made of separate LLMs that handle specific aspects of the research process: Crow, Falcon, and Owl perform literature searches and synthesis Phoenix is responsible for chemical synthesis design Finch does complex data analysis The way Robin works is actually brilliant. First, Crow analyzed the relevant literature for dAMD (around 550 studies) and proposed a hypothesis for the study. Crow suggested that enhancing retinal pigment epithelium (RPE) phagocytosis could be a potential treatment for dAMD. Robin's Falcon then looked at candidate molecules that might be able to achieve the goal above. The RPE layer of cells eats debris from the photoreceptors. RPE failure leads to dAMD. Falcon found 10 candidates that the scientists tested in the lab. Finally, Finch analyzed the data from the tests that humans ran in the lab. This AI agent found a Rho-kinase (ROCK) inhibitor called Y-27632 would increase RPE phagocytosis in cell cultures. The research didn't stop here. The scientists had Robin look at the data and propose a new round of testing based on the previous findings. This time, the AI co-scientist system proposed a genetic experiment. It suggested an RNA-sequencing test to see if the ROCK inhibitor found during the previous stage could induce gene expression changes that would convince the RPE layer to eat more of the debris that builds up. The new experiment ended with Finch discovering that Y-27632 upregulated ABCA1, a gene that acts as a pump for lipids (cholesterol) in RPE cells. The ROCK inhibitor practically told cells to eject more fat. Robin looked at the data from the first round of testing and then proposed a second set of drug candidates. The list included ripasudil, a drug that's already approved to treat eye conditions (glaucoma). Ripasudil raised phagocytosis 7.5 times. That is, the drug might increase the rate at which RPE destroys debris significantly. Preventing that accumulation would prevent blindness. The entire process took only two and a half months, allowing researchers to discover a potential cure for blindness much faster than traditional methods that do not involve AI models. The scientists picked the dAMD project to put the Robin AI to the test, but they could have used any disease that impacts a large number of people. That's not to say that ripasudil will become the standard dAMD treatment to prevent blindness. But it's very likely that doctors will consider the therapy once additional tests prove the AI's hypothesis and solution are correct. Only after human trials are successful will the drug be used for dAMD. What's great about this process is that Robin found a drug that's already approved for use rather than discovering a new molecule that might need years of additional testing. The Robin experiment isn't the first to use AI for repurposing existing drugs to treat other conditions. But this system is more complex than having an AI read massive amounts of studies to find side effects in existing drugs that might treat illnesses other than the conditions they were made for. Here, one of Robin's LLMs does the reading and suggests solutions. The other AI models in Robin come up with molecules and then analyze the data from lab experiments. The humans are needed only to run the experiments, verify the output from AI models, and tweak algorithms and prompts. I wouldn't be surprised if similar research leads to the discovery of brand-new therapies for all sorts of conditions in the future. Also important is the researchers' decision to release Robin as an open-source project. Others might use it to run similar experiments or build their own AI systems that can act as co-scientists. The following clip does a great job explaining this AI breakthrough. You'll find the full study at this link. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the
Yahoo
14 hours ago
- Yahoo
Apple releases iOS 18.6 beta 1 as iOS 26 development ramps up
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. A week after Apple introduced iOS 26 during the WWDC keynote, the company is now seeding iOS 18.6 beta 1 to developers. At this moment, it's unclear what's new with this update. However, this is likely one of the lightest iOS 18 updates; after all, the iOS 26 beta is already available, and Apple aims to release this big update later this fall. Historically, iOS x.6 versions don't bring many improvements, but sometimes new features are released. With iOS 17.6, Apple introduced the following functions: Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Apple News+ got Live Activity support for Home Screen and Lock Screen, so users could follow games and other factual stories It also brought a new Messages app setting that lets users filter unknown senders if they're international senders. That said, Apple could prepare a few new tweaks with iOS 18.6 during this beta cycle. With the previous software update, Apple added these changes: The update fixes the Contact Photos issue by offering Mail users a new menu that lets them toggle the feature with a simple tick. That should have always been the case. Apple added a new Pride wallpaper to celebrate the 'strength and beauty of LGBTQ+ communities around the world.' Parents now receive a notification when the Screen Time passcode is used on a child's device. Buy with iPhone is available when purchasing content within the Apple TV app on a 3rd party device. Fixes an issue where the Apple Vision Pro app may display a black screen. Support for carrier-provided satellite feature is available on all iPhone 13 models. With iOS 18.5 already being a light-on-features update, it's only natural that iOS 18.6 will follow that during the beta cycle. Looking in retrospect, iOS 18 did more cosmetic changes than breakthrough improvements, as Apple Intelligence didn't have the expected impact. BGR will let you know if we find anything new with iOS 18.6 beta 1. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the
Yahoo
19 hours ago
- Yahoo
19 new iOS 26 features Apple didn't have time to show us at WWDC 2025
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. Apple's WWDC 2025 keynote is over, and I already know I'll need to watch it again to get a better idea of what's coming to all of Apple's operating systems later this year. Apple announced tons of exciting features on Monday, starting with the brand new Liquid Glass design arriving in its next operating system updates. The rumors were true: Apple's operating systems are now tied to the year. 'Our releases for the fall that will power us through the coming year, 2026, will be version 26,' Craig Federighi said on stage. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Instead of iOS 19, we're getting iOS 26 this year, and the first iOS 26 beta is already available to download on iPhone 11 and newer devices. The Liquid Glass design is ready to preview right now in the iOS 26 developer beta, but some of the other interesting iOS 26 features that Apple demoed on stage won't be available until future beta releases roll out. But that's not all, as Apple also quietly revealed even more iOS 26 features that it didn't have time to highlight during the keynote. There are two types of people in the world: those who wake up when the alarm goes off, and those who hit snooze. I'm in the latter category, so I know how much extra sleep a snooze gives me: 9 minutes. In iOS 26, I'll be able to set that to any number I want. Speaking of sleep, rumors prior to WWDC said that iOS 26 would allow AirPods to pause media playback automatically when you fall asleep. That rumor has been confirmed. There are also two types of people when it comes to iPhone Messages: those who delete conversations and those who don't. I'm in the latter group. I might need some information from those old texts, but finding it isn't always easy. In iOS 26, Apple Intelligence will bring support for natural language search to the Messages app, which will make my life easier. When I do find what I need, I often struggle to copy the text in the chat bubbles. Apparently, iOS 26 will fix that, and I'll be able to select just the portions I need. I'm not quite sure what adaptive power means, but rumors suggested that the iPhone would use AI to intelligently adapt power use to improve battery life. As a future iPhone 17 Air owner, I hope that's what this iOS 26 feature does. Speaking of battery life, iOS 26 will finally tell us how long it'll take to charge an iPhone. Another useful thing the iPhone could tell us concerns the camera. iOS 26 will detect dirt on the lens and warn us to give it a wipe. I'm also routinely looking for images in the Photos app, including pictures taken around specific events. I'm hoping this iOS 26 feature will make it easier to mark events in the Photos app. Apple also listed an exciting feature for managing credit cards in the Wallet app. You'll be able to manage autofill cards in the app, which is great news for serial shoppers. Apple Maps will get new incident report types in iOS 26, a feature that has made Waze such a popular mapping app alternative in recent years. I've never been a fan of journaling on the iPhone or iPad. But I thought about using the app for jotting down thoughts about my marathon training sessions and the races I go to. Support for multiple journals might help me with that. It might help me journal other things, too, like travel-related info, in a different journal. Notes is my go-to writing app, as is the case for many iPhone owners. Among them, there are those who want to export Notes to Markdown. iOS 26 lets them do just that. iPhone users with Action buttons on their devices (iPhone 15 Pro and later) will soon be able to create reminders by pressing the Action button. iOS 26 will be able to enhance dialogue in the Podcasts app, a feature I have wanted for quite some time. I listen to Podcasts sometimes while I'm running, but hate having to max out the volume to hear people speaking. This feature should bring clarity to dialogue. I also want to speed up podcasts while I run, and iOS 26 will grant me that wish. The new iPhone operating system has new safety features that Apple didn't talk about. For example, the Settings app will feature a Block list, which is handy if you need to handle spammers, stalkers, and/or the annoying people in your life who might need blocking. iOS 26 will also do Safety Checks while blocking a contact, though it's unclear how this feature works. A new Focus feature will let you silence a SIM card, which is great news if you have one phone number for work and another for your personal life. Just silence it completely when you get home. You'll probably be able to automate the feature, too. Finally, the Passwords app will show you a history of passwords, which is something other password apps already offer. That way, you'll be able to keep track of the passwords you used for apps and websites, and avoid reusing any of them. Those are the highlights, but we'll continue to cover iOS 26 in the weeks and months ahead. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the
Yahoo
19 hours ago
- Yahoo
This tiny iOS 26 tweak makes a big difference for iPhone users
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. One of the best iOS features ever introduced to the iPhone was the ability to automatically fill in OTP codes. This handy function was previously limited to Messages and Mail OTP codes. With iOS 26, Apple is finally making it available to third-party apps, so it doesn't matter where you get that OTP code from. iOS 26 will make sure to suggest it to you right away. Interestingly enough, that improvement took almost seven years to arrive. The Security Code Autofill was introduced back in 2018 with iOS 12. Ever since then, we have been flooded with social media users saying, 'Whoever invented this feature deserves a raise.' Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Well, whoever improved how iOS 26 users get OTP codes definitely deserves a big bonus. While Apple isn't highlighting this tweak yet, several developers have posted about the improvement on social media. What makes the new OTP codes feature in iOS 26 so useful is that users won't have to do anything once the update is available, aside from updating their devices. Once iOS 26 is installed on your iPhone, it won't matter if your 2FA code comes through WhatsApp or a third-party mail app. The system will find it and suggest it without requiring you to leave Safari or whatever app you're using. View on Threads Apple will even highlight where the OTP is coming from, so you can be sure you're adding the right code to the right website. In addition to this feature, Apple is continuing to enhance user privacy by making its AI models available for developers to use in their apps. This means customers will be able to benefit from Apple's AI-powered features within third-party apps. BGR has a full list of hidden features coming to iOS 26 that Apple didn't discuss during the WWDC keynote. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the
Yahoo
4 days ago
- Business
- Yahoo
How much energy and water does ChatGPT consume?
If you purchase an independently reviewed product or service through a link on our website, BGR may receive an affiliate commission. Artificial intelligence has been the hottest topic in tech since late 2022, when ChatGPT went viral. The AI race started almost immediately, with every big tech company in the US and elsewhere working on new AI systems of their own. We quickly learned that software like ChatGPT requires massive resources. Datacenters packing thousands of expensive GPUs specialized in training and running AI chatbots were needed. The larger the data centers, the more energy the world would need to set aside for AI projects. Today's Top Deals Best deals: Tech, laptops, TVs, and more sales Best Ring Video Doorbell deals Memorial Day security camera deals: Reolink's unbeatable sale has prices from $29.98 Some people worried about the impact AI infrastructure would have on the world. It wasn't just about the electricity powering the chats, but also the water used to cool some of these data centers. Two and a half years after ChatGPT went viral, we finally know how much energy and power a ChatGPT chat consumes. It comes from Sam Altman's latest blog, titled The Gentle Singularity, which teases what the world could look like in the next five to ten years thanks to superintelligence: People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon. It's unclear where Altman's figure comes from. If accurate, I'll probably consume about a teaspoon of water with AI queries every day. But the figure is also misleading, considering that OpenAI has hundreds of millions of monthly users. Add to that all the energy and water Gemini, Claude, Meta AI, Deep Research, and all the other chatbots out there consume, and you'll rack up quite a bill. It's unclear what prompted the CEO to pen the post, one he describes on X as maybe 'the last one like this I write with no AI help at all.' It likely wasn't to reveal the energy costs associated with ChatGPT chats, though energy is one of the big topics in the blog. On that note, I'll point out that ChatGPT o3 just got dramatically cheaper than before. Unsurprisingly, Altman is quite optimistic about the future of AI. He presents superintelligence as inevitable. It's a foregone conclusion. We'll get to a world where smarter-than-human AI will make our jobs easier than ever, leading to potential massive discoveries to improve daily lives: AI will contribute to the world in many ways, but the gains to quality of life from AI driving faster scientific progress and increased productivity will be enormous; the future can be vastly better than the present. Scientific progress is the biggest driver of overall progress; it's hugely exciting to think about how much more we could have. Altman expects AI to bring novel insights in 2026. A year after that, the world will start getting robots that can do tasks in the wild. Then, 'the 2030s are likely going to be wildly different from any time that has come before,' Altman writes. That's even though we, humans, will continue to enjoy our lives as we did before. The OpenAI CEO also says that intelligence and energy will be 'wildly abundant' in the 2030s. Once that happens, the world will be able to do things that weren't possible before. Speaking of energy, Altman also sees datacenter production becoming automated. AI and robots will power everything: There are other self-reinforcing loops at play. The economic value creation has started a flywheel of compounding infrastructure buildout to run these increasingly-powerful AI systems. And robots that can build other robots (and in some sense, datacenters that can build other datacenters) aren't that far off. AI will help humanity achieve new 'wonders' by 2035. Altman even sees a future where some people will choose to 'plug in' via 'true high-bandwidth brain-computer interfaces.' It all sounds amazing, and it certainly beats the gloomier pictures others paint about the future of AI. Altman's essay also downplays the downsides, like the massive job revolution we're about to witness. The CEO isn't ready to propose any solution for AI stealing jobs, other than indicating that humans will adapt and some sort of new social contract might emerge: There will be very hard parts like whole classes of jobs going away, but on the other hand, the world will be getting so much richer so quickly that we'll be able to seriously entertain new policy ideas we never could before. We probably won't adopt a new social contract all at once, but when we look back in a few decades, the gradual changes will have amounted to something big. Toward the end of the post, Altman also addresses the obvious challenges. AI has to be aligned to our interests to give us the rosy future he paints in the previous paragraphs. The other challenge is making sure superintelligence is 'cheap, widely available, and not too concentrated with any person, company, or country.' How will OpenAI and every other firm engaged in developing frontier AI ensure it's safe, cheap, and widely available? Altman doesn't say. We'll just have to wait and see what happens next, while we continue to chat with chatbots like ChatGPT, a fifteenth of a teaspoon of water at a time. Don't Miss: Today's deals: Nintendo Switch games, $5 smart plugs, $150 Vizio soundbar, $100 Beats Pill speaker, more More Top Deals Amazon gift card deals, offers & coupons 2025: Get $2,000+ free See the