logo
Microsoft has started testing its AI agent in the Windows 11 Settings app.

Microsoft has started testing its AI agent in the Windows 11 Settings app.

The Verge15 hours ago

First announced last month, the feature lets users describe what they need in Settings, like 'my mouse pointer is too small.' The AI agent will then provide suggestions about how to address the issue, and can even fix it for them if users give permission.
Microsoft is bringing the AI agent to Windows 11 Insiders in the Dev Channel, but only for Snapdragon-based Copilot Plus PCs to start.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

ISS issues, technical setbacks delay historic Axiom-4 mission
ISS issues, technical setbacks delay historic Axiom-4 mission

Yahoo

time39 minutes ago

  • Yahoo

ISS issues, technical setbacks delay historic Axiom-4 mission

The Brief Axiom-4's mission remains delayed due to weather, equipment issues, and an ongoing ISS leak. Experts urge patience and caution as global partners assess safety and future station plans. The uncertainty underscores growing challenges in the transition to commercial spaceflight. CAPE CANAVERAL, Fla. - A series of technical and international setbacks are continuing to delay the highly anticipated Axiom-4 mission, originally expected to mark a major milestone in private spaceflight. What we know The Axiom-4 mission, a significant private spaceflight effort, has faced repeated delays. Initial postponements stemmed from weather conditions and a liquid oxygen leak. Now, a leak in a Russian module aboard the International Space Station (ISS) is adding to the holdup, prompting further safety reviews and coordination among international partners. What we don't know There is no clear timeline for when Axiom-4 will launch, nor a definitive assessment of how long the Russian module repair will take. The potential impact of repeated delays on the crew, mission goals, and future private spaceflight schedules remains uncertain. It's also unclear how soon a viable replacement for the ISS could be operational if early deorbiting plans gain traction. The backstory The ISS has served as a hub for international space collaboration since the early 2000s. Axiom-4 is part of the broader trend of commercializing low-Earth orbit, with companies like Axiom Space aiming to build private stations. The delays highlight both the promise and complexity of transitioning from government-led to commercial space operations. What they're saying Russia's space agency, Roscosmos, is currently addressing a leak on one of its ISS modules, raising concerns about the station's readiness to accommodate the incoming Axiom-4 crew. "This isn't something that they do on a normal, regular basis," said Don Platt, of the Florida Institute of Technology. "They really need to sit back and first make sure that all sides are happy with a plan to move forward, and then it seems to be working and then take a little bit of time after they do a fix to, to see how well that actually works." The delays come as the future of the ISS itself becomes increasingly uncertain. The orbiting laboratory has been in service for nearly 30 years, and some — including SpaceX CEO Elon Musk — re calling for it to be deorbited as early as 2026, four years ahead of NASA's current timeline. But experts caution against acting too quickly. "We've got to have a replacement before we willy nilly decide. It's too old. Let's junk it," Platt added. "You know, it's kind of like, again, with your car, you got to have another car to go to work the next day than before." Companies like Axiom Space are developing next-generation commercial space stations, but those projects are still in the early stages of construction and testing. For now, Axiom-4's launch remains in limbo, as engineers and international partners work to resolve both Earth-bound and orbital obstacles. STAY CONNECTED WITH FOX 35 ORLANDO: Download the FOX Local app for breaking news alerts, the latest news headlines Download the FOX 35 Storm Team Weather app for weather alerts & radar Sign up for FOX 35's daily newsletter for the latest morning headlines FOX Local:Stream FOX 35 newscasts, FOX 35 News+, Central Florida Eats on your smart TV The Source This story was written based on information shared by Axiom Space, NASA, and Don Platt, of the Florida Institute of Technology.

As disinformation and hate thrive online, YouTube quietly changed how it moderates content
As disinformation and hate thrive online, YouTube quietly changed how it moderates content

Yahoo

timean hour ago

  • Yahoo

As disinformation and hate thrive online, YouTube quietly changed how it moderates content

YouTube, the world's largest video platform, appears to have changed its moderation policies to allow more content that violates its own rules to remain online. The change happened quietly in December, according to The New York Times, which reviewed training documents for moderators indicating that a video could stay online if the offending material did not account for more than 50 per cent of the video's duration — that's double what it was prior to the new guidelines. YouTube, which sees 20 million videos uploaded a day, says it updates its guidance regularly and that it has a "long-standing practice of applying exceptions" when it suits the public interest or when something is presented in an educational, documentary, scientific or artistic context. "These exceptions apply to a small fraction of the videos on YouTube, but are vital for ensuring important content remains available," YouTube spokesperson Nicole Bell said in a statement to CBC News this week. But in a time when social media platforms are awash with misinformation and conspiracy theories, there are concerns that YouTube is only opening the door for more people to spread problematic or harmful content — and to make a profit doing so. YouTube isn't alone. Meta, which owns Facebook and Instagram, dialled back its content moderation earlier this year, and Elon Musk sacked Twitter's moderators when he purchased the platform in 2022 and rebranded it as X. "We're seeing a race to the bottom now," Imran Ahmed, CEO for the U.S.-based Center for Countering Digital Hate, told CBC News. "What we're going to see is a growth in the economy around hate and disinformation." WATCH | Experts warn Meta's moderation move will likely increase misinformation: YouTube's goal is "to protect free expression," Brooks said in her statement, explaining that easing its community guidelines "reflect the new types of content" on the platform. For example, she said, a long-form podcast containing one short clip of violence may no longer need to be removed. The Times reported Monday that examples presented to YouTube staff included a video in which someone used a derogatory term for transgender people during a discussion about hearings for U.S. President Donald Trump's cabinet appointees, and another that shared false information about COVID-19 vaccines but that did not outright tell people not to get vaccinated. A platform like YouTube does have to make some "genuinely very difficult decisions" when moderating content, says Matt Hatfield, executive director of the Canadian digital rights group OpenMedia. LISTEN | How Canada has come to play an outsized role in far-right misinformation: He believes platforms do take the issue seriously, but he says there's a balance between removing harmful or illegal content, such as child abuse material or clear incitements to violence, and allowing content to stay online, even if it's offensive to many or contains some false information. The problem, he says, is that social media platforms also "create environments that encourage some bad behaviour" among creators, who like to walk the line of what's acceptable. "The core model of these platforms is to keep you clicking, keep you watching, get you to try a video from someone you've never experienced before and then stick with that person." And that's what concerns Ahmed. He says these companies put profits over online safety and that they don't face consequences because there are no regulations forcing them to limit what can be posted on their platforms. He believes YouTube's relaxed policies will only encourage more people to exploit them. In a recent transparency report, YouTube said it had removed nearly 2.9 million channels containing more than 47 million videos for community guideline violations in the first quarter — that came after the reported policy change. The overwhelming majority of those, 81.8 per cent, were considered spam, but other reasons included violence, hateful or abusive material and child safety. LISTEN | Why you're being tormented by ads algorithms and AI slop: Hatfield says there is a public interest in having harmful content like that removed, but that doesn't mean all controversial or offensive content must go. However, he says YouTube does make mistakes in content moderation, explaining that it judges individual videos in a sort of "vacuum" without considering how each piece of content fits into a broader context. "Some content can't really be fairly interpreted in that way." Ahmed says companies should be held accountable for the content on their platforms through government regulation. He pointed to Canada's controversial but now-scuttled Online Harms Act, also known as Bill C-63, as an example. It proposed heavier sentences, new regulatory bodies and changes to a number of laws to tackle online abuse. The bill died when former prime minister Justin Trudeau announced his resignation and prorogued Parliament back in January. Ahmed says he hopes the new government under Prime Minister Mark Carney will enact similar legislation. Hatfield says he liked parts of that act, but his group ultimately opposed it after it tacked on some other changes to the Criminal Code and Human Rights Act that he says were unrelated to the platforms. He says groups like OpenMedia would have liked to see a strategy addressing business models that encourage users to post and profit off of "lawful but awful" content. "We're not going to have a hate-free internet," he said. "We can have an internet that makes it less profitable to spread certain types of hate and misinformation." WATCH | How people can become more discerning news consumers:

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store