
The Switch 2's latest update brings support for two great games.
See all Stories Posted Jul 15, 2025 at 11:18 PM UTC

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Business Insider
an hour ago
- Business Insider
Bitget Lists NERO Chain (NERO) for Spot Trading with 65M in Token Rewards
Seychelles, Victoria, July 28th, 2025, Chainwire Bitget, the leading cryptocurrency exchange and Web3 company, has announced the listing of NERO Chain (NERO) in the Innovation and Public Chain Zone, adding it for spot trading. Trading for the NERO/USDT pair will begin on 28 July 2025, 7:00 (UTC), with withdrawals available from 29 July 2025, 8:00 (UTC). Alongside the listing, Bitget will launch a CandyBomb campaign with 65,000,000 NERO available in rewards. Of this, 7,500,000 NERO will be allocated to the NERO, ETH and SUI trading pool, 15,000,000 NERO in the NERO trading pool, while 42,500,000 NERO will be up for grabs in the ETH and SUI trading pool. The campaign will run from 28 July 2025, 7:00 till 4 August 2025, 7:00 (UTC). NERO Chain is a modular blockchain designed to prioritize value creation for applications rather than relying solely on infrastructure-level storage. It redefines traditional blockchain economics by enabling dApps to capture and share transaction value, ensuring that developers and users benefit directly from an application's success, rather than value accruing solely to the base layer. Fully EVM-compatible and built on a high-performance settlement layer, NERO offers powerful scalability and flexibility. Features like native account abstraction, gas sponsorship through paymasters, and cutting-edge innovations such as Blockspace 2.0 provide developers with a seamless and efficient building experience. Bitget continues to expand its offerings, positioning itself as a leading platform for cryptocurrency trading. The exchange has established a reputation for innovative solutions that empower users to explore crypto within a secure CeDeFi ecosystem. With an extensive selection of over 800 cryptocurrency pairs and a commitment to broaden its offerings to more than 900 trading pairs, Bitget connects users to various ecosystems, including Bitcoin, Ethereum, Solana, Base, and TON. The addition of NERO into Bitget's portfolio marks a significant step toward expanding its ecosystem by advancing privacy-focused infrastructure and enabling seamless migration of Web2 applications into the Web3 space through scalable, developer-friendly solutions. For more details on NERO, visit here. About Bitget Established in 2018, Bitget is the world's leading cryptocurrency exchange and Web3 company. Serving over 120 million users in 150+ countries and regions, the Bitget exchange is committed to helping users trade smarter with its pioneering copy trading feature and other trading solutions, while offering real-time access to Bitcoin price, Ethereum price, and other cryptocurrency prices. Formerly known as BitKeep, Bitget Wallet is a leading non-custodial crypto wallet supporting 130+ blockchains and millions of tokens. It offers multi-chain trading, staking, payments, and direct access to 20,000+ DApps, with advanced swaps and market insights built into a single platform. Bitget is driving crypto adoption through strategic partnerships, such as its role as the Official Crypto Partner of the World's Top Football League, LALIGA, in EASTERN, SEA and LATAM markets, as well as a global partner of Turkish National athletes Buse Tosun Çavuşoğlu (Wrestling world champion), Samet Gümüş (Boxing gold medalist) and İlkin Aydın (Volleyball national team), to inspire the global community to embrace the future of cryptocurrency. Aligned with its global impact strategy, Bitget has joined hands with UNICEF to support blockchain education for 1.1 million people by 2027. In the world of motorsports, Bitget is the exclusive cryptocurrency exchange partner of MotoGP, one of the world's most thrilling championships. For more information, visit: Website | Twitter | Telegram | LinkedIn | Discord | Bitget Wallet For media inquiries, please contact: media@ Risk Warning: Digital asset prices are subject to fluctuation and may experience significant volatility. Investors are advised to only allocate funds they can afford to lose. The value of any investment may be impacted, and there is a possibility that financial objectives may not be met, nor the principal investment recovered. Independent financial advice should always be sought, and personal financial experience and standing carefully considered. Past performance is not a reliable indicator of future results. Bitget accepts no liability for any potential losses incurred. Nothing contained herein should be construed as financial advice. For further information, please refer to our Terms of Use.
Yahoo
an hour ago
- Yahoo
Trump pauses export controls to bolster China trade deal, FT says
(Reuters) -The U.S. has paused curbs on tech exports to China to avoid disrupting trade talks with Beijing and support President Donald Trump's efforts to secure a meeting with President Xi Jinping this year, the Financial Times said on Monday. The industry and security bureau of the Commerce Department, which oversees export controls, has been told in recent months to avoid tough moves on China, the newspaper said, citing current and former officials. Reuters could not immediately verify the report. The White House and the department did not respond to Reuters' requests for comment outside business hours. Top U.S. and Chinese economic officials are set to resume talks in Stockholm on Monday to tackle longstanding economic disputes at the centre of a trade war between the world's top two economies. Tech giant Nvidia said this month it would resume sales of its H20 graphics processing units (GPU) to China, reversing an export curb the Trump administration imposed in April to keep advanced AI chips out of Chinese hands over national security concerns. The planned resumption was part of U.S. negotiations on rare earths and magnets, Commerce Secretary Howard Lutnick has said. The paper said 20 security experts and former officials, including former deputy US national security adviser Matt Pottinger, will write on Monday to Lutnick to voice concern, however. "This move represents a strategic misstep that endangers the United States' economic and military edge in artificial intelligence," they write in the letter, it added. Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data


Forbes
an hour ago
- Forbes
AI Apps Are Undressing Women Without Consent And It's A Problem
AI nudification apps are making it frighteningly easy to create fake sexualized images of women and ... More teens, sparking a surge in abuse, blackmail and online exploitation. The rise of AI 'nudification' tools makes it shockingly easy for anyone to create a fake naked image of you—or any of your family, friends or colleagues—using nothing more than a photo and one of many readily available AI apps. The existence of tools that let users create non-consensual sexualized images might seem like an inevitable consequence of the development of AI image generation. But with 15 million downloads since 2022, and deepfaked nude content increasingly used to bully victims and expose them to danger, it's not a problem that society can or should ignore. There have been calls for apps to be banned, and criminal penalties for creating and spreading non-consensual intimate images have been introduced in some countries. But this has done little to stem the flood, with one in four 13 to 19-year-olds reportedly exposed to fake, sexualized images of someone they know. Let's look at how these tools work, what the real risks are, and what steps we should be taking to minimize the harms that are already being caused. What Are Nudification Apps And What Are The Dangers? Nudification apps use AI to create naked or sexualized images of people from the sort of everyday, fully-clothed images that anyone might upload to Facebook, Instagram or LinkedIn. While men are occasionally the targets, research suggests that 99 per cent of non-consensual, sexualized deepfakes feature women and girls. Overwhelmingly, it's used as a form of abuse to bully, coerce or extort victims. Media coverage frequently suggests that this is increasingly having a real impact on women's lives. While faked nude images can be humiliating and potentially career-affecting for anyone, in some parts of the world, it could leave women at risk of criminal prosecution or even serious violence. Another shocking factor is the growing number of fake images of minors that are being created, which may or may not be derived from images of real children. The Internet Watch Foundation reported a 400 percent rise in the number of URLs hosting AI-generated child sex abuse content in the first six months of 2025. This type of content is seen as particularly dangerous, even when no real children are involved, with experts saying it can normalize abusive images, fuel demand, and complicate law enforcement investigations. Unfortunately, media reports suggest that criminals have a clear financial incentive to get involved, with some making millions of dollars from selling fake content. So, given the simplicity and scale with which these images can be created, and the devastating consequences they can have on lives, what's being done to stop it? How Are Service Providers And Legislators Reacting? Efforts to tackle the issue through regulation are underway in many jurisdictions, but so far, progress has been uneven. In the US, the Take It Down Act makes online services, including social media, responsible for taking down non-consensual deepfakes when asked to do so. And some states, including California and Minnesota, have passed laws making it illegal to distribute sexually explicit deepfakes. In the UK, there are proposals to take matters further by imposing penalties for making, not simply distributing, non-consensual deepfakes, as well as an outright ban on nudification apps themselves. However, it isn't clear how the tools would be defined and differentiated from AI used for legitimate creative purposes. China's generative AI measures contain several provisions aimed at mitigating the harm of non-consensual deepfakes. Among these are requirements that tools should have built-in safeguards to detect and block illegal use, and that AI content should be watermarked in a way that allows its origin to be traced. One frustration for those campaigning for a solution is that authorities haven't always seemed willing to treat AI-generated image abuse as seriously as they would photographic image abuse, due to a perception that it 'isn't real'. In Australia, this prompted the government commissioner for online safety to call on schools to ensure all incidents are reported to police as sex crimes against children. Of course, online service providers have a hugely important role to play, too. Just this month, Meta announced that it is suing the makers of the CrushAI app for attempting to circumvent its restrictions on promoting nudification apps on its Facebook platform. This came after online investigators found that the makers of these apps are frequently able to evade measures put in place by service providers to limit their reach. What Can The Rest Of Us Do? The rise of AI nudification apps should act as a warning that transformative technologies like AI can change society in ways that aren't always welcome. But we should also remember that the post-truth age and 'the end of privacy" are just possible futures, not guaranteed outcomes. How the future turns out will depend on what we decide is acceptable or unacceptable now, and the actions we take to uphold those decisions. From a societal point of view, this means education. Critically, there should be a focus on the behavior and attitudes of school-age children to help make them aware of the harm that can be caused. From a business point of view, it means developing an awareness of how this technology can impact workers, particularly women. HR policies should ensure there are systems and policies in place to help those who may become victims of blackmail or harassment campaigns involving deepfaked images or videos. And technological solutions have a role to play in detecting when these images are transferred and uploaded, and potentially removing them before they can cause harm. Watermarking, filtering and collaborative community moderation could all be part of the solution. Failing to act decisively now will mean that deepfakes, nude or otherwise, are likely to become an increasingly problematic part of everyday life.