logo
XRP Price Outperforms Bitcoin, Ethereum As ‘Strange Signal' Emerges, Why The Target Is $4

XRP Price Outperforms Bitcoin, Ethereum As ‘Strange Signal' Emerges, Why The Target Is $4

Business Mayor12-05-2025

Trusted Editorial content, reviewed by leading industry experts and seasoned editors. Ad Disclosure
The XRP price, while still being well below its all-time high price, is doing incredibly well after the market rebound. The altcoin has been able to outperform the likes of Bitcoin and by a large margin, showing its strength in the market recently. This comes as crypto analysts in the community have been predicting and calling for higher prices, expecting XRP to keep outperforming the heavy-hitters.
XRP Leaves Bitcoin And Ethereum In The Dust
Pro-XRP lawyer Bill Morgan took to social media to share an important update about the recent XRP price performance. The screenshot shared in the X post showed that when compared to Bitcoin and Ethereum, XRP has performed exceptionally well.
The XRPUSDT pair showed XRP was up 9.96% in the last week at the time of the post. In contrast, the XRP/BTC pair showed a1.18% increase in seven days and 1.50% increase in 24 hours. Then when compared to Ethereum, the numbers were even worse. The XRP/ETH pair showed a 20.73% crash in seven days, while the 24-hour performance showed a 5.18% decrease.
'Strange Signal' Says A Blowout Is Coming
Crypto analyst MasterAnanda has pointed out a 'strange signal' that has been forming on the XRP price chart. This signals comes with the rapid changes in candle formation on the price chart through the bear and bull cycles. This candle formation refers to the volatility and price swings experienced by the altcoin each time.
The first one the crypto analyst points out is the fact that back in December 2024, XRP was forming big candles with the peaks at the beginning of the month. This continues even when the price turns and began going downwards.
Read More ChatGPT thinks XRP will surge by 8x by the end of 2024
Then again, in the middle of January 2025, the same thing happens as the candles grow bigger when the price peaks again. They continue on into April as XRP continues to struggle through high volatility and rapid price swings.
Source: TradingView
However, it seems the trend in candle formation had begun to change as volatility and price swings fell. From mid-April toward the end of the month, the crypto analyst explains that there are smaller candles while the price grows. As the volatility fell, the analyst notes that XRP moved low but never made a new low, calling this a bullish signal.
Related Reading: Still Holding TRUMP Coin? This Analyst Says Recovery To $79 Is Coming
'When the market was bearish, prices were moving down with force,' MasterAnanda wrote. 'Now the market is bullish confirmed because when resistance is hit, there is no bearish force, no bearish action no bearish momentum; just consolidation before additional growth.'
If this bullish signal plays out, then, going by the analyst's chart, the XRP price could be looking at a rally toward $3.6. This would be an over 50% jump in price from the current level and perhaps push it toward new all-time highs.
Market rebound sends price above $0.4 | Source: XRPUSDT on TradingView.com
Featured image from Dall.E, chart from TradingView.com

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

A Comprehensive Guide to Deposit and Withdrawal Methods at 7XL
A Comprehensive Guide to Deposit and Withdrawal Methods at 7XL

Time Business News

time31 minutes ago

  • Time Business News

A Comprehensive Guide to Deposit and Withdrawal Methods at 7XL

The 7XL platform offers a comprehensive suite of payment methods designed to accommodate the diverse preferences and needs of players worldwide. Each method provides unique advantages in terms of speed, convenience, and security, allowing users to choose the option that best suits their individual requirements. Credit and debit cards, particularly Visa and Mastercard, represent one of the most straightforward and widely recognized deposit methods available on the platform. The process is intuitive for most users, with funds typically appearing in the player's account almost immediately after transaction approval. This method provides maximum convenience, though it does require sharing card details with the platform's secure payment processing system. Digital wallet services such as Skrill, Neteller, and ecoPayz have become increasingly popular among online gaming enthusiasts. These services act as an intermediary between the user's bank account and the gaming platform, adding an essential layer of privacy and security. Since there's no need to expose sensitive financial details directly to 7XL, e-wallets provide peace of mind while maintaining instant deposit capabilities. For technologically oriented users, 7XL supports deposits via leading digital currencies including Bitcoin (BTC) and Ethereum (ETH). According to Coinbase, cryptocurrency transactions offer a high level of anonymity, relatively low transaction fees, and impressive speed. Funds become available for play after receiving the required number of confirmations on the blockchain network, a process that usually takes just a few minutes. While considered slower compared to modern alternatives, direct bank transfers remain a reliable and secure option for players who prefer traditional methods or need to deposit larger sums. These transactions benefit from the established security protocols of the banking industry. The withdrawal process at 7XL is designed for efficiency, though processing times vary significantly between different methods. Withdrawals to e-wallets typically represent the fastest option, with requests processed within 24 hours. Cryptocurrency withdrawals also offer high speed, with funds transferred to the user's digital wallet usually within a few hours. In contrast, withdrawals to credit cards or via bank transfer are subject to traditional banking system processing times and may take between 1 to 5 business days to complete. Understanding the fee structure is crucial for effective bankroll management. 7XL generally does not charge fees for deposits, though charges may apply from payment providers themselves, such as network fees in cryptocurrency transactions. For withdrawals, the platform may charge a small handling fee depending on the chosen method and withdrawal frequency. Players are strongly advised to consult the detailed and up-to-date fee table available on the 'Cashier' page of the 7XL website before conducting any financial transactions. As a regulated platform, 7XL maintains strict security processes in compliance with industry standards. An integral component of this security framework is the 'Know Your Customer' (KYC) process. Before processing the first withdrawal, players are typically required to verify their identity by uploading identification documents, such as a government-issued ID and proof of address. According to Financial Crimes Enforcement Network (FinCEN), these verification procedures are essential for preventing fraud, money laundering, and protecting player funds. While the process may seem comprehensive, completing verification in advance ensures smoother and faster processing of future withdrawals. The platform employs multiple layers of security to protect user data and financial transactions. This includes SSL encryption protocols that secure all data transmission between users and the platform, ensuring that sensitive information remains protected from unauthorized access. The 7XL platform provides a flexible and sophisticated financial ecosystem that empowers users to select deposit and withdrawal methods that align with their specific preferences and requirements. Understanding the distinctions between various options—including processing speeds, associated costs, and security levels—enables players to manage their funds efficiently while maintaining complete confidence in their financial activities. By making informed decisions about payment methods, users can ensure quick access to their funds and enjoy a seamless, worry-free gaming experience on the platform. Whether prioritizing speed, security, or convenience, 7XL's diverse payment portfolio accommodates every player's unique financial management needs. TIME BUSINESS NEWS

Baffled Facebook users share embarrassing personal details with world
Baffled Facebook users share embarrassing personal details with world

Yahoo

time42 minutes ago

  • Yahoo

Baffled Facebook users share embarrassing personal details with world

Facebook users are accidentally sharing legal woes, relationship dramas and health problems with the world after failing to realise that a chatbot they were speaking to was making the messages public. Internet users have publicly disclosed potentially embarrassing information or private personal details in conversations with an artificial intelligence (AI) app built by Meta. While the messages do not appear to have been meant for the public, dozens of posts have been shared on Meta AI's public 'Discover' feed. In one post seen by The Telegraph, a user asked the chatbot to write a character reference ahead of a court hearing, giving their full name. 'A character letter for court can be a crucial document,' Meta's chatbot said. 'To help me write a strong letter, can you tell me a bit more.' The person posting replied: 'I am hoping the court can find some leniency.' In another, a man appears to be asking for advice choosing between his wife and another woman. Others users shared long, rambling voice notes. Mark Zuckerberg's company launched its standalone Meta AI app in April. On it, users can speak to the company's chatbot, asking it questions in a manner similar to OpenAI's ChatGPT. Public sharing of conversations is not turned on by default, and users have to log in and confirm that they want to publish a conversation. However, many of the posts suggest users are unaware that their conversations have been aired in public. It suggests people may have opted to publish their conversations without fully realising what they were doing. In a post on X, Justine Moore, a partner at venture capital firm Andreessen Horowitz, said: 'Wild things are happening on Meta's AI app. The feed is almost entirely boomers who seem to have no idea their conversations with the chatbot are posted publicly.' In other shared conversations, users appeared to confuse Meta AI for a customer service bot, or asked it to provide technical support, such as helping them to log in. One chat begins: 'Dear Instagram Team, I am writing to respectfully request the reactivation of my Instagram account.' When it launched Meta AI, the tech company said its public feed was intended as a 'place to share and explore how others are using AI'. It said: 'You can see the best prompts people are sharing, or remix them to make them your own. And as always, you're in control: nothing is shared to your feed unless you choose to post it.' Technology giants have been aggressively pushing AI features despite fears that the tools are leaving social media filled with so-called AI 'slop' – nonsense images and conversations generated by bots. AI chatbots have been involved in a series of blunders. A Google chatbot last year told its users it was safe to eat rocks. In 2023, a chatbot from Microsoft went rogue and repeatedly expressed its love for users. Meta was contacted for comment.

AI tools collect and store data about you from all your devices – here's how to be aware of what you're revealing
AI tools collect and store data about you from all your devices – here's how to be aware of what you're revealing

Yahoo

timean hour ago

  • Yahoo

AI tools collect and store data about you from all your devices – here's how to be aware of what you're revealing

Like it or not, artificial intelligence has become part of daily life. Many devices – including electric razors and toothbrushes – have become 'AI-powered,' using machine learning algorithms to track how a person uses the device, how the device is working in real time, and provide feedback. From asking questions to an AI assistant like ChatGPT or Microsoft Copilot to monitoring a daily fitness routine with a smartwatch, many people use an AI system or tool every day. While AI tools and technologies can make life easier, they also raise important questions about data privacy. These systems often collect large amounts of data, sometimes without people even realizing their data is being collected. The information can then be used to identify personal habits and preferences, and even predict future behaviors by drawing inferences from the aggregated data. As an assistant professor of cybersecurity at West Virginia University, I study how emerging technologies and various types of AI systems manage personal data and how we can build more secure, privacy-preserving systems for the future. Generative AI software uses large amounts of training data to create new content such as text or images. Predictive AI uses data to forecast outcomes based on past behavior, such as how likely you are to hit your daily step goal, or what movies you may want to watch. Both types can be used to gather information about you. Generative AI assistants such as ChatGPT and Google Gemini collect all the information users type into a chat box. Every question, response and prompt that users enter is recorded, stored and analyzed to improve the AI model. OpenAI's privacy policy informs users that 'we may use content you provide us to improve our Services, for example to train the models that power ChatGPT.' Even though OpenAI allows you to opt out of content use for model training, it still collects and retains your personal data. Although some companies promise that they anonymize this data, meaning they store it without naming the person who provided it, there is always a risk of data being reidentified. Beyond generative AI assistants, social media platforms like Facebook, Instagram and TikTok continuously gather data on their users to train predictive AI models. Every post, photo, video, like, share and comment, including the amount of time people spend looking at each of these, is collected as data points that are used to build digital data profiles for each person who uses the service. The profiles can be used to refine the social media platform's AI recommender systems. They can also be sold to data brokers, who sell a person's data to other companies to, for instance, help develop targeted advertisements that align with that person's interests. Many social media companies also track users across websites and applications by putting cookies and embedded tracking pixels on their computers. Cookies are small files that store information about who you are and what you clicked on while browsing a website. One of the most common uses of cookies is in digital shopping carts: When you place an item in your cart, leave the website and return later, the item will still be in your cart because the cookie stored that information. Tracking pixels are invisible images or snippets of code embedded in websites that notify companies of your activity when you visit their page. This helps them track your behavior across the internet. This is why users often see or hear advertisements that are related to their browsing and shopping habits on many of the unrelated websites they browse, and even when they are using different devices, including computers, phones and smart speakers. One study found that some websites can store over 300 tracking cookies on your computer or mobile phone. Like generative AI platforms, social media platforms offer privacy settings and opt-outs, but these give people limited control over how their personal data is aggregated and monetized. As media theorist Douglas Rushkoff argued in 2011, if the service is free, you are the product. Many tools that include AI don't require a person to take any direct action for the tool to collect data about that person. Smart devices such as home speakers, fitness trackers and watches continually gather information through biometric sensors, voice recognition and location tracking. Smart home speakers continually listen for the command to activate or 'wake up' the device. As the device is listening for this word, it picks up all the conversations happening around it, even though it does not seem to be active. Some companies claim that voice data is only stored when the wake word – what you say to wake up the device – is detected. However, people have raised concerns about accidental recordings, especially because these devices are often connected to cloud services, which allow voice data to be stored, synced and shared across multiple devices such as your phone, smart speaker and tablet. If the company allows, it's also possible for this data to be accessed by third parties, such as advertisers, data analytics firms or a law enforcement agency with a warrant. This potential for third-party access also applies to smartwatches and fitness trackers, which monitor health metrics and user activity patterns. Companies that produce wearable fitness devices are not considered 'covered entities' and so are not bound by the Health Information Portability and Accountability Act. This means that they are legally allowed to sell health- and location-related data collected from their users. Concerns about HIPAA data arose in 2018, when Strava, a fitness company released a global heat map of user's exercise routes. In doing so, it accidentally revealed sensitive military locations across the globe through highlighting the exercise routes of military personnel. The Trump administration has tapped Palantir, a company that specializes in using AI for data analytics, to collate and analyze data about Americans. Meanwhile, Palantir has announced a partnership with a company that runs self-checkout systems. Such partnerships can expand corporate and government reach into everyday consumer behavior. This one could be used to create detailed personal profiles on Americans by linking their consumer habits with other personal data. This raises concerns about increased surveillance and loss of anonymity. It could allow citizens to be tracked and analyzed across multiple aspects of their lives without their knowledge or consent. Some smart device companies are also rolling back privacy protections instead of strengthening them. Amazon recently announced that starting on March 28, 2025, all voice recordings from Amazon Echo devices would be sent to Amazon's cloud by default, and users will no longer have the option to turn this function off. This is different from previous settings, which allowed users to limit private data collection. Changes like these raise concerns about how much control consumers have over their own data when using smart devices. Many privacy experts consider cloud storage of voice recordings a form of data collection, especially when used to improve algorithms or build user profiles, which has implications for data privacy laws designed to protect online privacy. All of this brings up serious privacy concerns for people and governments on how AI tools collect, store, use and transmit data. The biggest concern is transparency. People don't know what data is being collected, how the data is being used, and who has access to that data. Companies tend to use complicated privacy policies filled with technical jargon to make it difficult for people to understand the terms of a service that they agree to. People also tend not to read terms of service documents. One study found that people averaged 73 seconds reading a terms of service document that had an average read time of 29-32 minutes. Data collected by AI tools may initially reside with a company that you trust, but can easily be sold and given to a company that you don't trust. AI tools, the companies in charge of them and the companies that have access to the data they collect can also be subject to cyberattacks and data breaches that can reveal sensitive personal information. These attacks can by carried out by cybercriminals who are in it for the money, or by so-called advanced persistent threats, which are typically nation/state- sponsored attackers who gain access to networks and systems and remain there undetected, collecting information and personal data to eventually cause disruption or harm. While laws and regulations such as the General Data Protection Regulation in the European Union and the California Consumer Privacy Act aim to safeguard user data, AI development and use have often outpaced the legislative process. The laws are still catching up on AI and data privacy. For now, you should assume any AI-powered device or platform is collecting data on your inputs, behaviors and patterns. Although AI tools collect people's data, and the way this accumulation of data affects people's data privacy is concerning, the tools can also be useful. AI-powered applications can streamline workflows, automate repetitive tasks and provide valuable insights. But it's crucial to approach these tools with awareness and caution. When using a generative AI platform that gives you answers to questions you type in a prompt, don't include any personally identifiable information, including names, birth dates, Social Security numbers or home addresses. At the workplace, don't include trade secrets or classified information. In general, don't put anything into a prompt that you wouldn't feel comfortable revealing to the public or seeing on a billboard. Remember, once you hit enter on the prompt, you've lost control of that information. Remember that devices which are turned on are always listening – even if they're asleep. If you use smart home or embedded devices, turn them off when you need to have a private conversation. A device that's asleep looks inactive, but it is still powered on and listening for a wake word or signal. Unplugging a device or removing its batteries is a good way of making sure the device is truly off. Finally, be aware of the terms of service and data collection policies of the devices and platforms that you are using. You might be surprised by what you've already agreed to. This article is part of a series on data privacy that explores who collects your data, what and how they collect, who sells and buys your data, what they all do with it, and what you can do about it. Previous articles in the series: How illicit markets fueled by data breaches sell your personal information to criminals The Conversation will be hosting a free webinar on practical and safe use of AI with our tech editor and an AI expert on June 24 at 2pm ET/11am PT. Sign up to get your questions answered. This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Christopher Ramezan, West Virginia University Read more: From help to harm: How the government is quietly repurposing everyone's data for surveillance 23andMe is potentially selling more than just genetic data – the personal survey info it collected is just as much a privacy problem Governments continue losing efforts to gain backdoor access to secure communications Christopher Ramezan receives funding from the Appalachian Regional Commission.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store