
AstroAI recalls minifridges over fire hazard that caused $360,000 in property damages
AstroAI is recalling 249,100 minifridges on Wednesday over fire and burn hazards after two of the compact refrigerators caught fire, resulting in hundreds of thousands of dollars in property damage.
The personal refrigerators' electrical switch can short circuit, posing fire and burn risks, according to a notice from the Consumer Product Safety Commission (CPSC).
CPSC has received at least 70 reports of the minifridges emitting smoke, burning, melting or catching fire. Two fires caused more than $360,000 in property damages, the recall notice states.
Manufactured in China, the recalled 4-liter, 6-can capacity minifridges measure 9.45 inches in diameter, 6.9 inches wide and 10 inches tall. Available in black, white, blue and pink, the fridges were sold online for about $40 on Amazon.com and Astroai.com from June 2019 through June 2022.
The recalled minifridges' model number is LY0204A, which is printed on a label on the back of the product. Nine-digit serial numbers of the affected units starts with "S/N" and begin with either 19, 20, 21, 2201, 2202 or 2203.
Consumers with the recalled fridges are urged to immediately stop using them and to contact AstroAI by email at recall@astroai.com or through a dedicated page on the company's website at https://www.astroai.com/product-recall for a replacement product.
Disposal of the recalled minifridges should be in accordance with state and local waste disposal procedures, the CPSC states.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
11 minutes ago
- Yahoo
It's not just Big Tech: How AI is disrupting every industry
Listen and subscribe to Stocks In Translation on Apple Podcasts, Spotify, or wherever you find your favorite is the future and investors need to prepare accordingly. In this episode of Stocks in Translation, TECHnalysis Research president Bob O'Donnell joins Markets and Data Editor Jared Blikre and Yahoo Finance Senior Reporter Allie Canal to discuss the impact of AI on technology, the job market, and how it's affecting the economy. The trio specifically focuses on hybrid AI, the AI model that splits workloads across public clouds, private infrastructure, and personal devices, and how this rapidly evolving technology will be disruptive in the world in the years to come. Twice a week, Stocks In Translation cuts through the market mayhem, noisy numbers and hyperbole to give you the information you need to make the right trade for your portfolio. You can find more episodes here, or watch on your favorite streaming service. This post was written by Lauren Pokedoff Welcome to Stocks and Translation Broadcasting from the New York Stock Exchange. I'm Jared Blicky, your host, Alo signed me to help break things down for you. The viewer is Ali Cannell. The people's voice is on assignment. First, please like, subscribe, and comment on stocks and translation on Spotify, Apple Music, Amazon, or YouTube, and today we are welcoming Bob O'Donnell, president research and one of the most respected independent tech analysts in the game today we're going to be talking about chip technology and AI in all its facets, and that leads to our phrase of the day, which is hybrid AI. Where should AI live? On your phone, in the cloud, in a box with a fox, the future may not pick only this episode is brought to you by the number 20%. That's how high US unemployment could spike in the next 5 years thanks to AI. And that's according to the CEO of Enthropic. That is 1 in 5 people. So Bob, some sobering stats there, but let's give us, uh, just please your overview of things that have happened recently, a lot of developments in the market, the economy, and technology ingeneral. Well, look, it's an incredibly exciting time in tech, right? I mean, we are transitions and developments happening at a pace that are just unheard of. I mean, I've been covering the tech industry as a tech industry analyst for 25 years, and, you know, we thought some of the early days of the cloud were fast moving and we thought some of the developments around, you know, smartphones when the iPhone hits were fast moving, that's like child's play compared to this. This stuff is moving at a pace that's, it's frankly unheard of and a lot of people are frankly are having a hard time adjusting to it, but the fact of the matter of course, comments like a 20% unemployment rate applies. They don't get your eye. But look, we're opening up some new opportunities and we're finally getting to the point where we're seeing technology actually show a sense of intelligence. I'm not talking about general intelligence and all that kind of stuff. I, I think we're a long way away from that, but we are seeing technology products and services that are genuinely doing things we're like, is really cool, right? I mean, when Siri and some of those early AI assistants came out, we all kind of hoped they would do something that's like the Flintstones. Well, that'sthe thing. You look at it now and you're like, oh my God, but that, that kind of planted the seed of this notion of, you know, a digital assistant or something that could actually help me and and be driven by technology. Now we're there and we're starting to I, I do think getting it really widely deployed in a way that average people can use and everything else is gonna take a while, but you know, all the pieces are being put into place. This is a multiyear decade long kind of transition, I think, um, and that has implications for the market has for tech companies who's gonna lead what, when, where, how all of those things aren't gonna of that, but the net net of it all is, I think it's a super exciting time and you know, obviously, you know, today we happen to be recording on ears. I looked at Jensen for my wardrobe. Yes, exactly. And, uh, you know, and everybody's excited about what, you know, what he and what the company is doing, not just for themselves, but, you know, one of the cool things about Nvidia is they are actually across the entire tech industry, uh, and that's important to bear in mind. I think that's why he and the company itself would get so much attention these days. uh, let me just get to our phrase of the day. This is hybrid AI, which is the rising model that splits workloads across public cloud, private infrastructure, and even the devices on your pocket or on your wrist. So it's about speed, privacy and unlocking these new use cases. So I, I think public awareness right now of AI in general is pretty low. So when we talk about hybrid, that's just a layer, another layer of obscurity here, but tell us about what it means to you. Yeah, no, I mean, so the idea with hybrid AI is obviously we have these tremendous resources in the cloud with, you know, chat GPT and all the other things and all these big models that are running primarily in the we've also seen a number of companies start to say, you know, actually I'd like to do some of that, but with my own data, and oh by the way, not all of my data is in the cloud despite, you know, common misperceptions that that's the case. And then finally on top of that, of course, we have PCs and smartphones and wearables, uh, and I was just on a, you know, I was on a briefing this morning with AM they're talking about how AI is coming into the car as well. So, and all of these things are of running some of this AI work, but you don't want to have everything dependent on the cloud because first of all, that means all the data has to go to the cloud and and there's a lot of cloud goes down sometimes it does go down theydo and you know, and that's why in a car you could never do that. A car has to be, you know, autonomous on its own and then occasionally hit the cloud for information or updates and and certain capabilities. Uh, so think of that scenario, but then apply it to things like a PC as well, a again, companies, I mean, there have been various studies put out by some of the big, uh, market research firms that 50, 60%, as much as 70% of a corporation's data still lives behind the firewall that is in their data center or in a private cloud somewhere there's there's been this concept for a long time what's called data gravity, which means I want to do the work where the data resides, uh, especially because moving data into and out of the cloud costs money and time and, and lots of issues. So what companies are recognizing is, hey, my data my most important data is the stuff I keep behind the firewall, and I wanna be able to use uh these GPU servers uh on my own data and create my own applications. I'm not gonna ignore the fact that there's stuff in the cloud, uh, but I'm gonna do it here and then, oh by the way, there's also the another thing that plays into this, sorry, I know it's like it's a little, I got a burning question, this is so interesting. Well, because the other thing that's going on is if you look at the projections for how much power just electrical power is gonna take to run all the data centers that people say they're gonna build, it's like the grid can't handle that, right? And there's not enough water to cool these things because they all have to be liquid cooled, a certain point, just a common sense logic tells you, well, how do we do the compute we need but spread it out? Well, one obvious way is to do that across devices so that you can spread it out and do some of the types of applications that perhaps you would go to the cloud for locally. There's also privacy implications around that, all kinds of other things, which I'm sure Apple is gonna want to talk about a lot, by the but the net net of all of this is that's why I think hybrid AI is so important. If you look at a combination of where companies have their data, uh, what sort of computing resources are available, the logistical and power requirements of, of the buildout of AI data centers as we see them now, all of those things in my mind point to the fact that we're gonna have to figure out something, uh, to make it work, and that's what I think hybrid AI is. So that's interesting. So with the emergence of how do you trade that beyond just big tech? Are there certain industries that could benefit from that? I mean, you mentioned water too, so not just utilities, water, yeah, I mean, I mean, look, all these things, you know, I mean, obviously I'm sure you guys have talked about in the past. One of the things that people are talking about is, you know, SMRs, small modular reactors, right? Because there's so much power demands, they're gonna start building small nuclear reactors just to power data centers. So there's people who are looking at, yeah, as an investment play, right? and now those are gonna take a while I think to be fair, but obviously a very interesting technology but you know, I, I do think we're gonna see it to your question specifically Ali. I think we're gonna see quite a few industries start to leverage this because there's a lot of companies who do recognize the opportunities that some of these AI resources and and LLMs and and everything else that's being developed uh are starting to enable um and they're gonna wanna do some of know, again, within their own environment. So honestly, I think some of the more traditional industries, not necessarily the tech industries, but maybe, you know, transportation or other places that you wouldn't normally think of, it's not gonna be right away, but I, I would imagine that companies like that are gonna be interested in saying, hey, we want to do some of this on our own, maybe food, you know, the food manufacturing industry. I, I don't know, right? It's likeeverything is ripe for disruption and that's just gonna create its own some industries are going to go first, and ironically I think AI researchers are the first to go, um, at least the the lower and mid-level ones, but it's gonna affect everything and then it's just a function of finding those pockets that haven't been disrupted yet. Um, I mean, I got to think like masseuses are always going to be in demand, human masseuses, uh, and I'm not, I'm not kidding there. They're gonna be, there's gonna be a demand for a certain amount of human interaction, but we're gonna talk about the job market after the break, but just thinking about the So what are the plays in AI right now? Because it looks like the Nvidia and we don't know what's gonna happen this afternoon, but that trade, the AI trade looks a little bit long in the tooth. Uh, it's gonna go through cycles here. Maybe there's gonna be some bumps in the road. What do you like right now? Well, look, I mean, I, I think there are, I think all the cloud, big cloud guys are gonna play an incredibly important role. They've continued, you know, your Googles, your Amazons, your, uh, uh, Microsoft's, uh, the metas. Yeah, and oh by the way, the other interesting thing that's going on with those guys is they're all as much as they love Nvidia and are buying a lot of Nvidia chips, they also like, I don't wanna be completely dependent upon those guys, so they're starting to build their own chips. So there's starting to be an industry, uh, of people building their own chips and that's where the broadcoms of the world and some of these other companies that actually do some of that design work alongside of a Microsoft and Google and what have there's the, there's the companies like, you know, and Cadence happens to be, uh, they make, you know, the software that's used to design chips. I mean, there are, it's a very niche. There's cadence and synopsis basically are the two main players. There's also Siemens has a little bit of that business too, butThese are very, you know, they're not huge companies and they're very specialized, but it used to be they could only sell their tools to a Qualcomm and Nvidia and Intel and an AMD, you know, and pretty much it. Now they're selling it to a much broader array of people. Now does that mean it's gonna be a huge stockmen? I don't know, but in terms of a company that's interesting to watch, absolutely seeing things like that, um, I think those things are all gonna be important because, you know, one of the other things that I was talking to someone else earlier think that semiconductors are now this thing that people actually really care about is, I think great, but also kind of crazy as somebody who watched from the sidelines or, you know, not from, I mean, who was in the midst of watching all this stuff and realizing that the vast majority of the world had no idea what I'm like, oh yeah, it's a chip, like, yeah, OK, it's something inside my phone or whatever, now, everybody recognizes how important they are, how essential they are to so much of what's going on, and how, you know, who the companies are they're involved with doing all of this. And, and so anything related to all that, companies that build the manufacture the chips, people like a lamb research, um, you know, there are others as well. They're the guys who actually build those amazing machines, uh, yeah, you know, ASML obviously in, in, uh, in the Netherlands, um, those kinds of companies are also obviously very interesting to watch, uh, because, you know, they're, they're leading edge. All right, hold that thought, we need to take a short break, but coming up we are talking AI taking over jobs, plus a musically themed runway showdown that dials back the clock one century. Stay episode is brought to you by the number 20%. In a recent Axios article with Dario Emodi, CEO of Anthropic, he said to expect 10 to 20% unemployment in the US in the next 1 to 5 years. So it's something one of these other things you don't really hear that much about on the news, uh, from CEOs, not from how do we position ourselves through this revolution, which is just, it's like, uh, it's almost imminent. It's in our faces, but it's not widely known or seenyet. Well, you know, it's interesting because I ask this question all the time, you know, I, I have, I'm very fortunate in the, the work that I do. I'm interacting with CEOs of all kinds of different companies and, and, and other high level execs to, you know, talk about what some of these big trends are always the elephant in the room, right? I mean, it's the elephant in the room like, guys, OK, what is this gonna actually do? But there is, and there's a part of you that says, oh yeah, that's gonna happen. But then there's the other part that you realize people have been so slow to really adopt all of these things as a regular part of their day that it's, and oh by the way, no one is getting trained on this. I actually, I did a study last year and I just saw another study from the numbers happen to be the same. 55%.Of the people who were surveyed, and these were IT decision makers involved with AI applications at their companies, 55% said we don't train anybody on any of this AI stuff yet. Like, what? Like how so and, and the problem is every one of us, you know, we've heard about this idea of a prompt engineer being a new job in the AI world. Uh, I think is gonna, AI is gonna write their ownproblem. That's exactly right. That, that's exactly the point is, in reality, every single one of us needs to become a prompt engineer if we're really gonna leverage AI and yet how does that happen? Well, we have to be trained, you know, the systems have to be put into a place. I mean, even me, I, and I don't know about you guys, not, not to put you on the spot, but like, you know, this is what I do, and yet I find I have to occasionally force myself out of my old habits of how I, you know, respond to emails or do certain things or research because yes there are some better tools now, but I'm still getting used to integrating them into my normal day and I think that's true for most people. Um, they're still figuring this out and habits are really hard to change. Well, that, that's an interesting point because even anecdotally talking to friends looking at TikTok, Instagram, people are now going to chat GPT to get relationship advice, to get some free therapy sessions, things like becoming more part of the conversation that the zeitgeist of the time, but even when I use it, there are definitely limitations and you have to double check things. I'll ask it a question or say something to it, and it'll spit back former President Donald Trump, it's not registering that he is president once again. So can you dive deeper into some of the limitations that we have in this in this current moment in time and when you expect we could get to a point whereMaybe we can rely 100% on AI and maybe is that a risktoo? I think it's a huge risk, to be honest with you. I don't know that we ever there's some scary stuff out there. There's very scary stuff out there and look, I think it'sAnd, and this is one of the other challenges that I'm talking about that's gonna take time figuring out, uh, you know, how much trust we can put into these systems, uh, even as they improve is a, that's not an overnight decision, right? That's something you gotta kind of live with it and try it and see what happens and companies have to try it and they have to deploy it in a way with, you know, certain groups within the organization to say, hey, how is this working for you guys?And and how does this stuff uh play itself out but to your question specifically, I mean, look, we still need more explainability, right? It's still a black box. There's a bunch of this stuff. It's like OpenAI is not that open. How did it figure that out and how did it actually get there, you know, and there's obviously a lot of work that's being done to try and deal with this know, all the, you know, all the scariest stories like the, uh, the latest one, I can't, it was, I don't know if it was anthropic, it said, you know, Claude, yeah, that was likereally focused for 3 or 4 hours and it figured out, so it was given these, these emails, uh, and these were just a fake repository of emails and it ended ended up of the engineers who had an affair, and this was all false, but just to see where these things were, yeah, I mean, but, and so when you've got that kind of thing going on to your question, Ali, I mean, look, that stuff is gonna take a while to play itself out and even let's say it was solved still so much concern out there with regards to this stuff, and it's interesting, you know, I talked about surveys, um, whenever I do a survey and some of the surveys I've seen, I always ask people to provide a subjective commentary on AI and what have you. And there's percentage, you know, even in the people that I'm surveying who are this is like IT decision makers like this is their job who show that there's that fear out there and if you were to do a general population survey, walk around the streets in New York and ask people about it, I guarantee you it'd be probably 40 to 50% of the people would say, mm I don't know, right? So there's that psychological fear that's gonna take a very long time for people to get you know, I think again it's gonna be a question of how do we best integrate these things, you know, into how we do our work, how does it actually improve our lives and allow us to get more things done and, you know, we're not always gonna plan a weekend getaway. I love all those examples. Hey, let's watch how easy it is to plan, uh, you know, a 5 day vacation to XYZ fun doing that. No, I, I know. Well, first of all, there is that, um, like, yeah, some people actually like that. But second of all, it's like, OK, I did that, you know, once or twice this year. Then what about all the other time? Like, you know, like howcomfortable are you with integrating into your job and your personal life? Well, I mean, I think those are all, you know, some of the challenge that we face. And oh, by the way, and we're just at the cusp of agents. Wait till agents start kicking in, right? Because we're up till now it's been us inputting our own questions into the chat control of your computer, doing things on your behalf, sending emails, booking things. Agents start to do all that automatically. That gets very interesting. Hold that thought here because, uh, we're gonna stick with AI, and it is time now for our runway showdown this week. We are rewinding to the early 20th century where two musical minds were pushing the boundaries of form and function. First down the catwalk we have Gustav Mahler, monumental, expressive, and deeply architectural. His symphonies require total coordination across a Think of this as today's high performance PCs and centralized cloud systems, powerful, tightly integrated and engineered for coherence. But striding behind him, Igor Stravinsky with the rite of spring, he shattered musical convention, short bursts, syncopation, radical shifts in tone. Right now this is the triplet era. I haven't even gotten into that. The hybrid AI play, modular, fragmented and fast on its feet. Now Bob, here an industry analyst, he wrote an award winning undergrad thesis on this exact era of music and its psychological undercurrents. So Bob, when it comes to tech design today, where is it better, the symphonic giant building on legacy coherent cohesion or the modernist icon who remixed therulebook. This is the craziest question I will say I've ever been asked by CPT 4.5. I, I love it. Uh wow, know, I like both. I happen to like Stravinsky better personally, that's just my musical taste, but um, you know, look, there are elements, to be honest with you, and I hate to be copping out like this, but they both kind of work together because actually the notion of what the chilets are in terms of chip design where you bring together all these different elements together and you package of like a bespoke, all of that allows then these systems to to operate more efficiently and in fact, getting back to some of the comments I made earlier, people who are doing some of these custom accelerator, uh, chips like I said, the Googles and the uh Microsofts and the Amazons and what have of what they're doing is they're taking stock arm CPUs or an Intel or an AMD CPU combining it with their stuff. In fact, one of the things that Nvidia just introduced at uh CompuTex was this idea of opening up their uh NV and the link uh connection between their CPUs and the GPUs. They said, hey,As long as you have one side B Nvidia, we'll combine somebody else's uh like Qualcomm. I was just talking to the CEO of Qualcomm last night. They were talking about the fact they now have a CPU that will connect to an Invited they're gonna be one of the companies that does this. There'll be other arm-based providers as well. And so it's there, those chilet things come together to enable those bigger systems that are orchestrating and providing all the services. So I know it's not a direct answer, but we'll take it. And last one to you, yeah, I guess just to bring it back to the company we all love to talk about all the time, Nvidia is to be this AI godfather, so to speak, and has this been a long time coming? It has been a long time coming. I mean, I think part of the reason why Jensen gets so much credit is he foresaw this a long time ago and he built all the tools that he needed to do, right? The coda software he always talks about that was like 2011, yeah, I mean, exactly, all the chips, I mean, you know, I mean, remember, yeah, this.I started out making chips to make Doom play faster, right? I mean, that's it, and to see how far he's come with this and to recognize the enabling that Nvidia has done, that's why I think they have, you know, they're a very interesting player because, and by the way, Jensen is happy to sort of share that wealth, so to speak, you know, he's, I mean, there's enough to go around, there's enough to go like, hey, I'm, I love partnering with these guys and I love partnering with these guys and these guys and these guys and these guys, and all of it feeds into his ecosystem. Then he in turn provides other pieces that they can leverage. So, you know, it's very much of a nice community building upon each other. And that is 24minutes of stocks and translation right there. Be sure to check out all the other episodes on the Yahoo Finance website or wherever you find your podcast.
Yahoo
15 minutes ago
- Yahoo
Bitdeer Technologies (BTDR) Falls 7% After $330-Million Debt Offer
We recently published a list of 10 Stocks Take A Shocking Nosedive. Bitdeer Technologies Group (NASDAQ:BTDR) is one of the worst-performing stocks on Thursday. Bitdeer Technologies fell by 7.13 percent on Wednesday to end at $11.8 apiece as investors unloaded portfolios following plans to raise $330 million in fresh funds through a debt offering. In a statement, Bitdeer Technologies Group (NASDAQ:BTDR) said that it plans to issue convertible senior notes to qualified institutional investors until June 23, 2025. The notes carry a yield rate of 4.875 percent to be paid semiannually on every January 1 and July 1 until 2031, unless earlier converted, redeemed or repurchased. According to Bitdeer Technologies Group (NASDAQ:BTDR), it would allocate $129.6 million of the proceeds to pay the cost of the zero-strike call option transaction which it entered into with an affiliate of one of the institutional investors. A construction team in a mining datacenter building work site with plans and equipment in hand. Meanwhile, some $36.1 million will be spent to pay the cash consideration for the concurrent note exchange transactions that it has entered into, while the remaining balance will be used for its data center expansion, ASIC based mining rig development, as well as working capital and other general corporate purposes. While we acknowledge the potential of BTDR as an investment, our conviction lies in the belief that some AI stocks hold greater promise for delivering higher returns and have limited downside risk. If you are looking for an extremely cheap AI stock that is also a major beneficiary of Trump tariffs and onshoring, see our free report on the best short-term AI stock. READ NEXT: 20 Best AI Stocks To Buy Now and 30 Best Stocks to Buy Now According to Billionaires. Disclosure: None. This article is originally published at Insider Monkey. Sign in to access your portfolio
Yahoo
17 minutes ago
- Yahoo
Rigetti Targets Commercial Traction With QPU-as-a-Service Strategy
Rigetti Computing RGTI is sharpening its focus on commercial viability through a service-based approach, offering quantum access via cloud platforms. The company made a significant leap with the launch of its 84-qubit Ankaa-2 system in December 2023, through its Quantum Cloud Services. In August 2024, Rigetti extended Ankaa-2 access to AWS Braket, allowing customers to run quantum workloads on demand with daily system availability. These systems are powered by Quantum Processing Units (QPUs), the quantum equivalent of CPUs, which perform quantum operations by manipulating qubits through quantum gates. The Ankaa-2 boasts a 2.5x performance boost over its predecessor and an approximate 98% median two-qubit fidelity, making it the company's most commercially viable QPU to date. This QPU-as-a-Service model enables Rigetti to reach a wider user base, including enterprise, government, and academic customers, without waiting for full-scale quantum systems. Rather than relying solely on hardware sales or long-term development milestones, Rigetti is now positioned to monetize quantum usage on a consumption basis. The cloud-first approach could help bridge its near-term revenue gap while showcasing the real-world value of its superconducting quantum technology. IonQ IONQ continues to set the commercial benchmark in this space. Its trapped-ion quantum systems are integrated across AWS, Azure, and Google Cloud, offering users seamless access regardless of platform. The company also invests heavily in building application-layer solutions, particularly for machine learning, risk analysis, and quantum chemistry. Strategic collaborations with enterprise partners and government agencies further extend IonQ's reach beyond pure hardware, positioning it as a full-stack player in the QPU-as-a-Service landscape. D-Wave QBTS offers another model of cloud-based commercialization via its Leap platform, which provides real-time access to quantum annealing processors and hybrid solvers. Although D-Wave's architecture differs from Rigetti's gate-based model, Leap has handled millions of production jobs and supports enterprise clients in logistics, materials, and optimization. The company is also integrating AI/ML hybrid solvers to expand its commercial footprint. D-Wave's success reinforces the idea that even non-universal quantum systems can generate meaningful cloud-based revenues when targeted at specific, high-impact use cases. Shares of RGTI have lost 25.7% in the year-to-date period against the industry's growth of 12.9%. Image Source: Zacks Investment Research From a valuation standpoint, Rigetti trades at a price-to-book ratio of 15.73, above the industry average. RGTI carries a Value Score of F. Image Source: Zacks Investment Research The Zacks Consensus Estimate for Rigetti's 2025 earnings implies a significant 86.1% rise from the year-ago period. Image Source: Zacks Investment Research The stock currently carries a Zacks Rank #4 (Sell). You can see the complete list of today's Zacks #1 Rank (Strong Buy) stocks here. Want the latest recommendations from Zacks Investment Research? Today, you can download 7 Best Stocks for the Next 30 Days. Click to get this free report IonQ, Inc. (IONQ) : Free Stock Analysis Report Rigetti Computing, Inc. (RGTI) : Free Stock Analysis Report D-Wave Quantum Inc. (QBTS) : Free Stock Analysis Report This article originally published on Zacks Investment Research ( Zacks Investment Research Sign in to access your portfolio