logo

ChatGPT was down for some users

CNNa day ago

Source: CNN
Popular AI chatbot ChatGPT is experiencing degraded performance after a partial outage on Tuesday, according to parent company OpenAI and the website Downdetector, which tracks outages of major web services.
OpenAI said it began investigating issues at 2:36 am on Tuesday, and the problems began to spike around 5:30 am, according to Downdetector's data. In addition to ChatGPT, the company's video generator, Sora, and application programming interface for developers are affected. At its peak, Downdetector received nearly 2,000 error reports on Tuesday morning.
The company is 'seeing a recovery' on its developer tools and ChatGPT, but previously said a full recovery across all impacted services could take hours.
ChatGPT worked normally in the 11 am ET hour when CNN asked it a question, but as early as Tuesday afternoon, OpenAI's website and Downdetector indicate the service is experiencing problems.
On Tuesday morning, OpenAI said it was 'observing elevated error rates and latency across ChatGPT' in a post on X, adding that it's 'identified the root case' and is 'working as fast as possible to fix the issue.'
OpenAI pointed CNN to its status page and X post when asked for further information on the issue.
The partial outage comes as ChatGPT has become a bigger presence in the office, with Glassdoor reporting that ChatGPT usage in the workplace had doubled within a year. Pew Research reports that 26% of US teens are using ChatGPT for schoolwork, up from 13% in 2023.
Some users joked on social media about struggling to answer basic questions and declining productivity during the outage. 'ChatGPT is down…Which means I actually have to type out my own emails at work. Send prayers,' one X post read.
Services like Zoom and X have also had high-profile outages this year.
See Full Web Article

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Fapon Biopharma to Showcase Differentiated Pipeline, Including Phase 1 Immunocytokine FP008, and Innovative Technology Platforms at BIO 2025
Fapon Biopharma to Showcase Differentiated Pipeline, Including Phase 1 Immunocytokine FP008, and Innovative Technology Platforms at BIO 2025

Yahoo

time23 minutes ago

  • Yahoo

Fapon Biopharma to Showcase Differentiated Pipeline, Including Phase 1 Immunocytokine FP008, and Innovative Technology Platforms at BIO 2025

BOSTON, June 12, 2025 /PRNewswire/ -- Fapon Biopharma, a clinical-stage biotech company innovating therapeutic antibodies and fusion proteins, is pleased to announce its participation in the BIO International Convention 2025 (BIO 2025), taking place June 16-19 in Boston Convention & Exhibition Center. The company will exhibit at Booth #1851, presenting its differentiated pipeline, including the flagship Phase 1 immunocytokine FP008, a portfolio of promising early-stage candidates for oncology and autoimmune diseases, and its suite of proprietary technology platforms, while actively seeking global partnerships. FP008, Fapon Biopharma's lead asset, is a first-in-class immunocytokine currently in Phase 1 clinical trials, designed to address significant unmet needs in solid tumor patient refractory to anti-PD-1 therapy. The company will also feature promising preclinical candidates targeting oncology (FP010, FP011, FPE021) and autoimmune diseases (FPE022, FPE024), highlighting its expanding research capabilities. Fapon Biopharma will feature its proprietary and innovative technology platforms, engineered to overcome complex drug development challenges: Bi/Tri-TCE Platform: Human-monkey cross-reactive TCR/CD3 nanobody, enabling the design of potent multi-specific antibodies for targeted cancer immunotherapy. FILTEN™ (IL-10M Fusion Protein Platform): Overcoming IL-10 limitations for broad applications in cancer and autoimmune diseases PROTiNb™ (Proteolysis Targeting Intra-Nanobody): A pioneering platform targeting previously "undruggable" intracellular targets, demonstrating a strong competitive edge. FIND™ Mammalian Cell Display Platform: Accelerating antibody discovery by combining mammalian cell expression with high-throughput screening. "We are excited to connect with the global biopharma community at BIO 2025," said Vincent Huo, President of Fapon Biopharma. "We look forward to demonstrating the exciting progress of our internal pipeline and how our technology platforms can empower external partners to bring transformative therapy to patients faster." Engagement Opportunities:Exhibition Booth: #1851Company Presentation: Time: 11:30 a.m., Tuesday, June 17, 2025Location: Room 153A, Boston Convention & Exhibition Center To schedule a meeting in advance or during the conference, please contact our BD representatives, Max Wang ( and Liyan Gao ( or visit us at Booth #1851. Meetings can also be requested via the BIO partnering system. About Fapon Biopharma ( Fapon Biopharma specializes in discovering and developing biologics for treating cancers, autoimmune diseases and other diseases where there are unmet medical needs. Leveraging cutting-edge technologies, we have built advanced drug discovery platforms, including an antibody discovery platform based on the globally leading mammalian cell display technology, a platform for generating IL-10M fusion proteins, and a platform for developing multispecific antibodies using Fibody and nanobodies. With a differentiated pipeline of leading drug candidates, we have established capabilities that cover the entire drug development process from drug discovery, preclinical research, Chemistry, Manufacturing and Controls (CMC) to early clinical development. Committed to innovation, we strive to deliver safer, more efficacious, affordable, and accessible biologics for everyone. View original content to download multimedia: SOURCE Fapon Biopharma Error in retrieving data Sign in to access your portfolio Error in retrieving data Error in retrieving data Error in retrieving data Error in retrieving data

After three days with iOS 26, I'm amazed by Apple's Liquid Glass redesign, but I have concerns
After three days with iOS 26, I'm amazed by Apple's Liquid Glass redesign, but I have concerns

Android Authority

timean hour ago

  • Android Authority

After three days with iOS 26, I'm amazed by Apple's Liquid Glass redesign, but I have concerns

Dhruv Bhutani / Android Authority The biggest buzz at WWDC 2025 was around Apple's spanking new Liquid Glass interface. From a unified year-based naming scheme for its platforms to what might be the most extensive visual overhaul to iOS in years, iOS 26 marks a significant shift in Apple's software approach. But is there substance beneath the divisive shiny sheen? I dove into the developer betas to give it a try. Let me preface this by saying this first beta is very buggy, and I wouldn't recommend installing it on your primary phone. Still, if you're eager to explore it, just go to the 'Software Update' section under Settings and select 'Beta Updates.' That's all it takes. Since last year, Apple has dramatically simplified the beta sign-up process. Regardless, I'd highly recommend waiting for next month's public beta before installing the update. With that said, here are some of the most significant additions to iOS 26. Liquid Glass: The most dramatic design overhaul since iOS 7 Dhruv Bhutani / Android Authority Apple's biggest change this year is the introduction of a new design language called Liquid Glass. If you're a design enthusiast or have experience in web design, you're likely familiar with glassmorphism. Liquid Glass builds on that aesthetic and makes extensive use of transparency and floating elements. More importantly, this redesign spans every Apple platform from the iPhone to the iPad, Mac, Watch, TV, and even Vision Pro. It's Apple's first real attempt to unify the visual language across its entire ecosystem. Dhruv Bhutani / Android Authority In practice, Liquid Glass means layers of translucent color, soft reflections, and depth that shift as you interact with your device. It's playful, dramatic, and distinctly Apple — for better or worse. The Home Screen shows this off best. App icons appear like digital glass, glinting based on the background. You'll notice bubble-like UI elements across the Photos app, the Fitness App, and even the Camera. On the Lock Screen and in Control Center, most flat backgrounds are now translucent layers. It's a subtle but impactful shift that makes everything feel like it's floating rather than just sitting on top of your wallpaper. Readability suffers under all that transparency — especially in Control Center. In day-to-day use, not everything works perfectly yet. Transparency can hurt readability, especially in Control Center when it overlaps busy apps like the music player. The Lock Screen has similar issues. Some animations also feel inconsistent. The interface tweaks continue on to the browser, where you now get a near-full-screen view of the webpage with glass-inspired elements that pop out. Similar to the rest of the interface, there is ample reason to be concerned about readability (especially for those with accessibility needs), and your experience is entirely dependent on the background. Still, this is early beta territory, and Apple typically refines this by the time of public release. Despite the mixed public consensus, I quite like the general direction that Apple is taking here. The interface looks futuristic to a fault, like something straight out of an Apple TV science fiction show, and I'm personally here for it. But even at this early stage, it is clear that a lot of pain points need to be addressed before the public rollout this September. The new camera experience Dhruv Bhutani / Android Authority The Camera app, too, has received a major, and much-needed, overhaul. In fact, this is the first time in years that Apple has rethought the camera UI from the ground up. While the basics remain the same, Apple has refined the layout to provide quicker access to controls. The refreshed interface makes it easy to swipe between modes like photos, videos, portrait, and more with a single swipe along the bottom edge. This feels intuitive and much more useful when composing shots. Dhruv Bhutani / Android Authority Similarly, a subtle but welcome touch is how Apple now surfaces adjustments. In some ways, the Camera app has finally gained the 'Pro' mode users have been waiting for. Features such as switching between different recording settings, LOG video, and camera resolution are infinitely more straightforward to access. While it's nowhere close to the level of Pro mode features in the best Android camera phones or dedicated third-party camera apps, it's a good compromise for casual enthusiasts who desire more control without sacrificing simplicity. A side effect of these changes is that the overwhelming amount of animations and floating elements makes the interface feel slower than it is, with everything taking just half a second too long. I can't say for sure if Apple will allow for toned-down animations, but as it stands, the floaty feeling of the UI wears you down pretty quickly. Apple Intelligence everywhere It's fair to say that Apple's initial AI push has been somewhat underwhelming. When Apple Intelligence was announced last year, well behind the competition, it distinguished itself with a strong promise of privacy. A year later, a large portion of last year's promised features are still unavailable, making it difficult to take Apple's 2025 claims entirely seriously. Regardless, among the newly announced features is deeper integration with the entire suite of on-device communication apps. Moreover, this year, Apple is opening up access to its on-device LLM to third-party developers. That is bound to open up some very interesting and innovative use cases. In Messages, FaceTime, and the Phone app, Live Translation now enables real-time language translation of both text and audio. It functions within message threads and during calls, providing quick responses without requiring you to leave the app. I couldn't find a way to activate the feature in the beta. Apple Intelligence still lags in effectiveness despite the interesting platter of system-wide integrations. Similarly, Visual Intelligence now understands what's on your screen and can surface related results, links, or suggestions. For instance, if someone sends you a product image, you can ask the on-device intelligence to show you similar items from the web or pull up information about it without ever leaving the thread. Think of it as Apple's take on 'Circle to Search' but leveraging the power of Apple's on-device LLM and ChatGPT. This is one of iOS 26's more exciting features, but once again, it is not yet available in the developer beta. Genmoji and Image Playground are also part of this AI layer. You can now combine emoji, photos, and descriptive phrases to generate custom stickers and images. While these tools feel like fun party tricks for now, their true power lies in deep system-wide integration. The feature works exactly as you'd imagine and lets you combine existing emojis, photos and text-based prompts to create custom emojis. The results are pretty good, as you can see in the screenshot above. It's not really something I'd use very often, but better on-device image and emoji generation is effectively table stakes, so an improved experience is very welcome. Dhruv Bhutani / Android Authority The other feature that I found exciting was deeper integration of AI into Apple's on-device scripting service. Apple Intelligence is now available to the Shortcuts app, enabling you to create smarter automations. This means you can integrate Apple's on-device LLM or even ChatGPT into a shortcut and use it to parse data before passing it on to another app. I can envision use cases like instantly splitting a tab or summarizing any on-screen content, such as an Instagram post. In fact, it took me minutes to get a shortcut up and running to automatically create a note based on a shared Instagram post after passing it through the on-device LLM. That's very cool. A smarter battery dashboard Talking about everyday use features, Apple has finally overhauled the Battery section in Settings. The new interface replaces the 24-hour and 10-day views with a more digestible weekly breakdown. It then compares your average battery consumption to your daily usage, highlighting which apps are consuming power and why. Tapping into any given day reveals a split between active screen time and idle background use. It's very similar to the battery insights available to Android users and is a welcome addition. Dig deeper, and you'll also find a new Adaptive Power Mode. Unlike the static Low Power Mode, Adaptive adjusts in real time based on how aggressively you're using your phone. It can dim the screen or scale back background tasks without requiring user input. You still get the manual 80% charge limiter and battery health metrics, but the focus here is on smarter defaults. Settings, Keyboard, Messages, and other subtle improvements In addition to the big hits, numerous smaller quality-of-life improvements are sprinkled throughout the OS. The keyboard feels chunkier and more precise, with better haptic feedback. There's a new preview app that lets you perform a wide range of file-based functions, including previewing files, of course. The Settings app has undergone minor restructuring. While not a radical shift, the app feels cleaner and faster to navigate with its revamped font sizing and kerning. In Messages, you can now set custom backgrounds per conversation, adding a bit more personality to threads. Apple has also added a polls feature for group chats, something that arguably should have existed years ago. The Phone app has also received some attention. It now unifies the Recents, Favorites, and Voicemails tabs into a single, streamlined interface. The most significant addition is Call Screening. It screens unknown callers by gathering context and offering options to respond or dismiss them without ever answering. Hold Assist is another helpful tool. If you're stuck in a call queue with customer support, your iPhone can now wait on hold for you and alert you when a human finally joins the line. iOS 26 also introduces a dedicated Apple Games app. It acts as a central hub for all things gaming on your device, effectively serving as a lightweight but genuinely useful Game Center replacement. The app pulls in your installed games, offers personalized recommendations, and allows you to see what your friends are playing. Achievements, leaderboards, and Game Center invites are now neatly tucked into this space. Apple is clearly trying to make iOS gaming feel more like a platform and less like a series of one-off downloads, but it remains to be seen if there's significant adoption. So, is iOS 26 worth the hype? Dhruv Bhutani / Android Authority It's hard to say definitively at this early stage. There's no doubt that Liquid Glass gives iOS a bold new face, and updated Apple Intelligence features feel like the beginning of something genuinely useful. But right now, it's mostly potential. iOS 26 is playful, dramatic, and distinctly Apple — for better or worse. Many features are buggy or half-baked, and even improvements like those in the camera app require further polish. To be fair, this is a developer beta. I'll reserve judgment until the final release rolls out later this year, but what is undeniable is that this is the most ambitious update Apple has shipped in years.

AOSP isn't dead, but Google just landed a huge blow to custom ROM developers
AOSP isn't dead, but Google just landed a huge blow to custom ROM developers

Android Authority

timean hour ago

  • Android Authority

AOSP isn't dead, but Google just landed a huge blow to custom ROM developers

Mishaal Rahman / Android Authority TL;DR Google has made it harder to build custom Android ROMs for Pixel phones by omitting their device trees and driver binaries from the latest AOSP release. The company says this is because it's shifting its AOSP reference target from Pixel hardware to a virtual device called 'Cuttlefish' to be more neutral. While Google insists AOSP isn't going away, developers must now reverse-engineer changes, making the process for supporting Pixel devices more difficult. Earlier this year, Google announced it would develop the Android OS fully in private to simplify its development process. By focusing its efforts on a single internal branch, Google aimed to streamline work that was previously split. The news initially spooked some in the Android development community, but the controversy quickly subsided. The impact was minimal, as Google was already developing most of Android behind closed doors and promised that source code releases would continue. Now, however, a recent omission from Google has rekindled fears that the company might stop sharing source code for new Android releases, though Google has stated these concerns are unfounded. As promised, Google published the source code for Android 16 this week, allowing independent developers to compile their own builds of the new operating system. This source code was uploaded to the Android Open Source Project (AOSP), as usual, under the permissive Apache 2.0 license. However, multiple developers quickly noticed a glaring omission from the Android 16 source code release: the device trees for Pixel devices were missing. Google also failed to upload new driver binaries for each Pixel device and released the kernel source code with a squashed commit history. Since Google has shared the device trees, driver binaries, and full kernel source code commit history for years, its omission in this week's release was concerning. These omissions led some to speculate this week that Google was taking the first step in a plan to discontinue AOSP. In response, Google's VP and GM of Android Platform, Seang Chau, refuted these claims. In a post on X, he addressed the speculation, stating that 'AOSP is NOT going away.' Mishaal Rahman / Android Authority He also confirmed the omission of Pixel device trees is intentional, stating that 'AOSP needs a reference target that is flexible, configurable, and affordable — independent of any particular hardware, including those from Google.' Instead of supporting AOSP builds on Pixel devices, Google will support the virtual Android device 'Cuttlefish' as its reference target. Cuttlefish runs on PCs, allowing Google and platform developers to test new hardware features. Google will also continue to support GSI targets, which are generic system images that can be installed on nearly any Android device. On one hand, this logic is sound. Google wants to move away from using Pixels as the AOSP reference device and is making changes to that effect. As Seang Chau notes, 'AOSP was built on the foundation of being an open platform for device implementations, SoC vendors, and instruction set architectures.' In that regard, Cuttlefish is a more appropriate reference target because it isn't a heavily customized piece of consumer hardware like a Pixel phone. However, since Cuttlefish is a virtual device, it can only simulate how hardware features behave, making it an imperfect reference in some ways. The more significant issue, however, is the impact this decision will have on developers who build custom ROMs — the community term for hobbyist forks of AOSP. Nolen Johnson, a long-time contributor and reviewer for the LineageOS project, says the process of building these ROMs for Pixel phones will become 'painful' moving forward. Previously, Google made it simple for developers to build AOSP for Pixel devices, but that support is now gone. Developers simply had to 'pull the configurations [that] Google created,' add their customizations, and then build. Now, however, they will need to take the old device trees that Google released for Android 15 and 'blindly guess and reverse engineer from the prebuilt [binaries] what changes are needed each month.' This is because making a full Android build for a device — not just a GSI — requires a device tree. This is a 'collection of configuration files that define the hardware layout, peripherals, proprietary file listings, and other details for a specific device, allowing the build system to build a proper image for that device.' While Google previously handled this work, developers must now create their own device trees without access to the necessary proprietary source code. Furthermore, Google's decision to squash the kernel source code's commit history also hinders custom development. The Pixel's kernel source code was often used as a 'reference point for other devices to take features, bug fixes, and security patches from,' but with the history now reduced to a single commit, this is no longer feasible. While Google is under no obligation to release device trees, provide driver binaries, or share the full kernel commit history (in fact, it's one of the few device makers to do these things), it has done so for years. The company's reason for doing so was because the Pixel was treated as a reference platform for AOSP, so developers needed an easy way to build for it. Google's decision to now discontinue the Pixel as an AOSP reference device is unfortunate, as it has pulled the rug from under developers like the teams at LineageOS and GrapheneOS who build Android for Pixel devices. These developers will still be able to build AOSP for Pixel devices, but it will now be more difficult and painful to do so than before. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store