
Forget iPhone 17 — iPhone 18 Pro could have this long-awaited Face ID feature
We haven't seen the iPhone 17 yet, not officially anyway, but we have already been hearing rumors about what the iPhone 18 might have to offer next year. And the latest tidbit suggests one long-running rumor may finally come to pass on iPhone 18 Pro.
According to display analyst Ross Young, who has a solid track record with display-related leaks, we could see the iPhone 18 Pro sport an under-display Face ID system.
According to a recent Young tweet, OTI Lumionics CEO Michael Helander claimed that "they expect phones with under panel Face ID using their materials to be available for sale in 2026."
Young added this suggests under-display Face ID could be coming to iPhone 18 Pro — followed by other phones in the future.
At the SID Business Conference today, OTI Lumionics CEO Michael Helander confirmed that they expect phones with under panel Face ID using their materials to be available for sale in 2026. This suggests that iPhone 18 Pro models will have under panel Face ID with other brands and…May 14, 2025
This is far from the first time we've heard reports about this. Earlier this month, a report from The Information claimed that under-display Face ID was coming to iPhone 18 Pro. While another rumor from Digital Chat Station claimed that there could be under display camera tech on iPhone 18 Pro series.
It's unclear whether they counted Face ID as camera tech, but either way it would mean slightly more room on the display.
While Face ID tech could be vanishing by next year, word is that Apple plans to go even further. The 20th anniversary iPhone 20 is expected to have a full-screen edge-to-edge display with absolutely no cutouts, notches, pills or otherwise to speak of.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
If the iPhone X was anything to go by, the iPhone 20 will likely be the template for all flagship iPhones going forward. So while the iPhone 18 may still have a holepunch for the selfie camera, the rumors suggest that won't be sticking around for much longer either.
I just hope that Apple can figure out how to make an under display camera work without compromising too much on quality. While the hidden selfie cameras are nothing new, the fact they're nowhere close to being as good as an uncovered camera has always been a problem.
If Apple can nail that, then it could bring in a whole new era of full-screen phones.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Fast Company
an hour ago
- Fast Company
4 observations about Apple's low-key WWDC 2025
At Apple's annual WWDC keynote, the highest-level subject is always the future of its software platforms. And the big news in that department usually stares us right in the face. In 2023, for example, it was the debut of Apple Vision Pro, the company's entry into the headset market and its first all-new experience since the Apple Watch. Last year brought Apple Intelligence, its branded take on what AI should look like as a core element of computing experiences. And then there was Monday morning's WWDC 2025 keynote, as streamed online to millions and screened to a select audience of in-person attendees at Apple Park. After Apple's embarrassing inability to ship the AI-infused update to Siri it showed off at WWDC 2024, it was hardly surprising that this year's event didn't bet everything on whipping up a further AI frenzy. That alone set it apart from last month's Google I/O keynote, whose topics consisted of AI, AI, and more AI, with some AI drizzled on top. Apple did introduce some new AI during the keynote— quite a bit of it. Overall, though, the event felt like an act of counterprogramming. Instead of positioning itself as a leader in AI—or at least quashing fears that it's a laggard —the company seemed happy being itself. From the unified new design to old features (phone calls!) turning up in new places (the Mac!), it focused on giving consumers even more reasons to own and use as many of its products as possible. Herewith a few of the impressions I took away from my morning at Apple Park: Liquid Glass is classic Apple, in the Steve Jobs sense. In 2012, one of Tim Cook's first dramatic moves after succeeding Jobs as CEO was to oust software chief Scott Forstall. That led to a reorganization that put Jony Ive in charge of design for software as well as hardware. Ive's influence was seen in the iPhone's iOS 7 upgrade the company shipped the following year. It ditched the lush skeuomorphism of the iPhone's software up until that time for a far flatter look, bringing to mind the understated, Dieter Rams -like feel of an Ive MacBook, manifested in pixels rather than aluminum. Ive left in 2019, but the principles he instilled have informed Apple software ever since. But now there's Liquid Glas s, a new aesthetic Apple is rolling out across its portfolio of platforms. It's glossy, dimensional, pseudorealistic, and animated—a dramatic departure from iOS 7-era restraint, but reminiscent of both earlier iOS releases and also older Apple software all the way back to the first version of the Mac's OS X in 2000. That was the one with buttons that Jobs said people would want to lick —a memorable design imperative that is suddenly relevant again. As my colleague Mark Wilson writes, Liquid Glass isn't about adding new functionality to Apple devices. It might not even be about making them easier to use—in fact, when an interface introduces transparency effects and other visual flourishes, legibility is at risk. It does, however, look cool in a way that's classically Apple, and which the Apple of recent years had deemphasized. The iPad has left limbo . . . for Macland. For years, Apple seemed to have reached a mental standstill with the iPad. The company clearly wanted its tablet to be something distinct from a Mac, but it also appeared to be short on ideas that were different than the Mac, especially when it came to building out iPadOS as a productivity platform. End result: The platform has foundered rather than matured. With iPadOS 26, the iPad will finally see a lot of meaningful change all at once, and most of it is distinctly Maclike. It's getting a menu bar. Windows that float and overlap. A more full-featured Files app and, for the first time, a Preview app. Even the quirky circular cursor gives way to a more conventional pointy one. As an unabashed iPad diehard, I admit to my fair share of trepidation about all this. The iPad's abandonment of interface cruft in favor of considered minimalism is a huge reason why I've been using one as my primary computer since 2011: I don't like to wrangle windows or scour menus for the features I need, hidden among those I don't. Maybe Apple has figured out how to retain what's great about the iPad even as it gives in to the temptation to borrow from the Mac. But I'm alarmed by the apparent disappearance of the iPad's foundational multitasking features in the first iPadOS 26 beta, and hope they'll return before the software ships this fall. VisionOS is still evolving, and that's good. It's been two years since Apple unveiled the Vision Pro and 17 months since it shipped. Rumors aside, we still aren't any closer to clarity on how the $3,500 headset might lead to a product that caters to a larger audience than, well, people who will pay $3,500 for a headset. Even Tim Cook says it isn't a mass-market product. Still, Apple's enthusiasm for spatial computing doesn't seem to be flagging. As previewed during the WWDC keynote, VisionOS 26 looks downright meaty, with more realistic-looking avatars for use in video calls, features for watching movies and playing games with Vision Pro-wearing friends, widgets you can stick on a wall or place on a mantel in the real world, AI-powered 3D effects for 2D photos, partnerships with companies such as GoPro and Sony, and more. None of these additions will prompt radically more people to spring for a Vision Pro in its current form. But assuming that the headset doesn't turn out to be a dead end, Apple's current investment could help a future, more affordable version offer compelling experiences from day one. It's still unclear whether ChatGPT is a feature or a stopgap. Apple's own AI assistant, Siri, was acknowledged only at the start of the keynote, when Craig Federighi, senior VP of software engineering, mentioned last year's announcements and the decision to delay the newly AI-savvy version until it meets Apple's 'high-quality bar.' Another AI helper did pop up several times during the presentation, though: ChatGPT. For example, it powers a new Visual Intelligence feature that will let users ask questions about the stuff on-screen in any app. The keynote's example: Upon seeing an image of a mandolin in a social post, you can ask, 'Which rock songs is this instrument featured in?' Given that the new Siri features Apple revealed a year ago remain unfinished, adding a dash of ChatGPT here and there is an expedient way to maintain some AI momentum. But does the company see integrating the world's highest-profile LLM-based assistant as an attractive user benefit in itself—or just a placeholder until it can offer similar technology that's entirely under its own control? I'm still not sure. At WWDC 2024, Federighi also talked about incorporating other AI models, such as Google's Gemini, but no news has emerged on that front since. Even during a pivotal, unpredictable time for the tech industry, one of the WWDC keynote's purposes remains straightforward. Apple needs to get consumers excited for the software it will ship in the fall, which isn't necessarily synonymous with blowing them away through sheer force of AI breakthroughs. In a Bluesky conversation, one commenter suggested to me that people aren't actually clamoring for AI at all —a take that has a whiff of truth to it even if it isn't the whole story. Ultimately, users want pleasant products that help them get stuff done, whether in a personal context, a work environment, or somewhere in between.


The Verge
an hour ago
- The Verge
Craig Federighi confirms Apple's first attempt at an AI Siri wasn't good enough
In March, Apple delayed its upgraded Siri, saying that 'it's going to take us longer than we thought to deliver' the promised features. At WWDC this week, Apple's SVP of software Craig Federighi and SVP of worldwide marketing Greg Joswiak shared more details about the decision to delay in an interview with The Wall Street Journal's Joanna Stern. As part of its initial Apple Intelligence announcements at WWDC 2024, Apple said that the improved Siri would have awareness of your personal context and the ability to take actions for you in apps. While Apple was showing real software at that show, Siri 'didn't converge in the way, quality-wise, that we needed it to,' Federighi said. Apple wanted it to be 'really, really reliable. And we weren't able to achieve the reliability in the time we thought.' 'Look, we don't want to disappoint customers,' Joswiak said. 'We never do. But it would've been more disappointing to ship something that didn't hit our quality standard, that had an error rate that we felt was unacceptable. So we made what we thought was the best decision. I'd make it again.' Stern asked why Apple, with all of its resources, couldn't make it work. 'When it comes to automating capabilities on devices in a reliable way, no one's doing it really well right now,' Federighi said. 'We wanted to be the first. We wanted to do it best.' While the company had 'very promising early results and working initial versions,' the team came to feel that 'this just doesn't work reliably enough to be an Apple product,' he said. At WWDC, Federighi also spoke to YouTuber iJustine, and both Federighi and Joswiak were interviewed by Tom's Guide's Mark Spoonauer and TechRadar's Lance Ulanoff. In Apple's March statement, it said that anticipated rolling out the Siri upgrades 'in the coming year,' which, to Spoonauer, Joswiak clarified to mean 2026.


The Verge
an hour ago
- The Verge
Hands on with macOS Tahoe 26: Liquid Glass, new theme options, and Spotlight
At WWDC, Apple announced its new Liquid Glass design language, which is coming to all of its devices, including Macs. I've been tinkering with the macOS Tahoe 26 developer beta on the M4 MacBook Air for about a day. So far, the aesthetic changes range from slick to slightly overwrought, but the new Spotlight search features are nifty and useful. There are new touches of glassy transparency all over macOS 26, including the Dock, Finder, widgets, and built-in apps. It's more subtle than on the iPhone, mostly because the Mac's much larger screen real estate makes the Liquid Glass elements more like accents than whatever this mess is supposed to be. I'm not very fond of it just yet, but maybe it will grow on me, like UI changes tend to. The Dock now has a frosted background that's more translucent than Sequoia's flatter design. The hazy, frozen glass aesthetic also extends to widgets, like the calendar and weather, and drop-down menus — though the latter have much higher opacity. The pop-ups for volume and brightness now use this distorted glass look as well, though they've moved to the top-right corner of the screen instead of being centered above the dock. Frankly, they're ugly, and I find their new elongated horizontal look strange and out of place. Surprisingly, the Menu Bar at the top of the screen is now invisible, so it no longer masks the screen's notch cutout with a dark gray bar. At first I found this slightly jarring, but I adjusted to it quickly, just as I did the first time I saw a notched MacBook. It became mostly innocuous with even a bright wallpaper showing its borders. (If you really hate it you can enable 'Reduce transparency' in the accessibility menu, bringing back the filled-in Menu Bar and killing pretty much all of Tahoe's other transparent effects.) The one cool thing the invisible Menu Bar enables is a new animation: when you three-finger swipe up for Mission Control, a glass pane descends from the top and distorts the view of the wallpaper underneath. It's a kitschy flourish, but it's one of the few effects in Tahoe that tickles me. Widgets now live on the desktop instead of requiring a swipe-over of the Notification Center, allowing you to populate your desktop with lots of glanceable info like an iPad home screen if you choose. Open a Finder window and you see more of Tahoe's rounded design, with the sidebar now looking like its own tall, oval-ish nested window. Dark mode and light mode show some differences here, with light mode flattening the Finder windows quite a bit more than its darker version, which looks more glassy to me. The theme controls that launched with iOS 18 are now in macOS. Opening the Appearance menu lets you change Tahoe's overall looks (light, dark, and auto), highlight colors, and icon and widget styles. The right (or wrong) combination of these settings can dramatically change macOS's looks, from minimalist to garish. 1/5 More exciting for power users are the changes to Spotlight that make it much easier to operate your Mac by keyboard alone. Spotlight search now gives you shortcuts to finding files, launching apps, performing actions, and accessing clipboard history. Pressing Command and Space calls up Spotlight as it always has, but now if you hover over the search bar with the mouse you're shown four icons for those new functions, with each offering a handy keyboard shortcut. Now this is spotlighting: by pressing Command and either number 1, 2, 3, or 4 keys you can get quick access to Apps, Files, Shortcuts, and Clipboard. Then, you can type out whatever you're searching for or trying to do. The Apps drawer can act as a mini categorized launcher. Files puts suggestions and recents at the top. Shortcuts allows you to type out functions you'd like your Mac to do via compatible apps. Clipboard is a reverse chronological history of the most recent stuff you copied. I really like the ability to set custom quick key commands. For example, I set 'M' to be the quick key for a message, and 'TM' to set a timer. Each of those actions requires typing out some part of the prompt, like the number of minutes in your timer or the contents of a message and the recipient. But if you like to use lots of hotkeys and navigating around an app with the Tab and Alt keys you're likely to feel right at home. Several readers were quick to comment that this is Apple 'sherlocking' Raycast. Raycast is a much more customizable and expansive Spotlight alternative. It can do math and unit conversions, set timers, has its own appendable clipboard history, and a bunch more, and it also supports third-party extensions. While the changes in macOS Tahoe let Spotlight encroach on some of the things Raycast can do, it's not quite as expansive. At least, not yet. Raycast is a power-user tool, and it could take Apple some time and a lot more development to win over those users. I've been using the first Tahoe developer beta for about a day. There will be plenty more to learn about macOS Tahoe as developers continue using it in its current beta form and Apple delivers more updates. The public beta isn't coming until sometime next month, and it's possible that Apple will push out some sizable changes and UI tweaks even before then.