New iPhone 17 Pro Max renders give us the best look yet at the flagship phone
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Quick Summary
A new render has compiled all of the rumours about the iPhone 17 Pro Max.
That's a big departure from iPhone designs of old.
If you're an iPhone fan, chances are you're already looking forward to the next big release from the brand. Historically, we see the next generation of its flagship handsets around September each year, though the ever-turning rumour mill ensures we get a pretty good idea beforehand.
This year, the iPhone 17 range will have a lot riding on it. The brand has had a strong start to 2025, with the launch of the iPhone 16e and a first ever Q1 lead over Samsung.
Now, we've got the best look yet at the iPhone 17 Pro Max model. That comes from renowned concept maker, 4RMD, who has created the renders based on the flurry of leaks and rumours we've heard so far.
As you've probably already guessed, they're interesting to look at. We've already reported on the design changes expected for the new handsets, which in the case of the Pro Max includes a wholly new camera bar.
That spans the full width of the handset, with the camera modules themselves in the same spot, and the flash and microphone shifted to the far side of the case. Think of the Google Pixel 9 camera bar but about twice the size and you're in the right ballpark.
Elsewhere, the renders suggest a return of Touch ID via the haptic camera button which launched on the iPhone 16 range. It's also said to feature a Dynamic Island which is 30% smaller than the current devices.
The cameras are also expected to get a boost. A rear trio of 48MP sensors are expected with a 5x optical zoom on the periscope telephoto sensor. Still, it's the front-facing camera which is arguably more exciting, which the video suggests could be a 24MP unit.
Of course, all of this is speculation – we won't know the actual specs of the device until the range launches later in the year. Still, it's fun to think about. Ultimately, if the device showcased here were to launch at the end of the year, it would probably be quite a hit among the iPhone crowd.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
21 minutes ago
- Yahoo
Crashed lander looks back at Earth from the moon
When you buy through links on our articles, Future and its syndication partners may earn a commission. Resilience, a lunar lander built and operated by the Japanese company ispace, was part of the Hakuto-R Mission to deploy a "Moonhouse," a tiny colorful art piece, on the moon, while also exploring its surface features using its Tenacious rover. After launching on Jan. 15 from Florida's Space Coast on Hakuto-R Mission 2, ispace's Resilience lander arrived in lunar orbit on May 6. The lander then deftly shifted its path to an orbit just 62 miles (100 kilometers) above the moon's surface. On May 27, Resilience took this photo, with a view of Japan being blocked by the clouds on Earth's surface, according to an ispace tweet. Resilience hovers over the moon in its lunar orbit, aiming for the Mare Frigoris ("Sea of Cold") on the moon's near side, as a landing site. With Resilience, the Japanese company ispace hoped to be one of the first to land a private spacecraft on the moon. Unfortunately, its first mission, in April 2023, ended in failure as the crashed during its touchdown attempt. Undeterred, ispace — in collaboration with other agencies like NASA and JAXA — designed and tested Resilience as part of the Hakuto-R Mission 2 (the R stands for "reboot"). Resilience carried five payloads, including a small 11-pound (5 kilograms) rover named Tenacious, which would have been used to collect lunar samples, according to NASA. Unfortunately, the landing for Resilience, scheduled on June 5, 2025, came to an abrupt halt when telemetry data from the lander stopped coming in right before the soft landing, leaving the world wondering what happened to Resilience. A few hours later, ispace announced that Resilience likely crashed on the moon, bringing an end to the mission. You can read more about Resilience and ispace's other missions as the company tries to return to the moon.
Yahoo
30 minutes ago
- Yahoo
AI wasn't the focus of Apple WWDC. Here's why.
Apple's (AAPL) Worldwide Developers Conference (WWDC), the company's annual developer event, is underway. Yahoo Finance Tech Editor Dan Howley joins Market Domination from Cupertino, California to share what Apple's WWDC signals about the company's AI efforts and where the iPhone maker stands in the AI race compared to other Big Tech players like Meta (META) and Alphabet (GOOG, GOOGL). To watch more expert insights and analysis on the latest market action, check out more Market Domination here. All right, well, it is day two of Apple's worldwide developers conference is taking place. That's in California. And joining us now from that conference is Yahoo Finance tech editor Dan Howley. Dan. Yeah, Josh, uh, this is, you know, day two, as you said, developers are still kind of milling around, analysts and journalists like myself are, are also here. And there's some takeaways that we're getting from the event yesterday now. And so one of those is kind of based on Apple's approach to AI and how that's been kind of a slow trickle from the huge explosion that we saw last year with Apple intelligence to what we saw this year where instead of a new big splashy announcement, they had smaller bits of AI that they're weaving throughout their operating system. It looks like that's going to be Apple's path forward, at least for now. And so we hear something about that new generative AI powered series that it's supposed to come at some point either this year or next year. We're still not entirely sure. That seems to be the way that Apple can compete and potentially win this AI race that they have against the likes of Google and Samsung. I think one of the things that gets conflated because of the idea that AI is so large is comparing Apple's approach to something like a Microsoft, which obviously provides the majority of its AI services to enterprises. And yes, they do have consumers, but if you look at, you know, the balance, it's going to be on the enterprise side. And so they obviously compete with Google as well as far as smartphones, but really it comes down to the enterprise and what they can offer to advertising. So Apple's approach is going to be different naturally, but this way of kind of moving forward with the dribs and drabs, generally, it seems as though I, I got to see a few of these, these features that are truly helpful updates. So it's not that, you know, we're still seeing those, you know, make emojis or things like that, but then there's also ways of adding functionality to existing apps with AI that so far seems to be pretty, pretty helpful for users. Dan, here, here's my big question for you as, as a reporter who has followed Apple for a long time. You know, Apple not interested in being first and as you know, they are interested in, in getting it right and then succeeding in that market. So, and listen, that has proved to be a very successful tactic and strategy for Apple. Can Apple do that again, Dan, in your opinion, when it comes to AI? I think they can, right? I mean, look, as you said, they've done this again and again. They were not the first with music players. I think the Microsoft Zoom might have been before that, or maybe not the Zoom, but there were things like mini disc players that tried to, you know, make smaller music players, but then the iPod came out and it killed it. They weren't the first, uh, when it came to, uh, you know, smartphones, the idea of smartphones. Uh, there were ideas out there already, but they kind of just honed it and created the iPhone. And so same thing with, you know, wireless earbuds and smartwatches and fitness trackers, what have you. They've done this over and over and over again. And it, you know, it's their kind of MO where they let everyone trip over themselves getting out the, the, you know, the, the starting blocks, right? To be first. Uh, and then they kind of take their time and provide products that don't miss or more often than not, don't miss and give general, general utility to users. And I think that's what they're doing here, right? So, uh, some of the offerings that they have on include the ability to, uh, you know, create spreadsheets very quickly from recipes that you may have, and you need to scale that up, uh, for, for different aspects of, of what you maybe want to do is as far as parties go and things along those lines. So I think that those are really truly helpful rather than, you know, the whole idea of let's take pieces out of images. They also spent a lot of this event catching up, really. So they have, uh, Google has what's called circle of search. Samsung also offers it where you can circle something on your phone that you take a picture of or from your, your screen and then be able to search for it really quickly. Now Apple has that in their visual intelligence app, and that's a truly helpful feature. I use something like that on my own to, to search for something, a flower that I had seen, wanted to know if it was bad for my cats or not. It turns out it was. Got it out of the house. This adds to that. So there, I think that that strategy could absolutely take off here. Wall Street obviously wants to see a big explosive, splashy announcement. They're not getting that yet. We'll see what happens when Siri comes out. And if Siri is even helpful as far as users go, are we going to want to use that on the daily, or is it just going to be something that we kind of, you know, use for a minute and then fall out of love with like the prior series? And then these smaller announcements become the more important ones. We shall see. Still a lot of questions after this event. Dan, thank you. Appreciate it. Error while retrieving data Sign in to access your portfolio Error while retrieving data Error while retrieving data Error while retrieving data Error while retrieving data


Atlantic
34 minutes ago
- Atlantic
Good Taste Is More Important Than Ever
There's a lesson I once learned from a CEO—a leader admired not just for his strategic acumen but also for his unerring eye for quality. He's renowned for respecting the creative people in his company. Yet he's also unflinching in offering pointed feedback. When asked what guided his input, he said, 'I may not be a creative genius, but I've come to trust my taste.' That comment stuck with me. I've spent much of my career thinking about leadership. In conversations about what makes any leader successful, the focus tends to fall on vision, execution, and character traits such as integrity and resilience. But the CEO put his finger on a more ineffable quality. Taste is the instinct that tells us not just what can be done, but what should be done. A corporate leader's taste shows up in every decision they make: whom they hire, the brand identity they shape, the architecture of a new office building, the playlist at a company retreat. These choices may seem incidental, but collectively, they shape culture and reinforce what the organization aspires to be. Taste is a subtle sensibility, more often a secret weapon than a person's defining characteristic. But we're entering a time when its importance has never been greater, and that's because of AI. Large language models and other generative-AI tools are stuffing the world with content, much of it, to use the term du jour, absolute slop. In a world where machines can generate infinite variations, the ability to discern which of those variations is most meaningful, most beautiful, or most resonant may prove to be the rarest—and most valuable—skill of all. I like to think of taste as judgment with style. Great CEOs, leaders, and artists all know how to weigh competing priorities, when to act and when to wait, how to steer through uncertainty. But taste adds something extra—a certain sense of how to make that decision in a way that feels fitting. It's the fusion of form and function, the ability to elevate utility with elegance. Think of Steve Jobs unveiling the first iPhone. The device itself was extraordinary, but the launch was more than a technical reveal—it was a performance. The simplicity of the black turtleneck, the deliberate pacing of the announcement, the clean typography on the slides—none of this was accidental. It was all taste. And taste made Apple more than a tech company; it made it a design icon. OpenAI's recently announced acquisition of Io, a startup created by Jony Ive, the longtime head of design at Apple, can be seen, among other things, as an opportunity to increase the AI giant's taste quotient. Taste is neither algorithmic nor accidental. It's cultivated. AI can now write passable essays, design logos, compose music, and even offer strategic business advice. It does so by mimicking the styles it has seen, fed to it in massive—and frequently unknown or obscured —data sets. It has the power to remix elements and bring about plausible and even creative new combinations. But for all its capabilities, AI has no taste. It cannot originate style with intentionality. It cannot understand why one choice might have emotional resonance while another falls flat. It cannot feel the way in which one version of a speech will move an audience to tears—or laughter—because it lacks lived experience, cultural intuition, and the ineffable sense of what is just right. This is not a technical shortcoming. It is a structural one. Taste is born of human discretion—of growing up in particular places, being exposed to particular cultural references, developing a point of view that is inseparable from personality. In other words, taste is the human fingerprint on decision making. It is deeply personal and profoundly social. That's precisely what makes taste so important right now. As AI takes over more of the mechanical and even intellectual labor of work—coding, writing, diagnosing, analyzing—we are entering a world in which AI-generated outputs, and the choices that come with them, are proliferating across, perhaps even flooding, a range of industries. Every product could have a dozen AI-generated versions for teams to consider. Every strategic plan, numerous different paths. Every pitch deck, several visual styles. Generative AI is an effective tool for inspiration—until that inspiration becomes overwhelming. When every option is instantly available, when every variation is possible, the person who knows which one to choose becomes even more valuable. This ability matters for a number of reasons. For leaders or aspiring leaders of any type, taste is a competitive advantage, even an existential necessity—a skill they need to take seriously and think seriously about refining. But it's also in everyone's interest, even people who are not at the top of the decision tree, for leaders to be able to make the right choices in the AI era. Taste, after all, has an ethical dimension. We speak of things as being 'in good taste' or 'in poor taste.' These are not just aesthetic judgments; they are moral ones. They signal an awareness of context, appropriateness, and respect. Without human scrutiny, AI can amplify biases and exacerbate the world's problems. Countless examples already exist: Consider a recent experimental-AI shopping tool released by Google that, as reported by The Atlantic, can easily be manipulated to produce erotic images of celebrities and minors. Good taste recognizes the difference between what is edgy and what is offensive, between what is novel and what is merely loud. It demands integrity. Like any skill, taste can be developed. The first step is exposure. You have to see, hear, and feel a wide range of options to understand what excellence looks like. Read great literature. Listen to great speeches. Visit great buildings. Eat great food. Pay attention to the details: the pacing of a paragraph, the curve of a chair, the color grading of a film. Taste starts with noticing. The second step is curation. You have to begin to discriminate. What do you admire? What do you return to? What feels overdesigned, and what feels just right? Make choices about your preferences—and, more important, understand why you prefer them. Ask yourself what values those preferences express. Minimalism? Opulence? Precision? Warmth? The third step is reflection. Taste is not static. As you evolve, so will your sensibilities. Keep track of how your preferences change. Revisit things you once loved. Reconsider things you once dismissed. This is how taste matures—from reaction to reflection, from preference to philosophy. Taste needs to considered in both education and leadership development. It shouldn't be left to chance or confined to the arts. Business schools, for example, could do more to expose students to beautiful products, elegant strategies, and compelling narratives. Leadership programs could train aspiring executives in the discernment of tone, timing, and presentation. Case studies, after all, are about not just good decisions, but how those decisions were expressed, when they went into action, and why they resonated. Taste can be taught, if we're willing to make space for it.