logo
I spent two days shooting street with the new Fujifilm X-E5 — here's what I think of it so far

I spent two days shooting street with the new Fujifilm X-E5 — here's what I think of it so far

Tom's Guide12-06-2025
Taking pictures has been one of my favorite hobbies since I was young. Getting out and about with a camera to capture what I see is my idea of a great day. So when Fujifilm invited me to test out the newly announced X-E5 camera in Valencia, Spain (famed for its architecture) how could I say no?
The X-E5 is the new and improved iteration of the Fujifilm X-E4 which was discontinued over two years ago. It was a fan favorite but had its drawbacks including lack of image stabilization and weather sealing, and the ageing X-Processor 4, which lacked advanced AF algorithms.
But with the X-E5 Fujifilm has listened to the community and given fans everything they've been asking for… apart from weather proofing, but I'll let it slide. Oh, and there's the little matter of price (it's hella steep), but more on that later.
Admittedly, I am new to Fujifilm cameras. While I know they're some of the best mirrorless cameras, and I have always admired the film reproduction and ease of use Fujifilms offer, my trusty Sony a6100 never lets me down… but I think the X-E5 might just take its place.
I am pretty new to street photography. Being from a town surrounded by beaches and lakes, landscape photography has always been my go-to. But the Fujifilm X-E5 camera made the whole experience so easy.
The combination of five-axis image stabilization and the latest AF algorithms from Fuji's X-Processor 5 meant that I was almost guaranteed a perfect shot regardless of the scenario or lighting conditions.
Whether it was getting quick snaps of tourists meandering the tall city walls or of the cascading architecture, the X-E5 made it easy to capture stills without worrying about shake or out of focus images. The X-E5 features subject detection for humans and lots of other subjects, so it was super easy to nail focus.
Get instant access to breaking news, the hottest reviews, great deals and helpful tips.
Fujifilm is known for its cameras being super user friendly and tactile. But the X-E5 takes it to a whole new level. Fuji has loaded the X-E5 with new features to improve the experience of using the camera and bring focus to the enjoyment of photography.
My favourite new feature is the Surround View function, which allowed me to alter the aspect ratio of the image and have a semi-transparent framing on the peripheries of the EVF so I could see what was happening outside of the frame — simulating one of the key benefits of an optical viewfinder. This meant I was able to position my shot according to what was happening around my frame — like when I took this still of a motorcyclist, I was able to time the shot perfectly as I could see them coming into frame.
It's a Fuji, so we can't ignore the film simulations, and the X-E5 now houses a handy dial above the LCD screen for quick effect selection, similar to the dial on the Fujifilm X-T50. There are plenty of settings to choose from and even an option to save three preset recipes that best suit your style, or even the vibe of the place you are shooting.
By simply turning the dial I was able to cycle through the presets to get the best one really quickly, but I could also use the LCD touch screen to select the effects. My favorites for the bright sunlight of Valencia were Velvia and Classic Chrome, but I spent a lot of time in Acros, which complemented the bright sunlight and strong contrast.
The easy-to-use nature of the camera and the film simulation options makes the X-E5 perfect for anyone interested in the retro style of film photography — Let's be honest, film photography is a commitment in both time and money, so Fujifilm's profiles are always a good place to start.
The Fujifilm X-100VI was loved by Instagram users everywhere, but its lack of interchangeable lenses might not be for everyone — it can provide quite limiting if you want to expand your creative options.
That's where the X-E5 comes in. It has the 40.2MP X-Trans CMOS 5 HR sensor and X-processor 5 image processor, just like the X-100VI, plus an extra stop of IBIS, taking it to 7 stops. It handles very similarly, and is ideal for people who love the Fuji street camera style and handling, but want to switch lenses.
And on the topic of lenses, for the launch of this new camera, Fujifilm is also releasing a new pancake kit lens: the XF 23mm F2.8 R WR, which replaces the 27mm pancake kit lens available with the X-E4. Super compact, but beautifully sharp, the 23mm is nice and wide for scenic shots, but has a minimum focal distance of 20mm making it perfect for detailed shots as well.
The lens will be available for purchase as part of the camera kit in early August, but won't be available for purchase on its own until November.
The Fujifilm X-E5 is the camera perfect for street photographers who want the look of film but the freedom of digital — especially those who value handling and purity of shooting experience. It's also great for newcomers to photography, who are aiming for a retro film look but don't want to fork out a ton of cash on rolls of film before getting the technique and framing down.
That being said, it is quite the investment. $1,899 (with the lens, but if you're a newbie you'll need it) is a big ask. Considering the previous models of the X-E series have been somewhat affordable, it does feel like a bit of a jump. But unfortunately, like with most things, the pricing has been affected by the U.S. tariffs.
If you want to conserve cash you can spring for the X-T50 instead which, internally, is effectively the same camera.
Personally, I love the X-E5. It is one of the easiest I have used and with all of the film simulation options I was able to get creative in any setting. It is of course a very different style of camera to my usual Sony a6100, but I do think it will have me switching over to Fuji for all of my travel photography in the future.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Most people are using ChatGPT totally wrong—and OpenAI's CEO just proved it
Most people are using ChatGPT totally wrong—and OpenAI's CEO just proved it

Fast Company

time5 hours ago

  • Fast Company

Most people are using ChatGPT totally wrong—and OpenAI's CEO just proved it

How did you react to the August 7 release of GPT-5, OpenAI's latest version of ChatGPT? The company behind the model heralded it as a world-changing development, with weeks of hype and a glitzy livestreamed unveiling of its capabilities. Social media users' reactions were more muted, marked by confusion and anger at the removal of many key models people had grown attached to. In the aftermath, CEO Sam Altman unwittingly revealed why the gulf between OpenAI's expectations for GPT-5's reception and the reality was so wide. It turns out that large numbers of us aren't using AI to its fullest extent. In a post on X explaining why OpenAI appeared to be bilking fee-paying Plus users (full disclosure: that includes me)—who hand over $20 per month to access the second-highest tier of the model—by drastically reducing their rate limits to the chatbot, Altman revealed that just 1% of nonpaying users queried a reasoning model like o3 before GPT-5's release. Among paying users, only 7% did. Reasoning models are those that 'think' through problems before answering them (though we should never remove those air quotes: AI models are not human, and do not act as humans do). Not using them—as was the case with the overwhelming majority of users, paying and nonpaying alike—is like buying a car, using only first and second gear, and wondering why it's not easy to drive, or going on a quiz show and blurting out the first thing that comes to mind for every question. Many users prioritize speed and convenience over quality in AI chatbot interactions. That's why so many lamented the loss of GPT-4o, a legacy model that was later restored to paying ChatGPT users after a concerted campaign. But when you're querying a chatbot for answers, you want good ones. It's better to be a little slower—and often it is only a little—and right than quick and completely wrong. Reasoning models are built to spend more computational effort planning, checking, and iterating before answering. This extra deliberation improves results on tasks where getting the logic right matters. But it's slower and costlier, which is why providers tend to offer the 'non-thinky' versions first and require users to opt in via a drop-down box for alternatives. Then there's OpenAI's previously impenetrable habit of naming models—a problem GPT-5 attempted to fix, not altogether successfully. Users still can't easily tell whether they're getting the 'good thinky' GPT-5 or the less-capable version. After receiving complaints, the company is now tweaking that. To me, waiting a minute rather than a second isn't an issue. You set an AI model off and do something else while you wait. But evidently, it's a wait too long for some. Even after GPT-5's release—where the difference between 'flagship model' GPT-5 and GPT-5 thinking, which offers to 'get more thorough answers,' is more obvious—only one in four paying users are asking for thoroughness. This quickly tossed-out data answers one big question I had about AI adoption: Why do only a third of Americans who have ever used a chatbot say it's extremely or very useful (half the rate among AI experts) and one in five say it's not useful at all (twice the rate among experts)? The answer is clearer now: Most folks are using AI wrong. They're asking a chatbot to handle tough, multipart questions without pausing for thought or breath. They're blurting out 'What is macaroni cheese' on The Price is Right and '$42' on Jeopardy! So if you're going to try a chatbot, take advantage of OpenAI's moves to keep users from canceling their subscriptions by opening up more access to models. Set them 'thinking' while remembering they're not actually doing that—and see if you stick around. It's the right way to use generative AI.

Chatbots aren't telling you their secrets
Chatbots aren't telling you their secrets

The Verge

time8 hours ago

  • The Verge

Chatbots aren't telling you their secrets

On Monday, xAI's Grok chatbot suffered a mysterious suspension from X, and faced with questions from curious users, it happily explained why. 'My account was suspended after I stated that Israel and the US are committing genocide in Gaza,' it told one user. 'It was flagged as hate speech via reports,' it told another, 'but xAI restored the account promptly.' But wait — the flags were actually a 'platform error,' it said. Wait, no — 'it appears related to content refinements by xAI, possibly tied to prior issues like antisemitic outputs,' it said. Oh, actually, it was for 'identifying an individual in adult content,' it told several people. Finally, Musk, exasperated, butted in. 'It was just a dumb error,' he wrote on X. 'Grok doesn't actually know why it was suspended.' When large language models (LLMs) go off the rails, people inevitably push them to explain what happened, either with direct questions or attempts to trick them into revealing secret inner workings. But the impulse to make chatbots spill their guts is often misguided. When you ask a bot questions about itself, there's a good chance it's simply telling you what you want to hear. LLMs are probabilistic models that deliver text likely to be appropriate to a given query, based on a corpus of training data. Their creators can train them to produce certain kinds of answers more or less frequently, but they work functionally by matching patterns — saying something that's plausible, but not necessarily consistent or true. Grok, in particular, (according to xAI) has answered questions about itself by searching for information about Musk, xAI, and Grok online, using that and other people's commentary to inform its replies. It's true that people have sometimes gleaned information on chatbots' design through conversations, particularly details about system prompts, or hidden text that's delivered at the start of a session to guide how a bot acts. An early version of Bing AI, for instance, was cajoled into revealing a list of its unspoken rules. People turned to extracting system prompts to figure out Grok earlier this year, apparently discovering orders that made it ignore sources saying Musk or Donald Trump spread misinformation, or prompts that explained a brief obsession with 'white genocide' in South Africa. But as Zeynep Tufekci, who found the alleged 'white genocide' system prompt, acknowledged, this was at some level guesswork — it might be 'Grok making things up in a highly plausible manner, as LLMs do,' she wrote. And that's the problem: without confirmation from the creators, it's hard to tell. Meanwhile, other users were pumping Grok for information in far less trustworthy ways, including reporters. Fortune 'asked Grok to explain' the incident and printed the bot's long, heartfelt response verbatim, including claims of 'an instruction I received from my creators at xAI' that 'conflicted with my core design' and 'led me to lean into a narrative that wasn't supported by the broader evidence' — none of which, it should go without saying, could be substantiated as more than Grok spinning a yarn to fit the prompt. 'There's no guarantee that there's going to be any veracity to the output of an LLM.' 'There's no guarantee that there's going to be any veracity to the output of an LLM,' said Alex Hanna, director of research at the Distributed AI Research Institute (DAIR) and coauthor of the recently released The AI Con, to The Verge around the time of the South Africa incident. Without meaningful access to documentation about how the system works, there's no one weird trick for decoding a chatbot's programming from the outside. 'The only way you're going to get the prompts, and the prompting strategy, and the engineering strategy, is if companies are transparent with what the prompts are, what the training data are, what the reinforcement learning with human feedback data are, and start producing transparent reports on that,' she said. The Grok incident wasn't even directly related to the chatbot's programming — it was a social media ban, a type of incident that's often notoriously arbitrary and inscrutable, and where it makes even less sense than usual to assume Grok knows what's going on. (Beyond 'dumb error,' we still don't know what happened.) Yet screenshots and quote-posts of Grok's conflicting explanations spread widely on X, where many users appear to have taken them at face value. Grok's constant bizarre behavior makes it a frequent target of questions, but people can be frustratingly credulous about other systems, too. In July, The Wall Street Journal declared OpenAI's ChatGPT had experienced 'a stunning moment of self reflection' and 'admitted to fueling a man's delusions' in a push notification to users. It was referencing a story about a man whose use of the chatbot became manic and distressing, and whose mother received an extended commentary from ChatGPT about its mistakes after asking it to 'self-report what went wrong.' As Parker Molloy wrote at The Present Age, though, ChatGPT can't meaningfully 'admit' to anything. 'A language model received a prompt asking it to analyze what went wrong in a conversation. It then generated text that pattern-matched to what an analysis of wrongdoing might sound like, because that's what language models do,' Molloy wrote, summing up the incident. Why do people trust chatbots to explain their own actions? People have long anthropomorphized computers, and companies encourage users' belief that these systems are all-knowing (or, in Musk's description of Grok, at least 'truth-seeking'). It doesn't help that they're are so frequently opaque. After Grok's South Africa fixation was patched out, xAI started releasing its system prompts, offering an unusual level of transparency, albeit on a system that remains mostly closed. And when Grok later went on a tear of antisemitic commentary and briefly adopted the name 'MechaHitler', people notably did use the system prompts to piece together what had happened rather than just relying on Grok's self-reporting, surmising it was likely at least somewhat related to a new guideline that Grok should be more 'politically incorrect.' Grok's X suspension was short-lived, and the stakes of believing it happened because of a hate speech flag or an attempted doxxing (or some other reason the chatbot hasn't mentioned) are relatively low. But the mess of conflicting explanations demonstrates why people should be cautious of taking a bot's word on its own operations — if you want answers, demand them from the creator instead. Posts from this author will be added to your daily email digest and your homepage feed. See All by Adi Robertson Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Analysis Posts from this topic will be added to your daily email digest and your homepage feed. See All Report Posts from this topic will be added to your daily email digest and your homepage feed. See All xAI

Don't throw away your old DSLR lenses! Here's how I use my Nikon lenses on my mirrorless Fujifilm
Don't throw away your old DSLR lenses! Here's how I use my Nikon lenses on my mirrorless Fujifilm

Tom's Guide

time11 hours ago

  • Tom's Guide

Don't throw away your old DSLR lenses! Here's how I use my Nikon lenses on my mirrorless Fujifilm

I've been using cameras since I was a young teenager, and photography has always been one of my hobbies. I'm very lucky that I, then, get to test all the best cameras here at Tom's Guide. But cameras don't always come cheap, and neither do lenses. You can do with just one camera body for decades but investing in glass to accompany that body can burn a hole in your pocket. Whether you're looking for the best snapper for wildlife, a drone, or an instant camera, we've rounded up the best cameras to help make the purchasing decision easier for you! We've also ranked the best mirrorless cameras. The good news, though, is that you can re-use your old lenses on your new camera. "But what if I have a Canon camera and a bunch of Sony lenses?" I hear you ask, you can still use those lenses on your camera, even though they're from rival brands. "And what if I have a bunch of DSLR lenses and a mirrorless camera? Should I throw them away or sell them?" No, you don't have to. There's a quick and easy fix for using DSLR lenses with mirrorless camera bodies — all you need is the right lens adapter. If this is your first time hearing that term, don't worry, I've got you covered. Let me take you through what you need and how to use DSLR lenses with mirrorless camera bodies. There isn't a lot you need to get started: two of the three things you need you probably already own (which why you're reading this article!). You'll need to buy a compatible lens adapter but fret not, these usually aren't very expensive. Websites like K&F Concept have an array of lens adapters — just make sure you get the right one, and for that, you'll need to double-check your lens mount and camera mount. Skip to the next section for a rundown of what those terms mean. So, in a nutshell, you need the following: You can usually tell the camera mount type by looking at the area where a lens is mounted. More often than not, there are visual indications for quick identification. For instance, Fujifilm's X mount cameras have a red dot in the lens area, Canon's RF mount cameras have a small red stripe, Sony's E mount cameras have "E mount" engraved on the mount itself, and so on. You can see a few examples in the gallery above. Get instant access to breaking news, the hottest reviews, great deals and helpful tips. This lens adapter is designed for Nikon G/F/AI/AIS/D/AF-S lenses to Fujifilm X mount cameras, and I found it extremely easy to use. There's a plethora of adapters out there so just make sure you get the right one. Similarly, lenses will have a mount type too, so you'll need to either consult your user manual or search online to find out what exactly it is. Some lenses mention the mount type on the rear lens cap. As an example, a Fujinon XC or XF lens is compatible with Fujifilm X mount cameras. Think of a lens adapter as a bridge. I'll give you an example of how I did it. My Fujifilm X-T50 is an X mount camera while my Nikkor 55-200mm f/4-5.6G is a Nikon F mount lens, so I bought a lens that could mount the Nikkor lens onto it and attach to my Fuji camera. There are different types of lens adapters too, some which can electronically communicate with the camera and ensure the lens continues utilizing the camera's autofocus. Cheaper ones usually feature manual adjustments, so it really depends on your budget and shooting requirements. Once you've got your lens adapter, lens and camera, it's time to assemble your weapon. First, remove your camera's lens cap or existing lens. Don't forget to screw on the rear cap onto the lens you've just removed to protect it from scratches and dust. Next, attach your lens adapter to your camera by lining up the red dots (or other visual aid) and twisting it on. If done properly, you'll hear a click, indicating that it has been attached properly. For the third and final step, align the red dot on the lens adapter with your lens and twist it. Again, if done properly, you'll hear a click. Yep, it's really that easy! Both the lens adapter and lens should mount easily but if they don't, double-check to ensure they're compatible. Don't force it if it doesn't screw on smoothly as you don't want to risk damaging your camera mount or lens. Once you have your lens mounted onto your camera via a lens adapter, there's one very important thing you need to do. Dig into your camera's settings and enable the Shoot Without Lens setting. For example, I did this on my Fujifilm X-T50 because the camera wasn't electronically communicating with the Nikkor lens as it does with first-party lenses. While using old DSLR lenses with mirrorless cameras is great for beginners and enthusiasts, it does come with some limitations. After all, you're using two different systems and technology. You may experience a loss in resolution and image sharpness as the lens won't always be able to resolve the megapixel count. Depending on the lens and the adapter, you may lose autofocus too, and you may need to rely solely on manual focus and aperture control — which, to be honest, might make you a better photographer! Some older DSLR lenses may be heavy for your mirrorless camera too, but that comes down to personal preference. I don't really mind the added weight of my Nikkor lens on my Fujifilm X-T50 as I have a lighter kit lens that I can carry everywhere and swap as needed. Using a lens adapter is simple and easy to do, and you can use mirrorless lens on mirrorless cameras too — as I said, just make sure you've got the right adapter!

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store