logo
iPhone users can now capture DSLR like photos with Adobe's new camera app

iPhone users can now capture DSLR like photos with Adobe's new camera app

Hindustan Times5 hours ago

Adobe has released Project Indigo, a free experimental camera app to bring computational photography to iPhones. This app is created by Adobe's Nextcam team, including Marc Levoy and Florian Kainz. Both of them are known for their work on Pixel Camera's computational photography features. The app is a work in progress and is available on the App Store to download.
Project Indigo brings computational imaging techniques to smartphone photography by combining them with traditional camera controls. Unlike a stock smartphone camera app that captures a single photo, the Indigo app captures up to 32 underexposed frames per photo. The app then uses computational photography to align these frames to create images with significantly lower noise, higher dynamic range and natural photos.
To maintain the natural aesthetics of a photo, this app uses subtle, globally tuned image processing rather than aggressive enhancements. The app supports both JPEG and RAW formats to give photographers the flexibility to extensively edit the photos while retaining the low noise and an improved dynamic range.
The Project Indigo app is available on the Apple App Store and is compatible with iPhone Pro and Pro Max models starting from the iPhone 12 series. And for non-pro models, it supports iPhone 14 and onward. For the next experience, Adobe recommends using the iPhone 15 Pro or a new model due to the app's heavy processing requirements.
It offers a very simple camera user interface, Photo and Night mode, with all the controls including shutter speed, ISO, white balance and focus, similar to a professional DSLR camera. The app is also seamlessly integrated with the Lightroom mobile app. This allows the users to export images directly to the Lightroom app to adjust the colours and tone. An early access setting lets the user use Project Indigo as a camera app inside the Lightroom app.
Project Indigo is available only on iOS devices, and it's free to use without any signup needed. Adobe plans to expand the apps available to Android in the future. The roadmap also includes bringing more photography modes like portrait, panorama, video and advanced exposure. Adobe is also planning to introduce tone presets and looks to give more creative control to the users.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Adobe launches Project Indigo: A next-gen camera app for iPhone with AI and computational photography
Adobe launches Project Indigo: A next-gen camera app for iPhone with AI and computational photography

Mint

time28 minutes ago

  • Mint

Adobe launches Project Indigo: A next-gen camera app for iPhone with AI and computational photography

Adobe has launched a new experimental camera application for iPhone users, Project Indigo. This expands Adobe Labs' suite of mobile tools following the recent arrivals of Photoshop and Firefly on the App Store. The new app harnesses artificial intelligence and advanced computational photography to deliver images with greater depth, detail, and realism. Currently available as a free download, Project Indigo offers a refined photography experience aimed at addressing the limitations of traditional smartphone imaging. Adobe says the app is designed to move away from the typical "smartphone look", characterised by overly bright images, excessive smoothing, and exaggerated colour saturation, that can appear unnatural when viewed on larger displays. Unlike the default camera apps on most phones, Project Indigo prioritises image fidelity by using sophisticated algorithms to capture up to 32 individual frames per shot. These are then merged to produce a single image with improved dynamic range, fewer blown-out highlights, and significantly reduced noise, especially in shadowed areas. The app offers extensive manual controls, including settings for aperture, shutter speed, ISO, focus, and white balance, with additional tweaks for temperature and tint. Users can choose between two modes: Photo for regular daytime shots and Night, which leverages longer exposure and enhanced stabilisation to capture clearer images in low light with less motion blur. A standout feature of Project Indigo is its use of multi-frame super resolution. This function combats the quality loss typically associated with digital zoom by stacking multiple frames of the same scene, resulting in sharper, more detailed 'super resolution' images—particularly useful when zooming in on distant subjects. Project Indigo stores photos in both standard dynamic range (SDR) and high dynamic range (HDR), and the output is compatible with Adobe's own Camera Raw and Lightroom platforms. Adobe notes that its under-exposure technique in image capture allows for a more natural, DSLR-style output without heavy reliance on post-processing. Available for iPhones starting from the iPhone 12 Pro series, and select non-Pro models from the iPhone 14 onwards, the app does not currently require user sign-in and remains completely free to use. Adobe also confirmed plans to release an Android version of Project Indigo at a later stage.

Apple Eyes Generative AI to Speed Up iPhone and Mac Chip Design, Confirms Hardware Chief
Apple Eyes Generative AI to Speed Up iPhone and Mac Chip Design, Confirms Hardware Chief

Hans India

timean hour ago

  • Hans India

Apple Eyes Generative AI to Speed Up iPhone and Mac Chip Design, Confirms Hardware Chief

Apple is preparing to bring generative AI into the heart of its chip development process, a move that could transform how the company designs processors for its iPhones, Macs, and other devices. Johny Srouji, Apple's Senior Vice President of Hardware Technologies, recently confirmed the company's growing interest in using AI tools to make its silicon design process faster and more efficient. Speaking in Belgium while receiving an award from Imec, a renowned semiconductor research institute, Srouji revealed that Apple sees strong potential in applying AI-driven automation to speed up its chip development timeline. According to a report by Reuters, which reviewed a recording of the event, Srouji stated, 'Generative AI techniques have a high potential in getting more design work done in less time, and it can be a huge productivity boost.' Srouji emphasized that while AI offers immense benefits, it must be complemented by robust design infrastructure. He pointed out the crucial support Apple receives from electronic design automation (EDA) partners such as Cadence Design Systems and Synopsys. 'EDA companies are super critical in supporting our chip design complexities,' he said. Apple's exploration of generative AI for chipmaking reflects a broader industry trend, with tech rivals like Google and OpenAI investing heavily in artificial intelligence. Google, during its I/O 2025 developer event, showcased a range of AI innovations, while OpenAI continues to lead advancements in conversational AI through ChatGPT, further intensifying competition in the AI race. Though Apple has faced criticism for lagging in the consumer-facing AI space—especially after delays in rolling out promised Apple Intelligence features—its latest announcement indicates a shift toward strengthening its behind-the-scenes AI capabilities. Apple has been developing its in-house chips since 2010, beginning with the A4 processor for the iPhone. Since then, the tech giant has expanded its custom silicon portfolio to include the A-series for mobile devices and the M-series for its Mac lineup. The transition from Intel processors to Apple-designed M-series chips was a bold move, but one that has paid off significantly. These chips have helped Apple achieve industry-leading performance, battery efficiency, and tighter hardware-software integration across its devices. 'Moving the Mac to Apple Silicon was a huge bet for us,' Srouji reflected. 'There was no backup plan, no split-the-lineup plan, so we went all in, including a monumental software effort.' Now, with plans to incorporate AI into its chip design workflow, Apple is signaling a new chapter in its silicon strategy. While consumer-facing features may still be in development, the company is clearly investing in foundational technologies that could sharpen its competitive edge in the years ahead. As Apple continues to explore generative AI applications in hardware, the future of Apple Silicon may be even faster, more powerful, and more intelligently designed—bringing the company in closer competition with AI pioneers across the tech landscape.

Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera
Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera

India Today

timean hour ago

  • India Today

Adobe launches free camera app for iPhone users, it is made by same team that made Google Pixel camera

If you've ever felt your iPhone photos looked a bit too bright, too smooth, or just too 'smartphone-y,' Adobe may have just created your new favourite camera app. Project Indigo, which is now available as a free download on the App Store, is a new camera app designed by Adobe Labs, and it's built by the same team that helped create the iconic Pixel camera at Google. This time, the goal is different: give iPhone users more manual control and a more realistic, DSLR-style photo experience. For now, Indigo is free to try and available only on iPhone. advertisementHere's what iPhone users need to smartphone cameras today heavily process your photos – they brighten the shadows, smooth your skin, sharpen edges, and boost colours to make things pop on a small screen. While this can make pictures look good at a glance, they often feel artificial, especially when viewed on a bigger display. Adobe says Indigo is designed to produce a more natural, true-to-life image, closer to what you'd get from a DSLR. It applies less smoothing and sharpening, and its colour enhancements are subtle. The app avoids the common 'HDR-ish' or overly edited style that's typical of most default camera offers full manual camera controls – including focus, shutter speed, ISO, and white balance. You can shoot in JPEG or raw (DNG), and even control how many frames are captured for each photo. This matters because Indigo uses computational photography to combine up to 32 images to reduce noise and preserve also a Night mode that automatically suggests longer exposures in dark scenes, and even a Long Exposure setting to capture dreamy motion blur – perfect for waterfalls or city light. perfect for waterfalls or city also promises that with the Indigo app, your zoomed in pictures won't be blurry or noisy anymore. According to the Project Indigo blog post, when you pinch to zoom on the app, it uses a smart feature called multi-frame super-resolution that quietly captures several photos and blends them for sharper results. No AI guessing, just smarter shooting. And, because Indigo is by Adobe, it also seamlessly integrates with Lightroom Mobile. When you review photos in Indigo's gallery, you can launch Lightroom with a single tap to start editing right away – whether it is a JPEG or a raw DNG file. If you're already using Adobe's editing tools, this makes your workflow smoother than Adobe says it is also working on a live preview system, where you will be able to see the final edited look of your photo right in the viewfinder before you take the shot. This could dramatically change how people compose photos on their phones.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store