logo
#

Latest news with #VisualIntelligence

Penlink Introduces New Media Intelligence Platform Powered by Generative AI
Penlink Introduces New Media Intelligence Platform Powered by Generative AI

Associated Press

time28-05-2025

  • Business
  • Associated Press

Penlink Introduces New Media Intelligence Platform Powered by Generative AI

Washington, DC, May 28, 2025 (GLOBE NEWSWIRE) -- Penlink, the leading authority in AI-powered digital intelligence, proudly announces the launch of VIA, its next-generation Visual Intelligence Platform. This marks a major advancement in the company's suite of CoAnalyst GenAI solutions, designed to deliver intuitive access and powerful analysis of large-scale digital media datasets. VIA provides analysts and investigators with unprecedented access to media intelligence by transforming images and videos into actionable insights. The platform enables users to seamlessly search and analyze digital media such as open-source intelligence (OSINT), evidentiary, or forensic data. Investigators can now detect relevant images and frames through contextual and natural language queries, extract geospatial intelligence, identify objects and relationships, interpret scene context, verify media sources, and trace how visual content spreads across networks. 'VIA represents a major step forward in making unstructured visual data searchable, analyzable, and trustworthy,' said Shay Attias, Chief Technology Officer at Penlink. 'We've engineered it as a multimodal system, leveraging the latest generative AI models to extract and correlate insights across visual, textual, and spatial domains, unlocking layers of intelligence that were previously inaccessible.' Penlink has long focused on bridging the gap between overwhelming data and operational intelligence. With VIA, investigators no longer need to sift through irrelevant media - they can focus directly on the content that matters, gaining instant insights into time, place, and context. The platform extends Penlink's mission into the visual domain, offering accelerated workflows for investigations, threat detection, and uncovering new layers of intelligence. 'Penlink is committed to building solutions that empower our law enforcement, defense, intelligence, and enterprise partners,' said Peter Weber, CEO of Penlink. 'With VIA, we are unlocking the potential of visual data at scale - bringing clarity to complex investigations and accelerating the path from raw media to real understanding and better conclusions.' VIA is now available to selected partners and will be showcased during upcoming innovation briefings and public safety summits. For more information, visit About Penlink Penlink is the leading provider of digital intelligence solutions, integrating open-source intelligence and digital evidence for law enforcement, national security, defense, and enterprise sectors. Leveraging advanced AI technologies, Penlink enables end-to-end digital investigations and threat monitoring. Its comprehensive data intelligence platform accelerates the identification of leads and critical connections in complex investigations. Headquartered in the U.S. with global operations, Penlink is proud to support organizations worldwide with solutions that enhance safety, security, and operational impact. A.J. Guenther Yes& [email protected]

iPhone Visual Intelligence: What is it and how to use
iPhone Visual Intelligence: What is it and how to use

Hindustan Times

time28-04-2025

  • Hindustan Times

iPhone Visual Intelligence: What is it and how to use

Visual Intelligence feature brings powerful AI tools to iPhones, allowing users to interact with the world around them using their phone's camera. This feature, integrated with iOS 18 and later versions, lets users gather information about their surroundings, from identifying plants and animals to learning more about businesses and locations. However, Visual Intelligence is only available on specific devices. Users must have iOS 18.2 or later installed on iPhone 16 models, iOS 18.3 on iPhone 16E, or iOS 18.4 on iPhone 15 Pro models. Additionally, Apple Intelligence needs to be enabled through the Settings app under Apple Intelligence & Siri. Also read: How to easily compress data on your iPad to save storage space To start using Visual Intelligence, follow these steps based on your iPhone model: Also read: AC buying guide: 5 things to know before get a new air conditioner Visual Intelligence offers several ways to gather and act on information: Also read: Stranger Things Season 5 release date: Popular sci-fi horror series set to release in… To exit Visual Intelligence at any time, simply swipe up from the bottom of the screen. Explore and experiment with Visual Intelligence to unlock its full potential and enhance your everyday tasks. Mobile finder: iPhone 16 LATEST price, specs and all details

I've been testing iOS 18.4 — try these 5 features first after you upgrade
I've been testing iOS 18.4 — try these 5 features first after you upgrade

Yahoo

time03-04-2025

  • Yahoo

I've been testing iOS 18.4 — try these 5 features first after you upgrade

When you buy through links on our articles, Future and its syndication partners may earn a commission. iOS 18.4 is out of beta and available as a full release for anyone to download on their iPhone. Though at this point, the update seems to be getting more attention for what's not included than the features that actually are there. This update was expected to be the last significant one ahead of this summer's iOS 19 preview. As a result, we were looking for iOS 18.4 to deliver some promised Apple Intelligence features that had yet to materialize during the rollout of Apple's AI tools. Specifically, iOS 18.4 was supposed to bring new capabilities to Siri that made the assistant more aware of context and capable of interacting with apps on your phone. That's not happening, nor will it occur any time soon. Earlier this month, Apple confirmed that the Siri revamp would be delayed, with some people speculating that we may not see the features promised to us last year until 2026. "It's going to take us longer than we thought to deliver on these features and we anticipate rolling them out in the coming year," an Apple spokesperson conceded in a statement to Daring Fireball. While Apple's AI struggles are certainly disappointing, it would be a shame if they were to overshadow the enhancements that Apple has included with iOS 18.4. While not as significant as a Siri overhaul, features included in iOS 18.4 do bring some new capabilities and quality-of-life improvements to the iPhone. Even better, with a couple noteworthy exceptions, the big changes aren't tied to Apple Intelligence. That means anyone with a compatible iPhone — i.e., an iPhone XR, iPhone XS/XS Max or later — can enjoy the benefits that iOS 18.4 delivers. I've been trying out iOS 18.4 across multiple iPhones since the initial betas came out. And I've found a few features that are definitely worth trying out if you're only now upgrading to the latest version of Apple's iPhone software. Let's start with an Apple Intelligence feature if only because this is one of the better instances of Apple's AI efforts. Visual Intelligence lets you use the camera on your iPhone as a search tool. You can point your iPhone camera at signs in another language to get a translation, or you can use it to capture times and dates of upcoming events you see on a flyer. And yes, you can turn to Apple Intelligence to look up information about what your camera captures similar to Google Lens. Previously, that functionality was limited to iPhone 16 models, largely because they came with a camera capture button that had been necessary for accessing Visual Intelligence. iOS 18.4 adds a shortcut to launch Visual Intelligence that you can tie to your phone's Action button, making the feature available to the iPhone 15 Pro and iPhone 15 Pro Max. Am I flagging this up because I have an iPhone 15 Pro on hand, and this immediately makes my own phone more useful? I will not pretend otherwise. But the occasional hiccup aside, Visual Intelligence performs well, and it's nice to see feature parity restored to all iPhones capable of support Apple Intelligence. An iOS 18.4 addition that everyone can enjoy — and one that's probably my favorite new feature in the update — is the arrival of ambient music in Apple's Control Center. If music playing in the background helps you concentrate on work or wind down before bed, you should definitely check this out. Just swipe down from the upper right right corner of your iPhone screen to access the Control Center, and then press and hold on the screen to edit it. If you tap Add a Control, you'll find four new ambient music options as you scroll through the list of various Control Center shortcuts. You can even add an ambient music control to your phone's lock screen for quicker access. If I have a criticism about the feature, it's that I wish there were a way to toggle between various ambient music modes like Productivity and Chill without having to dedicate different spots on the Control Center for those specific shortcuts. But still, as a person who needs music in the background to focus sometimes, I find the addition of ambient music to be a welcome one. Apple includes some minor changes in each iOS update that aren't exactly headline-grabbers, but still make the iPhone any easier device to use. iOS 18.4 includes a couple of these changes that are worth noting. When using Safari, you may notice that a list of your recent searches now appears when you open a new tab and tap the search bar. Maybe that doesn't bug you, but it's not the most secure way to go through life. The shipping version of iOS 18.4 adds a toggle in Settings to hide your recent Safari searches from anyone who happens to glance at your iPhone screen at an inopportune moment. The other minor change in iOS 18.4 that's caught my eye is the ability to pause app downloads rather than stopping them completely. That way, you don't lose any progress when you resume downloading app when your network connection is stronger or you're back on Wi-Fi — or really whatever reason you have for mashing that pause button. Switch back to Apple Intelligence-specific improvements, iOS 18.4 sees the arrival of priority notifications. These are alerts that are deemed to be time-sensitive by the on-board AI within your phone, which then floats them to the top of your notification stack. Here's an example: My wife texted me in Messages about an upcoming dentist appointment. The priority notifications feature figures that's a more pressing concern than the goofy meme my daughter texted to me, so my wife's alert appears first, regardless of the order in which those texts were sent. When I was testing the iOS 18.4 beta, I didn't alway appreciate how another Apple Intelligence feature — notification summaries — would truncate the texts into an alert I couldn't easily parse. But as someone who's not always paying attention to messages as they come into my iPhone, I do like the fact that the most important ones will be displayed more prominently. To take advantage of priority notifications, you've got to go into the Notifications menu of the Settings and turn the feature on. If you've got a phone that supports Apple Intelligence, I suggest that you do, if only to see if the feature fits into your workflow. Apple Intelligence has been rolling out slowly to different parts of the world, debuting in the U.S. last October before arriving in the U.K., Australia, Canada and New Zealand late last year. iOS 18.4 lets the EU join the part, and there support for more languages. Specifically, iOS 18.4 now supports simplified Chinese, German, French, Italian, Japanese, Korean, Brazilian Portuguese and Spanish. Localized English is also available in Singapore and India. If you speak those languages or live in those countries and have an Apple Intelligence-capable iPhone, that's a big deal. And it's an even bigger deal with Apple, which has said that its new iPhones sell better in places where Apple Intelligence is available. Those aren't the only iOS 18.4 additions. Apple has added other things that I either haven't had a chance to try or that I'm a bit more dubious about. But I'll highlight them here in case they're capabilities that matter to you. I have not now nor ever been someone who cares about emojis, though I recognize that puts me in a very small minority. If you're someone who does find emojis to be the ultimate form of human expression, you've got eight new ones courtesy of iOS 18.4. Study them well, and use them as you will. On the Apple Intelligence front, the Image Playground app adds a third drawing style — Sketch joins Animation and Illustration. That certainly starts to tackle one complaint I've had about Apple's image generation feature — not enough styles. But my overall observation that images generated in Image Playground really don't have practical uses remains. One iOS 18.4 addition that I haven't tried but would like to centers around the addition of food-related content to the News Plus subscription service. Specifically, Apple is bringing recipes to its News app. Most of the "tens of thousands" of recipes Apple is introducing will be for News Plus subscribers, but a few will be available to everyone — think of it as your free taste. I'm a bit of a foodie myself, and I'm always looking for new ideas of what to whip up for dinner. So that might just tempt me to part with the $12.99/month that Apple charges for News Plus. That service's growth beyond just a collection of magazine articles and news stories to include games and now recipes is one of the more interesting developments among Apple's various subscription packages. I'm a long-time iPhone owner, but I rarely use Apple Intelligence — here's why Why Apple's WWDC 2025 keynote will be its biggest in years iPhone 16e review: The right trade-offs?

iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro — and it makes your phone more powerful
iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro — and it makes your phone more powerful

Yahoo

time11-03-2025

  • Yahoo

iOS 18.4 adds a crucial Apple Intelligence feature to the iPhone 15 Pro — and it makes your phone more powerful

When you buy through links on our articles, Future and its syndication partners may earn a commission. I have an iPhone 15 Pro in my possession, which means that I also have the ability to access most Apple Intelligence tools after Apple launched its suite of AI features with the iOS 18.1 release last fall. Most tools, but not all. When Apple launched the iPhone 16 lineup last year, it also announced a new feature called Visual Intelligence. With the Camera Control button on those iPhone 16 models, you could summon Apple's answer to Google Lens and get more information and even a handful of actionable commands based on whatever it was you were pointing your phone's camera at. Though the iPhone 15 Pro and iPhone 15 Pro Max have enough RAM and processing power to run Apple Intelligence features, Visual Intelligence has not been one of them. That means someone's $799 iPhone 16 could do something the phone you paid at least $999 for couldn't And when the $599 iPhone 16e debuted last month, we learned that it, too, could access Apple Intelligence while iPhone 15 Pro and Pro Max owners remained shut out. Why, the very idea! That's changed, though, with the arrival of the second public beta of iOS 18.4. If you're trying out that beta on an iPhone 15 Pro, you've now gained the ability to run Visual Intelligence. And while that's not necessarily a game-changing decision, it does give your older iPhone new powers it didn't have previously. And some of those powers are proving to be quite useful. Here's a quick rundown of how iPhone 15 Pro and iPhone 15 Pro Max users can set up their devices to take advantage of Visual Intelligence, along with a reminder of just what you can use that AI-powered feature to do. If you want to use Visual Intelligence on your iPhone 15 Pro, you'll need to find a way to launch the feature since only iPhone 16 models come with a Camera Control button. Fortunately, you've got two other options, thanks to the iOS 18.4 update. iOS 18.4 adds Visual Intelligence as an option for the Action button, so you can use that button on the left side of your iPhone to trigger Visual Intelligence. Here's how. (Image: © Future) Launch the Settings app, and on the main screen, tap on Action Button. (Image: © Future) You'll see a list of possible shortcuts to trigger with the Action button. Scroll through by swiping until you see Visual Intelligence. Tap on Settings to exit. From that point, whenever you press and hold the Action button, it will launch Visual Intelligence. If you don't want to tie up your Action button with Visual Intelligence, you can also use the ability to customize the lock screen shortcuts in iOS 18 to add a Visual Intelligence control. The iOS 18.4 adds a Visual Intelligence control that you can place on the bottom of your lock screen. To add that control, simply edit your screen and select the control you want to customize. (In the screen above, we're putting it on the bottom left corner.) Select Visual Intelligence from the available options — you'll find it under Apple Intelligence & Siri controls though you can also use the Search bar at the top of the screen to track down the control. Tap the icon to add it to your lock screen. So your iPhone 15 Pro is now set up to launch Visual Intelligence, either from the Action button or a lock screen shortcut. What can you do with this feature? Essentially, Visual Intelligence turns the iPhone's camera into a search tool. We have step-by-step instructions on how to use Visual Intelligence, but if your experience is like mine, you'll find things very intuitive. Once Visual Intelligence launches, point your camera at the thing you want to look up — it could be a business's sign, a poster or just about anything. The iOS 18.3 update that arrived last month added the ability to identify plants and animals, for example. The information that appears on your screen varies depending on what you point at. A restaurant facade might produce the hours of the place is open, while you can also collect phone numbers and URLs by capturing them with Visual Intelligence. I captured a poster of an upcoming event with my iPhone 15 Pro, and Visual Intelligence gave me the option of creating a calendar event with the date and time already filled in. It would be nice if the location were copied over, too, since that information was also on the poster, but we'll chalk that up to this being early days for Visual Intelligence. Visual Intelligence can also get flummoxed in situations like these. When I tried to add a specific soccer match to my calendar from a schedule listing multiple dates, Visual Intelligence got confused as to which one to pick. (It seems to default to the date at the top of the list.) Having to edit incorrect data defeats most of the purpose of this particular capability, but you'd expect Apple to expand Visual Intelligence's bag of tricks over time. You have two other options for expanding on the info Visual Intelligence gives you. If you've enabled Chat GPT in Apple Intelligence, you can share the information with ChatGPT by selecting the Ask button, or you can tap Search to run a Google search on the image you've collected. Of those two options, ChatGPT seems to be the more fully featured in my experience. When I captured a recipe for a bean dip, ChatGPT initially summarized the article, but by asking a follow-up question, I could get the chatbot to list the ingredients and the steps, which I could then copy and paste into the Notes app on my own. To me, that's a lot more handy than having to pinch and zoom on a photo of a recipe or, worse, transcribing things myself. Google searches of Visual Intelligence image captures can be a lot more hit and miss. A photo of a restaurant marquee near me produced search results with similarly named restaurants, but not the actual restaurant I was in front of. Google Search did a much better job when I took a picture of a book cover via Visual Intelligence, and the subsequent search results produced reviews of the book from various sites. That could really be useful the next time I'm in a book store — it's a place that sells printed volumes, youngsters, ask your parents — and want to know if the book I'm thinking of buying is actually as good as its cover. That's been my experience with Visual Intelligence so far, but my colleagues have been using it since it came out last year as everything from a virtual guide in an art museum to a navigation tool for getting out of a corn maze. If you've got an iPhone 15 Pro, you can now try out your own uses for Visual Intelligence. Apple confirms Siri 2.0 is delayed to iOS 19 and possibly beyond New to Apple Intelligence? Try these features first This update improves my least favorite Apple Intelligence feature

Apple's "Google Lens" update is confirmed for older iPhones too, but there's a small catch
Apple's "Google Lens" update is confirmed for older iPhones too, but there's a small catch

Yahoo

time22-02-2025

  • Yahoo

Apple's "Google Lens" update is confirmed for older iPhones too, but there's a small catch

When you buy through links on our articles, Future and its syndication partners may earn a commission. Quick Summary Apple will introduce Visual Intelligence to its last-gen iPhones in a forthcoming iOS update. The Apple Intelligence feature has, until now, required the use of the Camera Control button that neither device has. Apple will soon give iPhone 15 Pro and 15 Pro Max owners the opportunity to use one of its new Apple Intelligence features that they've not yet had access to. Visual Intelligence is already available on the iPhone 16 family of phones, and will be accessible on the iPhone 16e when it arrives next week, but last-gen Pro users have not received it yet for good reason – it requires the use of the Camera Control button they don't have on their handsets. However, a future iOS update will reportedly introduce the option to switch its use to the Action button instead, which the iPhone 15 Pro and its super-sized stablemate do have. A Control Center shortcut is also said to be coming as an alternative. Both of these are going to be available on the iPhone 16e from the off, as it too is lacking a Camera Control button. Visual Intelligence is essentially Apple's answer to Google Lens. You currently activate it on supported iPhones by holding down the Camera Control button and point the camera at an object you want information on, such as a building or something you'd like to buy online. It'll then give you the option to ask ChatGPT for details or search for the item through Google. Of course, you also need to approve Apple Intelligence support on your device, plus sign into ChatGPT through settings to get the full functionality. According to John Gruber of Daring Fireball, Apple is yet to reveal when Visual Intelligence will be made available on last-gen iPhones – ie. which iOS update will carry the changes. However, he guesses that it could come with iOS 18.4, which should be available in its beta form "any day now", so we're likely to find out more about it when that arrives. We'll let you know more as and when we find out.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store