Latest news with #MishaalRahman


The Verge
6 hours ago
- Entertainment
- The Verge
YouTube makes it easy for TV users to skip to the best bits of videos
It seems YouTube is finally giving its TV app the AI feature that lets you skip to the most interesting parts of a video. Android Authority's Mishaal Rahman reports that the Jump Ahead perk for YouTube Premium subscribers appeared on his Nvidia Shield TV yesterday, a feature that was previously exclusive to YouTube's web and mobile platforms. Jump Ahead gives users an easy way to automatically get to the best bits of a video by using AI to analyze the most-watched segments that viewers typically skip to. YouTube started testing the feature last year before later releasing it for Premium subscribers on web and mobile, but those who prefer watching on the big screen — which is now the primary viewing source in the US — have been left wanting until now. Premium subscribers can activate Jump Ahead by double-tapping the fast-forward button on the video player, which then takes viewers to the next point in the video that most users view. This works differently on TVs, according to YouTube's support page, requiring users to press the right arrow on their remote to see the next most-watched section, as indicated by a dot on the progress bar. Pressing the right arrow again will then take users to that point in the video, instead of skipping ahead by ten seconds as usual. Rahman says that a message reading 'Jumping over commonly skipped section' appeared when using the feature. While YouTube's support page confirms that Jump Ahead is now 'available on Living Room,' the scale and pace of the rollout are unclear. The feature doesn't appear to be widely available on TVs yet, and YouTube hasn't made a launch announcement. A Reddit user has reported seeing the feature appear on their Samsung TV, however, and Android Police also spotted it on a Google TV streamer. We have asked Google for clarity on the rollout. Posts from this author will be added to your daily email digest and your homepage feed. See All by Jess Weatherbed Posts from this topic will be added to your daily email digest and your homepage feed. See All AI Posts from this topic will be added to your daily email digest and your homepage feed. See All Gadgets Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Streaming Posts from this topic will be added to your daily email digest and your homepage feed. See All TVs Posts from this topic will be added to your daily email digest and your homepage feed. See All YouTube


Android Authority
6 hours ago
- Android Authority
Real life or AI? Watch these videos and see if you can spot the difference
Mishaal Rahman / Android Authority Having spent some time with generative AI, I thought I had a fair idea of what to expect from Veo 3 — Google's cutting-edge AI video generator. But when I finally ponied up the $20 for a Google AI Pro subscription a few weeks ago, I was surprised to find that it outperformed even my most optimistic expectations. Unlike early AI image generators that would produce obvious deformities like extra fingers or absurd architecture, Google's Veo 3 can generate videos that look strikingly similar to their real world equivalents. In fact, some of Veo's videos can look so convincing on social media that I had to double-check whether I was looking at AI-generated content or a stock clip. Naturally, that led to the question: how good is Veo 3, really — and could the average person even tell that they're looking at an AI-generated video? To find out, I've put together a short quiz below with six Veo-generated clips pit against real-world videos. Can you tell the difference? AI-generated videos with Veo 3: Scarily good Mishaal Rahman / Android Authority Veo 3's ability to generate extremely convincing clips is impressive in its own right, but it also goes one step further: it can also produce synchronized speech or sound effects. This means the results it produces can seem nearly indistinguishable from the real deal to the untrained eye. Of course, there are telltale signs pointing to a synthetic video's AI origins if you look closely but you can expect those minor imperfections to be gone sooner rather than later. Google has already dispatched numerous fixes to Veo 3 since its debut at I/O, including a recent one that prevents glitchy subtitle-like text from appearing. To generate a video using Veo 3, you will need a Google AI Pro or Ultra subscription. That will set you back a minimum of $20 per month, to speak nothing of the higher tier that costs at an eye-watering $250 each month. And even then, you only get a limited amount of generation credits per month. Google Veo 3 is expensive, and extremely limited, but it's still very capable. The list of Veo 3 limitations doesn't end there. You can only generate extremely short videos at the moment — no longer than eight seconds each. That said, Google Flow, an experimental AI filmmaking tool, allows you to chain multiple Veo-generated clips together to create a longer video. Length aside, the other big limitation is that you can only generate 720p videos with Veo 3 at the moment. Veo 3 costs Google a lot of money in terms of processing — and while we don't know the exact internal cost to Google, we do know what developers are charged to use Veo 3 via an API. Each second of video with audio costs $0.75 to generate, while silent clips cost $0.50 per second. That means an 8-second video costs developers up to $6 per generation. Multiply that by just a few clips and it becomes clear why Google limits how many generations you get with a $20 Pro subscription. The cost of this tech is likely far from trivial. So, is Veo 3 worth that princely price tag? That brings us back to the original question: can you actually tell the difference between a real world video and an AI-generated one? Below, I've lined up six short clips — Let's see if you can spot which is which. Video 1: Combine harvester Let's start with an easy one. This one's relatively simple to pick out if you're looking closely. The AI-generated version doesn't replicate many of the real-world details you'd expect in a genuine farming scene. The sky, farm machinery, and smaller background elements look a little too clean and uniform. But to be fair, I gave Veo 3 a pretty short and non-descriptive prompt. Considering that, Veo 3 actually did an excellent job. If you weren't looking at the video side by side with real footage, it could easily pass for the real thing at a glance. What's more impressive is that I asked for a specific machinery color scheme and even mentioned the brand name, and Veo 3 delivered on both fronts. That shows just how good this model is at following context and direction — even if it doesn't nail the finer details just yet. Video 2: Squirrel eating a nut Another relatively easy one. While Veo 3's version comes impressively close, especially with the subtle body movements and surprisingly convincing ambient sound, it falls short when placed next to real stock footage. The AI squirrel looks just a bit too clean, and the background is too dark — although my prompting could be to blame. The most impressive part, though? I instructed Veo 3 to focus on the squirrel's fur with a shallow depth of field, and it delivered. I think what gives it away is the lack of any unpredictable authenticity you get with real animals. In the stock clip, the squirrel fumbles with the nut, bites off more than it can chew (literally), and has a bit more character. Still, if you saw the AI clip on its own, you'd probably never question it. Video 3: A busy night market in Thailand Veo 3 shows off its strengths here, nailing the overall atmosphere — the bustling energy and sense of movement. If you've never been to Thailand, both videos might look equally convincing. But look closer and the cracks begin to show. The stalls are too uniform and lack the visual clutter that you'd see in a real night market. The vendors also seem to be selling random, mismatched items that don't make much sense side by side. And if you look at the vendors' hand movements, you'll see that they're rather unnatural. This is a classic telltale sign of generative AI, and Google's video generator is not immune to that problem. Still, this is a difficult scene to pull off, and considering the complexity, Veo 3's attempt is half decent. Video 4: A hiker and rolling fog This scene is perhaps the most impressive of the bunch. Without the clutter of city elements or complex character interactions, Veo 3 can truly shine. Even with dramatic lighting, scenic landscapes, and atmospheric effects like fog, it doesn't really break a sweat. It helps that the real-world clip looks striking too, almost like something out of a video game. That makes this one genuinely tricky to guess. Need a hint? Look closely at the hiker's left hand and you'll notice a subtle rendering hiccup that breaks the illusion. Video 5: Herd of goats Another difficult one. Veo 3 delivers an impressive result here, and at first glance, it's genuinely hard to tell the AI-generated video apart from the real thing. The pacing and the motion of the goats look convincing enough. I don't know if I would be able to tell them apart, but knowing which one is AI-generated, I can pick out subtle oddities. For example, the ground in the AI clip feels just a bit too flat. The goats' faces and bodies are also strangely smooth, whereas the real animals have some grime on them. Still, there's no single glaring flaw — it's more of a gut feeling. How accurately can you spot AI-generated videos? How many videos did you guess correctly? 0 votes None NaN % One NaN % Two NaN % Three NaN % Four NaN % All of them NaN % Some of the above clips were easier to detect than others, but if you found yourself second-guessing even the obvious ones, you're not alone. When AI-generated videos get the lighting, camera angles, and subjects mostly right, it can become surprisingly difficult to spot. I'm not sure I would've picked up on many of the fakes without a direct comparison, even though I've looked at hundreds or thousands of AI-generated images. As the technology becomes cheaper, you can expect videos made using Veo 3 to become more commonplace. Google currently adds a small watermark to the bottom-right corner of all AI-generated videos, but if you didn't notice it above, that's because I cropped it out of every single clip. Doing that took all of a few minutes per video, meaning we need to find a new and more effective way of dealing with the impending deluge of fake videos on the Internet. I don't know what the solution is, but I hope Google's AI ethics team does. Follow


Android Authority
7 hours ago
- Android Authority
PSA: New 'choicejacking' attacks can steal your Android or iPhone's data without your knowledge
Mishaal Rahman / Android Authority TL;DR Researchers have identified new methods to exploit backdoors into Android and iOS to steal data. 'Choicejacking' is an evolution of the infamous juice jacking technique and also uses a rigged USB charger or cable to initiate data theft on your mobile devices. Choicejacking uses a combination of techniques to bypass existing juice jacking protection while faking user input to enable permissions illicitly. Juice jacking is a decade-old technique where hackers can install spyware and gain access to your phone when you use a public charging point to juice up (hence, the name) the phone's battery. Over the years, Google and Apple have enforced restrictions that prevent data transfer, especially when your phone is locked. Although these measures have been believed to suffice, researchers recently discovered they may not be enough, primarily in the face of more sophisticated attacks. Researchers at TU Graz, Austria, recently identified a series of novel techniques that can bypass existing preventive restrictions and access data on anyone's iPhone or Android device using the USB port. They have named the new technique 'Choice-jacking,' a wordplay on the familiar technique of juice jacking. In the paper, researchers claim they were able to spoof user actions, such as actively switching from just charging to data transfer and allowing a prompt that enables an external system or device to access files and settings on your phone. The nature attack involves replicating user choices, which could have led to the naming. Like juice jacking, choicejacking uses malicious chargers to initiate attacks on the users' phones. Unlike connections to PCs, both Android and iOS allow direct access to wired accessories without explicit permission, which can be exploited for attacks. On Android, specifically, the attacks work by exploiting permissions for peripherals (via AOAP or Android Open Accessory Protocol), such as mice or keyboards. Attackers can then begin hijacking system input through ADB (or Android Debug Bridge), which can simulate user input and change the USB mode to allow data transfer. The attack then proceeds with a series of commands aimed at gaining complete control of the device and gaining key access for further control. On iOS, a rigged USB cable or charger can be used to trigger a connection event for a Bluetooth device. Although it may appear as a regular Bluetooth-based audio accessory to your iPhone, it could act as the machinery to secretly allow data transfer and gain access to specific files and photos. However, it cannot access the entire iOS system as it can on Android. The team says it tested these attacks on eight top phone brands, including Xiaomi, Samsung, Google, Apple, etc. It notified these brands, and six out of eight have already patched — or are in the process of patching — the vulnerability. Despite these fixes, the best defense against choicejacking would be to avoid using public chargers at all costs. If you're traveling or anticipate your phone's battery may not last through the duration that you are out, we suggest carrying your own solution. There are plenty of chargers or power banks that we recommend so you can avoid attacks like choicejacking and avoid getting malware on your phone, or worse, losing your personal data in the process. Other solutions, such as Android's Lockdown mode, could be your saviors, but you would need to activate it manually every time you charge your phone with an unknown charger. Follow


Android Authority
21 hours ago
- Entertainment
- Android Authority
YouTube's best Premium feature is finally coming to the big screen
Hadlee Simons / Android Authority TL;DR YouTube's 'Jump Ahead' feature, which lets Premium subscribers skip to the most interesting part of a video, is now rolling out to the YouTube app on TVs. The feature uses AI and viewing data to identify the segment of the video that most viewers typically skip ahead to. Unlike on mobile and web, the TV version lets users tap their remote's fast-forward key to automatically skip to a designated point on the progress bar. YouTube is finally bringing one of its best Premium perks to the big screen. The 'Jump Ahead' feature, which lets you skip to the most interesting part of a video, is now rolling out to YouTube's TV app. First launched in May of last year for web and mobile, its absence on TVs was a notable omission, especially since nearly half of all YouTube users watch on their televisions. I discovered the change on my NVIDIA Shield TV earlier today. When I pressed my remote's fast-forward button during a certain part of a video, instead of skipping ahead 10 seconds as usual, the app automatically 'jumped ahead.' A message appeared in the top-right corner, noting it was 'jumping over a commonly skipped section,' automatically taking me to the most replayed part. Mishaal Rahman / Android Authority This is the same 'Jump Ahead' feature that YouTube rolled out on mobile and the web last year. The service uses AI and viewing data to identify the part of the video that most viewers skip ahead to, then presents Premium subscribers with a button to fast forward to that segment. On TVs, the experience is slightly different. Instead of a dedicated button, a dot appears on the video's progress bar to indicate the most common skip point. Tapping your remote's fast-forward key again will automatically jump the video to that spot. While a YouTube support page confirms the feature is now available on 'Living Room' (AKA TV) devices, the scope of the rollout is unclear. I've only seen one other user report having it, so if you've spotted the 'Jump Ahead' feature on your TV's YouTube app, let us know in the comments below! Follow


Android Authority
2 days ago
- Android Authority
Google's Linux Terminal plays a big part in turning Android into a true desktop OS
Mishaal Rahman / Android Authority TL;DR Google has revealed that it's developing a Linux Terminal app to transform Android into a platform for on-device app development and eventually gaming. The app runs a Debian Linux environment in a virtual machine on select Android devices. Recent Android builds can already run graphical Linux apps, paving the way for Android to become a true desktop computing platform. When Google released a Linux Terminal app earlier this year, it generated a lot of buzz among enthusiasts and developers. Despite the excitement, Google has been quiet about the release, even declining to mention it at its annual I/O developer conference. Recently, however, Google published documentation for the Terminal app, revealing its ambitious plans for the feature. With the Linux Terminal, Google aims to let developers build Android apps directly on Android devices. Eventually, the company plans to allow users to run full-fledged graphical Linux apps and games. The ultimate goal could be to transform Android into a first-class desktop platform that rivals macOS and Windows — and we couldn't be more excited. You're reading an Authority Insights story. Discover Authority Insights for more exclusive reports, app teardowns, leaks, and in-depth tech coverage you won't find anywhere else. These reports reflect developments at the time of writing. Some features or details uncovered in leaks may change before official release. The Linux Terminal app arrived in the second quarterly release of Android 15, which Google rolled out this past March. It uses the Android Virtualization Framework (AVF) to boot a Debian OS image in a virtual machine (VM), providing users with a terminal interface to run Linux commands. Google recently updated its official documentation for AVF, highlighting the Linux Terminal app as a key use case. The documentation addresses a long-standing limitation, noting that 'Android has traditionally been the only major operating system that doesn't let users develop apps on the platform itself.' Unlike on macOS or Windows, building apps for Android has always required a separate computer because the development tools aren't natively available on the OS. By introducing the Linux Terminal app, Google can 'provide a Linux-based development environment to Android users who are developers.' This is crucial because many development tools, including Google's official Android Studio, are available for traditional Linux distributions. While Android Studio is available for Linux, there's a caveat: it doesn't currently support the ARM-based CPUs that power the vast majority of Android devices. To enable true on-device development, Google will likely need to add ARM support to the Linux version of Android Studio, allowing it to run in a virtual machine on Android devices much like it already does on Chrome OS. More interestingly, Google plans to enable OEMs to 'implement innovative VM use cases like running graphical user interface apps and even games.' To accomplish this, the company has been working to add graphics, audio, and hardware acceleration support to AVF. This work is already bearing fruit. The Android Canary build released last week allows the Terminal app to run graphical Linux applications. We tested this new capability over the weekend and successfully ran several full-fledged Linux apps — including the desktop version Chromium, GIMP, and LibreOffice — on a Pixel 8 Pro. Compared to our initial tests back in January, the Terminal app now runs Linux apps far more reliably, though performance is still sluggish. When we ran the Speedometer benchmark inside the VM, for example, it scored less than half of what it did natively. Google still has a lot of work to do to fix bugs and improve performance, but it's impressive how far the feature has come since its initial release. If implemented well, the Linux environment could even provide the means for Android to become a desktop gaming platform. Chromebooks can currently run Windows games through their Linux environment thanks to the Proton compatibility layer, so the same could theoretically be done on Android. However, the CPU architecture would again pose a challenge, as Proton doesn't support ARM-based CPUs — at least, not yet. Valve is rumored to be working on ARM support for Proton. If true, this could open the door to running many Windows games on Android through its new Linux environment. The addition of AVF and the Linux environment introduces a wealth of new possibilities, and we're excited to see Google continue this work. These features may even be key to Google's long-term ambition of merging Chrome OS and Android into a single, unified platform. For Android to truly compete with macOS and Windows on the desktop, it needs to be more than just a blown-up version of its mobile OS. It needs to win over the developers and gamers who demand a powerful, versatile platform, and these new features are a crucial step in that direction. Follow