Latest news with #StephenSchenck


Android Authority
2 days ago
- Android Authority
We asked Gemini for a story about the Pixel 10 launch and it hallucinated a bizarre foldable
Stephen Schenck / Android Authority Summer's been going by so fast, it's hard to believe that we're already down to just two weeks to go until Google's August 20 event, where we're expecting to see the company introduce the Pixel 10 series, the Pixel Watch 4, and some new Pixel Buds. As that date keeps creeping up on us, we're learning more and more about what we're likely to get. Some of it sounds great, like the full-featured Qi2 magnetic charging support, while we're a little bummed by other developments, like word that the Pixel 10 Pro Fold might not hit stores until October. While those are important details, granted, with the event so palpably close even those are starting to feel overshadowed by the tone of general excitement that's consuming us. After all, this isn't just about some fresh new hardware — Google is going to want to sell us on the experience these devices deliver, and that's going to mean showing off new software features, hyping up creative tools, and all the fun and showmanship that a launch like this calls for. Alright, so cool? That will probably be really neat to experience two weeks from now? So what? Well, on Tuesday Google just introduced its latest Gemini tool, with the AI system picking up the ability to easily generate illustrated storybooks. We already had a bit of fun imagining some fictional Android Authority offices, but it felt like this feature has even more potential. Gemini lets you steer the story in any direction you want, accepts file uploads for adding richness to the world it creates, and supports revisions to help tweak things just right. That got us thinking: If we feed Gemini everything we know about Google's plans for the Pixel 10 launch, could it put together a story of what we're likely to expect? We prompted Gemini with a PDF full of Pixel 10 rumors, along with a general description of Google's upcoming event and our own plans for covering it. Gemini's first draft was in the right ballpark, but needed a bit of tweaking. To an extent, Gemini's storybook tool feels resistant to taking too many notes. We were able to insert our own C. Scott Brown into the story easily enough (or, at least, someone wearing his glasses) but Gemini didn't like our attempt to correct Rick Osterloh's job title. Then there are the hallucinations, from a Pixel with a several-generations-old camera bar … in the middle of the screen, to a Pixel 10 Pro Fold that looks like one of those origami fortune tellers. But enough spoilers; let's check out the actual story! Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Stephen Schenck / Android Authority Wat. Stephen Schenck / Android Authority Stephen Schenck / Android Authority Gemini, you have just turned the Pixel into an iPhone. We hope you're happy with yourself. Follow


Android Authority
4 days ago
- Entertainment
- Android Authority
Gemini storybooks let you be the star of your kids' bedtime stories
Stephen Schenck / Android Authority TL;DR Gemini now lets you generate illustrated storybooks. You can direct output towards a specific art style, and even upload your own imagery. Gemini lets you direct how the story unfolds, and can read it aloud when completed. As Google builds out its AI-fueled tools and services, we keep seeing impressive new ways the company manages to 'connect the dots' and create something new and useful out of existing pieces. Just look at Audio Overviews: Gemini could already summarize content, and Google has tons of experience when it comes to synthesizing speech, so combining those to make virtual podcasts made perfect sense. Last month we checked out some early evidence towards another new feature that would smartly combine a number of Gemini's skills, and today it's finally going official. We're talking about Gemini storybooks, which Google has just launched today. The idea is simple: Ask Gemini to tell you a story, and it will combine its generative text and imagery capabilities to weave together a 10-page tale. You can provide as much story direction as you please, and can also steer how the artwork turns out, having Gemini render its pages in the art style of your choice. There's even support for uploading pictures of people or elements you want included. While this is clearly a feature designed to entertain and educate young children, it is a heck of a lot of fun to play with for Gemini users of all ages, and we've already been pretty impressed with some of what it's come up based on our prompts. Stephen Schenck / Android Authority For the record, that is indeed exactly how well-groomed and attractive everyone at Android Authority appears. While we're generally happy with our first attempts playing with Gemini storybooks, there are still occasionally a few rough edges, and most popped up with the artwork it generated — the occasional wonky-looking logo, or sometimes forgetting the art style entirely and switching to photo-realistic characters. But this is technically still an experiment for the moment, so that sort of thing is probably only to be expected. The more important factor is that Gemini makes it easy to go back and revise pages. Even there, though, getting exactly what you want out of the tool can be a little delicate. For instance, we requested a specific change on page 8 of our story, and Gemini still went back and changed the art of page 1 here, inexplicably putting a screen on the backside of a monitor: Stephen Schenck / Android Authority Issues like that can be a little frustrating, but ultimately don't take away from much of the fun of this tool. And let's face it, with the audience Gemini storybooks are intended for, we doubt those young readers will be especially picky about the random hallucination or two. Storybooks are available now in Gemini on both your desktop and in the mobile app. Share the best of what you're able to create with us down in the comments. Follow


Android Authority
31-07-2025
- Android Authority
Everyone hates the new Google Photos editing interface
Stephen Schenck / Android Authority Change isn't always easy, and while sometimes we resist it for as long as we can, dealing with it is more often than not an inevitability. Though our attitudes may sometimes gravitate more towards acceptance than full-on embracing that change, swallowing that pill can be a lot easier if we feel reassured that we're moving in a positive direction — that we're at least going to get some benefit from the change. Right now, Google Photos is going through some growing pains: a moment of temporary upheaval — and we really are trying to emphasize there, 'temporary' — as the app introduces a rejiggered approach to its editing suite that has a lot of users feeling somewhere between confused, frustrated, and angry. Why did Google change my Photos editor? Let's back up for a second. Google Photos first debuted back in 2015, emerging from the rubble of Google+, and while celebrating its 10th birthday this year, Google announced a few upcoming changes. Those included that updated QR code scanner as well as a new editor experience that 'provides helpful suggestions and puts all our powerful editing tools in one place.' While Google did mention a few concrete changes, like being able to tap on part of an image to get editing suggestions, the full scope of the reorganization wasn't immediately clear. And while Google missed its initial June release timetable for distributing the update, that also didn't sound like a bad thing, with Google talking seriously about how much it wanted to get this refresh right: This is a major redesign for our editor — providing all new helpful suggestions and bringing all our powerful editing tools together in one place — so we are taking our time rolling it out and making sure that it is working well for users before bringing it to everyone. This week, on the cusp of August, the new editor has finally started rolling out widely. Did that extra time pay off? Based on the reactions we're hearing from users, Google may have wanted to keep testing some of these tweaks just a little bit longer. Worse, or just different? Pull up the new Google Photos on your Android device, and it's going to look reasonably similar to what we had just earlier this week. And even when you tap on that 'Edit' button, it's still clear this is very much the Google Photos editor, even as our editing options present themselves in a quite different way. Old UI New UI Maybe the first thing you'll notice is the persistent cropping interface. While you could tap on an edge to immediately begin cropping in the old UI, here you're more or less always in cropping mode, which Google makes practical by parking those crop options up above your image. Old UI New UI That's a big change already, but one that's not too hard to get behind. What's more frustrating is that this shift seems to have resulted in us losing the ability to perform perspective correction while cropping — if that option is hiding somewhere in the new UI, we haven't spotted it yet. Hopefully that's not a permanent oversight, as this was a quite useful tool we'd love to have back. One of the issues that appears to be generating the most frustration is that a lot of editing options are no longer where you'd expect them to be. With this new UI, Google has seriously reorganized where many editing tools live, and while this new approach arguably makes a bit more sense than the old implementation, having to relearn everything is slowing users down. Before, Google split the Photos editing options into a few main categories: Suggestions Crop Tools Adjust Filters Markup With the new editor, those categories receive a big overhaul: Auto Actions Markup Filters Lighting Color Even where we have the same categories existing across the two interfaces, the options within are changing. For instance, if you wanted to play with sky options before, you'd find that control grouped under Tools. With that option going away in the UI (aren't all of these tools, after all?), Google has instead started categorizing it with Filters: Old UI New UI Let's go through all of these and look at what has — and what hasn't — changed. Markup offers the same selection as before, with pen, highlighter, and text tools. Filters starts by adding that distinction between old filters and sky styles, but once you tap through, you'll see the same filter choices as you had before, now joined by an 'auto' option at the end. You can tap a filter once to select, and again to control the intensity. Old UI New UI While that works the same as it did before, Google now gives us a slightly tweaked look for how those sliders are presented. Actions offers a combination of the pop, sharpen, and denoise features from the old Adjust, combined with all the old Tools (with the exception of sky filters). And if you ever need to get really explicit about your intent to crop a pic (despite being able to already do so at most places across this new editing interface), an option for it also lives here. Lighting contains a subset of options that used to be under Adjust — brightness and contrast stuff. And the new editor splits the saturation and tint family of options from the old Adjust off into a new Color section, all by themselves. New UI New UI To Google's credit, it seems aware that people are going to stumble a bit (at least initially) as they pick up this new editor, and all the way over on the right you'll find an incredibly handy magnifying glass icon that pulls up a detailed, searchable list explaining all available options. In a pinch, you could even just skip the rest of the interface and work straight from here. If that's all this update amounted to, we'd offer a little sympathy for the haters, acknowledge that change can be tough, and suggest they keep a stiff upper lip and learn to deal with the new placement of so many editing options. But that's not the complete story, as we alluded to when mentioning the vanishing perspective crop tool earlier. Old UI New UI Magic Eraser is back, but in the new Photos editing interface the old camouflage options no longer appears. This would recolor objects to help make them less distracting, without outright removing them. Granted, it's not one we used nearly as much as the object removal tool, but it's still odd to see it unceremoniously disappear like this. We've tried long-pressing buttons and everything else we can think of to find another way to access it, but if it is there, it is very much not proving intuitive. While we've spotted these couple feature omissions while attempting to catalog the overarching editor reorganization, it's entirely possible that there are even more cuts that impact the tools available to you in Google's new editor. Keep calm, and edit on Those missing tools aside, it's hard to honestly categorize this refresh as anything other than positive. For all the complaints we've heard voiced, most don't amount to more than 'I need to make an extra tap' or 'I have to remember where this moved to.' When we look at how Google is now grouping the editing options in Photos, none of the categorization really seems 'wrong' — it's just 'wrong to us' because we're used to the old sorting. After a few weeks of experience using this new interface, it's going to feel just as familiar as it used to, rest assured. New UI New UI Get past that mental hump, and you're well on your way towards learning to appreciate these editor improvements for what they really offer. The way the initial editing screen now includes engaging previews of its auto-edit suggestions is a nice move in the right direction. Being able to tap for suggestions on a specific area of an image is also a fun new way to get started, and we especially like the way it helps highlight some of the newer and AI-powered tools that longtime Photos users might have glossed over when they first arrived. Ultimately, this is still very much the Google Photos we know and love. It's fine to be a little frustrated when you've got to learn a new workflow, but it's definitely going to be worth your effort. Take some time getting comfortable with the new tool placement, and get yourself ready for the next decade of Google Photos. Follow


Android Authority
13-06-2025
- Android Authority
Google's latest experiment brings NotebookLM's best features to Search
TL;DR Audio Overviews emerged as one of Google's breakout AI hits, synthesizing virtual podcasts with a pair of hosts. After debuting with NotebookLM and spreading to other Google services, the company is experimenting with Audio Overviews in Search. For this initial test, access is limited to the US and only supports English. Forgive us for sounding like a broken record by this point, but Google's Audio Overviews have easily emerged as one of the company's most genuinely impressive and useful AI tools. First debuting as part of the NotebookLM research assistant, Audio Overviews make text summaries a whole lot more accessible by crunching them down into what's essentially a mini podcast, with a pair of virtual hosts chatting back and forth. We've seen Google expand access and bring Audio Overviews to more of its services since then, like Gemini this past spring, and now it's coming to the granddaddy of them all: Search itself. Google is offering users the opportunity to opt in to try anew Search Labs experiment, giving them early access to Audio Overviews in Search results. Once you flip it on, you'll start finding a new 'Generate Audio Overview' button when running a Search. If you're not immediately seeing it, try scrolling down, as right now Google's featuring it distinctly from AI Overviews. Stephen Schenck / Android Authority The company warns that it can take up to 40 seconds to do the research, synthesize the voices, and assemble your completed Audio Overview, but in our initial testing we saw results that were much more manageable — closer to 10 seconds. Obviously your mileage will vary, and a more complicated or obscure query may take a little longer. Just make sure you sit tight if you're actually interested in hearing the results, as Google informs us that navigating away will discard the in-progress Overview. This kind of feature makes perfect sense for Search, as it's basically just repackaging solutions we already had. Back in April we saw NotebookLM pick up the ability to search down new source material on your behalf. This is more or less Search doing the same thing, and then piping that output into an Audio Overview, just like NotebookLM could. But by baking it right into Google's most publicly facing service, this experiment has the potential to expose this impressive functionality to a whole lot more users. Right now this experiment is only available in English and in the US, but considering how Audio Overviews in NotebookLM already support over 50 languages, we're hopeful that won't be the case for long. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.


Android Authority
03-06-2025
- Business
- Android Authority
Google's awesome NotebookLM tool simplifies sharing with this highly requested feature
Stephen Schenck / Android Authority TL;DR NotebookLM now lets you share your notebooks publicly. The tool lets you create a shareable URL that allows anyone with a Google account to view your notebook. The public sharing feature is not available for Workspace Enterprise and Education users. NotebookLM is undoubtedly one of the best AI tools in Google's arsenal. The tool makes it incredibly easy to organize notes collected from various sources, draw information from them, and even generate podcasts based on your notes featuring AI hosts. Google recently released NotebookLM on mobile, bringing these features to Android and iOS users. Now, the company is adding a highly requested feature that simplifies how you share your notebooks with others. So far, NotebookLM has required users to provide the recipient's email ID to share notebooks, and users with personal Gmail accounts could only share their notebooks with up to 50 other users. Google has now dropped this limitation and added a link-based sharing feature that lets you share your notebooks publicly. The feature works much like link-based sharing in Google Docs and Sheets, allowing you to create a shareable URL that lets anyone view your notebook, so long as they have a Google account. You can generate this link by selecting the 'Share' button in the top-right corner of your notebook and then setting access to 'Anyone with a link.' Google notes that viewers 'won't be able to edit source content, but can still interact with a public notebook by asking questions or exploring generated content, such as audio overviews, FAQs, or briefing documents.' A support page highlighting the sharing feature reveals that only owners and editors can generate a public share link, and public sharing is not available on Workspace Enterprise and Education accounts. It adds that while users with free personal Google accounts can publicly share notebooks, only those with a paid subscription can view usage analytics for public notebooks. Got a tip? Talk to us! Email our staff at Email our staff at news@ . You can stay anonymous or get credit for the info, it's your choice.