
Google Pixel 10 Leaks: Triple Camera, Tensor G5 Chip & More!
Watch this video on YouTube. Design and Camera Upgrades
The leaked images suggest that the Pixel 10 will share a strong resemblance to the Pixel 10 Pro prototype in terms of overall design. However, one of the most striking updates is the inclusion of a triple rear camera setup, a notable improvement over the dual-camera configuration seen in its predecessor. This enhanced camera system is expected to include: A 5x telephoto lens, allowing users to capture detailed zoom shots with precision.
A wide-angle lens, offering versatility for everyday photography needs.
An ultra-wide lens, ideal for capturing expansive landscapes or group photos.
Another intriguing addition is a new sensor positioned beneath the flash. While its exact purpose remains unconfirmed, speculation suggests it could function as a temperature sensor, similar to the one introduced on the Pixel 9 Pro. If accurate, this feature may unlock advanced capabilities such as thermal imaging or health-related applications, further expanding the phone's utility beyond traditional smartphone functions. Tensor G5 Chip: Balancing Efficiency and Performance
At the core of the Pixel 10 lies the Tensor G5 chip, developed using TSMC's innovative 3nm process technology. This advanced manufacturing process is expected to deliver several key benefits: Improved energy efficiency, potentially leading to longer battery life for users.
Enhanced heat management, addressing common overheating concerns during intensive tasks.
Despite these advancements, reports indicate that the Tensor G5 may incorporate older CPU cores. While this choice could limit raw performance improvements, the focus on efficiency reflects a broader industry trend toward optimizing power consumption and thermal performance. For users, this translates to a device that remains cooler and more reliable during prolonged use, making it well-suited for both casual and demanding applications. Prototype Motherboard Reveals Hardware Insights
Additional insights into the Pixel 10's hardware have emerged from the sale of an engineering prototype motherboard labeled EVT 1.0 (Engineering Validation Test). Although this component represents an early stage of development, it provides valuable clues about the phone's internal architecture.
The motherboard prominently features the Tensor G5 chip and hints at potential improvements in areas such as heat dissipation and hardware reliability. These refinements suggest that Google is placing a strong emphasis on creating a device that balances performance, durability, and user experience. Such advancements could enhance the Pixel 10's appeal to a wide range of users, from tech enthusiasts to everyday consumers. Growing Anticipation Ahead of Launch
With the Pixel 10's official launch expected in just over a month, anticipation continues to build within the tech community. The recent leaks have fueled speculation about the device's capabilities, particularly in areas such as photography, performance, and design.
Google's focus on camera innovation, chip efficiency, and hardware engineering positions the Pixel 10 as a compelling option in the competitive smartphone market. As the launch date approaches, the Pixel 10's potential to redefine user expectations and set new standards in smartphone technology remains a central topic of discussion. The combination of advanced features and thoughtful design could make it a standout choice for those seeking a premium mobile experience.
Uncover more insights about Google Pixel 10 in previous articles we have written.
Source & Image Credit: Demon's Tech Filed Under: Android News, Mobile Phone News, Top News
Latest Geeky Gadgets Deals
Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Auto Car
2 hours ago
- Auto Car
Chinese car firm GAC confirms UK launch!
State-owned car maker – a partner of Honda and Toyota – plans to undercut established EVs here Open gallery The GAC Aion UT is billed by its maker as "China's version of the Mini" Aion UT is slightly longer and wider than the Volkswagen ID 3 The hatchback was designed at GAC's European base in Milan, Italy Aion UT will arrive alongside Aion V crossover Close GAC, one of China's biggest car makers, will launch in the UK early next year with an electric hatchback and SUV aimed squarely at Volkswagen's ID 3 and ID 4. The firm's announcement confirms Autocar's exclusive report that it was planning an imminent launch here, with the firm viewing the UK as an especially bountiful growth market for EVs. As previously reported, the joint-venture partner of Honda and Toyota – which sold just over two million cars last year – plans to undercut established brands on price to attract buyers. The smaller of its first two models is the Aion UT hatchback, billed as 'China's version of the Mini'. Designed to suit city commuters, it's slightly longer and wider than the ID 3 – a decision informed by Chinese buyers' expectations of interior space. It is priced from the equivalent of just £7500 in its homeland but is expected to cost significantly more in the UK, due to the cost of shipping the car across the globe, plus taxes. But, according to GAC COO Thomas Schemera (who was previously a leading figure at Hyundai's N performance division and at BMW before that), the Aion UT must come in at a competitive price to generate any traction in a difficult market. 'From a price positioning point of view, to be higher than other competitors which have a well-established brand, that would be very challenging,' he said. 'Let me put it this way: we have to be smart and clever here to put our morals on the streets and also to simultaneously build the brand, not just with the product presence but supported by campaigns. "As long as our brand value is on such a low level, we have to do everything to increase it. 'With this mechanism, you can also increase the price step by step for your products. But first of all, you have to build confidence.' To that end, the Aion UT is expected to land in the UK in the mid-to-high £20,000s. This would have it significantly undercutting the ID 3 (£30,795), aligning it more closely with the MG 4 EV and upper versions of the Renault 5. UK specifications have yet to be confirmed, but in China, it employs a front-mounted 134bhp electric motor and a 60kWh lithium-iron-phosphate (LFP) battery, yielding a range of 267 miles. Inside, the dashboard is dominated by a 14.6in infotainment touchscreen, which is used to operate key functions such as the climate controls. There is also an 8.8in digital instrument panel showing critical information such as the car's speed and range. In keeping with a Chinese trend, the front seats can be folded flat to form a bed-like arrangement with the rear seats. The hard plastics throughout the interior are moulded to lift perceived quality. For example, the windowsills give the impression of stitched leather, while the backing of the doorcards has a geometric finish, as if to mimic carbonfibre. Arriving alongside the Aion UT is the Aion V, a Tesla Model Y and Skoda Enyaq-rivalling electric SUV revealed at last year's Paris motor show. It gets a 224bhp motor and a 90kWh LFP battery, giving a range of 324 miles. In addition to EVs, GAC intends to launch a range of hybrids , plug-in hybrids and 'in some cases' pure-ICE cars. The cars will be sold in the UK through a joint-venture with Jameel Motors, currently responsible for retailing Farizon electric vans and soon to add Geely-branded cars to its portfolio. Q&A: Thomas Schemera, chief operating officer, GAC Motor What sets GAC apart from its rivals? 'First and foremost, our premium quality. This is nothing outstanding, because every customer expects premium quality, but we have run joint ventures with Toyota and Honda and we learned the ropes from the beginning. If you take a walk through our facilities – especially for Aion [models] – and have a look at our production, we really know how lean production works. Quality is not just a word; we take it very seriously from a customer perspective.' What are your expectations for GAC's European launch? 'If you enter a marketplace and your brand awareness is low, you have to build brand awareness. The second step is to interact with customers – walking through this valley of tears, so to speak. You have to invest and you have to be absolutely aware that you cannot make money from the very beginning: this is impossible.' Why headquarter your design operations in Milan specifically? 'I'm a big fan of globalisation. It's very, very important from a Chinese perspective to understand globalisation, to want to know what our consumer benefits from across the world, and we have a lot of diversity. Not everything that works in China can be shifted across to another market, and that has to be very clearly understood - and vice versa.' As well as fleshing out plans for global expansion, GAC has shared its vision of a personal luxury car inspired by the legendary Bugatti Type 57SC, Concorde and Chanel handbags. The Chanel inspiration manifests in the Hyperluxury concept's livery: a black exterior largely free of branding (limited to the shrouds on the wing-mounted pop-up headlights) is contrasted against a bright orange interior, with a huge GAC logo embossed into the dashboard. The seats are inspired by the minimalist look of the famed Le Corbusier LC4 lounge chair, first produced in 1928, and contain zero heating or cooling elements. The driver is instead supposed to wear a large jacket containing these functions – a reference to the bold outfits worn by motorists in open-top cars throughout the '20s and '30s. Instead of a traditional key, the car is opened and started using a gold ring that the driver can wear as a piece of jewellery. GAC executives added that the concept was designed without the assistance of artificial intelligence (AI) tools. Stephane Janin, design director for GAC's Milan studio, explained: 'We thought it should be human-made, because we compare it with luxury goods, and we thought that's a value we should work on for this very specific project. It was interesting for us to ask ourselves: 'what does hyperluxury mean nowadays?'' GAC doesn't plan to produce the car, however. Join our WhatsApp community and be the first to read about the latest news and reviews wowing the car world. Our community is the best, easiest and most direct place to tap into the minds of Autocar, and if you join you'll also be treated to unique WhatsApp content. You can leave at any time after joining - check our full privacy policy here. Next Prev In partnership with


Geeky Gadgets
3 hours ago
- Geeky Gadgets
Google Memory Bank Released : Long-Term AI Memory for Your Agents
What if your AI assistant could truly remember you—your preferences, your habits, even the context of past conversations? With the release of Google's new Memory Bank, this vision is no longer a distant dream but a tangible reality. Designed to equip AI agents with long-term memory capabilities, Memory Bank addresses one of the most persistent challenges in artificial intelligence: the inability to recall and build upon past interactions. This innovation promises to transform AI systems from reactive tools into proactive, context-aware companions, capable of delivering personalized and seamless experiences. Imagine an AI that doesn't just respond but evolves with you—this is the future Google is unveiling. In this update, MG explores how Memory Bank redefines what's possible for AI by introducing adaptive memory storage and retrieval. You'll discover why traditional stateless models fall short in personalization and continuity, and how Google's solution bridges that gap with features like prospective reflection and retrospective refinement. Whether you're a developer eager to integrate this technology or simply curious about its potential, Memory Bank's capabilities open up a world of possibilities for smarter, more intuitive AI systems. As we delve deeper, consider this: what could it mean for technology to truly remember you? Google's Memory Bank Unveiled Why Memory Bank Matters Memory Bank is a response to the challenges posed by traditional AI memory systems. Stateless models, while effective for single-session tasks, are inherently limited in their ability to maintain continuity across multiple interactions. This lack of continuity often results in repetitive or impersonal responses. Memory Bank bridges this gap by allowing AI agents to store and retrieve relevant information over extended periods. This capability significantly enhances personalization, allowing AI systems to adapt to individual user preferences and deliver contextually appropriate responses. For example, an AI-powered virtual assistant equipped with Memory Bank can remember a user's preferences for specific services or products, making sure a more tailored and engaging experience. Limitations of Existing AI Memory Systems Traditional approaches to AI memory management often fail to deliver the efficiency and relevance required for modern applications. Common methods, such as storing entire conversation histories or using similarity searches to retrieve past interactions, come with significant drawbacks: Inefficient: Storing large volumes of data increases operational costs and slows down processing times, making these systems impractical for large-scale applications. Storing large volumes of data increases operational costs and slows down processing times, making these systems impractical for large-scale applications. Error-Prone: Retrieval mechanisms frequently surface irrelevant or outdated information, leading to inconsistent user experiences. Retrieval mechanisms frequently surface irrelevant or outdated information, leading to inconsistent user experiences. Rigid: These systems lack the flexibility to adapt to evolving user behavior, limiting their ability to refine memory retrieval processes effectively. Memory Bank addresses these shortcomings by introducing a more intelligent and scalable approach to memory management. Its design ensures that only the most relevant and valuable information is stored and retrieved, optimizing both performance and user satisfaction. Long-Term Memory for Your AI Agent Watch this video on YouTube. Here are more detailed guides and articles that you may find helpful on long-term memory for AI. Key Features of Memory Bank Memory Bank incorporates several innovative features that set it apart from traditional memory systems. These features are designed to enhance the functionality and adaptability of AI agents: Personalization: Automatically identifies and stores user-specific preferences and interactions, allowing AI systems to deliver responses that are tailored to individual needs. Automatically identifies and stores user-specific preferences and interactions, allowing AI systems to deliver responses that are tailored to individual needs. Continuity: Selectively stores and retrieves only the most relevant portions of past interactions, making sure that conversations remain seamless and contextually coherent across sessions. Selectively stores and retrieves only the most relevant portions of past interactions, making sure that conversations remain seamless and contextually coherent across sessions. Adaptability: Employs reinforcement learning techniques to refine memory retrieval processes over time, improving the relevance and accuracy of responses. Employs reinforcement learning techniques to refine memory retrieval processes over time, improving the relevance and accuracy of responses. Cohesive Memory Creation: Consolidates fragmented session data into unified, meaningful memories, enhancing the AI's ability to understand and anticipate user needs. These features collectively enable AI agents to deliver more intelligent, responsive, and user-centric interactions, making Memory Bank a valuable tool for developers and businesses alike. How Memory Bank Works Memory Bank is seamlessly integrated with the Google Cloud Platform (GCP) and the Google Agent Development Kit (ADK), making it accessible to developers across a wide range of industries. Its functionality is supported by REST API integration, making sure compatibility with various frameworks and systems. Developers using Google ADK gain access to several key benefits: Automatic memory storage and retrieval capabilities, reducing the need for manual intervention. Seamless integration into existing AI workflows, minimizing disruption to ongoing projects. Minimal coding effort required for implementation, allowing for faster deployment. This flexibility ensures that Memory Bank can be easily adopted by developers, allowing them to enhance their AI systems with long-term memory capabilities without significant overhead. Technical Innovations Memory Bank introduces two new techniques that redefine how AI systems manage and use memory: Prospective Reflection: This technique consolidates fragmented session data into cohesive memory summaries, allowing AI agents to maintain a clear and organized understanding of past interactions. By creating structured memories, the system ensures that relevant information is readily accessible when needed. This technique consolidates fragmented session data into cohesive memory summaries, allowing AI agents to maintain a clear and organized understanding of past interactions. By creating structured memories, the system ensures that relevant information is readily accessible when needed. Retrospective Reflection: By analyzing user interactions over time, this method refines the relevance of retrieved information, making sure that future responses are more accurate and contextually appropriate. This adaptive learning process enhances the overall user experience. These innovations allow Memory Bank to go beyond simple data storage, allowing AI systems to evolve and improve in response to user behavior and preferences. Real-World Applications The introduction of Memory Bank opens up a wide range of practical applications for AI agents across various industries. Some notable use cases include: Personalized Experiences: AI systems equipped with Memory Bank can recall user-specific details, such as preferences, past queries, or frequently used services, to deliver a more tailored and engaging experience. AI systems equipped with Memory Bank can recall user-specific details, such as preferences, past queries, or frequently used services, to deliver a more tailored and engaging experience. Improved Continuity: By maintaining context across multiple interactions, Memory Bank ensures that users receive consistent and relevant responses, which is particularly valuable in fields like customer support, education, and healthcare. By maintaining context across multiple interactions, Memory Bank ensures that users receive consistent and relevant responses, which is particularly valuable in fields like customer support, education, and healthcare. Enhanced Decision-Making: In industries such as finance or logistics, Memory Bank can help AI systems analyze historical data to provide more informed recommendations and predictions. These capabilities make Memory Bank a powerful tool for businesses seeking to use AI for consistent, personalized, and context-aware interactions. Getting Started with Memory Bank Implementing Memory Bank is a straightforward process for developers working within the Google ecosystem. To get started, you need to create an agent engine session within GCP. The system can be accessed via REST API or natively integrated with Google ADK. Key benefits of this setup include: Automated storage and retrieval of long-term memory, reducing the need for manual configuration. Scalability to handle large datasets and complex interactions, making it suitable for enterprise-level applications. Ease of integration into existing AI workflows, allowing rapid deployment and minimal disruption. This streamlined approach ensures that developers can quickly enhance their AI systems with advanced memory capabilities, unlocking new possibilities for innovation and user engagement. Performance and Efficiency Extensive benchmark testing has demonstrated that Memory Bank outperforms traditional long-term memory systems in both efficiency and relevance. Its ability to adapt to user behavior and refine memory retrieval processes ensures a superior user experience. These advancements make Memory Bank an essential tool for developers aiming to create context-aware AI agents that deliver consistent and meaningful interactions. By addressing the limitations of stateless AI models and introducing a scalable, intelligent approach to memory management, Memory Bank represents a significant advancement in the field of artificial intelligence. Media Credit: MG Filed Under: AI, Top News Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.


Geeky Gadgets
4 hours ago
- Geeky Gadgets
The Windsurf Story So Far, Windsurf's Wild Ride : OpenAI, Google and Acquisition By Cognition
What happens when a $3 billion deal collapses in just 72 hours? For Windsurf, the AI coding assistant once hailed as a rising star in developer tools, the answer is a whirlwind of chaos, controversy, and reinvention. In a matter of days, the company went from being courted by OpenAI to witnessing a controversial reverse acqui-hire by Google, only to ultimately land in the hands of Cognition Labs. The implosion of Windsurf's high-stakes acquisition deal reveals not just the fragility of Silicon Valley's startup ecosystem but also the growing tension between AI model creators and tool developers. In an industry where innovation moves faster than trust can be built, Windsurf's saga is a cautionary tale of ambition colliding with reality. Nate B Jones explains the dramatic rise and fall of Windsurf, offering a rare glimpse into the high-stakes world of AI development. You'll discover how a company that once boasted $100 million in annual recurring revenue and a user base of over 1 million developers was brought to its knees by intellectual property disputes, competitive pressures, and shifting industry priorities. But this isn't just a story of failure—it's also one of reinvention. From Google's controversial talent grab to Cognition Labs' inclusive acquisition strategy, the Windsurf saga offers critical insights into the future of AI coding tools, the battle for top engineering talent, and the evolving power dynamics within the AI ecosystem. What does this mean for the future of AI-driven development? The answer lies in the lessons Windsurf's journey has to offer. Windsurf's Tumultuous 72 Hours Windsurf's Evolution: From GPU Optimization to AI-Native Development Windsurf's journey began in 2021 under its original name, Exofunction, founded by MIT graduates Varun Mohan and Douglas Chen. Initially, the company focused on creating GPU optimization tools. However, recognizing the growing demand for AI-driven solutions, the company pivoted in 2024 to focus on AI coding assistance, rebranding itself as Windsurf and launching an AI-native development environment. This strategic shift positioned Windsurf as a direct competitor to Cursor, offering developers a more affordable premium tier and a streamlined coding experience. By 2025, Windsurf had achieved several significant milestones that solidified its position in the market: FedRamp High certification: This enabled Windsurf to be used in U.S. government workloads, opening doors to regulated industries. This enabled Windsurf to be used in U.S. government workloads, opening doors to regulated industries. $100 million in annual recurring revenue (ARR): A testament to its growing customer base and financial stability. A testament to its growing customer base and financial stability. A user base of over 1 million developers: Including 350 enterprise customers, showcasing its widespread adoption. These achievements reflected Windsurf's ability to adapt to the evolving needs of developers and enterprises, cementing its reputation as a leader in AI coding tools. The $3 Billion OpenAI Deal That Fell Apart In April 2025, OpenAI announced its intention to acquire Windsurf for $3 billion, aiming to integrate the IDE into its ChatGPT developer suite. However, the deal fell apart due to complications surrounding Microsoft's intellectual property (IP) rights, which stemmed from a 2023 agreement between Microsoft and OpenAI. The situation worsened when Anthropic, a key partner, withdrew its Claude AI model from Windsurf, citing competitive concerns. The fallout from the failed acquisition was swift and damaging: Developer migration: Many developers began moving to competing platforms, eroding Windsurf's user base. Many developers began moving to competing platforms, eroding Windsurf's user base. Enterprise uncertainty: Contracts with enterprise customers faced delays and renegotiations, creating instability. Contracts with enterprise customers faced delays and renegotiations, creating instability. Product roadmap disruptions: The company's plans for future development were stalled, leaving it vulnerable to competitors. These challenges left Windsurf in a precarious position, struggling to regain momentum in a highly competitive market. The Rise and Fall of Windsurf: Lessons from a Failed AI Acquisition Watch this video on YouTube. Stay informed about the latest in AI coding assistant IDE by exploring our other resources and articles. Google's Reverse Acqui-Hire: A Controversial Move Following the collapse of the OpenAI deal, Google stepped in with a $2.4 billion reverse acqui-hire. This deal focused on acquiring Windsurf's founders and 40 of its top engineers while licensing its technology on a non-exclusive basis. However, the remaining 250 employees were excluded from the deal, sparking widespread criticism on social media and within industry circles. This move highlighted a growing trend in Silicon Valley: prioritizing top-tier talent with lucrative compensation packages while sidelining broader employee participation. The backlash underscored deeper issues within the startup ecosystem, where equity and inclusivity are becoming increasingly contentious topics. Google's approach, while strategic, raised questions about the long-term impact of such practices on organizational culture and employee morale. Cognition Labs' Acquisition: A More Inclusive Approach Cognition Labs ultimately acquired Windsurf's remaining assets, including its intellectual property, product, and workforce. Unlike Google, Cognition Labs adopted a more inclusive approach, offering financial participation and vesting opportunities to all employees. This strategy not only boosted morale but also positioned the company as a more equitable player in the industry. Cognition Labs announced plans to use Windsurf's strengths in several key areas: Integration with Devon AI: Windsurf's tools will be integrated with Cognition Labs' AI agent, Devon, to enhance its capabilities. Windsurf's tools will be integrated with Cognition Labs' AI agent, Devon, to enhance its capabilities. Restoration of Anthropic's Claude model: Cognition Labs aims to rebuild partnerships and restore access to critical AI models. Cognition Labs aims to rebuild partnerships and restore access to critical AI models. Expansion into regulated industries: Windsurf's FedRamp High certification will be used to target government and enterprise markets. This inclusive and strategic approach positions Cognition Labs to capitalize on Windsurf's assets while addressing the gaps left by previous deals. Key Industry Takeaways The Windsurf saga offers valuable insights into the evolving dynamics of the AI development industry. Several key trends and tensions have emerged: Model Makers vs. Tool Developers: The collapse of the OpenAI deal highlights the growing power imbalance between AI model creators, such as OpenAI and Anthropic, and tool developers like Windsurf. Model makers increasingly dictate the terms of collaboration, creating challenges for smaller players. The collapse of the OpenAI deal highlights the growing power imbalance between AI model creators, such as OpenAI and Anthropic, and tool developers like Windsurf. Model makers increasingly dictate the terms of collaboration, creating challenges for smaller players. Talent Wars: Google's reverse acqui-hire underscores the escalating competition for top engineering talent in Silicon Valley. However, this approach often comes at the expense of broader employee equity and organizational cohesion. Google's reverse acqui-hire underscores the escalating competition for top engineering talent in Silicon Valley. However, this approach often comes at the expense of broader employee equity and organizational cohesion. Advancing AI Coding Tools: Windsurf's evolution from GPU optimization tools to an AI-native IDE reflects a broader industry trend toward more integrated and sophisticated development solutions. This shift is reshaping how developers interact with AI in their workflows. Future Implications Cognition Labs' acquisition of Windsurf signals a strategic focus on integrating advanced AI coding tools into regulated industries such as government and enterprise. By using Windsurf's FedRamp High certification and combining its tools with the Devon AI agent, Cognition Labs is well-positioned to carve out a competitive edge in the market. At the same time, the competition between AI model makers and tool developers is expected to intensify. As AI coding agents become increasingly central to software development, the balance of power within the AI ecosystem will continue to shift. Windsurf's story serves as a reminder of the challenges and opportunities that lie ahead, offering valuable lessons for startups and established players alike. Media Credit: AI News & Strategy Daily | Nate B Jones Filed Under: AI, Top News Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.