
Jony Ive and Sam Altman are about to give us 'the coolest piece of technology the world has ever seen'
Learning from the past
Attempts at AI-specific hardware have had a rocky start. The Humane AI Pin, a screenless wearable device launched in 2023, promised hands-free access to an AI assistant. However, it suffered from short battery life, overheating and sluggish performance tied to its cloud-based processing—ultimately falling short of expectations.
The Rabbit R1, released in 2024, fared no better. Designed as a pocket-sized AI assistant with a physical scroll wheel and screen, it drew criticism for its inaccurate AI responses and inconsistent user interface.
Above Rabbit Inc. CEO Jessie Lyu presenting the Rabbit R1 AI device (Photo: Rabbit Inc.)
As an AI company first, OpenAI can learn from these failures and be the first to bring its products to the market using custom-designed hardware. With a world-class design team and firsthand expertise in AI software, the company is well-positioned to bridge the gap between sleek hardware and powerful intelligence. Furthermore, Ive's legacy of refining existing concepts—like turning the MP3 player into the iPod or the touchscreen phone into the iPhone—demonstrates his talent for breaking boundaries.
Moreover, OpenAI has the potential to challenge Apple and Google in creating tightly integrated hardware-software ecosystems. Just as Google uses its Pixel phones to showcase Android's capabilities, OpenAI could do the same for its own models with dedicated hardware.
Above Ray-Ban Meta glasses can access Meta AI's live translate and search functions when connected to a smartphone with the Meta AI app (Photo: Meta)
Smart glasses and visual computing devices represent another promising area. Meta, for example, has teamed up with Ray-Ban to develop glasses that integrate real-time translation and AI-powered search. OpenAI could similarly redefine how AI is embedded in everyday experiences—especially if it controls both the interface and the intelligence behind it.
With AI capabilities evolving rapidly, OpenAI's in-house hardware team may prove crucial in ensuring its next breakthroughs are matched by equally advanced, intuitive physical forms.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Tatler Asia
23-05-2025
- Tatler Asia
Jony Ive and Sam Altman are about to give us 'the coolest piece of technology the world has ever seen'
Learning from the past Attempts at AI-specific hardware have had a rocky start. The Humane AI Pin, a screenless wearable device launched in 2023, promised hands-free access to an AI assistant. However, it suffered from short battery life, overheating and sluggish performance tied to its cloud-based processing—ultimately falling short of expectations. The Rabbit R1, released in 2024, fared no better. Designed as a pocket-sized AI assistant with a physical scroll wheel and screen, it drew criticism for its inaccurate AI responses and inconsistent user interface. Above Rabbit Inc. CEO Jessie Lyu presenting the Rabbit R1 AI device (Photo: Rabbit Inc.) As an AI company first, OpenAI can learn from these failures and be the first to bring its products to the market using custom-designed hardware. With a world-class design team and firsthand expertise in AI software, the company is well-positioned to bridge the gap between sleek hardware and powerful intelligence. Furthermore, Ive's legacy of refining existing concepts—like turning the MP3 player into the iPod or the touchscreen phone into the iPhone—demonstrates his talent for breaking boundaries. Moreover, OpenAI has the potential to challenge Apple and Google in creating tightly integrated hardware-software ecosystems. Just as Google uses its Pixel phones to showcase Android's capabilities, OpenAI could do the same for its own models with dedicated hardware. Above Ray-Ban Meta glasses can access Meta AI's live translate and search functions when connected to a smartphone with the Meta AI app (Photo: Meta) Smart glasses and visual computing devices represent another promising area. Meta, for example, has teamed up with Ray-Ban to develop glasses that integrate real-time translation and AI-powered search. OpenAI could similarly redefine how AI is embedded in everyday experiences—especially if it controls both the interface and the intelligence behind it. With AI capabilities evolving rapidly, OpenAI's in-house hardware team may prove crucial in ensuring its next breakthroughs are matched by equally advanced, intuitive physical forms.


Tatler Asia
23-05-2025
- Tatler Asia
Jony Ive's 8 most influential Apple product designs—and how they will shape the rise of physical AI
The iPod (2001): mastering intuitive complexity Above The iPod's click wheel allowed users to effortlessly navigate massive digital libraries. (Photo: Cartoons Plural / Unsplash) 'A thousand songs in your pocket' represented more than storage capacity—it demonstrated Ive's genius for creating intuitive interfaces that make vast complexity feel simple. The iPod's revolutionary click wheel transformed navigation from a technical task into an almost meditative experience, allowing users to effortlessly traverse massive digital libraries. This principle directly informs AI hardware design. Future physical AI devices must provide seamless access to virtually unlimited intelligence while retaining the iPod's elegant simplicity. Ive's mastery of tactile interaction design—making complex systems feel natural and responsive—represents precisely the expertise required for crafting humanity's first truly intelligent physical companions. The iPhone (2007): redefining human-machine interaction Above The iPhone's integration of hardware and software allowed users to focus on their goals rather than the tools. The iPhone's multi-touch interface eliminated the barrier between user intent and digital response, creating what felt like direct manipulation of information itself. Ive's seamless integration of hardware and software established a new paradigm where technology receded, allowing users to focus on their goals rather than the tools. This philosophy will be fundamental to physical AI design. Just as the iPhone made computing feel magical rather than mechanical, Ive's AI hardware must make AI feel like a natural extension of human capability rather than an alien presence requiring conscious interaction. The MacBook Air (2008): engineering impossible simplicity Above The MacBook Air redefined what portable computing means. (Photo: Maxim Hopman / Unsplash) When Steve Jobs pulled the MacBook Air from an envelope, it demonstrated Ive's ability to achieve the seemingly impossible through relentless pursuit of essential design. The precision aluminium unibody construction wasn't merely aesthetic—it was a fundamental rethinking of what portable computing could be. Physical AI hardware will demand similarly revolutionary thinking. Ive must distil AI's vast computational requirements into forms that feel effortless, just as the MacBook Air made laptop portability seem intuitive. The iPad (2010): creating new interaction categories Above The iPad pioneered an new category of intimate, immersive computing. (Photo: Maury Page) The iPad succeeded by refusing to be a smaller laptop or larger phone, instead pioneering an entirely new category of intimate, immersive computing. Ive's insight that different technologies demand different interaction paradigms will be critical for physical AI design. Current AI interfaces remain constrained by smartphone and computer metaphors. Ive's proven instinct for knowing when to break from tradition sets him apart, making him uniquely suited to create the first truly native physical AI experiences. iOS 7 (2013): designing deferential intelligence Above iOS 7 represented the evolution toward unobtrusive and deferential technology. Ive's dramatic shift from skeuomorphic to flat design wasn't merely aesthetic—it represented a profound philosophical evolution toward unobtrusive and deferential technology. His goal of designing interfaces that fade into the background to elevate the user's content anticipates physical AI's primary challenge. The most successful AI hardware will be nearly invisible, amplifying human capability without demanding attention. Ive's iOS 7 philosophy of 'bringing order to complexity' through restraint and refinement provides the perfect framework for AI that enhances rather than interrupts daily life. The Apple Watch (2015): pioneering ambient intelligence Above The Apple Watch lives with users rather than demanding dedicated attention. (Photo: Luke Chesser / Unsplash) The Apple Watch marked Ive's first foray into truly personal, ambient computing—technology that lives with users rather than demanding dedicated attention. The Digital Crown and Force Touch demonstrated his ability to invent novel interaction methods for unprecedented use cases. These innovations directly inform physical AI challenges. Ive's understanding of how to make technology feel personal and contextually aware, rather than universal and demanding, will be crucial for AI hardware that must integrate seamlessly into diverse human environments. The AirPods (2016): mastering invisible assistance Above The AirPods made wireless audio feel effortless and magical. Despite initial scepticism, AirPods became a cultural phenomenon by making wireless audio feel effortless and magical. Ive's ability to create technology that works beautifully without user intervention—automatic pairing, intuitive controls, seamless switching—represents the gold standard for physical AI interaction. Future AI hardware must exhibit similar invisible competence, anticipating needs and responding naturally without requiring conscious management. Ive's mastery of helpful, non-disruptive technology provides a perfect blueprint for AI that assists rather than demands attention. The AI renaissance Jony Ive's collaboration with OpenAI represents more than another product launch—it is the convergence of three decades of design philosophy with technology's most transformative frontier. His ability to make complex technology feel natural, personal and delightful positions him to solve AI's greatest challenge: creating physical forms that amplify human potential while remaining beautifully, invisibly present. The future of human-AI interaction will not be shaped by algorithms alone, but by a rare human ability to create experiences that feel both revolutionary and inevitable—which is the true mark of great design. Credits This article was created with the assistance of AI tools


Tatler Asia
22-05-2025
- Tatler Asia
Move over, Duolingo: These experimental AI tools from Google will change the way we learn languages
These tools currently support a wide range of languages, including Arabic, Chinese, English, French, German, Greek, Hebrew, Hindi, Italian, Japanese, Korean, Portuguese, Russian, Spanish, and Turkish. Of course, at this experimental stage, text-to-speech quality may vary by language. These experiments are part of Google's wider mission to make independent language learning more engaging, dynamic, and tailored to individual needs through AI. By focusing on real-world scenarios, authentic speech and visual learning, Google is signalling a bold new direction—one where technology adapts to the learner, not the other way around.