
Taking An AI-Native Approach To Business Innovation
Many organizations are racing to embed AI into nearly every aspect of their operations. From marketing automation to customer service to supply chain optimization—new AI tools enabling more efficient workflows are emerging almost daily.
While this rapid adoption of AI often signals a forward-looking mindset, this does not always extend to corporate innovation. Corporate innovation—the ability of organizations to turn novel ideas and concepts into financial impact—remains paradoxically constrained within many organizations.
The Innovation Challenge
Although embracing technology is seen as a strategic imperative, the systems designed to drive innovation in a company often remain stuck in the past. Outdated innovation processes, top-down decision-making structures and a lack of innovation culture can breed an environment where ideas are stifled rather than scaled.
I see this misalignment between innovation methods, systems and today's demands manifesting in four recurring problems inside large enterprises: a lack of strategically relevant ideas, a lack of data-driven validation, limited time to innovate and a lack of strategic portfolio thinking.
But the solution is not to plaster AI onto these broken innovation practices like a Band-Aid. Instead, what is required is an entire reimagination of these systems and approaches through an AI-native lens.
Problem No. 1: Not Enough Strategically Relevant Ideas
Where people are given the time, space and autonomy to innovate, I've found creativity tends to flourish. Yet, when companies launch a call for ideas, they tend to face a recurring challenge: a lack of relevant ideas that are strategically aligned and timely toward the company's overarching business vision and priorities.
When such ideas are scarce, organizations risk spending both money and resources on the wrong initiatives. Even the most advanced tools—if fed by irrelevant and low-quality input—will produce misaligned output.
Traditionally, assessing an idea's relevance required manual efforts, including well-aligned ideation workshops and manual idea evaluation. AI is shifting this, and as it evolves, companies will likely rely less on traditional idea submissions to identify challenges and more on internal signals: process data as well as feedback or observations gathered through the data pools of a company.
However, adopting AI-native approaches can present challenges. Many companies struggle to adapt to a fundamentally new way of doing certain processes, and as noted earlier, legacy mindsets and a general discomfort surrounding AI can slow adoption. Additionally, it can be difficult to gain oversight of all current innovation activities since fragmented systems, silos across departments and lack of centralized oversight can make it difficult to get a clear understanding of all current innovation initiatives.
Overcoming these challenges begins by building a certain level of confidence and literacy around AI among employees. This ensures a healthy understanding of how AI can augment human creativity in the innovation process, significantly increasing the time to value. Further, when selecting tooling, it is also important to ensure that the tooling complies with robust data protection standards and seamlessly integrates with existing systems.
Problem No. 2: Lack Of Data-Driven Validation
In many companies, top-level executives still have the final say when it comes to deciding which projects are worth pursuing. These decisions are often based on gut feeling and past experience, without necessary data or feedback.
An AI-native approach bridges this by leveraging AI tools directly when validating an idea—granted that companies have access to quality data. For example, to test market demand for a product or service, AI can model customer personas or simulate responses to different value propositions. In this case, companies with extensive customer data can use this data to train AI models on specific information to yield even better and more accurate results. This accelerates validation and helps ensure that decisions are grounded in real data.
Problem No. 3: Not Enough Time To Innovate
While it would be nice for innovation to be a daily byproduct of our efforts, it requires mental bandwidth. This means that employees need to be given the headspace and the time to innovate.
For companies to achieve tangible results from their innovation efforts, it is crucial to allocate employees time to spend on creative activities. Google's famous 20% time rule saw employees being allocated one day a week to projects that were not part of their primary responsibilities. This led to some of the company's most successful products—including AdSense and Google News.
AI tools can also play an important role here by augmenting innovation processes and providing the time to focus on the most important projects and focus areas—as well as accelerate workflows. Incorporating agentic AI to evaluate, improve and even execute ideas can reduce the time usually spent on the initial phases of an idea.
Problem No. 4: Lack Of A Portfolio Approach
A common response I get from prospects is: "We already manage our innovation portfolio." Now, this might be true for some organizations, but for many, this is not the case. In reality, these portfolios are often top-heavy, lack thematic diversity and fail to adapt dynamically. Beyond this, innovation activities are often spread across systems—hindering oversight and creating redundancies.
Using an AI-native approach can help improve visibility so leaders can then effectively prune or scale ideas based on multidimensional criteria, creating an evenly weighted innovation portfolio.
To begin this transition, it is important for organizations to start by mapping out all ongoing innovation activities within the company and establish a centralized oversight to create a single source of truth. Based on this, a structured framework can be built whereby clear processes and governance are introduced—ensuring that each project moves from ideation to implementation in a systematic way.
Making this a successful transition will likely require companies to challenge long-standing processes and their levels of efficiency. Embracing an objective mindset can allow teams to eliminate inefficiencies and discover possible hindrances to establishing a portfolio-based approach to innovation.
Overall, solving the innovation illusion is not about bolting another chatbot onto the innovation process—it's about rethinking and rebuilding the entire innovation system. An AI-native approach can transform the innovation system into one that proactively adapts and responds to internal needs and external market shifts.
Forbes Business Council is the foremost growth and networking organization for business owners and leaders. Do I qualify?
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


CNET
a minute ago
- CNET
Nvidia Will Let You Add Your Own Steam Games to Stream Via GeForce Now
Nvidia's announcements at the 2025 Gamescom show were sprinkled with the usual minor technology refreshes across its software platforms, but the biggest updates are coming to its GeForce Now cloud gaming service. The company upgraded its highest-end GFN server clusters ("pods") used for Ultimate members with its new RTX GeForce 5080-equivalent graphics processors and is in the process of upgrading the network pipeline to deliver faster streams with less latency. In practice, that means a lot of new capabilities mostly, but not all, for the top tier subscribers. Subscription prices are the same, for now, at least. Nvidia added Annual payment plans, though they're not any cheaper at double the price of the six-month plan. One of the most exciting new features, available to both Premium and Ultimate members, is Install-to-Play, which essentially brings back a capability that was dropped between the original beta tests and the launch in 2020, which is the ability to play Steam games not already in GFN's already optimized library. (If you're not familiar with how GFN works, it's not a game vault. It allows you to stream games you already own and the usual free-to-play games.) Install-to-Play installs your games — still not all of them — to a virtual machine in the cloud, which is a much faster install than downloading to your device because it's all through fat network pipes. It then streams to your device. They differ from typical GFN games since they're not as optimized and they don't get automatically updated the same way. I'm not quite sure how the updating will actually work. Install-to-Play comes with 100GB of temporary storage that vaporizes once your session ends. Nvidia will offer persistent storage options for $3/month (200GB), $5/month (500GB) and $8/month (1TB). Aside from the convenience of not having to repeatedly reinstall a game and get over 2,000 more titles available, it theoretically makes it possible to stream games with local-only saves. Nvidia also upped some of the top frame rate/resolution specs for the Ultimate plan. They rely on the Multi Frame Gen technology which is only available with the Blackwell-based 50-series GPU. Steam Deck players get a boost to a max of 90fps, and there is a new 1080/360fps option (which I believe requires an Nvidia Reflex-compatible monitor). Supporting LG TVs will be able to hit up to 4K at 120Hz and 5K for LG OLED monitors connected to Windows or Mac systems. (LG is the first partner for this feature.) Plus the company has added Logitech racing wheels to its list of supported devices. Ultimate members also have the option for Cinematic Quality Streaming. CQS uses more bandwidth, up to 100Mbps, to deliver higher bit-depth HDR (YUV 4:4:4) and resolution (by autodetecting your screen's native res) with optimized sharpening that better resolves text and reduces blur. All of the above are slated to rollout starting in September. Later in the year, Nvidia plans to debut Play Instantly on Discord, starting with Fortnite. Play Instantly will let streamers invite viewers to launch the game as a one-hour trial directly from the stream for free without an account. This capability was in Google's grand plan for Stadia, which never got off the the ground. Notable titles coming to GFN as they launch include Borderlands 4, Call of Duty: Black Ops 7 and The Outer Worlds 2. Bits and pieces Other announcements the company made at the show include the first implementation of its RTX Hair, in Indiana Jones and the Great Circle in an update to the game coming in September. RTX Hair uses path tracing and linear swept spheres (a fascinating new type of primitive) to produce more natural looking hair and fur with less of a performance hit than current techniques. Modding fans of RTX Remix will be able to incorporate path-traced particles into older games for more realistic rendering. On the AI invasion front, Nvidia debuted The Oversight Bureau, a voice-driven puzzle game from Iconic that uses the company's speech-to-text technology to feed voice commands into the game, which combines pre-recorded dialog and AI-based contextual analysis to decide which responses to use. Nvidia made some minor optimizations to its new G-Assist AI app, reducing the amount of memory it requires and adding laptop-specific help, mostly for improving battery life. It also brought more user-requested features to the Nvidia App (which merged the GPU driver with GeForce Experience), adding DLSS global overrides and bring back some legacy 3D control panel settings.


The Verge
2 minutes ago
- The Verge
‘Play Instantly on Discord': Fortnite will be Nvidia and Discord's first instant game demo
Nvidia's GeForce Now is getting a big upgrade next month — and it's also part of an intriguing new experiment. Nvidia, Discord, and Epic Games have teamed up for an early test of instant game demos for Discord servers, which could theoretically let you immediately try a game without buying it, downloading it, or signing up for an account. Sound familiar? That's probably because instantly try-before-you-buy was the original vision for Gaikai, one of the first cloud gaming services, and Google's Stadia cloud gaming service also tried it many times by letting you demo games in a web browser. Now, Nvidia will be showing off the idea at Gamescom this week by letting people try Fortnite from within Discord. 'You can simply click a button that says 'try a game' and then connect your Epic Games account and immediately jump in and and join the action, and you'll be playing Fortnite in seconds without any downloads or installs,' says Nvidia product marketing director Andrew Fear. That doesn't sound completely frictionless if you still need an Epic Games account to play, and it's not clear if Nvidia, Epic and Discord will offer the demo outside of Gamescom just yet. Nvidia is calling it a 'technology announcement' rather than a confirmed feature, one that'll hopefully see game publishers and developers reach out if they're interested in potentially adding it to their games. After Sony bought Gaikai in 2012, it initially suggested it would offer instant try-before-you-buy game demos on the PlayStation 4 too, but that never happened. Years later, Gaikai's founder told me that publishers didn't necessarily want it. Posts from this author will be added to your daily email digest and your homepage feed. See All by Sean Hollister Posts from this topic will be added to your daily email digest and your homepage feed. See All Gaming Posts from this topic will be added to your daily email digest and your homepage feed. See All News Posts from this topic will be added to your daily email digest and your homepage feed. See All Nvidia Posts from this topic will be added to your daily email digest and your homepage feed. See All PC Gaming Posts from this topic will be added to your daily email digest and your homepage feed. See All Tech


Forbes
2 minutes ago
- Forbes
Cold Data At FMS And Magnetic Tape Data Recovery
The FMS conference has moved from a concentration on solid state storage to other types of storage as well. In this article we will explore a session on cold storage. In addition, we will report on an on-going project to recovering data from old tapes for training AI and other applications. The cold storage session was moderated by Rich Godomski from Fujifilm and included talks by John Monroe from Further Market Research, Ilya Kazansky, CEO, SPhotonix, Steffen Hellmold, President, Cerabyte, Inc., Dave Landsman, Distinguished Engineer, Director Industry Standards, Western Digital and Alistair Symons, VP, Storage Systems Development, IBM. John Monroe's message was that about 70% of the data stored in enterprise and data center applications is cold data, but that it may need to be accessed at any time to support modern AI workflows. Thus, it should be stored on an active archive, which would save on power consumption and operating costs. He also felt that in the future AI will be used to manage what could otherwise become an enormous amount of stored data. Although much of this cold storage is currently on HDDs, he projected that magnetic tape and emerging optical archive storage technologies would grow to over 20% of the total shipped storage capacity by 2030, compared to about 16% in 2024 (which is somewhat higher than my projection of about 12% in 2024). Ilya Kazansky CEO of SPhotonix spoke about the company's optical storage technology, its 5D fused quartz-based memory crystals, an example is shown below. He argued that conventional storage technologies need to be replaced every 10-15 years and that migrating content to new storage technology takes a significant amount of effort and money and that it results in significant waste. The commonly used storage technologies are also susceptible from damage from heat, humidity, chemical reactions and electromagnetic pulses and that they require controlled environments to preserve the storage media. The company's memory crystals provide a write once, read many times or WORM media that can persist for a thousand years or more. Recording on this media are also not subject to damage from heat, humidity and the other factors that can cause loss of data in other storage media. Thus, it would not require migration to new media on a regular schedule and is easy to recycle, since it has a composition similar to sand. He said this volumetric storage media can support storage densities up to 10GB/mm3. The femto-second lasers used for recording record nanostructures that can encode data in their width, height, depth, polarization and birefringence, hence the 5D name. Steffen Hellmold, President of Cerabyte, spoke about their optical media technology for archiving applications. He also stressed the costs of archiving and management required with current digital storage media. Similar to the SPhotonix media, Cerabyte media could last a long time since it is not sensitive to heat, humidity and other factors that can damage the media or the data on conventional digital storage media. He stressed the need to move from traditional archiving approaches to active archiving to support data needs of modern AI workflows. The Cerabyte media uses glass substrates, which are coated with a ceramic material. The Cerabyte solution uses digital light processors, DLPs, from Texas Instruments to effectively write and read millions of bits at once and thus providing high data throughput, as shown below. Although not represented in this session, there are other optical storage startup working on archival technology. These include Folio Photonics and Optera, both of which are developing higher capacity circular optical media that could be placed into traditional optical storage libraries. David Landsman, Distinguished Engineer and Director of Industry Standards, at Western Digital spoke about why HDDs will remain relevant for cold storage applications. He said that HDDs are where cool, versus cold data, lives in data centers. He said that HDDs for data centers are 6X less expensive than data center SSDs and would remain so through 2030 and beyond. One of his slides showed 36-44TB HDDs in 2026, 80-100TB HDDs by 2030 and over 100TB in the years after, as shown below. Alistair Symons, VP of Storage Systems Development at IBM, spoke in favor of magnetic tape for archive storage in the session. He argued that the increasing amount of storage that AI and other applications is a significant factor in the increasing energy requirements at storage centers and this will favor storing less frequently accessed data on magnetic tape. Compared to SSDs and HDDs the energy consumption for magnetic tape storage is considerably less. Chuck Sobey of Channel Science and the organizer of the Monday training sessions before the regular FMS sessions, has been pursuing the development of a multi-format, minimal-contact legacy tape reader that can recover data from old magnetic tapes. There are vast libraries of older magnetic tapes in now obsolete formats that contain valuable data for AI training and other applications or that may contain other content that has a long-standing cultural value. Training on this often-irreplaceable data, such as cultural, seismic, climate or astronomy data, could make better domain specific AI-based models. Currently many efforts to recover data from older tapes requires having the original tape equipment to read the tapes. In addition, even if you can find the old tape drives, replacing worn out parts such as heads is hard because these heads are no longer in production. In addition, even if you have a functioning vintage tape drive it only has the capabilities that were designed into the original drives. To meet the need to read these old magnetic tapes Channel Science created its multiformat tape reader. The company has one patent granted on the technology and more pending. The prototype tape transport device is shown below. Developed using SBIR grants the device uses GMR sensors, in mass production for HDD heads, to read the data from the tapes with a gentle tape path and a high-speed tape transport with advanced signal processing, detection and decoding and uses AI to optimize its operation for different types of tape. To find out more, contact Chuck Sobey at csobey@ The 2025 FMS Conference included sessions and exhibitors working on the increasing demand for colder storage for AI and other applications. Channel Science developed a multiformat magnetic tape reader for recovering data from old tapes.