logo
End subscription fatigue with this $30 lifetime Microsoft Office deal

End subscription fatigue with this $30 lifetime Microsoft Office deal

New York Posta day ago
Discover startups, services, products and more from our partner StackCommerce. New York Post edits this content, and may be compensated and/or receive an affiliate commission if you buy through our links.
TL;DR: Stop paying monthly for essential programs and own Microsoft Office for life for just $29.97.
Pay once and keep it forever, no subscriptions, no 'renew now' pop‑ups
Seven must‑have apps: Word, Excel, PowerPoint, Outlook, OneNote, Publisher, and Access
Fresh upgrades, including better inking, sharper Excel analysis tools, new PowerPoint tricks, and smarter Outlook email handling
Works perfectly on Windows 10 or 11 so you can plug in and get moving
Instant digital delivery so you can start downloading in minutes, not days
All languages supported, making it perfect for global work and multilingual projects
Updates are included so your software stays current without costing extra
Officially licensed from a Microsoft Partner
Stop renting your office tools. Get a lifetime license to Microsoft Office Professional Plus 2019 for just $29.97 and own your productivity for life.
StackSocial prices subject to change.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Skipping Nvidia Left Amazon, Apple And Tesla  Behind In AI
Skipping Nvidia Left Amazon, Apple And Tesla  Behind In AI

Forbes

timean hour ago

  • Forbes

Skipping Nvidia Left Amazon, Apple And Tesla Behind In AI

Everyone thinks they are a comic. And everyone in big cap high tech thinks they can design better and/or cheaper AI chip alternatives to the industry-leader, Nvidia. Turns out, it's simply not that easy. Apple and AWS have recently run aground in AI growth, and Tesla has just abandoned their own Dojo Supercomputer chip development, saying they are switching to Nvidia and AMD for training AI models. (Like many semiconductor developers, Nvidia is a client of Cambrian-AI Research). Oh, and today, The Information reported that 'Microsoft's AI Chip Effort Falls Behind'. There is definitely an important trend here. A few companies have eschewed getting locked in to Nvidia, paying the high prices state-of-the-art AI technology commands. This penny-smart but pound-foolish approach left the world's largest consumer electronics company (Apple) and the undisputed cloud leader (AWS) far behind, just when generative AI created massive end user opportunities they could not adequately address. Nvidia's CUDA platform is the de-facto standard for training and deploying large language models for generative AI. CUDA offers unmatched performance, developer tooling and ecosystem support. Companies that build on CUDA — like Microsoft (with OpenAI) and Meta (with LLaMA) — have been able to scale quickly and deliver cutting-edge AI products. By contrast, Amazon and Apple chose to go their own way, and Tesla took the Nvidia off-ramp in 2019. Let's take a look, as each took a different approach, and mostly failed. Apple Maintains Its ABN Strategy (Anybody But Nvidia) Apple's generative AI journey has been even more problematic. After unveiling 'Apple Intelligence' in 2024, the company's most anticipated upgrade — a fully LLM-powered Siri — has been delayed until 2026. Apple has some serious semiconductor bona fides, with its M-class Arm-based chips for desktops and the A-class for mobile. The company is justifiably proud of these efforts. But Apple tried its hand at AI acceleration early on using its own chips and then shifted to Google TPU-based AI development. Not a bad choice, mind you, but the TPU does not have the performance nor the AI development tool-set that Nvidia Blackwell enjoys. The result? Well, how's that AI-enhanced Siri and Apple Intelligence working out for you? Yeah, not at all. To be sure, Apple has significant technical challenges that come with having an installed base and a focus on privacy above all, but not using Nvidia from the start probably cost it more in extra work and time to market than the 'expensive' Nvidia infrastructure would have cost. Siri's architecture, built over a decade ago, wasn't designed for generative AI, and retrofitting it has proven more difficult and taking longer than Apple expected. To make matters worse, Apple's AI teams have faced internal fragmentation, with some pushing for in-house developed AI models and others advocating partnerships with with OpenAI, Perplexity or Google. The company also lost key talent to competitors. Ruoming Pang, who led Apple's foundation models team, left for Meta in 2023. Other researchers followed, citing slow progress and lack of clarity in Apple's AI strategy. Amazon AWS Does Offer Nvidia GPUs, but Prefers its Own Silicon AWS recently paid the price of its slow generative AI sales on Wall Street caused by Amazon's hubris and NIH (not invented here). The market share of new generative AI use cases landing on AWS is reportedly lower than its overall cloud share, with Microsoft taking over the lead. According to IOT-Analytics, Microsoft has about 16% share of new genAI case studies, as does AWS, well below AWS leadership share in 2023 of 37%. AWS is not losing its first-place share in the overall cloud market, at least not yet, but for genAI-specific apps and new enterprise AI workloads, Azure and Google are increasingly competitive, and in some cases are outpacing AWS in genAI-related tools and adoption. Reducing reliance on Nvidia and lowering costs sounded like a good strategy. So, Amazon's AWS division, like Google and Microsoft, invested heavily in custom silicon for training and inference, named, of course, Trainium and Inferentia. The latest release, Trainium2, was launched in 2024 and appears to offer impressive specs: up to 83.2 petaflops of FP8 compute and 6 TB of HBM3 memory bandwidth. Amazon even created a 40,000-chip Trainium UltraCluster to support generative AI workloads. But accelerator performance alone doesn't create AI. You need software, great chip-to-chip networking and a thriving developer ecosystem. AWS developers found Trainium software harder to work with than CUDA, and they reportedly pushed back to management against Trainium's limitations. Management essentially said shut up and get to work. So, Trainium adoption lagged. Amazon realized it needed to invest even more to create the developers ecosystem, and it launched the Build on Trainium initiative — a $110 million investment in university research. While appealing, this effort came years after Nvidia had firmly cemented its dominance in AI research and development. That is $110 million that could have been better spent on Nvidia hardware and better AI. And that $110 million is on top of the money that AWS spent developing the Trainium and Inferentia chips, probably well over $1 billion. So, Amazon decided to invest another $4 billion in Anthropic, the company behind Claude. Anthropic agreed to use Trainium chips for training its models in return. But behind the scenes, tensions emerged. Engineers at Anthropic reportedly also pushed back against Trainium. Many preferred Nvidia's stack for its maturity and tooling. Anthropic teams had to rework their CUDA-based pipelines to work on Trainium, leading to delays and performance issues. While Amazon touted the partnership as a breakthrough, it was a compromise — Anthropic needed funding, and Amazon needed a flagship AI partner. Amazon appears of late to be changing course, deepening its partnership with Anthropic and expanding support for Nvidia GPUs. AWS is building a massive Nvidia cloud infrastructure, Project Ceiba, with over 20,000 Nvidia GPUs. But it is only available to Nvidia engineers for use in developing AI and chips, not for public cloud access. Now Tesla has Seen the Light In 2019, Tesla shifted from using Nvidia to its custom FSD Chip for vehicle Autopilot hardware and neural network inference, replacing Nvidia's Drive PX2 system. And it began a major effort to build its own AI Supercomputer, DOJO, with its in-house chips. Since 2019, Tesla has reportedly spent over $1 billion developing DOJO along with another $500 million developing a DOJO supercomputer in Buffalo, New York. Last week, Elon Musk announced on X that he was ending this program and would instead deploy on Nvidia and AMD GPUs. I suspect Tesla will mostly deploy Nvidia this year and see how AMD's MI400 looks in 2026. Should Cloud Service Providers Even Build Their Own AI Chips? Well, first, let's look at a company that did not. OpenAI has recently reached $12 billion in annualized revenue and broke the $700 million ChatGPT weekly active user barrier. And guess what it uses? Yep, Nvidia. Sam Altman does have the gleam of OpenAI chips in his eye, to be sure, but he also realizes that speed, ease of use and development time matters more to OpenAI than the savings that proprietary chips could provide. At least for now. Meta has its own MTIA chip, but it is used for internal workloads, like recommendation engines for its Facebook and other properties. Microsoft has its own Maia chips starting with the Maia 100, announced in 2023, Used primarily for internal testing and select workloads. The planned successor, Maia 200, is now expected in 2026 due to delays. Maia 200 is designed for data center AI acceleration and inference workloads. We will see if Microsoft learns from Tesla and Apple's mistakes. I suspect Google is perhaps alone in realizing a decent return on its TPU investments, but it has generally failed to attract large outside customers, aside from Apple. But it gets a lot of bang for the buck for internal workloads and training. My advice to CSPs is this: if you can get Nvidia GPUs, use them. If you have a workload for which you believe they are not ideal and can model a decent ROI, then go for it. Otherwise, save your capital. The Consequences of Skipping Nvidia Can be Dire A year in the world of generative AI can mean the difference between heaven and hell, or at least multi-billion-dollar successes or failure. The hubris of some high tech companies have cost them billions of dollars, spent needlessly. Amazon ceded early leadership in cloud AI to Microsoft Azure, which now hosts many of the world's top models. Apple missed the 'AI supercycle' for iPhone upgrades, as consumers saw little reason to buy new devices without meaningful Siri improvements. Tesla has seen the light and is moving fast. All three of these companies now face pressure to catch up — not just in model performance, but in developer mindshare and ecosystem momentum. Yeah, you can build your own AI chip. But you might regret it.

Two of Microsoft's biggest products, one absurdly low lifetime price
Two of Microsoft's biggest products, one absurdly low lifetime price

New York Post

timean hour ago

  • New York Post

Two of Microsoft's biggest products, one absurdly low lifetime price

Discover startups, services, products and more from our partner StackCommerce. New York Post edits this content, and may be compensated and/or receive an affiliate commission if you buy through our links. TL;DR: For $54.97, get lifetime access to Microsoft Office and Windows 11 Pro to install on one PC. If your workday runs on Microsoft, this bundle gets you set for the long haul — and for a fraction of what you'd expect to pay. For $54.97, you get lifetime licenses to both Microsoft Office Professional 2021 and Windows 11 Pro. That means no monthly fees, no renewal notices, just the full software, installed and ready to go on your PC. Office Pro 2021 delivers all the essentials: Word, Excel, PowerPoint, Outlook, Teams (free version), OneNote, Publisher, and Access. Whether you're handling data-heavy reports, building presentations from scratch, or keeping client projects organized, the tools are all here. Advertisement On the OS side, Windows 11 Pro gives you Microsoft's latest interface, faster workflows, and enterprise‑grade security. Features like BitLocker encryption, Windows Sandbox, Hyper‑V virtualization, and Azure AD support make it a strong choice for professionals, while everyday enhancements like snap layouts, improved search, and built‑in Teams keep you moving quickly. Gamers get DirectX 12 Ultimate for high‑end graphics, and creative users can take advantage of touchscreen support and the integrated Copilot AI assistant for faster answers, summaries, and even code suggestions. Both licenses are lifetime — install them once and they're yours for good. Office Pro 2021 is compatible with Windows 10 and 11, while Windows 11 Pro's requirements include a 1 GHz or faster processor, 4 GB of RAM, and TPM 2.0 support. Skip the subscriptions and download Office Pro 2021 and Windows 11 Pro for a one-time payment of $54.97 (MSRP $418.99). StackSocial prices subject to change.

Free AI training comes to California colleges — but at what cost?
Free AI training comes to California colleges — but at what cost?

Associated Press

time2 hours ago

  • Associated Press

Free AI training comes to California colleges — but at what cost?

As artificial intelligence replaces entry-level jobs, California's universities and community colleges are offering a glimmer of hope for students: free AI training that will teach them to master the new technology. 'You're seeing in certain coding spaces significant declines in hiring for obvious reasons,' Gov. Gavin Newsom said Thursday during a press conference from the seventh floor of Google's San Francisco office. Flanked by leadership from California's higher education systems, he called attention to the recent layoffs at Microsoft, at Google's parent company, Alphabet, and at Salesforce Tower, just a few blocks away, home to the tech company that is still the city's largest private employer. Now, some of those companies — including Google and Microsoft — will offer a suite of AI resources for free to California schools and universities. In return, the companies could gain access to millions of new users. The state's community colleges and its California State University campuses are 'the backbone of our workforce and economic development,' Newsom said, just before education leaders and tech executives signed agreements on AI. The new deals are the latest developments in a frenzy that began in November 2022, when OpenAI publicly released the free artificial intelligence tool ChatGPT, forcing schools to adapt. The Los Angeles Unified School District implemented an AI chatbot last year, only to cancel it three months later without disclosing why. San Diego Unified teachers started using AI software that suggested what grades to give students, CalMatters reported. Some of the district's board members were unaware that the district had purchased the software. Last month, the company that oversees Canvas, a learning management system popular in California schools and universities, said it would add 'interactive conversations in a ChatGPT-like environment' into its software. To combat potential AI-related cheating, many K-12 and college districts are using a new feature from the software company Turnitin to detect plagiarism, but a CalMatters investigation found that the software accused students who did real work instead. Mixed signals? These deals are sending mixed signals, said Stephanie Goldman, the president of the Faculty Association of California Community Colleges. 'Districts were already spending lots of money on AI detection software. What do you do when it's built into the software they're using?' Don Daves-Rougeaux, a senior adviser for the community college system, acknowledged the potential contradiction but said it's part of a broader effort to keep up with the rapid pace of changes in AI. He said the community college system will frequently reevaluate the use of Turnitin along with all other AI tools. California's community college system is responsible for the bulk of job training in the state, though it receives the least funding from the state per student. 'Oftentimes when we are having these conversations, we are looked at as a smaller system,' said Daves-Rougeaux. The state's 116 community colleges collectively educate roughly 2.1 million students. In the deals announced Thursday, the community college system will partner with Google, Microsoft, Adobe and IBM to roll out additional AI training for teachers. Daves-Rougeaux said the system has also signed deals that will allow students to use exclusive versions of Google's counterpart to ChatGPT, Gemini, and Google's AI research tool, Notebook LLM. Daves-Rougeaux said these tools will save community colleges 'hundreds of millions of dollars,' though he could not provide an exact figure. 'It's a tough situation for faculty,' said Goldman. 'AI is super important but it has come up time and time again: How do you use AI in the classroom while still ensuring that students, who are still developing critical thinking skills, aren't just using it as a crutch?' One concern is that faculty could lose control over how AI is used in their classrooms, she added. The K-12 system and Cal State University system are forming their own tech deals. Amy Bentley-Smith, a spokesperson for the Cal State system, said it is working on its own AI programs with Google, Microsoft, Adobe and IBM as well as Amazon Web Services, Intel, LinkedIn, Open AI and others. Angela Musallam, a spokesperson for the state government operations agency, said California high schools are part of the deal with Adobe, which aims to promote 'AI literacy,' the idea that students and teachers should have basic skills to detect and use artificial intelligence. Much like the community college system, which is governed by local districts, Musallam said individual K-12 districts would need to approve any deal. Will deals make a difference to students, teachers? Experts say it's too early to tell how effective AI training will actually be. Justin Reich, an associate professor at MIT, said a similar frenzy took place 20 years ago when teachers tried to teach computer literacy. 'We do not know what AI literacy is, how to use it, and how to teach with it. And we probably won't for many years,' Reich said. The state's new deals with Google, Microsoft, Adobe and IBM allow these tech companies to recruit new users — a benefit for the companies — but the actual lessons aren't time-tested, he said. 'Tech companies say: 'These tools can save teachers time,' but the track record is really bad,' said Reich. 'You cannot ask schools to do more right now. They are maxed out.' Erin Mote, the CEO of an education nonprofit called InnovateEDU, said she agrees that state and education leaders need to ask critical questions about the efficacy of the tools that tech companies offer but that schools still have an imperative to act. 'There are a lot of rungs on the career ladder that are disappearing,' she said. 'The biggest mistake we could make as educators is to wait and pause.' Last year, the California Community Colleges Chancellor's Office signed an agreement with NVIDIA, a technology infrastructure company, to offer AI training similar to the kinds of lessons that Google, Microsoft, Adobe and IBM will deliver. Melissa Villarin, a spokesperson for the chancellor's office, said the state won't share data about how the NVIDIA program is going because the cohort of teachers involved is still too small. ___ This story was originally published by CalMatters and distributed through a partnership with The Associated Press.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store