TechTalk Daily
By: Daniel W. Rasmus for Serious Insights
A few years ago, I wrote a white paper for Cisco on the post-PC world. We haven’t arrived there yet. The personal computer market continues to evolve. Artificial intelligence is on the cusp of driving the next transformative shift, and that shift requires more processing power than a tablet or mobile phone can encompass, though they already leverage their own onboard AI processing and the cloud to incorporate AI into the fix.
For major AI workloads, however, the PC will become a necessary component for local private AI, and the portal into enterprise and cloud AI systems will likely leverage onboard capabilities for pre- and post-processing and for some handoffs during an agentic (I hate that word) interaction.
AI-powered PCs will likely redefine how we interact with technology, unlocking new levels of performance, efficiency, and personalization. Don’t get hung up on the Copilot key; for the most part, Lenovo didn’t. AI PCs aren’t about anything on the outside; they are all about the new chips and memory on the inside.
At the heart of AI-enabled PCs lies the Neural Processing Unit (NPU), a specialized processor designed to handle the intensive mathematical computations required by AI algorithms. Traditional CPUs and even GPUs have limits in efficiently processing AI workloads, such as deep learning inference, natural language processing, and computer vision. NPUs, however, are designed specifically for these tasks, enabling real-time AI-driven applications to run locally.
For example, Qualcomm’s Snapdragon processors, featured in the latest generation of AI-powered laptops, include built-in NPUs that accelerate tasks like voice recognition, real-time translation, and adaptive power management. Similarly, Apple’s M1 and M2 chips come equipped with dedicated AI engines that deliver machine learning performance at a fraction of the power cost compared to previous architectures. As these NPUs proliferate, PCs are moving beyond being passive devices into ones capable of performing predictive and adaptive computing.
Most AI PCs will not just ship with emergent NPUs but with beefed-up GPUs, some perhaps even with discrete GPUs, which served AI workloads before Apple, Intel, Qualcomm, and others rolled out NPU-powered chips. This will make the new devices even more capable of handling not only verbal intelligence but visual intelligence as well.
One of the key advantages of AI PCs is their ability to process AI tasks locally without relying on cloud infrastructure. This shift offers several benefits:
Microsoft has already begun integrating AI capabilities into its operating system with Windows 11, where AI features like automatic window management, background blur in video calls, and personalized system settings run seamlessly with built-in NPUs.
It is important to differentiate between generative AI (and other forms of AI, such as ruled-based AI, and machine learning) and this class of AI, which I think of as autonomous intelligence, meaning that these PC functions are the equivalent of biological functions that don’t require cognitive functions. You don’t need to think to breathe or blink your eyes, but your brain controls those functions.
Read the rest of the article to learn more about AI PCs and what is next with AI PCs here: AI PCs: First Looks at 5 Laptops that Redefine the Future of Personal Computing
About the Author:
Daniel W. Rasmus, the author of Listening to the Future, is a strategist and industry analyst who has helped clients put their future in context. Rasmus uses scenarios to analyze trends in society, technology, economics, the environment, and politics in order to discover implications used to develop and refine products, services, and experiences. He leverages this work and methodology for content development, workshops, and for professional development.
Interested in AI? Check here to see what TechTalk AI Impact events are happening in your area. Also, be sure to check out all upcoming TechTalk Events here.