AI has been the buzzword on everyone’s lips for the last few years, ever since generative AI took off with the launch of ChatGPT. Over time, generative AI models have matured and are now much more capable, but a trend we’ve been noticing lately is local compute. Because data is precious, and no one wants to share it with big corporations to train their models.
Therefore, having powerful hardware to run these models and AI features natively is the need of the hour. Consider Apple Intelligence—it only runs on the iPhone 15 Pro and Mac/iPads with M series chipsets. Similarly, only the Pixel 8 Pro from Google gets the top-end Gemini features. With this context in mind, it’s manufacturers like Intel who are bridging the gap in the personal computing space on the other end of the spectrum.
Also read: 7 AI PCs with Microsoft Copilot+ powered by Snapdragon X Elite, X Plus from Asus, Acer, HP and more
Think about it—having AI features on smartphones is exciting, but it is still a niche. To reach a wider user base, it is only possible through enterprise-grade use cases. The real benefits of AI will only be realised when a large segment of the working population uses it daily. And this will only happen when capable hardware becomes accessible to them. This is where brands like Intel come in—making competent hardware accessible, and enabling “on-device” AI compute.
What Are AI PCs, and How Can They Make Compute Accessible?
It’s simple, really. We’ve grown up hearing about CPUs (central processing units) and GPUs (graphics processing units), but now there’s a third component that has recently made its way into chipsets: the NPU, short for Neural Processing Unit. This ensures that machine learning and light AI tasks are handled locally on the system instead of first sending the data to online servers. We have increasingly seen how NPUs can power local features on smartphones. Now, with a renewed focus on PCs, we will gradually see on-device AI features more commonly on AI PCs by brands like Intel.
At the recently concluded Computex in Taipei, we saw how chipmakers are working towards building efficient processors to handle increased workloads in data centres worldwide. Intel, in particular, launched the Xeon 6 processors, which enable much higher performance per watt. In the long term, this will make compute cheaper. For consumers, the Lunar Lake chipsets also focus on better efficiency. These new Lunar Lake chipsets feature the fourth generation Intel NPU, offering 48 tera-operations per second and up to four times the AI compute found in the previous generation.
AI AI AI AI AI AI AI AI AI AI AI AI AI AI AI AI AI AI AI
Also Read: Motorola Edge 50 Ultra vs OnePlus 12: Camera, display, processor, price and more specs compared
AI-enabled PCs Can Change the Way Enterprises Work
According to Statista, Windows PCs still hold over 72.17% of the market share in desktop PCs. This alone highlights the importance of local computers for the PC market. Windows PCs are ubiquitous—they’re given to new employees in corporate settings, found in offices, libraries, and many other places. This is where enhanced AI computing will be beneficial.
Also read: Navigating the ethical minefield of the AI landscape- Intel’s Santhosh Viswanathan on what India should do
Moreover, it’s not just about hardware. Collaboration with developers is equally important. Because unless you build the use cases for the hardware you have, what’s the use, really? Here too, brands like Intel are working with developers to bring various features and experiences to AI PCs.
It’s about solving simple tasks that can consume hours for a worker—summarising emails, removing annoying objects from photos, helping with writing emails, filtering through data—these tasks take hours. But with local compute and easy-to-access software features, this can eventually make the workforce more efficient with their time and, of course, achieve the infamous work-life balance that everyone craves.