The chips running AI on your device

EmilyHempelArtificial Intelligence, Press, Semiconductors

Want to read more?

Disruption is changing the face of investment. To receive monthly performance updates and more content like this from Loftus Peak, fill in your name and email address. You can unsubscribe at any time.

Do not show again.

Want to read more?

It was inevitable that the AI boom animating the Nvidia share price would finally find its way into your phone, laptop and PC. And as of last week, the latest AI-enabled PC (the Microsoft Surface) is available for pre-order, with an 18 June delivery date. This Surface is powered by a Qualcomm Snapdragon X ARM chip, with this chip also in units from Dell, Acer, Asus, Samsung, HP and Lenovo. This will be a jolt of innovation that will likely enrich chipmakers, PC vendors and maybe even delight consumers.

A PC buying rush cannot come quick enough for some players. The market has been in decline since an out-of-sequence buying surge during the Covid pandemic, with market researchers seeing the first shoots of PC growth in the final quarter of 2023, following eight consecutive quarters of decline.

The first heart-starter for the PC market is the advent of the so-called AI PC, while the second is the long-awaited Windows on Arm (WoA) laptop that looks to have the performance of Apple’s M series laptops and, more importantly, the Apple machine’s superior battery life. Beyond that, it will also be able to deal with a range of AI tasks without a need to connect to the cloud. These tasks could include simultaneous translation, picture search, speech to text and a host of others.

We covered off ARM chips earlier this month.

So what is an AI PC?

Loosely speaking, an AI PC is a laptop with a relatively new sliver of silicon attached to its CPU called a neural processing unit or NPU. In Applespeak, it’s called a neural engine. This NPU catches AI workloads that would normally bog down laptops running only a CPU.

NPUs will be put to work helping run tasks like referencing cut down large language models (LLMs) designed to run locally on the PC instead of in the cloud as well as driving AI assistants, A.K.A. copilots, that will help with everything from tweaking Windows settings to working up documents, images and videos.

As suggested above, the benefits of running an LLM locally include not having to pay cloud hosting costs, improved data privacy, lower latency and always on capability for AI apps due to not having to rely on the vagaries of network connectivity and bandwidth.

NPUs are particularly good at taking the strain of localised AI inferencing work such as translating text or generating code while sticking to a low power budget and will also be put to work on new Windows features like Recall, which continuously records your screen activity and lets you instantly call up half forgotten stuff like “where’s that email where I chatted with Bradley about roller skates.”

GPUs can also do the AI tricks that NPUs do, but using a full-blown GPU from the likes of Nvidia in an ultra-portable laptop hurts the power budget as well as adding bulk and extra expense. Mobile compute devices do have a GPU to render graphics, but they are miniaturised, resulting in lower power and lower performance designs. Note the Qualcomm Hexagon NPU in the schematic below.

All these parts are on the Snapdragon X Elite

Source: Qualcomm

Watch this space, though. Mediatek and Nvidia are working on a laptop chipset that employs an ARM CPU and a special, low power but high AI performance Nvidia GPU. Mediatek/Nvidia AI PC models are due out next year.

The advent of the NPUs bring a new marketing metric called TOPS or trillions of operations per second. TOPS are a rough measure of AI performance.

To make matters more confusing, chipmakers quote NPU specific TOPS and total system TOPS (NPU plus CPU and GPU). For instance, the latest Hawk Point mobile CPU from Advanced Micro Devices has 16 NPU TOPS and 39 total TOPS.

Talk of TOPS and AI PCs bring us to Qualcomm

Microsoft is using the Copilot+ branding to distinguish Windows AI PCs from their less artificially intelligent brethren. Copilot+ PCs must have an NPU with a minimum of 45 TOPS and a special Copilot key .

For now, the Snapdragon X is the only ultra mobile AI PC part that cuts the Copilot+ mustard. Copilot+ are also required to carry a minimum of 16GB of system memory and 256GB of storage. These uplifts in PC memory are positive for Micron and Samsung, which are in portfolios we manage along with Qualcomm.

Fresh silicon coming from AMD and Intel will see X86 mobility laptop models that have AI performance parity with Snapdragon X. The X86 AI competition is expected to be on shelves during the fourth quarter of 2024.

This gives Qualcomm and its PC maker partners three or four months of clear air to make a mark with the latest WoA PCs and lay the ground work for a whole new PC segment as well as a fresh revenue line.

The PC industry has been hungry for some innovation – and here it comes in a double dose from ARM and AI.

Share this Post

Want to read more?

Disruption is changing the face of investment. To receive monthly performance updates and more content like this from Loftus Peak, fill in your name and email address. You can unsubscribe at any time.

Do not show again.