Back to list
314 / 452
Qualcomm AI Engine in Snapdragon 8 Elite Gen 5
HonoreeArtificial Intelligence온디바이스 AI스냅드래곤AI 엔진머신러닝엣지 컴퓨팅퀄컴

Qualcomm AI Engine in Snapdragon 8 Elite Gen 5

2
0

Qualcomm Technologies, Inc.

One-Line Product Definition

Next-generation AI engine for smartphones - An AI acceleration hardware and software platform embedded in the Snapdragon 8 Gen5 Elite mobile chipset, enabling on-device personalized AI experiences such as an in-device AI assistant that understands user needs and automatically performs tasks. It delivers 37% improved NPU performance compared to the previous generation and ultra-low latency AI processing with a CPU specialized for matrix operations.

Problem Definition

Users want AI functions such as voice assistants, camera AI, and translation on their smartphones, but most rely on the cloud, causing concerns about response delays and privacy breaches.

For example, using an AI assistant that helps with scheduling or message replies requires sending data to a server for processing, which raises privacy concerns, and it becomes useless without an internet connection. Also, it was difficult to use large language models like ChatGPT on phones due to insufficient computing power.

In other words, the challenge was to run rich AI functions in the limited computing and battery environment of mobile devices. Previous generation chips also had NPUs (Neural Network Accelerators), but they were insufficient for driving large models or processing multiple AI tasks simultaneously, and collaboration between the CPU/GPU and AI cores was not optimized, resulting in insufficient utilization of heterogeneous computing.

As a result, smartphones have been limited to some simple AI processing, failing to implement user-customized AI assistants or instant generative AI.

Key Differentiators

Qualcomm's new AI Engine is designed to comprehensively utilize the AI capabilities of various cores within the SoC, such as CPU, NPU, and DSP.

First of all, the Hexagon NPU has been upgraded to be 37% faster than the previous generation, and it can run larger models with less power. In addition, the custom Oryon CPU core has the first-ever built-in matrix operation acceleration unit, allowing the CPU itself to quickly perform AI matrix calculations (e.g., multiplication-accumulation). This significantly reduces the latency of deep learning inference because the CPU, which is normally responsible for running apps, can directly assist with AI model inference.

By combining the advantages of heterogeneous cores according to the situation, it enables ultra-low latency (on-demand) responses for any AI task. As a result, within the smartphone,Always-on LLM (large language model)is realized, enabling a true agent-type AI assistant experience where the device continuously learns and understands the user and performs automatic tasks between apps according to the context.

For example, the phone AI learns the user's schedule, messages, location, etc., and turns on the meeting link in advance, or when a photo is taken, it instantly completes the editing with personal preference styles stored on the device. All of this personalized AI is processed internally on the device, so personal information is not sent externally, protecting privacy, and it works even in offline mode.

In addition, this AI Engine operates with improved power management, improving overall performance per watt and minimizing the impact of always-on AI tasks (e.g., always-listening triggers) on the battery.

In conclusion, the Qualcomm AI Engine is ahead of competitors in heterogeneous computing optimization and AI continuous operation capabilities, and is the key to realizing the AI experiences (generative AI, real-time translation subtitles, personal assistants, etc.) that were imagined on mobile.

Key Adopters

Smartphone OEM manufacturers (B2B)will purchase the Snapdragon 8 Gen5 chipset, which includes this AI Engine, and install it in their next flagship phones. Major Android smartphone companies such as Samsung Electronics, Xiaomi, and Oppo are major customers.

In addition, XR (extended reality) headset manufacturers or in-vehicle infotainment systems can also adopt this chipset (or derived versions), so those companies are also potential customers.

General consumers do not directly buy this engine, but they benefit from purchasing devices containing this engine, so the B2C perceived effect is significant. However, since commercial purchasing decisions are made by smartphone/device companies, it is basically a B2B2C structure.

Scalability

The mobile AP market is global in scale, and Qualcomm leads the top-tier Android chipset sector, so the scope of application is wide.

This AI Engine will be extended to other product lines such as mid-range chips and automotive platforms, so it will be applicable to IoT devices, laptops, etc. in addition to smartphones. As the on-device AI trend is clear, it can be expanded to smart TVs, home appliances, etc. in the future.

There are no regional restrictions, and rather, as regulations in each country emphasize personal information protection, on-device AI is preferred over cloud AI, which further highlights the strengths of this technology. It is expected to be continuously strengthened in future chips to improve performance and become a standard feature of the entire Qualcomm platform.

Judges' Evaluation

The CES Innovation Award in the AI category is the result of being recognized for *"injecting new AI intelligence into smartphones"*.

Industry experts are paying particular attention to the "always-on LLM" concept, seeing the ability to run ultra-large language models on small devices as a major leap forward. At the time of the Snapdragon announcement, demonstrations of running LLMs with a scale of 1 billion words within the device became a hot topic, creating expectations of *"an era where ChatGPT can be run directly on mobile devices"*.

In addition, the introduction of matrix acceleration in the CPU is the first in the ARM ecosystem, and media such as AnandTech have described it as a "paradigm shift in mobile CPUs." It is certain that phones equipped with this engine will be released in large numbers in 2026, so consumer expectations are high.

Users hope to be able to quickly generate images and perform real-time interpretation without the cloud, and manufacturers are planning to actively use it as a differentiation point. Although it is still in its early stages, there are tasks such as "how much performance LLM can actually run" and "how much software optimization OEM will provide," but overall, there is a strong perception that it is a technological advancement rather than hype.

Rather, actual perceived experience is more important to the public than specifications, so the completeness of the AI functions of future搭載phones will determine the evaluation.

Analyst Insights

🔥 High marketability / Business connection potential – By implementing on-device AI that will be the core of differentiation for next-generation smartphones, both related manufacturers and consumers will feel great value. It is a technological engine that will be adopted in a wide range of devices such as mobile and XR, leading Qualcomm's business growth.

The award list data is based on the official CES 2026 website, and detailed analysis content is produced by USLab.ai. For content modification requests or inquiries, please contact contact@uslab.ai. Free to use with source attribution (USLab.ai) (CC BY)

댓글 (0)

댓글을 불러오는 중...