[ad_1]
Qualcomm president and CEO Cristiano Amon speaks at a news conference during CES 2022 in Las Vegas, Nevada, U.S. January 4, 2022.
Steve Marcus | Reuters
Qualcomm on Tuesday announced two new chips that are designed to run AI software — including the large language models, or LLMs, that have captivated the technology industry — without having to connect to the internet.
Interest in AI applications has exploded since the Stable Diffusion image generator and OpenAI’s ChatGPT were released in late 2022. Both so-called “generative AI” applications require a lot of processing power, and so far, they’ve primarily been run in the cloud on powerful and power-hungry Nvidia graphics processors.
The new silicon includes Qualcomm’s X Elite chip for PCs and laptops and the Snapdragon Series 8 Gen 3 for high-end Android phones.
The speed at which a smartphone chip processes AI models could represent a new feature battleground between high-end Android phones from companies such as Asus, Sony and OnePlus versus Apple’s iPhones, which are also getting new AI features on an annual basis.
Qualcomm’s latest Snapdragon chip can perform AI tasks significantly quicker than last year’s processor, dropping the time to generate an image from 15 seconds on last year’s chip to less than a second, a Qualcomm executive said in an interview.
“If someone would go to buy a phone today, they would say, how fast is the CPU, how much memory do I get on this thing? What does the camera look like?” said Alex Katouzian, Qualcomm senior vice president for mobile. “Over the next two or three years, people are going to say, what kind of AI capability am I going to have on this?”
Generative AI in your pocket
The AI boom has boosted Nvidia stock but has largely bypassed Qualcomm, even though its smartphone chips are shipped in huge volumes and have included AI portions called NPUs since 2018.
Qualcomm’s NPUs have been used to improve photos and other features. Now, Qualcomm said, its smartphone chip can handle the bigger AI models used in generative AI — as many as 10 billion parameters. That’s still less than some of the biggest AI models, such as OpenAI’s GPT3, which has about 175 billion parameters.
Qualcomm executives said these kinds of AI models can run on devices if the chips are fast enough and equipped with enough memory. They said it makes more sense to run language models locally rather than in the cloud because it can be faster and is more private. Qualcomm said its chips can run a version of Meta‘s Llama 2 model and that it expects its clients — smartphone makers — to develop their own models, too. Qualcomm is also developing its own AI models, it said.
Qualcomm provided an example of a device running the free Stable Diffusion AI model, which can generate images based on a string of words. It also demonstrated a related ability to expand or fill in parts of photos using AI.
Last year’s Qualcomm chip, the Gen 2, ran the same model successfully, but it took 15 seconds to crunch all the numbers to create an image of, for example, a cat at the beach. This year, its new chip can do it in half a second, which Katouzian said could greatly improve responsiveness for AI applications such as personal assistants.
Future applications, such as a personal voice assistant, could use an AI model on the device itself for easy queries, running on the device’s chip, and can send tougher questions to a beefier computer in the cloud, Qualcomm said.
Qualcomm said this is why it works so closely with Microsoft to make sure its chips are optimized for AI software.
“The more of these devices get used on the edge for running AI capabilities, the less money they have to spend on the Azure cloud, running super expensive inference capabilities,” Katouzian said. “All of that stuff can get offloaded. A hybrid situation where an offload of the cloud onto the client devices at the edge gives them a huge advantage.”
This year’s top smartphone chip from Qualcomm, Snapdragon Series 8 Gen 3, will start appearing in “premium” Android devices costing more than $500 early next year from brands including Asus, Sony and OnePlus, Qualcomm said. The features on the high-end chip eventually trickle down to other devices.
Qualcomm X Elite
Qualcomm’s new PC chip, the X Elite, is based on Arm and will compete with Intel‘s x86 chips to be used in laptops and desktops.
It uses technology from the company’s acquisition of Nuvia, which was started by former Apple engineers and is at the heart of a legal dispute with Arm. Laptops based on the chip using what Qualcomm calls Oryon cores are expected to hit the market in the middle of next year. Qualcomm said it beats Apple’s M2 Max chip in performance while using less power.
Don’t miss these CNBC PRO stories:
[ad_2]