Jump to content
  • Sign Up
×
×
  • Create New...

Microsoft Snapdragon X Copilot+ PCs get local DeepSeek-R1 support — Intel, AMD in the works


Recommended Posts

  • Diamond Member

This is the hidden content, please

This is the hidden content, please
Snapdragon X Copilot+ PCs get local DeepSeek-R1 support — Intel, AMD in the works

This is the hidden content, please

This is the hidden content, please
just announced that it will release NPU-optimized versions of DeepSeek-R1, allowing it to take advantage of AI-optimized hardware found in Copilot+ PCs. According to the
This is the hidden content, please
, the feature will first arrive on Qualcomm Snapdragon X PCs, to be followed by Intel Core Ultra 200V (Lunar Lake) and other chips. The initial release will feature DeepSeek-R1-Distill-Qwen-1.5B, which an AI research team from UC Berkeley has discovered is the smallest model that delivers correct answers, but larger models featuring 7 billion and 14 billion parameters will arrive shortly thereafter.

DeepSeek’s optimizations meant that it needed 11x less compute versus its Western competitors, making it a great model to run on consumer devices. However, it also uses Windows Copilot Runtime so developers can use on-device DeepSeek APIs within their apps.

Furthermore,

This is the hidden content, please
claims that this NPU-optimized version of DeepSeek will deliver “very competitive time to first token and throughput rates, while minimally impacting battery life and consumption of PC resources.” This means that Copilot+ PC users can expect the power and performance of competing models like Meta’s Llama 3 and OpenAI’s o1 while ensuring that the devices it’s installed on still offer great battery life.

That said, DeepSeek’s availability on Copilot+ PCs is geared more toward programmers and developers instead of consumers. Perhaps

This is the hidden content, please
is using it to encourage them to build more apps that would take advantage of AI PCs as many people still don’t see the need for it and market research suggests users only purchase these devices because they’re the only available option nowadays.

Another thing that got us curious is

This is the hidden content, please
’s preferential treatment for Qualcomm Snapdragon X PCs at this time. While it launched the Copilot+ branding with these chips last July, the latest mainstream Intel and AMD laptops now also have built-in NPUs. AMD has even released instructions on how users can run it on Ryzen AI CPUs and Radeon GPUs, with the company even claiming that the RTX 7900 XTX runs DeepSeek better than the RTX 4090.

Whatever the case, we’re still excited about the possibilities that DeepSeek unlocks for AI. Since it’s open source, nearly anyone can download it and run it locally, allowing others to build upon the advancements and optimizations the original model has put into place.

Get Tom’s Hardware’s best news and in-depth reviews, straight to your inbox.



This is the hidden content, please

#

This is the hidden content, please
#Snapdragon #Copilot #PCs #local #DeepSeekR1 #support #Intel #AMD #works

This is the hidden content, please

This is the hidden content, please

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Vote for the server

    To vote for this server you must login.

    Jim Carrey Flirting GIF

  • Recently Browsing   0 members

    • No registered users viewing this page.

Important Information

Privacy Notice: We utilize cookies to optimize your browsing experience and analyze website traffic. By consenting, you acknowledge and agree to our Cookie Policy, ensuring your privacy preferences are respected.