{"id":30031,"date":"2025-03-04T18:33:24","date_gmt":"2025-03-04T10:33:24","guid":{"rendered":"https:\/\/www.1ai.net\/?p=30031"},"modified":"2025-03-04T18:33:24","modified_gmt":"2025-03-04T10:33:24","slug":"%e5%be%ae%e8%bd%af%e6%8b%a5%e6%8a%b1-deepseek%ef%bc%8ccopilot-pc-%e6%9c%ac%e5%9c%b0%e8%bf%90%e8%a1%8c-7b-%e5%92%8c-14b-%e6%a8%a1%e5%9e%8b","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/30031.html","title":{"rendered":"Microsoft Embraces DeepSeek, Copilot+ PCs Run 7B and 14B Models Locally"},"content":{"rendered":"<p>March 4 News.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%be%ae%e8%bd%af\" title=\"[View articles tagged with [Microsoft]]\" target=\"_blank\" >Microsoft<\/a>today announced the availability of the Azure AI Foundry to access the <a href=\"https:\/\/www.1ai.net\/en\/tag\/deepseek\" title=\"[View articles tagged with [DeepSeek]]\" target=\"_blank\" >DeepSeek<\/a>-R1 7B and 14B distillation models.<strong>because of <a href=\"https:\/\/www.1ai.net\/en\/tag\/copilot\" title=\"[View articles tagged with [Copilot]]\" target=\"_blank\" >Copilot<\/a>+ PC provides the ability to run 7B and 14B models locally<\/strong>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-30032\" title=\"883c8c0aj00sslhao00k5d000v900hhp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/03\/883c8c0aj00sslhao00k5d000v900hhp.jpg\" alt=\"883c8c0aj00sslhao00k5d000v900hhp\" width=\"1125\" height=\"629\" \/><\/p>\n<p>Back in January, Microsoft announced plans to bring an NPU-optimized version of the DeepSeek-R1 model directly to Copilot+ PCs with Qualcomm Snapdragon X processors. Now, that promise has finally been realized.<\/p>\n<p>1AI learned from the official Microsoft blog that<strong>Models will go live starting with Copilot+ PCs with Qualcomm Snapdragon X processors<\/strong>, followed by Intel Core Ultra 200V and AMD Renegade devices.<\/p>\n<p>Because the model runs on the NPU, AI computational power is continuously available while reducing the impact on PC battery life and thermal performance, and the CPU and GPU are available for other tasks.<\/p>\n<p>Microsoft emphasized that it used Aqua's internal automatic quantization tool to quantize all DeepSeek model variants into int4 weights. Unfortunately, model labeling speeds are quite low. Microsoft reported that the 14B model had a labeling speed of only 8 tok\/sec, while the 1.5B model had a labeling speed of nearly 40 tok\/sec. Microsoft mentioned that the company is working on further optimizations to increase the speed.<\/p>\n<p>Developers can download and run versions 1.5B, 7B, and 14B of DeepSeek models on Copilot+ PCs via the AI Toolkit VS Code extension.<\/p>","protected":false},"excerpt":{"rendered":"<p>March 4, 2011 - Microsoft today announced that it is providing Copilot+ PCs with the ability to run 7B and 14B models locally by accessing the DeepSeek-R1 7B and 14B distillation models through Azure AI Foundry. Back in January, Microsoft announced plans to bring NPU-optimized versions of the DeepSeek-R1 models directly to Copilot+ PCs powered by Qualcomm Snapdragon X processors. Now, that promise has finally been realized. 1AI has learned from Microsoft's official blog that the models will go live starting with Qualcomm Snapdragon X processor-powered Copilot+ PCs, followed by Intel Core Ultra 200V and<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[442,3606,280],"collection":[],"class_list":["post-30031","post","type-post","status-publish","format-standard","hentry","category-news","tag-copilot","tag-deepseek","tag-280"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/30031","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=30031"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/30031\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=30031"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=30031"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=30031"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=30031"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}