{"id":15812,"date":"2024-07-18T08:51:06","date_gmt":"2024-07-18T00:51:06","guid":{"rendered":"https:\/\/www.1ai.net\/?p=15812"},"modified":"2024-07-18T08:51:06","modified_gmt":"2024-07-18T00:51:06","slug":"%e5%8f%af%e5%9c%a8%e6%89%8b%e6%9c%ba%e8%bf%90%e8%a1%8chugging-face%e6%8e%a8%e5%b0%8f%e8%af%ad%e8%a8%80%e6%a8%a1%e5%9e%8bsmollm-%e4%bd%8e%e5%8f%82%e6%95%b0%e8%a1%a8%e7%8e%b0%e4%bc%98%e7%a7%80","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/15812.html","title":{"rendered":"Can run on mobile phones! Hugging Face launches small language model SmolLM with low parameters and excellent performance"},"content":{"rendered":"<p data-pm-slice=\"1 1 []\">recent,<a href=\"https:\/\/www.1ai.net\/en\/tag\/hugging-face\" title=\"[See articles with [Hugging Face] label]\" target=\"_blank\" >Hugging Face<\/a>Launched a new<a href=\"https:\/\/www.1ai.net\/en\/tag\/ai%e5%b7%a5%e5%85%b7\" title=\"[SEE ARTICLES WITH [AI TOOL] LABELS]\" target=\"_blank\" >AI Tools<\/a>\u2014\u2014<a href=\"https:\/\/www.1ai.net\/en\/tag\/smollm\" title=\"_Other Organiser\" target=\"_blank\" >SmolLM<\/a>This is a series of high-performance small language models, ranging from 135M to 1.7B parameters, designed specifically for a variety of devices and applications. Imagine that these small models can run efficiently on mobile phones and laptops, it&#039;s so cool!<\/p>\n<p data-track=\"22\">The SmolLM models are small and powerful. They can still perform well with fewer computing resources and help users protect their privacy. Hugging Face used a dataset called SmolLM-Corpus to train these models. This dataset was carefully selected and contains rich educational and synthetic data to ensure that the model can learn a variety of knowledge.<\/p>\n<p data-track=\"23\">Specifically, SmolLM has three versions: 135M, 360M and 1.7B parameters. These models can not only handle a variety of tasks, but also run flexibly according to the user&#039;s hardware configuration. For example, the SmolLM-135M model surpasses many similar products and becomes the leader among models with less than 200M parameters.<\/p>\n<div class=\"pgc-img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-15813\" title=\"get-536\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/07\/get-536.jpg\" alt=\"get-536\" width=\"721\" height=\"400\" \/><\/div>\n<p data-track=\"24\">\n<p data-track=\"25\">SmolLM models were evaluated on various benchmarks, testing commonsense reasoning and world knowledge. The models showed impressive performance, outperforming other models in their respective size categories. For example, despite being trained on fewer tokens, the SmolLM-135M model outperformed MobileLM-125M, the current best model with less than 200M parameters. Similarly, the SmolLM-360M and SmolLM-1.7B models outperformed all other models with less than 500M and 2B parameters, respectively.<\/p>\n<p data-track=\"26\">In addition to its excellent performance, SmolLM has been specially tuned to make it better at understanding instructions and answering questions. Hugging Face also provides a WebGPU demo so that everyone can directly experience the capabilities of these models.<\/p>\n<p data-track=\"27\">The release of SmolLM demonstrates that even small models can achieve amazing performance with high-quality training data.<\/p>","protected":false},"excerpt":{"rendered":"<p>Hugging Face recently launched a new AI tool, SmolLM. it's a series of high-performance mini-language models with parameters ranging from 135M to 1.7B, designed for a wide range of devices and applications. It's simply cool to imagine these small models running efficiently on cell phones and laptops! SmolLM models are characterized by being small and powerful. They can help users protect their privacy by performing well even with fewer computational resources.Hugging Face uses a dataset called SmolLM-Corpus in training these models, which has been carefully selected to contain rich educational and synthetic data to ensure that the models learn a wide range of knowledge. Specifically, SmolLM has<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[387,384,3566,3565],"collection":[],"class_list":["post-15812","post","type-post","status-publish","format-standard","hentry","category-news","tag-ai","tag-hugging-face","tag-smollm","tag-3565"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15812","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=15812"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/15812\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=15812"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=15812"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=15812"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=15812"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}