{"id":11451,"date":"2024-05-28T09:26:55","date_gmt":"2024-05-28T01:26:55","guid":{"rendered":"https:\/\/www.1ai.net\/?p=11451"},"modified":"2024-05-28T09:26:55","modified_gmt":"2024-05-28T01:26:55","slug":"%e6%96%87%e5%ad%97%e7%94%9f%e6%88%90%e6%89%8b%e8%af%ad%e8%a7%86%e9%a2%91%e5%a4%a7%e6%a8%a1%e5%9e%8bsignllm-%e5%b8%ae%e5%8a%a9%e5%90%ac%e9%9a%9c%e4%ba%ba%e7%be%a4%e5%ae%9e%e7%8e%b0%e6%97%a0%e9%9a%9c","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/11451.html","title":{"rendered":"SignLLM, a large model for generating sign language videos from text, helps the hearing-impaired achieve barrier-free communication"},"content":{"rendered":"<p>Recently, a product called<a href=\"https:\/\/www.1ai.net\/en\/tag\/signllm\" title=\"[See articles with [SignLLM] label]\" target=\"_blank\" >SignLLM<\/a>Multilingual<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%89%8b%e8%af%ad\" title=\"[Sees articles with [sign language] labels]\" target=\"_blank\" >Sign Language<\/a>The model has attracted widespread attention.<span class=\"spamTxt\">First<\/span>A model that can generate sign language gestures from input text.<\/p>\n<p>SignLM uses the rich \"Prompt2Sign\" multilingual sign language data set to ensure that the resulting sign language video action is naturally consistent. In the past, sign language interpretation often required the involvement of professional sign language interpreters and was less efficient. The emergence of SignLLM, which provides instant and autonomous sign language conversion services to persons with hearing impairments, has significantly improved communication efficiency and better integration into society\u3002<\/p>\n<p class=\"article-content__img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-11452\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/05\/6385248437409435784560296.jpg\" alt=\"\" width=\"554\" height=\"254\" \/><\/p>\n<p>This model can generate sign language gestures from input text, subverting the traditional sign language translation model. However, some people are worried about whether the hand deformation effect of this model is real and credible. Some users said that they do not understand sign language, so it is difficult to evaluate the accuracy of this model. At present, the model has released 9 examples with links, which has attracted a lot of attention and discussion.<\/p>\n<p>Currently, SignLLM has released 9 examples and provided relevant links, so that users can have a deeper understanding of the performance of this model. It is hoped that this technology can provide convenience for more people and allow more people to benefit from the convenience and diversity of sign language communication.<\/p>","protected":false},"excerpt":{"rendered":"<p>In recent days, a multilingual sign language model called SignLM has attracted widespread attention. This was said to be the first model to generate sign language signs from input texts. SignLM uses the rich \"Prompt2Sign\" multilingual sign language data set to ensure that the resulting sign language video action is naturally consistent. In the past, sign language interpretation often required the involvement of professional sign language interpreters and was less efficient. The emergence of SignLLM, which provides instant and autonomous sign language conversion services to persons with hearing impairments, has significantly improved communication efficiency and better integration into society. This model can generate sign language signposts from the input text, which subverts the traditional sign language translation model. However, there are concerns as to whether the hand deformation effect of the model is real and credible. One<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[2797,216,2796],"collection":[],"class_list":["post-11451","post","type-post","status-publish","format-standard","hentry","category-news","tag-signllm","tag-216","tag-2796"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/11451","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=11451"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/11451\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=11451"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=11451"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=11451"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=11451"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}