{"id":39605,"date":"2025-07-16T20:45:07","date_gmt":"2025-07-16T12:45:07","guid":{"rendered":"https:\/\/www.1ai.net\/?p=39605"},"modified":"2025-07-16T20:45:07","modified_gmt":"2025-07-16T12:45:07","slug":"mistral-%e6%8e%a8%e5%87%ba-voxtral-%e7%b3%bb%e5%88%97%e8%af%ad%e9%9f%b3%e7%90%86%e8%a7%a3%e6%a8%a1%e5%9e%8b%ef%bc%9a%e4%bb%a5%e5%bc%80%e6%ba%90%e5%bd%a2%e5%bc%8f%e6%8f%90%e4%be%9b%e5%87%ba%e8%89%b2","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/39605.html","title":{"rendered":"Mistral Launches Voxtral Series of Speech Understanding Models: Delivering Outstanding Correctness Performance in an Open Source Format"},"content":{"rendered":"<p>July 16 news.<a href=\"https:\/\/www.1ai.net\/en\/tag\/mistral\" title=\"[See article with [Mistral] label]\" target=\"_blank\" >Mistral<\/a> AI local time yesterday announced its <a href=\"https:\/\/www.1ai.net\/en\/tag\/voxtral\" title=\"[See articles with [Voxtral] labels]\" target=\"_blank\" >Voxtral<\/a> <a href=\"https:\/\/www.1ai.net\/en\/tag\/%e8%af%ad%e9%9f%b3%e7%90%86%e8%a7%a3%e6%a8%a1%e5%9e%8b\" title=\"[Sees articles with [Voice Understanding Model] labels]\" target=\"_blank\" >Speech Understanding Model<\/a>. The series is modeled as<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>which is available in a format that provides leading-edge low error rate performance at a lower price point.<strong>Supporting real-world speech intelligence production applications<\/strong>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-39606\" title=\"2e3f06e2j00szhsq4002td000v900gup\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/07\/2e3f06e2j00szhsq4002td000v900gup.jpg\" alt=\"2e3f06e2j00szhsq4002td000v900gup\" width=\"1125\" height=\"606\" \/><\/p>\n<p>The Voxtral family of models, derived from Mistral Small 3.1, includes Voxtral Small, a 24B version for production applications, and Voxtral Mini, a 3B version for local\/edge deployments, as well as Voxtral Mini Transcribe, which is a speech-to-text-only application.<\/p>\n<p>this model<strong>Supports context lengths of 32K tokens and can handle 30 minutes of audio transcription or 40 minutes of audio comprehension.<\/strong>It has a built-in ability to generate relevant questions, structured summaries, and support for Indo-European languages such as English, Spanish, French, Portuguese, Hindi, German, Dutch, and Italian.<\/p>\n<p>Mistral AI claims that Voxtral Mini Transcribe outperforms OpenAI Whisper in cost-sensitive use cases at less than half the price, and in advanced use cases, Voxtral Small is close to ElevenLabs Scribe, the top model in the field, at less than half the price.<\/p>","protected":false},"excerpt":{"rendered":"<p>July 16, 2011 - Mistral AI announced its Voxtral speech understanding models yesterday, local time. The family of models, available as open source, provides leading-edge, low error rate performance at a lower price point to support real-world speech intelligence production applications. The Voxtral family of models, derived from Mistral Small 3.1, includes Voxtral Small, a 24B version for production applications, and Voxtral Mini, a 3B version for local\/edge deployments, as well as Voxtral Mini Transcribe, which is a speech-to-text-only model. The models support 32K Token context length<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[559,7212,219,7211],"collection":[],"class_list":["post-39605","post","type-post","status-publish","format-standard","hentry","category-news","tag-mistral","tag-voxtral","tag-219","tag-7211"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/39605","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=39605"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/39605\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=39605"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=39605"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=39605"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=39605"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}