{"id":16549,"date":"2024-07-27T09:20:01","date_gmt":"2024-07-27T01:20:01","guid":{"rendered":"https:\/\/www.1ai.net\/?p=16549"},"modified":"2024-07-27T09:20:09","modified_gmt":"2024-07-27T01:20:09","slug":"%e7%99%bd%e5%ab%96%e5%85%9a%e7%8b%82%e5%96%9c%ef%bc%81gpt-4o-mini%e5%85%8d%e8%b4%b9%e5%be%ae%e8%b0%832%e4%b8%aa%e6%9c%88%ef%bc%8c%e6%af%8f%e5%a4%a9200%e4%b8%87token%e9%9a%8f%e4%be%bf%e8%96%85","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/16549.html","title":{"rendered":"Freeloaders are delighted! GPT-4o mini is fine-tuned for 2 months for free, and 2 million tokens can be collected every day"},"content":{"rendered":"<p class=\"pgc-p\" data-track=\"12\" data-pm-slice=\"0 0 []\">When the news of Llama 3.1 open source is still ringing in my ears,<a href=\"https:\/\/www.1ai.net\/en\/tag\/openai\" title=\"[View articles tagged with [OpenAI]]\" target=\"_blank\" >OpenAI<\/a>From now on, 2 million training tokens per day will be used to fine-tune the model for free until September 23.<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e5%8f%91%e8%80%85\" title=\"[Sees articles with [developer] labels]\" target=\"_blank\" >Developers<\/a>The generous donation is a bold boost to the advancement of AI technology.<\/p>\n<p data-track=\"19\"><a href=\"https:\/\/www.1ai.net\/en\/tag\/gpt-4o-mini\" title=\"[See article with [GPT-4o Mini] label]\" target=\"_blank\" >GPT-4o mini<\/a>The advent of OpenAI has excited countless developers. It ranks first with GPT-4o in the LMSYS ranking of the large model arena. It has powerful performance but the price is only 1\/20 of GPT-4o. This move by OpenAI is undoubtedly a major benefit to the field of AI.<\/p>\n<div class=\"pgc-img\"><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-16551\" title=\"get-839\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/07\/get-839.jpg\" alt=\"get-839\" width=\"750\" height=\"544\" \/><\/div>\n<p data-track=\"20\">Developers who received the email were excited to tell each other that they must grab such a big bargain as soon as possible. OpenAI announced that from July 23 to September 23, developers can use 2 million training tokens for free every day. The portion exceeding this will be charged at US$3 per million tokens.<\/p>\n<p data-track=\"21\">More affordable: GPT-4o mini\u2019s input token fee is 90% lower than GPT-3.5Turbo, and its output token fee is 80% lower. Even after the free period, GPT-4o mini\u2019s training cost is half that of GPT-3.5Turbo.<\/p>\n<p data-track=\"22\">Longer context: The training context length of GPT-4o mini is 65k Token, which is 4 times that of GPT-3.5Turbo, and the inference context length is 128k Token, which is 8 times that of GPT-3.5Turbo.<\/p>\n<p data-track=\"23\">Smarter and more capable: GPT-4o mini is smarter than GPT-3.5Turbo and supports visual capabilities (although fine-tuning is currently limited to text).<\/p>\n<p data-track=\"24\">The GPT-4o mini fine-tuning function will be open to enterprise customers and Tier 4 and Tier 5 developers, and will gradually expand access rights to all levels of users in the future. OpenAI has released a fine-tuning guide to help developers get started quickly.<\/p>\n<p data-track=\"25\">Some netizens are not optimistic about this, and they believe that this is OpenAI collecting data, training and improving AI models. Other netizens believe that the victory of GPT-4o mini is substantial evidence that AI has become smart enough to even fool us.<\/p>\n<p data-track=\"26\">The release of GPT-4o mini and the free fine-tuning policy will undoubtedly promote the further development and popularization of AI technology. For developers, this is a once-in-a-lifetime opportunity to build more powerful applications at a lower cost. And for AI technology itself, does this mean a new milestone? Let us wait and see.<\/p>\n<p data-track=\"27\">Fine-tuning documentation: https:\/\/platform.openai.com\/docs\/guides\/fine-tuning\/fine-tuning-integrations<\/p>\n<p>&nbsp;<\/p>","protected":false},"excerpt":{"rendered":"<p>While the news of Llama 3.1 open source is still echoing in our ears, OpenAI has come back to steal the show. From now on, 2 million training tokens per day for free fine-tuning models until September 23rd. This is not only a generous offer to developers, but also a bold push for the advancement of AI technology. The introduction of GPT-4o mini has thrilled countless developers. It tied for first place with GPT-4o in the Big Model Arena LMSYS rankings, with strong performance at 1\/20th of the price of GPT-4o.OpenAI's move is undoubtedly a major boon to the AI field. Developers who received the email excitedly ran to tell each other that such a big wool must be quickly gripped.OpenAI announced that during the period of July 23-September 23, developers<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[3578,190,1903],"collection":[],"class_list":["post-16549","post","type-post","status-publish","format-standard","hentry","category-news","tag-gpt-4o-mini","tag-openai","tag-1903"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/16549","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=16549"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/16549\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=16549"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=16549"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=16549"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=16549"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}