{"id":25491,"date":"2024-12-23T08:55:09","date_gmt":"2024-12-23T00:55:09","guid":{"rendered":"https:\/\/www.1ai.net\/?p=25491"},"modified":"2024-12-23T08:55:09","modified_gmt":"2024-12-23T00:55:09","slug":"%e6%b6%88%e6%81%af%e7%a7%b0-openai-%e6%96%b0%e6%a8%a1%e5%9e%8b-gpt-5-%e7%a0%94%e5%8f%91%e9%81%87%e9%98%bb%ef%bc%8c%e6%88%90%e6%9c%ac%e9%ab%98%e6%98%82%e3%80%81%e6%95%88%e6%9e%9c%e6%9c%aa%e8%be%be","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/25491.html","title":{"rendered":"OpenAI's New Model GPT-5 Development Stalls, Costly, Unexpected Results, Sources Say"},"content":{"rendered":"<p>According to a Dec. 22 news report in the Wall Street Journal, it was noted that<a href=\"https:\/\/www.1ai.net\/en\/tag\/openai\" title=\"[View articles tagged with [OpenAI]]\" target=\"_blank\" >OpenAI<\/a> The next generation of large-scale language models under development <a href=\"https:\/\/www.1ai.net\/en\/tag\/gpt-5\" title=\"[SEE ARTICLES WITH [GPT-5] LABELS]\" target=\"_blank\" >GPT-5<\/a> development is behind schedule, and the current<strong>The results achieved have not yet reached a level commensurate with their significant cost.<\/strong><\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-25492\" title=\"3b579e5cj00sox971006nd000lc00csp\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2024\/12\/3b579e5cj00sox971006nd000lc00csp.jpg\" alt=\"3b579e5cj00sox971006nd000lc00csp\" width=\"768\" height=\"460\" \/><\/p>\n<p>The news echoes a previous report in The Information, which had hinted that OpenAI was looking for a new strategy to<strong>Because GPT-5 may not be able to achieve the significant performance leaps that previous models did<\/strong>The Wall Street Journal report reveals further details of the 18-month development process of the GPT-5, codenamed Orion. A report in the Wall Street Journal reveals further details of the 18-month development process of the GPT-5, code-named Orion.<\/p>\n<p>OpenAI has reportedly completed at least two large-scale trainings aimed at improving model performance through massive data training. The first training was slower than expected, signaling that larger-scale training will be time-consuming and costly. While GPT-5's performance is claimed to be superior to that of its predecessor, the<strong>But it has not progressed far enough to justify the huge costs of keeping the model running<\/strong>.<\/p>\n<p>The report also says that OpenAI employs people to create entirely new data by writing code or solving math problems, in addition to relying on publicly available data and licensing agreements. In addition, the company is using synthetic data generated by another of its models, o1.<\/p>\n<p>As of 1AI's press release, OpenAI has not yet responded, and the company has previously said that it will not release a model codenamed \"Orion\" this year.<\/p>","protected":false},"excerpt":{"rendered":"<p>According to the Wall Street Journal report of 22 December, the next generation of large-scale language model GPT-5, which is being developed by OpenAI, is lagging behind schedule and the results achieved are not yet at a level commensurate with its huge costs. The news echoes an earlier article in The Information that suggests that OpenAI is looking for new strategies, because GPT-5 may not be able to make a significant leap in performance like the previous model. The Wall Street Journal reports further details of the 18-month-long development of GPT-5 code-named Orion. It was reported that OpenAI had completed at least two major training exercises<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[719,190],"collection":[],"class_list":["post-25491","post","type-post","status-publish","format-standard","hentry","category-news","tag-gpt-5","tag-openai"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/25491","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=25491"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/25491\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=25491"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=25491"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=25491"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=25491"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}