OpenAI Launch GPT-5-Codex-Mini "Economic Efficient" AI Programming Model

November 9th news, September this yearOpenAI Launched GPT-5- Codex, a GPT-5 model that optimizes the “autonomous coding” mission on the Codex platform, based on the GPT-5 architecture, with a significant increase in its reasoning and programming capabilities。

GPT-5-Codex is a real software engineering landscape capable of working from creating new projects, adding functionality and testing to large-scale code re-engineering。

According to Foreign Media Neowin, OpenAI published GPT-5-Codex-Mini. By definition, the model is a smaller and cheaper version of GPT-5-Codex. It has a small loss of performance compared to the original version and is available to developersAbout 4 times the amount of useI don't know. In the SWE-bench Verified test, GPT-5 High scored 72.8%GPT-5-Codex score 74.51 TP3T, while GPT-5-Codex-Mini score 71.31 TP3T.

OpenAI Launch GPT-5-Codex-Mini "Economic Efficient" AI Programming Model

OpenAI proposalLightweight engineering task or near-speed ceilingwhen using GPT-5-Codex-Mini; when the usage reaches 90%, the Codex system automatically alerts the user to switch. The Mini version is online in CLI and IDE extensions, and API support is about to be in place。

With GPU efficiency improvements, ChatGPT Plus, Business and Edu users will be able to increase their speed ceilings by 50%, ChatGPT Pro and Enterprise users will have priority scheduling to obtain faster response。

1AI understands that OpenAI has also made a back-office optimization of Codex to ensure that, whenever visited, developers have a stable and predictable use experience and avoid previous fluctuations caused by the problem of caches or traffic routes。

statement:The content of the source of public various media platforms, if the inclusion of the content violates your rights and interests, please contact the mailbox, this site will be the first time to deal with.
Information

PROFESSOR MAN: AI IS NO SUBSTITUTE FOR HUMAN CREATIVITY AND EMOTION

2025-11-7 11:35:30

Information

The dark side of the moon Kimi K2 Tinking training costs were exposed to $4.6 million and performance exceeded the OpenAI GPT model of billions of dollars invested

2025-11-9 14:35:47

Search