{"id":33300,"date":"2025-04-17T10:54:39","date_gmt":"2025-04-17T02:54:39","guid":{"rendered":"https:\/\/www.1ai.net\/?p=33300"},"modified":"2025-04-17T10:54:39","modified_gmt":"2025-04-17T02:54:39","slug":"%e5%9c%a8%e7%bb%88%e7%ab%af%e5%b0%b1%e8%83%bd%e8%b7%91%e7%9a%84%e8%bd%bb%e9%87%8f%e7%ba%a7%e6%8e%a8%e7%90%86%e6%99%ba%e8%83%bd%e4%bd%93%ef%bc%8copenai-%e5%8f%91%e5%b8%83%e5%ae%8c%e5%85%a8%e5%bc%80","status":"publish","type":"post","link":"https:\/\/www.1ai.net\/en\/33300.html","title":{"rendered":"OpenAI Releases Fully Open Source Codex CLI Tool for Lightweight Reasoning Intelligence That Runs at the Terminal"},"content":{"rendered":"<p>April 17 news.<a href=\"https:\/\/www.1ai.net\/en\/tag\/openai\" title=\"[View articles tagged with [OpenAI]]\" target=\"_blank\" >OpenAI<\/a> A lightweight terminal runtime code was released today<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e6%99%ba%e8%83%bd%e4%bd%93\" title=\"[View articles tagged with [intelligent body]]\" target=\"_blank\" >Agent<\/a> \u2014\u2014\u00a0<strong><a href=\"https:\/\/www.1ai.net\/en\/tag\/codex-cli\" title=\"_Other Organiser\" target=\"_blank\" >Codex CLI<\/a><\/strong>The tool is now available at <a href=\"https:\/\/www.1ai.net\/en\/tag\/github\" title=\"_Other Organiser\" target=\"_blank\" >GitHub<\/a> wholly<a href=\"https:\/\/www.1ai.net\/en\/tag\/%e5%bc%80%e6%ba%90\" title=\"[View articles tagged with [open source]]\" target=\"_blank\" >Open Source<\/a>.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"alignnone size-full wp-image-33301\" title=\"17df026ej00suuddd000pd000sm00cap\" src=\"https:\/\/www.1ai.net\/wp-content\/uploads\/2025\/04\/17df026ej00suuddd000pd000sm00cap.jpg\" alt=\"17df026ej00suuddd000pd000sm00cap\" width=\"1030\" height=\"442\" \/><\/p>\n<p>The Codex CLI works directly on the user's computer.<strong>Designed to maximize the inference power of models such as o3 and o4-mini<\/strong>and will soon support additional API models such as GPT-4.1.<\/p>\n<p><strong>Users have access to multimodal reasoning via the command line<\/strong>, e.g. by passing screenshots or low-fidelity sketches to the model, combined with access to native code.<\/p>\n<p>OpenAI considers the Codex CLI to be a minimalist interface that connects its models to users and their computers.<strong>The Codex CLI is designed for developers who already live in a terminal<\/strong>They want ChatGPT-level reasoning capabilities, and the power to actually run code, manipulate files, and iterate -- all under version control. In short, it's a chat-driven development tool that understands and executes repositories.<\/p>\n<ul>\n<li>Zero Configuration - Import the OpenAI API key and you're ready to go!<\/li>\n<li>Fully automated approvals, also secured by running network disabling and directory sandboxing<\/li>\n<li>Multimodal - Input screenshots or charts to enable reasoning<\/li>\n<\/ul>\n<p>The Codex CLI can be used on macOS 12+, Ubuntu 20.04+\/Debian 10+, and the WSL2 subsystem of Windows 11 and requires a minimum of 4GB of RAM (8GB recommended).<\/p>\n<p data-vmark=\"aecf\">1AI Attached is a link to the Codex CLI open source:<\/p>\n<p data-vmark=\"571e\"><a href=\"https:\/\/github.com\/openai\/codex\" target=\"_blank\" rel=\"noopener\"><span class=\"link-text-start-with-http\">https:\/\/github.com\/openai\/codex<\/span><\/a><\/p>","protected":false},"excerpt":{"rendered":"<p>April 17, 2011 - OpenAI today released Codex CLI, a lightweight end-run coding intelligence that is now fully open source on GitHub. Codex CLI works directly on the user's computer and is designed to maximize the inference capabilities of models such as o3 and o4-mini, with support for additional API models such as GPT-4.1 coming soon. Users can gain multimodal inference capabilities via the command line, such as by passing screenshots or low-fidelity sketches to the model, combined with access to native code. OpenAI considers the Codex CLI to be a minimalist interface that connects its models to users and their computers.The Codex CLI is designed for<\/p>","protected":false},"author":1,"featured_media":0,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[146],"tags":[6328,385,190,219,6327,1355],"collection":[],"class_list":["post-33300","post","type-post","status-publish","format-standard","hentry","category-news","tag-codex-cli","tag-github","tag-openai","tag-219","tag-6327","tag-1355"],"acf":[],"_links":{"self":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/33300","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/comments?post=33300"}],"version-history":[{"count":0,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/posts\/33300\/revisions"}],"wp:attachment":[{"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/media?parent=33300"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/categories?post=33300"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/tags?post=33300"},{"taxonomy":"collection","embeddable":true,"href":"https:\/\/www.1ai.net\/en\/wp-json\/wp\/v2\/collection?post=33300"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}