June 18th.GoogleThe company released a blog post today (June 18) in which it announced that Gemini 2.5 Flash and Gemini 2.5 Pro models beyond the stabilization phase.A new lightweight model, Gemini 2.5 Flash-Lite, has also been launched..

1AI cited a blog post in which Google officially said that the Gemini 2.5 Flash and Gemini 2.5 Pro models have passed large-scale testing and can stably support production-grade app development. companies such as Spline and Rooms, as well as Snap and SmartBear, have been developing real-world apps using the latest versions for the past few weeks.
Google says the Gemini 2.5 series was designed with the core goal of balancing "cost-speed-performance" with efficient reasoning and affordability, and along with the stable release, ensures that developers can build complex systems with greater confidence.
Google is simultaneously releasing a preview version of Gemini 2.5 Flash-Lite.This is by far the most cost-effective and fastest reasoning model in the family.
Tests show that Flash-Lite outperforms its predecessor, 2.0 Flash-Lite, in terms of overall quality in tasks such as code writing, scientific computing, and multimodal analysis; it also outperforms 2.0 in terms of latency, especially in scenarios that require fast response, such as translation and classification.
The model inherits the core capabilities of the Gemini 2.5 family, including flexible control over inference budgets, connectivity to external tools (e.g., Google search, code execution), and support for ultra-long contextual processing of 1 million tokens.
Developers can access stable versions of 2.5 Flash and Pro, as well as preview versions of Flash-Lite, through Google AI Studio, the Vertex AI platform. In addition, 2.5 Flash and Pro have been integrated into the Gemini app, while Google Search has deployed customized versions of Flash-Lite and Flash models to improve service efficiency.