- Published on
OpenAI's o3 Mini Launch and AGI Power Needs
OpenAI's o3-Mini and Its Anticipated Launch
The tech world is eagerly awaiting the release of OpenAI's o3-mini, expected in the coming weeks. This announcement was confirmed by OpenAI CEO Sam Altman, addressing prior industry speculation. The o3-mini, a streamlined version of a larger model, will be accessible through both an API and a web interface, aiming to make advanced AI more widely available.
OpenAI research scientist Hongyu Ren also revealed that the company plans to launch three versions of the o3-mini concurrently: high, medium, and low. This isn't entirely new information, as Altman previously indicated a late January release for the o3-mini, followed by the full o3 model.
Performance and Features of the o3-Mini
Altman has clarified that the o3-mini will not outperform the o1-pro but will provide improved speed. This might disappoint those expecting a significant performance leap, as the o3-mini may only be a minor improvement over the o1-mini.
However, OpenAI's benchmark data presents a more complex picture. While the low version of the o3-mini might not match the o1's performance in areas like the Codeforces programming benchmark, the high version demonstrates enhancements. The key benefit of the o3-mini is its cost-effectiveness, making it highly suitable for programming tasks. Dylan Hunn from OpenAI has also emphasized the o3-mini’s increased speed in coding.
The Future of the o3 Series
To reassure users, Altman has emphasized the capabilities of the full o3 model, stating that it will be significantly more advanced than the o1-pro, and particularly the o3-pro. The o3-pro will be available to the 2,000 per month.
Regarding the o3-mini’s usage quota, Altman has described it as "really high," implying it exceeds the o1 series, and it will be accessible to ChatGPT Plus subscribers. Furthermore, Altman has suggested a brand convergence between the GPT and o series models, planned for this year.
AGI's Computing Power Demands
In addition to the o3-mini, Altman also discussed AGI, reiterating that it is achievable but will require 872 megawatts of computing power. To put this in perspective, the largest US nuclear power plant, Alvin W. Vogtle, has an installed capacity of 4536 megawatts, which would only be sufficient to support approximately 5 such AGIs.
According to situational-awareness.ai, the current power consumption of AI is nearing that level, suggesting that OpenAI may have already developed next-generation models, possibly even achieving AGI, depending on the definition of AGI.