- Oracle claims its Zettascale10 system can hit 16 zettaFLOPS peak
- The project uses around 800,000 Nvidia GPUs spread across data centers
- OpenAI’s Stargate cluster in Texas runs on Oracle’s new infrastructure
Oracle has announced what it calls the largest AI supercomputer in the cloud, the OCI Zettascale10.
The company claims the system can deliver 16 zettaFLOPS of peak performance across 800,000 Nvidia GPUs.
That output, when divided, equals about 20 petaflops per GPU, roughly matching the Grace Blackwell GB300 Ultra chip used in high-end desktop AI systems.
Network design for large-scale AI workloads
Oracle says the platform is the foundation for OpenAI’s Stargate cluster in Abilene, Texas, built to handle some of the most demanding AI workloads now emerging in research and commercial use.
“The highly scalable custom RoCE design maximizes fabric-wide performance at gigawatt scale while keeping most of the power focused on compute…,” said Peter Hoeschele, vice president, Infrastructure and Industrial Compute, OpenAI.
At the core of the Zettascale10 system is Oracle Acceleron RoCE networking, designed to increase scalability and reliability for data-heavy AI operations.
This architecture uses network interface cards as mini switches, linking GPUs across several isolated network planes.
The design aims to reduce latency between GPUs and allow jobs to continue running if one network path fails.
“Featuring Nvidia full-stack AI infrastructure, OCI Zettascale10 provides the compute fabric needed to advance state-of-the-art AI research and help organizations everywhere move from experimentation to industrialized AI,” said Ian Buck, vice president of Hyperscale, Nvidia.
Oracle claims this structure can lower costs by simplifying tiers within the network while maintaining consistent performance across nodes.
It also introduces Linear Pluggable and Receiver Optics to reduce energy and cooling use without cutting bandwidth.
Although Oracle’s figures are impressive, the company has not provided independent verification of its 16 zettaFLOPS claim.
Cloud performance metrics can vary depending on how throughput is calculated, and Oracle’s comparison may rely on theoretical peaks rather than sustained rates.
Given that the system’s advertised total equals the sum of 800,000 top-end GPUs, real-world efficiency could depend heavily on network design and software optimization.
Analysts may wait to see whether the configuration delivers performance comparable to leading AI clusters already run by other major cloud providers.
The Zettascale10 positions Oracle alongside other major players racing to provide the infrastructure behind the best GPUs and AI tools.
The company says customers could train and deploy large models across Oracle’s distributed cloud environment, supported by data sovereignty measures.
Oracle also says Zettascale10 offers operational flexibility through independent plane-level maintenance, allowing updates with less downtime.
“With OCI Zettascale10, we’re fusing OCI’s Oracle Acceleron RoCE network architecture with next-generation Nvidia AI infrastructure to deliver multi-gigawatt AI capacity at unmatched scale,” said Mahesh Thiagarajan, executive vice president, Oracle Cloud Infrastructure.
“Customers can build, train, and deploy their largest AI models into production using less power and will have the freedom to operate across Oracle’s distributed cloud with strong data and AI sovereignty…”
Still, observers note that other providers are building their own large-scale GPU clusters and advanced cloud storage systems, which could narrow Oracle’s advantage.
This system will roll out next year, and only then will it be clear whether the architecture can meet demand for scalable, efficient, and reliable AI computation.
Via HPCWire
Follow TechRadar on Google News and add us as a preferred source to get our expert news, reviews, and opinion in your feeds. Make sure to click the Follow button!
And of course you can also follow TechRadar on TikTok for news, reviews, unboxings in video form, and get regular updates from us on WhatsApp too.