Cerebras's WSE-3 is a silicon behemoth Equivalent to 62 Nvidia H100 GPUs Cerebras Systems has just pulled the wraps off its Wafer Scale Engine 3 (WSE-3), and it's a tech titan that's got the industry buzzing. This AI chip is not just big; it's colossal, with a mind-blowing 4 trillion transistors and 900,000 AI cores. The WSE-3 isn't just for show—it's the powerhouse behind the CS-3 supercomputer, a behemoth that can train AI models with up to 24 trillion parameters. With up to 1.2PB of external memory, this machine has more space than a black hole. Need to fine-tune a 70 billion parameter model in a day? The CS-3 can handle it. It's like the Usain Bolt of supercomputers, sprinting through data with the ease of a champion. And with support for PyTorch 2.0, it's not just fast; it's smart, too. According to Tom’s Hardware Cerebras's CS-3 doubles the performance without increasing the juice. Meanwhile, a strategic partnership between Cerebras and G42 is also set to expand with the construction of, the Condor Galaxy 3, an AI supercomputer featuring 64 CS-3 systems (packing a whopping 57,600,000 cores). https://fudzilla.com/news/ai/58639-c...licon-behemoth |
All times are GMT +1. The time now is 22:19. |
Powered by vBulletin® - Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Content Relevant URLs by vBSEO