Free Shipping on orders over US$39.99 How to make these links

Computer History Museum Honors Cerebras Systems

When Cerebras Systems came out with Hot Chips in August 2019, the hardware community wasn’t sure what to think. Attendees were understandably skeptical of the novel “wafer-scale” technology, not to mention the estimated power envelope of ~15 kilowatts for the chip alone. In the intervening three years, the company – under the direction of founder and CEO Andrew Feldman – has won over early critics with a series of impressive milestones. The company signed a multi-lab contract with the Department of Energy just one month after Hot Chips debuted and now has several high-profile installations in government laboratories and commercial sites around the world.

The new Cerebras Wafer-Scale Engine exhibit at the Computer HIstory Museum in Mountain View, California.

Today (Aug. 3), Cerebras was honored by the Computer History Museum (CHM), which opened a new display featuring Cerebras’ Wafer-Scale Engine (WSE) at its landmark Mountain View, Calif., location. About the size of a dinner plate, Cerebras WSE-2 has 2.6 trillion transistors and 850,000 AI optimized cores. Powered by WSE-2, the Cerebras CS-2 system deals with AI models that reach billions and trillions of parameters.

“There are more transistors – by far – in this one Cerebras chip, than in all 100,000 computing objects in the Museum’s permanent collection combined,” said Dag Spicer, Senior Curator, Computer History Museum.

The Computer History Museum hosted a live conversation with Cerebras Systems CEO Andrew Feldman and Computer History Museum President & CEO Dan’l Lewin. The event has ended. Watch the replay below.


Cerebras starts with a 300 mm wafer and removes the largest possible square.

WSE-2 is manufactured by TSMC at its 7nm node and is the second generation Wafer Scale Engine. The WSE-2 measures 46,225mm2 which is more than 50x larger than competing chips. Launched in 2021, the WSE-2 offers double the transistor count, core count, memory, memory bandwidth and fabric bandwidth of the first generation product, with a slight increase in its power footprint (23 kW up from 20 kW). The next chip, planned for 5nm process technology, will pack more cores to handle the rapidly growing computing needs of AI, Feldman said.

Cerebras has customers in North America, Asia, Europe and the Middle East. Flagship customers include GlaxoSmithKline, AstraZeneca, TotalEnergies, nference, Argonne National Laboratory, Lawrence Livermore National Laboratory, Pittsburgh Supercomputing Center, Leibniz Supercomputing Center, National Center for Supercomputing Applications, Edinburgh Parallel Computing Center (EPCC), National Energy E Technology Laboratory, and Tokyo Equipment.

Source link

We will be happy to hear your thoughts

Leave a reply

Info Bbea
Enable registration in settings - general
Compare items
  • Total (0)
Shopping cart