Skip to main content

Cerebras’ enormous artificial intelligence chip is the size of an iPad

Image used with permission by copyright holder

We’re used to microchips becoming increasingly miniaturized, thanks to the amazing trend of Moore’s Law letting engineers pack more and more transistors onto ever tinier chips. The same thing can’t be said for the Wafer Scale Engine (WSE) chip designed by Californian startup Cerebras, which recently emerged from stealth. Cerebras has created an immensely powerful chip designed for carrying out A.I. processes — and there’s absolutely no missing it. Partly because, unlike most microchips, this one is the size of an iPad.

Recommended Videos

The 46,225 square millimeter WSE chip boasts an enormous 1.2 trillion transistors, 400,000 cores, and 18 gigabytes of on-chip memory. That makes it the biggest chip ever created. The previous record-holder was a mere 815 square millimeters, with 21.1 billion transistors. As CEO and co-founder Andrew Feldman told Digital Trends, this means the WSE chip is “56.7 times larger” than the giant chip it beat for the title.

Please enable Javascript to view this content

“Artificial intelligence work is one of the fastest-growing compute workloads,” Feldman said. “Between 2013 and 2018, it grew at a rate of more than 300,000 times. That means every 3.5 months, the amount of work done with this workload doubled.”

This is where the need for bigger chips comes into play. Bigger chips process more information more quickly. That, in turn, means that the user can calculate their computationally heavy answer in less time.

Image used with permission by copyright holder

“The WSE contains 78 times as many compute cores; it [has] 3,000 times more high speed, on-chip memory, 10,000 times more memory bandwidth, and 33 times more fabric bandwidth than today’s leading GPU,” Feldman explained. “This means that the WSE can do more calculations, more efficiently, and dramatically reduce the time it takes to train an A.I. model. For the researcher and product developer in A.I., faster time to train means higher experimental throughput with more data: less time to a better solution.”

Unsurprisingly, a computer chip the size of a freaking tablet isn’t one that’s intended for home use. Instead, it’s intended to be used in data centers where much of the heavy-duty processing behind today’s cloud-based A.I. tools is carried out. There’s no official word on customers, but it would seem likely that firms such as Facebook, Amazon, Baidu, and others will be keen to put it through its paces.

No performance benchmarks have been released yet. However, if this chip lives up to its promises it will help keep us in A.I. innovation for the weeks, months, and even years to come.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Google AI Mode will reinvent Search. I’m worried — and you should be, too
Google AI Mode for Search.

Update: A Google spokesperson responded to our queries. The story has been updated with their answers in a dedicated section below. 

Google is pushing forward with more AI into how internet search works. Remember AI Overviews, which essentially summarizes the content pulled from websites, and presents it at the top of the Google Search page?

Read more
My dream AI MacBook may be delayed until 2027 thanks to Apple Intelligence
Apple's Craig Federighi discussing Apple Intelligence at the Worldwide Developers Conference (WWDC) 2024.

Artificial intelligence (AI) is experiencing explosive growth at the moment, with everyone in the tech world seemingly trying to get in on the action. That includes Apple, but it’s no secret that the company’s Apple Intelligence platform is struggling to compete with the likes of ChatGPT, Gemini and Copilot. Yet we’ve just had some news that could make that situation even worse, especially for Mac users.

That’s because Bloomberg reporter Mark Gurman has just claimed that some key Apple Intelligence features won’t be available until 2026 or even 2027, putting the dream of a powerful AI-powered MacBook firmly on the backburner.

Read more
xAI’s Grok-3 is free for a short time. I tried it, and I’m impressed
Grok-3 access option in the X mobile app.

xAI launched its Grok-3 AI chatbot merely a few days ago, but locked it behind a paywall worth $40 per month. Now, the company is offering free access to it, but only for a limited time. xAI chief, Elon Musk, says the free access will only be available for a “short time,” so it’s anyone’s guess how long that window is going to be.

For now, the only two features available to play around are Think and DeepSearch. Think is the feature that adds reasoning capabilities to Grok-3  interactions, in the same view as DeepThink on DeepSeek, Google’s Gemini 2.0 Flash Thinking Experimental, and OpenAI’s o-series models.

Read more