Skip to main content

Developers use 750 Raspberry Pi boards as supercomputing testbed

Developers requiring a platform to test their scalable software for supercomputers now have an inexpensive solution. Designed and built by BitScope in collaboration with the Department of Energy’s Los Alamos National Laboratory, this new platform relies on the popular Raspberry Pi 3 boards – 750 of them, to be exact – that are spread out across five rack-mounted Pi Cluster Modules. This platform will eliminate the need for a $250 million investment.

“It’s not like you can keep a petascale machine around for R&D work in scalable systems software,” said Gary Grider, leader of the High Performance Computing Division at Los Alamos National Laboratory. “The Raspberry Pi modules let developers figure out how to write this software and get it to work reliably without having a dedicated testbed of the same size.”

Recommended Videos

Each Raspberry Pi 3 board is a self-contained miniature PC packed with a quad-core processor, 1GB of system memory, wired and wireless networking, and a handful of USB ports. That means each Pi Cluster Module consists of 600 computer cores to develop scalable software for high-performance computing (HPC), large-scale sensor network simulation, and more at a fraction of the cost needed to purchase a dedicated HPC testbed.

The Los Alamos National Laboratory currently manages the Trinity supercomputer, which consists of 19,420 “nodes,” or self-contained PCs sporting Intel Xeon processors, memory, and storage. In total, these nodes add up to four petabytes of memory, four petabytes of flash-based storage, and 100 petabytes of hard drive space. These nodes are installed in clusters, which can cost $250 million each just to build along with the added cost of keeping them cool.

That said, not every HPC platform will be the same size, and this makes developing HPC-based software difficult given the different processor “pipelines,” storage architectures, and network connections. According to BitScope, software that works on a specific HPC platform may not work correctly on a larger, scaled-out design. Spending $250 million to see if the software works on larger systems just isn’t an option for many developers.

“Cluster simulations can help to some extent but in many cases real-world issues can intervene to mitigate their effectiveness,” BitScope says. “What’s really needed is a low-cost development platform on which to research the design options and prototype new ideas without the expense of building a running a full-scale HPC cluster to do this research.”

Grider got the idea of using Raspberry Pi 3 to create a development platform after searching for a low-cost, low-power solution for the HPC software development community. Given each board costs around $35 and consumes up to 6.7 watts of power, one Pi Cluster Module would cost $5,250 just in the cards alone. That’s still not cheap, but better than dumping millions into hardware just for research and development.

The five-rack platform recently introduced during the Super Compute 2017 convention in Denver is a “pilot” system. Currently, BitScope is building Pi Cluster Modules packed with 150 Raspberry Pi 3 boards each, which will be distributed by SICORP. BitScope says that it also plans to create smaller modules packing 48 and 96 boards at a later date.

Kevin Parrish
Former Digital Trends Contributor
Kevin started taking PCs apart in the 90s when Quake was on the way and his PC lacked the required components. Since then…
ChatGPT’s latest image tools are stirring up another viral and creepy trend
ChatGPT logo on a phone

Earlier this week, ChatGPT's creator OpenAI revealed a couple of new reasoning models which, it claims, are capable of "thinking with images." The o3 and the o4-mini models are characterized by powerful abilities to interpret and manipulate images and fetch any information to improve the model's output. Simultaneously, the capable models are also being used to fuel fun side quests, including using ChatGPT to determine locations shown in photos, also known as geolocating, which, if not used responsibly, can turn into a privacy nightmare.

Following the models' release, expert users realized their ability to identify locations in photos, with limited additional inputs. Out of the two models, o3 -- the more advanced one -- appears to be proficient at this skill, and we could already be witnessing the origins of yet another viral trend started by ChatGPT.

Read more
Nvidia’s latest driver gives your GPU a performance boost, but there’s a catch
The RTX 5090 sitting on a pink background.

If you own one of Nvidia's best graphics cards, the latest driver update might be an interesting one for you. According to users who downloaded the patch, the drivers bring an up to 8% boost in synthetic benchmarks. But, seeing as most of us don't spend all of our time benchmarking our GPUs, what are the actual benefits of the 572.02 graphics driver?

The first reports of these driver improvements showed up on Reddit and were then picked up by publications like VideoCardz. Some users have found that they saw performance gains in synthetic benchmarks, ranging from 3% to 8%.

Read more
Google demos its smartglasses and makes us hanker for the future
A screenshot from Google's TED Talk on its smartglasses.

At a recent TED talk, Google’s exciting XR smartglasses were demonstrated to the public for the very first time. While we’ve seen the smartglasses before, it has always been in highly polished videos showcasing Project Astra, where we never get a true feel for the features and functionality in the real world. All that has now changed, and our first glimpse of the future is very exciting. However, future is very much the operative word. 

https://www.ted.com/talks/shahram_izadi_the_next_computer_your_glasses?utm_campaign=tedspread&utm_medium=referral&utm_source=tedcomshare

Read more