Skip to main content

This prototype computer can hold the entire Library of Congress – five times over

hpe announces 160tb the machine prototype hewlett packard labs engineers working on
Image used with permission by copyright holder
In November 2016, Hewlett Packard Enterprise announced it had a working prototype of The Machine, its long-gestating research project investigating memory-based computing. Today, the company revealed a newer iteration of the potentially game-changing system. The new prototype represents a major step forward in terms of its computing power, and HPE is seeking partners to test its limits.

After years in the lab, The Machine is about to be put through its paces — and while the vast majority of us will never use this technology firsthand, we could feel its effects in everyday life.

What’s New?

HPE’s first working prototype of The Machine, unveiled last year, comprised of just a couple of nodes, with access to eight terabytes of memory in total. The specifications of the new model make clear it’s a massive step forward, as it’s outfitted with a whopping 160TB of memory, spread across 40 nodes.

Put another way, the system can hold the text of every single book in the Library of Congress in memory – approximately 160 million books – five times over. Big data is getting bigger all the time, and The Machine is built to offer a better way of working with huge data sets.

The system can hold the text of every single book in the Library of Congress in memory, five times over.

The initial reveal of the prototype last year prompted speculation regarding the future of the project, as HPE announced its intention to implement technologies developed for The Machine elsewhere in its portfolio of products as early as 2018. Some interpreted this as an acknowledgement that The Machine itself would never become a product on its own. Andrew Wheeler, Vice President and Deputy Director of Hewlett Packard Labs, sought to set to set things straight when he spoke to Digital Trends about the project last week.

“There was probably even a little bit of confusion around some of that last year, because this prototype has always been just that — it’s been a prototype,” explained Wheeler. “We weren’t trying to say that we’re not working toward productization of something called The Machine, it’s just that that’s further out in time.”

HPE hopes to continue progress, confident that The Machine’s architecture can be scaled to an exabyte-scale single-memory system, and from there leap to a 4,096-yottabyte (yes, that’s a thing!) pool of memory. For now, though, the company is working on its relationships with partners, whose usage of the hardware will inform future development.

“It’s a little bit of a messy recipe, if you will,” observed Wheeler. “There’s no clear path to how innovation like this is done. You get a vision, you get working on it, and you see where both the technology, the problem, and the opportunity takes you.”

Machine Learning

The Machine is an ambitious project. At its core, it seeks to reassess today’s computing architecture at the most basic level. Unlike virtually all of today’s computers, The Machine is built around memory at its core, rather than a processor, to make interaction between different components more efficient.

“We can’t rely on the technologies of the past. We need a computer built for the Big Data era,” wrote HPE CEO Meg Whitman in a press release detailing the new prototype. The company is betting big on the advantages The Machine can offer to its partners and their customers. It’s the largest research and development project that it’s ever taken on.

Image used with permission by copyright holder

This commitment, though massive, isn’t all surprising. Research from IBM suggests 90 percent of all the data in the world today was created in the last two years. Terabytes upon terabytes of data is recorded every minute of every day. This data can be used in ways not yet dreamed, but today’s computers can’t sort through the data effectively.

To fix that, HPE has spent years researching technology that could create a new kind of computer. The Machine isn’t about one singular breakthrough; it uses several different technologies at the same time. The benefit to HPE is that it might be possible to utilize individual technologies elsewhere, when it seems appropriate.

When The Machine was first announced, much was made of its potentially groundbreaking memristor components. HPE is still hard at work finding a memory solution that will do the project justice, given that it’s central to the differences between this system and a normal computer.

“There’s no clear path to how innovation like this is done.”

“The architecture is a fundamental tenet, so we do want to exploit any memory technology that comes online and is made available from now, five years out, ten years out,” said Wheeler. “Certainly, we do believe one of the emerging non-volatile memories will provide us with the density, and really, the cost, that allows us to build for the problem at hand.

But it’s not just about the memory. Shuttling data around the system is critical, as well, and the company believes photonic interconnects are the likely solution. This technology, which uses silicon to transfer optical signals, provides higher bandwidth and uses far less energy than current solutions.

The project is also making advances in areas that aren’t directly linked to its memory-based architecture. Since The Machine will be working with huge amounts of enterprise data, it’s imperative that it can keep that data safe and secure. According to Wheeler, the team at Hewlett Packard Labs were thinking about security “from day one.”

“That’s another great thing about this program. It’s allowed us to really think about security from the ground up, and build it in from the very primitive levels — from the silicon and the hardware, all the way through the firmware, and the operating system, and the applications,” explained Wheeler. “I think it’s going to pay great dividends for us in the future.”

The Machine and Me

The technology in The Machine could one day show up in a home computer, but it’d be on a much smaller scale, and wouldn’t appear for years – perhaps decades. Still, that doesn’t mean The Machine isn’t important to you. Once it’s rolled out in industry, The Machine could bring meaningful changes to everything from your next doctor’s appointment, to your Facebook feed, to your self-driving car.

The Machine could play a role in helping medical staff provide more personalized healthcare for their patients. Within the system’s memory, software could access the patient’s entire medical history, their family’s medical history, genomic data, environmental influences that they’ve been exposed to, and records of how successful treatments given to other similar patients were.

Image used with permission by copyright holder

We often think about computers taking over roles in established professions. However, this kind of implementation demonstrates how a system like The Machine could provide an ancillary service, eliminating extraneous information and delivering only the most relevant data to the medical professional. Working with a broad, internationally sourced data set, The Machine could provide a wealth of contextual information that might allow doctors to offer a more accurate diagnosis to their patient.

Moving onto social media, Wheeler told us the current prototype of The Machine is capable of holding every action that takes place on Facebook worldwide, over the course of a ninety-minute period, in memory. All at once. In doing so, it could solve a problem that’s been blighting the social media giant.

The Machine could handle every action that takes place on Facebook worldwide over a 90-minute window, all at once.

In theory, Facebook Live is a fun way for people to offer friends a window into their lives. In practice, it’s become a platform for highly offensive content, including livestreamed rape and murder. The entire appeal of the feature is that it’s live, which makes it very difficult for Facebook to police what users do with it. The Machine theoretically has enough processing power to keep an eye on everything that’s going on around the world, which could facilitate a solution to detect and block unwanted streams before they go viral.

Finally, there’s the nascent industry of self-driving vehicles. In this case, the advantages that The Machine offers aren’t about a vast body of data, but how quickly data can be accessed. Wheeler raised a concern about an autonomous car communicating with a central server to make decisions. In heavy traffic, when an unexpected situation comes about, quick communication could be the difference between a safe journey and a horrible accident.

“We don’t have time,” he explains. “The speed of light is the speed of light, and latency matters. We need to make the decision within that vehicle, which means that vehicle needs to be able to store and process everything that’s happening, all those central inputs. That decision needs to happen then and there.”

The Next Big Thing?

The Machine is a mammoth undertaking, and that’s why it has such a broad range of potential applications. In truth, even HPE doesn’t know its full capabilities — that’s why the company is working with partners to test it out in all kinds of different situations.

It’s easier for most of us to swoon over the iPhone than enterprise-grade hardware, but the scale of The Machine is far beyond a typical enterprise product. A computer like The Machine could power the next great leap forward in computational capability, and in doing so, it would power innovations are too difficult, or expensive, for traditional computers to handle. The Machine could become a part of everyday life.

As the world has become dependent on computers, their foundational technology has stayed relatively static. However, between projects like The Machine, investigations into quantum computing, and other similar research, it’s clear that there’s a thirst to see what could be achieved if we rethink computer architecture.

What remains to be seen is whether the technology’s applications can live up to the promise of a huge leap forward in computing. With a working, large-scale prototype in its possession, HPE is about to find out whether The Machine offers a benefit to its customers — and, in turn, that will determine whether memory-based computing is a neat idea, or the next big thing.

Brad Jones
Former Digital Trends Contributor
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
What is Microsoft 365? Here’s the cloud software suite, explained
Microsoft Office free apps.

Microsoft 365 is the brand’s suite of cloud-based productivity apps that can be used for word processing, group collaboration, data analysis, presentation development, storage, and email. Many may be familiar with Microsoft Teams, Word, Excel, PowerPoint, Outlook, and OneDrive as separate applications at one point; however, many high-performance users may utilize more than one of these programs for work, hobbies, or their everyday lives.

This could serve as a reason to consider Microsoft 365, to get more comprehensive access to the brand’s app library. Here is a look at what you need to know about the Microsoft 365 productivity suite.
Microsoft 365 paid subscriptions 

Read more
France’s cyber unit preps for potential cyberattacks targeting Paris Olympics
A hacker typing on an Apple MacBook laptop while holding a phone. Both devices show code on their screens.

Organizers at the Paris Olympics are expecting a wave of cyberattacks to target the Games when the sporting extravaganza kicks off in earnest this weekend.

Researchers have noted that some attacks have already started, with Russia-affiliated hackers suspected to be behind the nefarious efforts, Bloomberg reported on Thursday.

Read more
Gamers are flocking to return Intel CPUs — and some are permanently damaged
A hand holds the Intel Core i9-12900KS.

Intel's troubles with instability on 13th-gen and 14th-gen CPUs continues to escalate, and a new report suggests that gamers are returning these CPUs at a much higher rate than retailers expect. An anonymous European retailer says they've seen four times as many returns for 13th-gen and 14th-gen CPUs compared to 12th-gen, according to a report from French outlet Les Numeriques.

Returns have only ramped up recently, however. The retailer says that in the six months following the release of all three generations, the return rates are nearly identical. Looking at the rate now, however, 13th-gen CPUs are being returned four times as often as 12th-gen, while 14th-gen CPUs are being return three times as much. Given what we've learned about Intel's instability issue, this suggests that the processors do, indeed, degrade over time.

Read more