IBM to tackle the vastness of space with exabyte-crunching supercomputers

IBM to tackle the vastness of space with exabyte-crunching supercomputers

Google may have a pretty good handle on organizing the volumes of information generated by mankind here on Earth, but how do you handle the data produced by the entire universe? Ask IBM.

The hardware giant has been awarded a five-year contract teaming it with the Dutch Institute for Radio Astronomy (better known as ASTRON) to collaborate on energy-efficient, exascale computers systems. These systems are being built to handle the huge amounts of data that will be collected by the Square Kilometer Array (SKA) consortium.

The SKA is a set to be the largest and most sensitive radio telescope on the planet when it comes online in 2024. It should be capable of collecting multiple exabytes of data. For perspective on how big an exabyte is; a gigabyte is 1 billion bytes, that’s 10 digits. An exabyte has 18 zeros after the one.

IBM’s challenge is in designing a system that can not only process the huge amounts of data from SKA, but also analyze and store that data. A system of this kind will require groundbreaking technology. At the moment, IBM is talking about 3D chip stacking, water-cooling systems, optical interconnects, nanophotonics and new tape and phase-change memory as potential ways to achieve their goal. It will also have to keep power consumption to a minimum, which means that this will also  be an exercise in green supercomputing. 

The next five years of research will be done at the new Astron & IBM Center for Exascale Technology in Drenthe, Netherlands. When SKA finally powers up, online scientists around the world should get new insights into the universe. And even if you don’t care much about what’s beyond Earth, it’s sure to create a number breakthroughs in computing power that will someday make into the consumer market.