Skip to main content

IBM is cutting deep-learning processing times from days down to hours

Deep learning uses algorithms inspired by the way human brains operate to put computers to work on tasks too big for organic gray matter. On Monday, IBM announced that a new record for the performance of a large neural network working with a large data set.

The company’s new deep-learning software brings together more than 256 graphics processing units across 64 IBM Power systems. The speed improvements brought about by the research come as a result of better communication between the array of GPUs.

Recommended Videos

Faster GPUs provide the necessary muscle to take on the kind of large scale problems today’s deep-learning systems are capable of tackling. However, the faster the components are, the more difficult it is to ensure that they are all working together as one cohesive unit.

As individual GPUs work on a particular problem, they share their learning with the other processors that make up the system. Conventional software is not capable of keeping up with the speed of current GPU technology, which means that time is wasted as they wait around for one another’s results.

Hillery Hunter, IBM’s director of systems acceleration and memory, compared the situation to the well-known parable of the blind men and the elephant. The company’s distributed deep-learning project has resulted in an API that developers can be used in conjunction with deep-learning frameworks to scale to multiple servers, making sure that their GPUs remain synchronized.

IBM recorded image recognition accuracy of 33.8 percent on a test run using 7.5 million images from the ImageNet-22K database. The previous best-published result was 29.8 percent, which was posted by Microsoft in October 2014 — in the past, accuracy has typically edged forward at a rate of about one percent in new implementations, so an improvement of four percent is considered to be a very good result.

Crucially, IBM’s system managed to achieve this in seven hours; the process that allowed Microsoft to set the previous record took 10 days to complete.

“Speed and scalability, which means higher accuracy, means that we can quickly retrain an AI model after there is a new cyber-security hack or a new fraud situation,” Hunter told Digital Trends. “Waiting for days or weeks to retrain the model is not practical — so being able to train accurately and within hours makes a big difference.”

These massive improvements in terms of speed, combined with advances in terms of accuracy make IBM’s distributed deep-learning software a major boon for anyone working with this technology. A technical preview of the API is available now as part of the company’s PowerAI enterprise deep-learning software.

Brad Jones
Brad is an English-born writer currently splitting his time between Edinburgh and Pennsylvania. You can find him on Twitter…
Meta made insane offers in bid to nab OpenAI talent, Altman claims
OpenAI CEO Sam Altman during the Uncapped podcast in June 2025.

OpenAI chief Sam Altman has said that Meta tried to tempt his top AI researchers to switch sides by offering hiring bonuses of $100 million. Yes, you read that right -- $100 million. Altman said that up to now, none of his top team have left for Mark Zuckerberg's Meta.

Altman made the claim on Tuesday in the Uncapped podcast, hosted by his brother, Jack.

Read more
I hated the Macbook notch, but this app has made me love it
FocusNotch on a MacBook Air.

When Apple put a notch on the MacBook, I was immensely excited about the functional possibilities, the same kind you see built around the Dynamic Island on iPhones. Expanding live updates, current activities, and navigation guidance are just a few of the examples. 

Unfortunately, that is yet to happen on the MacBook. With the massive redesign across macOS Tahoe, I was again hopeful that the notch would finally find a purpose. Again, I was disappointed. Thankfully, the developer and open-source community have built some fantastic utilities that extract the best out of the boat-shaped notch. 

Read more
Eyes on the Apple Mac mini M4? It’s 15% off today from Amazon
The M4 Mac mini on a desk.

If you're a fan of Apple's Mac mini desktop computers and you've been thinking about getting the latest series, you won't want to miss this chance at a discount on the Apple Mac mini M4. The configuration with 24GB of RAM and a 512GB SSD is on sale from Amazon at 15% off, bringing its price down from $999 to $849. As with all Apple deals, you should take advantage of the savings as soon as you can because the offer may disappear at any moment -- hurry if you want to get it for $150 lower than usual.

Why you should buy the Apple Mac mini M4

Read more