Skip to main content

IBM's '5 in 5' predicts what crazy scientific inventions may emerge in the next few years

Five years ago, few would have predicted that an artificial intelligence system named AlphaGo would defeat Lee Sedol, a Go master, at his own game. Even the experts thought the victory was at least another decade away. But earlier this year, the Google DeepMind algorithm conquered its human opponent to win the tournament 4-1.

AlphaGo is just one example of unforeseen AI progress over the past few years. And, as new trends and technologies emerge every week, who’s to say what the next five years have in store?

Recommended Videos

IBM has decided to take another stab at the idea in its annual IBM 5 in 5 predictions, in which the company uses market and societal trends to predict scientific innovations that will revolutionize our lives in the next half decade.

“5 in 5 began as a way to demonstrate the most exciting developments coming out of IBM Research, to generate collaborative conversations about the possibilities for innovation across various industries, and to promote excitement about how technology can be applied to solve certain societal problems and improve our daily lives,” Dario Gil, vice president of science and solutions at IBM Research, told Digital Trends.

This year, the tech giant considered what instruments might make the invisible visible. Here are its predictions.

“With AI, our words will be a window into our mental health”

Mental health is a growing concern in communities around the world, where it serves as an emotional burden on people and economic burden on the societies they live in. Part of the difficulty is that, unlike most physical ailments, mental health conditions aren’t always easy to predict, prevent, or diagnose. But our words offer insight into our minds and researchers are unraveling the hidden messages within.

“In five years, what we say and write will be used as indicators of our mental health and physical well being,” IBM predicts. “Patterns in our speech and writing analyzed by new cognitive systems will provide tell-tale signs of early-stage mental and neurological diseases that can help doctors and patients better predict, monitor, and track these diseases.”

Researchers have recently used brushstrokes to predict neurological disorders in artists. IBM predicts that AI will soon use things like syntax and intonation for similar ends.

“Hyperimaging and AI will give us superhero vision”

Hi-tech instruments already help us image difficult-to-see regions, from deep space to deep within our bodies. But, as IBM notes, these instruments are limited to specific functions.

“In five years, new imaging devices using hyperimaging technology and AI will help us see broadly beyond the domain of visible light by combining multiple bands of the electromagnetic spectrum to reveal valuable insights or potential dangers that would otherwise be unknown or hidden from view,” the company writes. “Most importantly, these devices will be portable, affordable, and accessible, so superhero vision can be part of our everyday experiences.”

As example, imagine a car windshield that helps drivers peer through fog or detect black ice. Or imagine a smartphone camera that can snap a photo of your dinner and give a breakdown its nutritional value.

“Macroscopes will help us understand Earth’s complexity in infinite detail”

IBM takes “macroscopes” to mean an instrument that offers insight into the interconnected and complex nature of our planet. Rather than revealing a small thing within a big thing, macroscopes reveal a big thing through the many small things that make it up.

“In five years, we will use machine-learning algorithms and software to help us organize the information about the physical world to help bring the vast and complex data gathered by billions of devices within the range of our vision and understanding,” IBM predicts. “We call this a ‘macroscope’ — but unlike the microscope to see the very small, or the telescope that can see far away, it is a system of software and algorithms to bring all of Earth’s complex data together to analyze it for meaning.”

“Medical labs ‘on a chip’ will serve as health detectives for tracing disease at the nanoscale”

Take a functioning full-scale laboratory and crunch its capabilities down to something smaller than a USB stick — that’s a lab on a chip. Lab-on-a-chip technology may be able to detect biomakers from a handheld device, diagnosing diseases like Parkinson’s with ease.

“In the next five years, new medical labs on a chip will serve as nanotechnology health detectives — tracing invisible clues in our bodily fluids and letting us know immediately if we have reason to see a doctor,” IBM writes. “The goal is to shrink down to a single silicon chip all of the processes necessary to analyze a disease that would normally be carried out in a full-scale biochemistry lab.”

The feasibility of labs-on-chips has been debated but, if they prove possible, they could revolutionize medicine and bring health care to people who are otherwise disadvantaged.

“Smart sensors will detect environmental pollution at the speed of light”

We typically can’t see pollutants until they’re well past the point of safe return. Think toxic waste and smog. However, smart sensors can already pick up on the otherwise invisible chemical patterns in the air and even in our breath. By combining these sensors with the Internet of Thing (IoT), we may be able to detect contaminants early to avoid catastrophic events.

“In five years, networks of IoT sensors wirelessly connected to the cloud will provide continuous monitoring of the vast natural gas infrastructure, allowing leaks to be found in a matter of minutes instead of weeks, reducing pollution and waste and the likelihood of catastrophic events,” IBM predicts.

Dyllan Furness
Former Digital Trends Contributor
Dyllan Furness is a freelance writer from Florida. He covers strange science and emerging tech for Digital Trends, focusing…
Star Wars legend Ian McDiarmid gets questions about the Emperor’s sex life
Ian McDiarmid as the Emperor in Star Wars: The Rise of Skywalker.

This weekend, the Star Wars: Revenge of the Sith 20th anniversary re-release had a much stronger performance than expected with $25 million and a second-place finish behind Sinners. Revenge of the Sith was the culmination of plans by Chancellor Palpatine (Ian McDiarmid) that led to the fall of the Jedi and his own ascension to emperor. Because McDiarmid's Emperor died in his first appearance -- 1983's Return of the Jedi -- Revenge of the Sith was supposed to be his live-action swan song. However, Palpatine's return in Star Wars: Episode IX -- The Rise of Skywalker left McDiarmid being asked questions about his character's comeback, particularly about his sex life and how he could have a granddaughter.

While speaking with Variety, McDiarmid noted that fans have asked him "slightly embarrassing questions" about Palpatine including "'Does this evil monster ever have sex?'"

Read more
Waymo and Toyota explore personally owned self-driving cars
Front three quarter view of the 2023 Toyota bZ4X.

Waymo and Toyota have announced they’re exploring a strategic collaboration—and one of the most exciting possibilities on the table is bringing fully-automated driving technology to personally owned vehicles.
Alphabet-owned Waymo has made its name with its robotaxi service, the only one currently operating in the U.S. Its vehicles, including Jaguars and Hyundai Ioniq 5s, have logged tens of millions of autonomous miles on the streets of San Francisco, Los Angeles, Phoenix, and Austin.
But shifting to personally owned self-driving cars is a much more complex challenge.
While safety regulations are expected to loosen under the Trump administration, the National Highway Traffic Safety Administration (NHTSA) has so far taken a cautious approach to the deployment of fully autonomous vehicles. General Motors-backed Cruise robotaxi was forced to suspend operations in 2023 following a fatal collision.
While the partnership with Toyota is still in the early stages, Waymo says it will initially study how to merge its autonomous systems with the Japanese automaker’s consumer vehicle platforms.
In a recent call with analysts, Alphabet CEO Sundar Pichai signaled that Waymo is seriously considering expanding beyond ride-hailing fleets and into personal ownership. While nothing is confirmed, the partnership with Toyota adds credibility—and manufacturing muscle—to that vision.
Toyota brings decades of safety innovation to the table, including its widely adopted Toyota Safety Sense technology. Through its software division, Woven by Toyota, the company is also pushing into next-generation vehicle platforms. With Waymo, Toyota is now also looking at how automation can evolve beyond assisted driving and into full autonomy for individual drivers.
This move also turns up the heat on Tesla, which has long promised fully self-driving vehicles for consumers. While Tesla continues to refine its Full Self-Driving (FSD) software, it remains supervised and hasn’t yet delivered on full autonomy. CEO Elon Musk is promising to launch some of its first robotaxis in Austin in June.
When it comes to self-driving cars, Waymo and Tesla are taking very different roads. Tesla aims to deliver affordability and scale with its camera, AI-based software. Waymo, by contrast, uses a more expensive technology relying on pre-mapped roads, sensors, cameras, radar and lidar (a laser-light radar), that regulators have been quicker to trust.

Read more
Uber partners with May Mobility to bring thousands of autonomous vehicles to U.S. streets
uber may mobility av rides partnership

The self-driving race is shifting into high gear, and Uber just added more horsepower. In a new multi-year partnership, Uber and autonomous vehicle (AV) company May Mobility will begin rolling out driverless rides in Arlington, Texas by the end of 2025—with thousands more vehicles planned across the U.S. in the coming years.
Uber has already taken serious steps towards making autonomous ride-hailing a mainstream option. The company already works with Waymo, whose robotaxis are live in multiple cities, and now it’s welcoming May Mobility’s hybrid-electric Toyota Sienna vans to its platform. The vehicles will launch with safety drivers at first but are expected to go fully autonomous as deployments mature.
May Mobility isn’t new to this game. Backed by Toyota, BMW, and other major players, it’s been running AV services in geofenced areas since 2021. Its AI-powered Multi-Policy Decision Making (MPDM) tech allows it to react quickly and safely to unpredictable real-world conditions—something that’s helped it earn trust in city partnerships across the U.S. and Japan.
This expansion into ride-hailing is part of a broader industry trend. Waymo, widely seen as the current AV frontrunner, continues scaling its service in cities like Phoenix and Austin. Tesla, meanwhile, is preparing to launch its first robotaxis in Austin this June, with a small fleet of Model Ys powered by its camera-based Full Self-Driving (FSD) system. While Tesla aims for affordability and scale, Waymo and May are focused on safety-first deployments using sensor-rich systems, including lidar—a tech stack regulators have so far favored.
Beyond ride-hailing, the idea of personally owned self-driving cars is also gaining traction. Waymo and Toyota recently announced they’re exploring how to bring full autonomy to private vehicles, a move that could eventually bring robotaxi tech right into your garage.
With big names like Uber, Tesla, Waymo, and now May Mobility in the mix, the ride-hailing industry is evolving fast—and the road ahead looks increasingly driver-optional.

Read more