Skip to main content

What is deep learning?

Curious how A.I. 'brains' work? Here's a super-simple breakdown of deep learning

Deep learning is a particular subset of machine learning (the mechanics of artificial intelligence). While this branch of programming can become very complex, it started with a very simple question: “If we want a computer system to act intelligently, why don’t we model it after the human brain?”

That one thought spawned many efforts in past decades to create algorithms that mimicked the way the human brain worked—and that could solve problems the way that humans did. Those efforts have yielded valuable, increasingly competent analysis tools that are used in many different fields.

Recommended Videos

The neural network and how it’s used

Neural Network Chart
via Wikipedia

Deep learning gets its name from how it’s used to analyze “unstructured” data, or data that hasn’t been previously labeled by another source and may need definition. That requires careful analysis of what the data is, and repeated tests of that data to end up with a final, usable conclusion. Computers are not traditionally good at analyzing unstructured data like this.

Think about it in terms of writing: If you had ten people write the same word, that word would look very different from each person, from sloppy to neat, and from cursive to print. The human brain has no problem understanding that it’s all the same word, because it knows how words, writing, paper, ink, and personal quirks all work. A normal computer system, however, would have no way of knowing that those words are the same, because they all look so different.

That brings us to via neural networks, the algorithms specifically created to mimic the way that the neurons in the brain interact. Neural networks attempt to parse data the way that a mind can: Their goal is to deal with messy data—like writing—and draw useful conclusions, like the words that writing is attempting to show. It’s easiest to understand neural networks if we break them into three important parts:

The input layer: At the input layer, the neural network absorbs all the unclassified data that it is given. This means breaking down the information into numbers and turning them into bits of yes-or-no data, or “neurons”. If you wanted to teach a neural network to recognize words, then the input layer would be mathematically defining the shape of each letter, breaking it down into digital language so the network can start working. The input layer can be pretty simple or incredibly complex, depending on how easy it is to represent something mathematically.

Complex Neural Network
Image used with permission by copyright holder

The hidden layers: At the center of the neural network are hidden layers—anywhere from one to many. These layers are made of their own digital neurons, which are designed to activate or not activate based on the layer of neurons that precedes them. A single neuron is a basic “if this, then that model, but layers are made of long chains of neurons, and many different layers can influence each other, creating very complex results. The goal is to allow the neural network to recognize many different features and combine them into a single realization, like a child learning to recognize each letter and then forming them together to recognize a full word, even if that word is written a little sloppy.

The hidden layers are also where a lot of deep learning training goes on. For example, if the algorithm failed to accurately recognize a word, programmers send back, “Sorry, that’s not correct,” and the algorithm would adjust how it weighed data until it found the right answers. Repeating this process (programmers may also adjust weights manually) allows the neural network to build up robust hidden layers that are adept at seeking out the right answers through a lot of trial and error plus, some outside instruction — again, much like how the human brain works. As the above image shows, hidden layers can become very complex!

The output layer: The output layer has relatively few “neurons” because it’s where the final decisions are made. Here the neural network applies the final analysis, settles on definitions for the data, and draws the programmed conclusions based on those definitions. For example, “Enough of the data lines up to say that this word is lake, not lane.” Ultimately all data that passes through the network is narrowed down to specific neurons in the output layer. Since this is where the goals are realized, it’s often one of the first parts of the network created.

Applications

Eye Scan Army
Image used with permission by copyright holder

If you use modern technology, chances are good that deep learning algorithms are at work all around you, every day. How do you think Alexa or Google Assistant understand your voice commands? They use neural networks that have been built to understand speech. How does Google know what you’re searching for before you’re done typing? More deep learning at work. How does your security cam ignore pets but recognize human movement? Deeping learning once again.

Anytime that software recognizes human inputs, from facial recognition to voice assistants, deep learning is probably at work somewhere underneath. However, the field also has many other useful applications. Medicine is a particularly promising field, where advanced deep learning is used to analyze DNA for flaws or molecular compounds for potential health benefits. On a more physical front, deep learning is used in a growing number of machines and vehicles to predict when equipment needs maintenance before something goes seriously wrong.

The future of deep learning

History of AI name
Image used with permission by copyright holder

The future of deep learning is particularly bright! The great thing about a neural network is that it excels at dealing with a vast amount of disparate data (think of everything our brains have to deal with, all the time). That’s especially relevant in our era of advanced smart sensors, which can gather an incredible amount of information. Traditional computer solutions are beginning to struggle with sorting, labeling and drawing conclusions from so much data.

Deep learning, on the other hand, can deal with the digital mountains of data we are gathering. In fact, the larger the amount of data, the more efficient deep learning becomes compared to other methods of analysis. This is why organizations like Google invest so much in deep learning algorithms, and why they are likely to become more common in the future.

And, of course, the robots. Let’s never forget about the robots.

Tyler Lacoma
Former Digital Trends Contributor
If it can be streamed, voice-activated, made better with an app, or beaten by mashing buttons, Tyler's into it. When he's not…
Star Wars legend Ian McDiarmid gets questions about the Emperor’s sex life
Ian McDiarmid as the Emperor in Star Wars: The Rise of Skywalker.

This weekend, the Star Wars: Revenge of the Sith 20th anniversary re-release had a much stronger performance than expected with $25 million and a second-place finish behind Sinners. Revenge of the Sith was the culmination of plans by Chancellor Palpatine (Ian McDiarmid) that led to the fall of the Jedi and his own ascension to emperor. Because McDiarmid's Emperor died in his first appearance -- 1983's Return of the Jedi -- Revenge of the Sith was supposed to be his live-action swan song. However, Palpatine's return in Star Wars: Episode IX -- The Rise of Skywalker left McDiarmid being asked questions about his character's comeback, particularly about his sex life and how he could have a granddaughter.

While speaking with Variety, McDiarmid noted that fans have asked him "slightly embarrassing questions" about Palpatine including "'Does this evil monster ever have sex?'"

Read more
Waymo and Toyota explore personally owned self-driving cars
Front three quarter view of the 2023 Toyota bZ4X.

Waymo and Toyota have announced they’re exploring a strategic collaboration—and one of the most exciting possibilities on the table is bringing fully-automated driving technology to personally owned vehicles.
Alphabet-owned Waymo has made its name with its robotaxi service, the only one currently operating in the U.S. Its vehicles, including Jaguars and Hyundai Ioniq 5s, have logged tens of millions of autonomous miles on the streets of San Francisco, Los Angeles, Phoenix, and Austin.
But shifting to personally owned self-driving cars is a much more complex challenge.
While safety regulations are expected to loosen under the Trump administration, the National Highway Traffic Safety Administration (NHTSA) has so far taken a cautious approach to the deployment of fully autonomous vehicles. General Motors-backed Cruise robotaxi was forced to suspend operations in 2023 following a fatal collision.
While the partnership with Toyota is still in the early stages, Waymo says it will initially study how to merge its autonomous systems with the Japanese automaker’s consumer vehicle platforms.
In a recent call with analysts, Alphabet CEO Sundar Pichai signaled that Waymo is seriously considering expanding beyond ride-hailing fleets and into personal ownership. While nothing is confirmed, the partnership with Toyota adds credibility—and manufacturing muscle—to that vision.
Toyota brings decades of safety innovation to the table, including its widely adopted Toyota Safety Sense technology. Through its software division, Woven by Toyota, the company is also pushing into next-generation vehicle platforms. With Waymo, Toyota is now also looking at how automation can evolve beyond assisted driving and into full autonomy for individual drivers.
This move also turns up the heat on Tesla, which has long promised fully self-driving vehicles for consumers. While Tesla continues to refine its Full Self-Driving (FSD) software, it remains supervised and hasn’t yet delivered on full autonomy. CEO Elon Musk is promising to launch some of its first robotaxis in Austin in June.
When it comes to self-driving cars, Waymo and Tesla are taking very different roads. Tesla aims to deliver affordability and scale with its camera, AI-based software. Waymo, by contrast, uses a more expensive technology relying on pre-mapped roads, sensors, cameras, radar and lidar (a laser-light radar), that regulators have been quicker to trust.

Read more
Uber partners with May Mobility to bring thousands of autonomous vehicles to U.S. streets
uber may mobility av rides partnership

The self-driving race is shifting into high gear, and Uber just added more horsepower. In a new multi-year partnership, Uber and autonomous vehicle (AV) company May Mobility will begin rolling out driverless rides in Arlington, Texas by the end of 2025—with thousands more vehicles planned across the U.S. in the coming years.
Uber has already taken serious steps towards making autonomous ride-hailing a mainstream option. The company already works with Waymo, whose robotaxis are live in multiple cities, and now it’s welcoming May Mobility’s hybrid-electric Toyota Sienna vans to its platform. The vehicles will launch with safety drivers at first but are expected to go fully autonomous as deployments mature.
May Mobility isn’t new to this game. Backed by Toyota, BMW, and other major players, it’s been running AV services in geofenced areas since 2021. Its AI-powered Multi-Policy Decision Making (MPDM) tech allows it to react quickly and safely to unpredictable real-world conditions—something that’s helped it earn trust in city partnerships across the U.S. and Japan.
This expansion into ride-hailing is part of a broader industry trend. Waymo, widely seen as the current AV frontrunner, continues scaling its service in cities like Phoenix and Austin. Tesla, meanwhile, is preparing to launch its first robotaxis in Austin this June, with a small fleet of Model Ys powered by its camera-based Full Self-Driving (FSD) system. While Tesla aims for affordability and scale, Waymo and May are focused on safety-first deployments using sensor-rich systems, including lidar—a tech stack regulators have so far favored.
Beyond ride-hailing, the idea of personally owned self-driving cars is also gaining traction. Waymo and Toyota recently announced they’re exploring how to bring full autonomy to private vehicles, a move that could eventually bring robotaxi tech right into your garage.
With big names like Uber, Tesla, Waymo, and now May Mobility in the mix, the ride-hailing industry is evolving fast—and the road ahead looks increasingly driver-optional.

Read more