Deep learning vs. machine learning: What’s the difference between the two?

You’re starting on page 2 of this, click here to start at the beginning.

Deep Learning

Deep learning is basically machine learning on a “deeper” level (pun unavoidable, sorry). It’s inspired by how the human brain works, but requires high-end machines with discrete add-in graphics cards capable of crunching numbers, and enormous amounts of “big” data. Small amounts of data actually yield lower performance.

Unlike standard machine learning algorithms that break problems down into parts and solves them individually, deep learning solves the problem from end to end. Better yet, the more data and time you feed a deep learning algorithm, the better it gets at solving a task.

In our examples for machine learning, we used images consisting of boys and girls. The program used algorithms to sort these images mostly based on spoon-fed data. But with deep learning, data isn’t provided for the program to use. Instead, it scans all pixels within an image to discover edges that can be used to distinguish between a boy and a girl. After that, it will put edges and shapes into a ranked order of possible importance to determine the two genders.

On an even more simplified level, machine learning will distinguish between a square and triangle based on information provided by humans: squares have four points, and triangles have three. With deep learning, the program doesn’t start out with pre-fed information. Instead, it uses an algorithm to determine how many lines the shapes have, if those lines are connected, and if they are perpendicular. Naturally, the algorithm would eventually figure out that an inserted circle does not fit in with its square and triangle sorting.

Again, this latter “deep thinking” process requires more hardware to process the big data generated by the algorithm. These machines tend to reside in large datacenters to create an artificial neural network to handle all the big data generated and supplied to artificial intelligent applications. Programs using deep learning algorithms also take longer to train because they’re learning on their own instead of relying on hand-fed shortcuts.

“Deep Learning breaks down tasks in ways that makes all kinds of machine assists seem possible, even likely. Driverless cars, better preventive healthcare, even better movie recommendations, are all here today or on the horizon,” writes Nvidia’s Michael Copeland. “With Deep Learning’s help, A.I. may even get to that science fiction state we’ve so long imagined.”

Is Skynet on the way? Not yet

A great recent example of deep learning is translation. This technology is capable of listening to a presenter talking in English, and translating his words into a different language through both text and an electronic voice in real time. This achievement was a slow learning burn over the years due to the differences in overall language, language use, voice pitches, and maturing hardware-based capabilities.

Deep learning is also responsible for conversation-carrying chatbots, Amazon Alexa, Microsoft Cortana, Facebook, Instagram, and more. On social media, algorithms based on deep learning are what cough up contact and page suggestions. Deep learning even helps companies customize their creepy advertising to your tastes even when you’re not on their site. Yay for technology.

“Looking to the future, the next big step will be for the very concept of the ‘device’ to fade away,” says Google CEO Sundar Pichai. “Over time, the computer itself—whatever its form factor—will be an intelligent assistant helping you through your day. We will move from mobile first to an A.I. first world.”

Emerging Tech

Just like an eagle, this autonomous glider can fly on thermal currents

Using a type of artificial intelligence which learns based on trial and error, researchers have demonstrated how gliders can glide autonomously on thermal currents, much like an eagle does.
Emerging Tech

Teaching machines to see illusions may help computer vision get smarter

Researchers are teaching computers to see optical illusions. The reason? To create smarter, more brain-like vision recognition algorithms for everything from robots to autonomous cars.
Smart Home

The best washing machines make laundry day a little less of a chore

It takes a special kind of person to love doing laundry, but the right machine can help make this chore a little easier. Check out our picks for the best washing machines on the market right now.
Emerging Tech

Don’t be fooled — this automated system sneakily manipulates video content

In the vein of “deep fakes," Recycle-GAN, a new system from Carnegie Mellon University, presents another case for how difficult it will be to distinguish fiction from reality in the future.
Emerging Tech

New sustainable plan to mitigate climate change involves… a hot dog cooker?

Chemists have demonstrated a new, energy-efficient method of pulling carbon dioxide directly from the air. The secret ingredients? An air humidifier and a solar-powered hot dog cooker.
Emerging Tech

Removing ‘zombie cells’ in the brain could help battle the effects of dementia

Researchers at the Mayo Clinic have demonstrated how the removal of so-called "zombie cells" can help reverse the effects of dementia-style cognitive decline in mice. Here's what they did.
Emerging Tech

Awesome Tech You Can’t Buy Yet: Click-to-brew beer, comfy headlamps, and more

Check out our roundup of the best new crowdfunding projects and product announcements that hit the Web this week. You can't buy this stuff yet, but it sure is fun to gawk!
Emerging Tech

NASA’s planet hunter satellite gets first hit in its search for another Earth

NASA's planet hunter satellite TESS has discovered a new Earth-like planet. At only 62 light-years distant, the new find is much closer than the Kepler Mission's 2015 exoplanet discovery -- that one was 155 light-years distant.
Emerging Tech

From flying for fun to pro filmmaking, these are the best drones you can buy

In just the past few years, drones have transformed from a geeky hobbyist affair to a full-on cultural phenomenon. Here's a no-nonsense rundown of the best drones you can buy right now, no matter what kind of flying you plan to do.
Emerging Tech

New mask-mounted head-up display gives Navy combat divers tactical advantage

Divers are often forced to work in low-light conditions where visibility is limited or all-but nonexistent. In order to help solve this problem, the Navy has developed a new head-up display known as Shadow Nav.
Emerging Tech

Roll over, SpotMini — here comes the ALMA robo-dog

If two robo-dogs met on the street, would one try to sniff the mechanical components at the rear of the other? We have no idea, but with at least two different rob-dogs now making real advances, we may soon find out.
Emerging Tech

A Japanese spacecraft just landed two rovers on an asteroid

Japan's space agency has succeeded in landing two rovers on the surface of an asteroid around 200 million miles from Earth. The deployment is part of a bold mission aimed at unlocking some of the mysteries of our solar system.
Emerging Tech

3D-printed gun advocate extradited to Texas to face sex-assault charges

Cody Wilson, the founder of Defense Distributed, has been arrested in Tawan. U.S. law enforcement have reported that they are working with Taiwanese authorities to have Wilson returned to the U.S. where he faces charges of sexual assault.
Emerging Tech

Eye-tracking tech lets you control a drone by looking where you want it to move

Put down your smartphones and other drone controllers. Researchers have invented a method to allow drone pilots to fly drones using a pair of eye-tracking glasses. What could be simpler?
2 of 2