Deep learning is basically machine learning on a “deeper” level (pun unavoidable, sorry). It’s inspired by how the human brain works, but requires high-end machines with discrete add-in graphics cards capable of crunching numbers, and enormous amounts of “big” data. Small amounts of data actually yield lower performance.
Unlike standard machine learning algorithms that break problems down into parts and solves them individually, deep learning solves the problem from end to end. Better yet, the more data and time you feed a deep learning algorithm, the better it gets at solving a task.
In our examples for machine learning, we used images consisting of boys and girls. The program used algorithms to sort these images mostly based on spoon-fed data. But with deep learning, data isn’t provided for the program to use. Instead, it scans all pixels within an image to discover edges that can be used to distinguish between a boy and a girl. After that, it will put edges and shapes into a ranked order of possible importance to determine the two genders.
On an even more simplified level, machine learning will distinguish between a square and triangle based on information provided by humans: squares have four points, and triangles have three. With deep learning, the program doesn’t start out with pre-fed information. Instead, it uses an algorithm to determine how many lines the shapes have, if those lines are connected, and if they are perpendicular. Naturally, the algorithm would eventually figure out that an inserted circle does not fit in with its square and triangle sorting.
Again, this latter “deep thinking” process requires more hardware to process the big data generated by the algorithm. These machines tend to reside in large datacenters to create an artificial neural network to handle all the big data generated and supplied to artificial intelligent applications. Programs using deep learning algorithms also take longer to train because they’re learning on their own instead of relying on hand-fed shortcuts.
“Deep Learning breaks down tasks in ways that makes all kinds of machine assists seem possible, even likely. Driverless cars, better preventive healthcare, even better movie recommendations, are all here today or on the horizon,” writes Nvidia’s Michael Copeland. “With Deep Learning’s help, A.I. may even get to that science fiction state we’ve so long imagined.”
Is Skynet on the way? Not yet
A great recent example of deep learning is translation. This technology is capable of listening to a presenter talking in English, and translating his words into a different language through both text and an electronic voice in real time. This achievement was a slow learning burn over the years due to the differences in overall language, language use, voice pitches, and maturing hardware-based capabilities.
Deep learning is also responsible for conversation-carrying chatbots, Amazon Alexa, Microsoft Cortana, Facebook, Instagram, and more. On social media, algorithms based on deep learning are what cough up contact and page suggestions. Deep learning even helps companies customize their creepy advertising to your tastes even when you’re not on their site. Yay for technology.
“Looking to the future, the next big step will be for the very concept of the ‘device’ to fade away,” says Google CEO Sundar Pichai. “Over time, the computer itself—whatever its form factor—will be an intelligent assistant helping you through your day. We will move from mobile first to an A.I. first world.”
- Learn to play piano with the One Smart Keyboard Pro, on sale till Christmas
- Learn how to easily set up speech-to-text in Windows 10
- The best sound machines to help you fall (and stay) asleep
- Instagram purges fake followers, likes, and comments generated from other apps
- Google awarded patent for using eye tracking to detect expressions in VR