Skip to main content

Analog A.I.? It sounds crazy, but it might be the future

Promotional image for Tech For Change. Person standing on solar panel looking at sunset.
This story is part of Tech for Change: an ongoing series in which we shine a spotlight on positive uses of technology, and showcase how they're helping to make the world a better place.

Forget digital. The future of A.I. is … analog? At least, that’s the assertion of Mythic, an A.I. chip company that, in its own words, is taking “a leap forward in performance in power” by going back in time. Sort of.

Before ENIAC, the world’s first room-sized programmable, electronic, general-purpose digital computer, buzzed to life in 1945, arguably all computers were analog — and had been for as long as computers have been around.

Recommended Videos

Analog computers are a bit like stereo amps, using variable range as a way of representing desired values. In an analog computer, numbers are represented by way of currents or voltages, instead of the zeroes and ones that are used in a digital computer. While ENIAC represented the beginning of the end for analog computers, in fact, analog machines stuck around in some form until the 1950s or 1960s when digital transistors won out.

“Digital kind of replaced analog computing,” Tim Vehling, senior vice president of product and business development at Mythic, told Digital Trends. “It was cheaper, faster, more powerful, and so forth. [As a result], analog went away for a while.”

In fact, to alter a famous quotation often attributed to Mark Twain, reports of the death of analog computing may have been greatly exaggerated. If the triumph of the digital transistor represented the beginning of the end for analog computers, it may only have been the beginning of the end of the beginning.

Building the next great A.I. processor

Mythic Ai logo on a chip graphic.
Mythic

Mythic isn’t building purposely retro tech, though. This isn’t some steampunk startup operating out of a vintage clock tower headquarters filled with Tesla coils; it’s a well-funded tech company, based in Redwood City, California and Austin, Texas, that’s building Mythic Analog Matrix Processors (Mythic AMP) that promise advances in power, performance, and cost using a unique analog compute architecture that diverges significantly from regular digital architectures.

Devices like its announced M1076 single-chip analog computation device purport to usher in an age of compute-heavy processing at impressively low power.

“There’s definitely a lot of interest in making the next great A.I. processor,” said Vehling. “There’s a lot of investment and venture capital money going into this space, for sure. There’s no question about that.”

The analog approach isn’t just a marketing gimmick, either. Mythic sees problems in the future for Moore’s Law, the famous observation made by Intel co-founder Gordon Moore in 1965, claiming that roughly every 18 months the number of transistors able to be squeezed onto an integrated circuit doubles. This observation has helped usher in a period of sustained exponential improvement for computers over the past 60 years, helping support the amazing advances A.I. research has made during that same period.

But Moore’s Law is running into challenges of the physics variety. Advances have slowed as a result of the physical limitations of constantly attempting to shrink components. Approaches like optical and quantum computing offer one possible way around this. Meanwhile, Mythic’s analog approach seeks to create compute-in-memory elements that function like tunable resistors, supplying inputs as voltages, and collecting the outputs as currents. In doing so, the idea is that the company’s chips can capably handle the matrix multiplication needed to enable artificial neural networks to function in an innovative new way.

As the company explains: “We use analog computing for our core neural network matrix operations, where we are multiplying an input vector by a weight matrix. Analog computing provides several key advantages. First, it is amazingly efficient; it eliminates memory movement for the neural network weights since they are used in place as resistors. Second, it is high performance; there are hundreds of thousands of multiply-accumulate operations occurring in parallel when we perform one of these vector operations.”

“There’s a lot of ways to tackle the problem of A.I. computation,” Vehling said, referring to the various approaches being explored by different hardware companies. “There’s no wrong way. But we do fundamentally believe that the keep-throwing-more-transistors-at-it, keep-making-the-process-nodes-smaller — basically the Moore’s Law approach — is not viable anymore. It’s starting to prove out already. So whether you do analog computers or not, companies will have to find a different approach to make next-generation products that are high computation, low power, [et cetera].”

The future of A.I.

brain with computer text scrolling artificial intelligence
Chris DeGraw/Digital Trends, Getty Images

If this problem is not taken care of, it’s going to have a big impact on the further advancement of A.I., especially when this is carried out locally on devices. Right now, some of the A.I. we rely on on a daily basis combines on-device processing and the cloud. Think of it like having an employee who’s able to make decisions up to a certain level, but must then call their boss to ask advice.

This is the model used by, for instance, smart speakers, which carry out tasks like keyword spotting (“OK, Google”) locally, but then outsource the actual spoken word queries to the cloud, thereby letting household devices harness the power of supercomputers stored in massive data centers thousands of miles away.

That’s all well and good, although some tasks require instant responses. And, as A.I. gets smarter, we’ll expect more and more of it. “We see a lot of what we call Edge A.I., which is not relying on the cloud, when it comes to industrial applications, machine vision applications, drones, in video surveillance,” Vehling said. “[For example], you may want to have a camera trying to identify somebody and take action immediately. There are a lot of applications that do need immediate application on a result.”

A.I. chips need to keep pace with other breakthroughs in hardware. Cameras, for instance, are getting better all the time. Picture resolution has increased dramatically over the past decades, meaning that deep A.I. models for image recognition must be able to parse ever-increasing amounts of resolution data to carry out analytics.

Add onto this the growing expectations for what people believe should be extractable from an image — whether that’s mapping objects in real-time, identifying multiple objects at once, figuring out the three-dimensional context of a scene — and you realize the immense challenge that A.I. systems face.

Whether it’s for offering more processing power while keeping devices small, or the privacy demands that require local processing instead of outsourcing, Mythic believes its compact chips have plenty to offer.

The roll-out

Mythic Ai logo on a chip graphic.
Mythic

“We’re [currently] in the early commercialization stages,” said Vehling. “We’ve announced a couple of products. So far we have a number of customers that are evaluating [our technology] for use in their own products… Hopefully by late this year, early next year, we’ll start seeing companies utilizing our technology in their products.”

Initially, he said, this is likely to be in enterprise and industrial applications, such as video surveillance, high-end drone manufacturers, automation companies, and more. Don’t expect that consumer applications will lag too far behind, though.

“Beyond 2022 — [2023] going into ’24 — we’ll start seeing consumer tech companies [adopt our technology] as well,” he said.

If analog computing turns out to be the innovation that powers the augmented and virtual reality needed for the metaverse to function … well, isn’t that about the most perfect meeting point of steampunk and cyberpunk you could hope for?

Hopefully, Mythic’s chips prove less imaginary and unreal than the company’s chosen name would have us believe.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
OpenAI’s new ChatGPT app is free for iPhone and iPad
The ChatGPT website on an iPhone.

OpenAI has just launched a free ChatGPT app for iOS, giving iPhone and iPad owners an easy way to take the AI-powered tool for a spin.

The new app, which is able to converse in a remarkably human-like way, is available now in the U.S. App Store and will come to additional countries “in the coming weeks,” OpenAI said. Android users are promised their own ChatGPT app “soon.”

Read more
Medical health experts the latest to sound alarm over AI development
A digital brain on a computer interface.

An international group of doctors and medical health experts is the latest to call for artificial intelligence (AI) to be regulated, saying that it “poses a number of threats to human health and well-being,” and claiming that the “window of opportunity to avoid serious and potentially existential harms is closing.”

The analysis follows other recent warnings from prominent tech figures who include Geoffrey Hinton, the so-called “godfather of AI,” and a group of experts who were among 1,000 signatories of a letter that called for a suspension on AI development until a set of rules can be established to ensure its safe use.

Read more
AI could replace around 7,800 jobs at IBM as part of a hiring pause
The ChatGPT website on a laptop's screen as the laptop sits on a counter in front of a black background.

A valid concern that is often brought up in the discourse surrounding AI and automation is the prospect that many jobs could disappear due to being replaced by the new technology. And the latest example of this is the recent news that IBM may include the use of AI and automation in its plans to pause hiring for certain roles within the company.

Bloomberg has reported that among IBM's plans for a hiring pause for certain "back-office functions," IBM could replace approximately 7,800 jobs with AI and automation over a span of five years.

Read more