Skip to main content

Multiferroic memory could slash RAM power consumption – a decade from now

Every form of chip-based memory inside modern PCs relies on the application of current to store information. The DRAM you use every day, for example, stores charge in capacitors. Whether a capacitor is or isn’t charged serves to indicate the value of a bit, be it a 0 or 1.

This works, and the value can be changed with blazing speed, but there are downsides. If current is lost the data is also lost (solid state drives get around this to an extent, but can still lose data if disconnected for many years). The need for an application of current also draws power and produces heat. Engineers have always desired a form of memory that’s as quick as DRAM yet doesn’t require the application of current.

Recommended Videos

Now researchers at Cornell University have made a discovery that could finally turn the dream into a reality. A team lead by post doctoral associate John Heron, professor Darrell Schlom and professor Dan Ralph has found data can be stored in memory made of bismuth ferrite without constant application of current.

Bismuth ferrite, in case you’re wondering, is a chemical compound with an unusual property; it’s multiferroic. This means it has its own, permanent magnetic field and is always electrically polarized. The application of an electric field can change the polarization, and once changed it remains in its new state permanently until a jolt is applied again. The polarization can be read as a bit value, which makes the invention usable as memory.

This means power is required only to change the polarization, but not to maintain it, cutting both power consumption and the heat that results dramatically. And unlike previous similar devices, which worked only at extremely cool temperatures, the bismuth ferrite device functions in a normal ambient environment.

Related: Crossbar’s breakthrough RRAM could enable hard drives the size of a postage stamp

The potential of this invention is significant,  but it’s still at a very early stage of development. It’s important to note the researchers have put together just one device, which can hold one bit; a stick of DRAM has millions of capacitors and transistors. For this invention to be useful researchers will need to find a way to put huge numbers of the devices together, all built using bismuth ferrite, an entirely synthetic material.

In short, this is not something coming to your computer next year, the year after that, or even five years from now. If researchers can find a way to construct a functioning memory chip, however, this invention could drastically cut power draw in computers, smartphones, tablets and let engineers achieve even smaller, more compact designs.

Image credit: Mycteria/Shutterstock

Matthew S. Smith
Matthew S. Smith is the former Lead Editor, Reviews at Digital Trends. He previously guided the Products Team, which dives…
Outlook typing lag will finally get a fix from Microsoft
A Dell laptop connected to a hard drive on a couch.

If you use classic Outlook to handle your emails, then you're most likely familiar with the annoying bug that causes huge CPU spikes while typing. It can be difficult to finish emails when your system resources jump by as much as 50 percent (and increase power usage with it), but Microsoft has finally announced that a fix is on the way. The downside? It won't arrive until late May for most users, although some might see it in early or mid May if they're part of the beta program. Until then, there is a workaround.

Rolling classic Outlook back to version 2405 seems to fix the issue, but it comes with a not-insignificant tradeoff. Updates since version 2405 have patched several security flaws, so if you opt to go this route, be aware that it opens your system to vulnerabilities.

Read more
YouTube’s AI Overviews want to make search results smarter
YouTube App

YouTube is experimenting with a new AI feature that could change how people find videos. Here's the kicker: not everyone is going to love it.

The platform has started rolling out AI-generated video summaries directly in search results, but only for a limited group of YouTube Premium subscribers in the U.S. For now, the AI Overviews are focused on things like product recommendations and travel ideas. They're meant to give quick highlights from multiple videos without making users look at each item they're interested in.

Read more
OpenAI’s GPT-4 might be coming to an end. Here’s why that’s actually good news
OpenAI's new typeface OpenAI Sans

OpenAI has seen many changes in recent weeks, and more are quickly coming. The AI company has yet to confirm the launch of its upcoming GPT-5 AI model. However, it is making room for its planned model by ending support for other models in its lineup. OpenAI recently announced that it is retiring its GPT-4 AI model as of April 30. GPT-4 stood as one of the brand’s most popular and longest-running large language models. However, the company has already shifted its focus away from its original large language model technology and more toward its series of reasoning models and other technologies in recent months. 

The brand has also made some interesting moves by introducing a new GPT 4.1 model family, strictly as an API for developers, while simultaneously indicating plans to sunset the recently launched GPT-4.5 model and also releasing the o3 and o4 reasoning models. While not yet confirmed, these moves appear to propel the GPT-5 timeline closer to launch.

Read more