Skip to main content

IBM Looks to DNA To Self-Assemble Future Chips

IBM Looks to DNA To Self-Assemble Future Chips

Technology giant IBM has always invested tremendous amounts of money and talent into technology research, and has one of the largest (perhaps the largest) patent portfolio in the world to show for its efforts. Today, IBM announced another technology breakthrough that, in a decade or so, may revolutionize the way computer CPUs and other microprocessors are made: the company has come up with a way to use organic DNA molecules as a scaffold on which nanotubes and nanowires can self-assemble in precise patterns compatible with today’s lithographic chip-making technology. The technology may enable chip features smaller than 22nm—IBM is currently pushing traditional chip techniques towards 28nm—and may ultimately help make chips smaller and more power efficient.

IBM says the technique marks the first time biological molecules have been used to help with semiconductor processing and manufacture. The technique was a joint undertaking of IBM Research and the California Institute of Technology.

“The cost involved in shrinking features to improve performance is a limiting factor in keeping pace with Moore’s Law and a concern across the semiconductor industry,” said IBM Research’s manager of science and technology Spike Narayan, in a statement. “The combination of this directed self-assembly with today’s fabrication technology eventually could lead to substantial savings in the most expensive and challenging part of the chip-making process.”

Dubbed “DNA origami,” the techniques put a long strand of viral DNA in a solution with individual DNA molecules and short synthetic oligonucleotide strands. The individual DNA molecules bond with the long DNA segment, with the small segments functioning as staples that hold the viral DNA in a particular 2D shape. Here’s the trick: those oligonucleotide strands can be modified to have attachment points for nanoscale items like carbon nanotubes or silicon nanowires as small as 6 nm, meaning the process can be used to create triangles, squares, and other shapes from about 100 to 150 nm on a side, but the thickness of just the DNA double helix. These templates can then be used in traditional optical or laser lithography to etch out patterns used for chips.

If the process lives up to its promise, researchers estimate it will still be 10 years or more before products using the process come to market, but the technology holds the promise of chipmaking techniques that create much smaller components than are available today with a much simpler (and less expensive) production process.

Editors' Recommendations

Geoff Duncan
Former Digital Trends Contributor
Geoff Duncan writes, programs, edits, plays music, and delights in making software misbehave. He's probably the only member…
Why multi-chip GPUs are the future of graphics power
An AMD RX 6000 graphics card with the Radeon branding.

Multi-chip module (MCM) graphics cards just might be the future, bringing upcoming generations of GPUs to a whole new level. Combining multiple dies instead of relying on a single chip could provide better performance without being hindered by the current hardware limitations.

According to recent leaks, both AMD and Intel may currently be working on MCM graphics cards. AMD's rendition of this technology may be just around the corner -- in the upcoming RDNA 3 GPUs.

Read more
New Samsung and IBM discoveries could one day produce ultra-efficient chips
The new IBM and Samsung semiconductor design.

IBM and Samsung revealed that they are working on a new joint project: The creation of a new semiconductor design.

The goal of the joint efforts of these two companies is to create a new standard of ultra-energy-efficient chips.

Read more
Intel reports new computing breakthroughs as it pursues Moore’s Law
Intel CEO Pat Gelsinger delivers the Day 1 closing keynote at IAA Mobility

Intel issued a press release, unveiling various advancements in the fields of packaging, transistor, and quantum physics. The company has stated that these new findings were made in pursuit of Moore's Law.

According to Intel, these breakthroughs are going to be fundamental to advancing computing well into the upcoming years, going as far as beyond 2025.

Read more