It’s easy to forget sometimes that these magical machines we use are the result of some pretty complicated science. Underneath the shiny user interfaces and first person shooters and productivity apps lies the fundamental laws of physics, mathematics, and chemistry. Accordingly, advances in things like processor speed and storage capacity come from basic advances in our understand of how the world works.
Recently, some scientists parlayed their work on such fundamental science into a purely theoretical — but potentially massive — increase in the speed of personal computer memory, as Engadget reports. The researchers, hailing from Europe and Russia, are looking into how terahertz radiation, or T-rays, could speed up how quickly memory cells can be reset and ready for reuse by 1,000 times.
According to the scientists, the theoretical result of using T-rays instead of magnetic fields, used by today’s technology, for memory switching would be seriously fast memory that would significantly speed up personal computer performance. Today, memory speeds are a serious limitation relative to modern processor performance, and so such a breakthrough would be a tremendous benefit to all kinds of high-performance computing applications.
Today, T-rays are being researched for use in a number of imaging applications such as scanning for concealed weapons and for safer medical imaging, along with a variety of communications and manufacturing purposes. T-rays are of particular interest for such applications because they lie in the radiation spectrum between microwave and infrared radiations, and are non-ionizing and thus not destructing to various materials including human body tissue.
The research so far on using T-rays for computing application is still in its initial stages, and so we’re not likely to enjoy any benefits any time soon. Whenever the new technology does make its way to PCs, however, we’ll see significantly faster machines that are far more capable of fully leveraging advances in processor performance for things like advanced data modeling and virtual reality.
- GTC 2020 roundup: Nvidia’s virtual world for robots, A.I. video calls
- Should you overclock your CPU?
- How to choose a graphics card
- The best desktop computers for 2020
- How much RAM do you need?