Most folks know that data travels at much higher speeds and capacities over optical cables than it can over copper equivalents. A prime example is the tremendous transfer speeds available on fiber optic networks, as opposed to copper POTS (plain old telephone service) lines. We’re talking thousands, even millions more times the data.
If you think about it, there’s no reason that the technology (or something similar) that allows us to move massive data around from Point A to Point B shouldn’t help speed up our computers too.
Related: A beginner’s guide to Tor
To that end, an English technology company dubbed Optalysys says that in January 2015, it will demonstrate a prototype optical computer that performs calculations at the speed of light. If all goes well, the company says that we will see exascale supercomputers as early as 2020.
What is an optical computer?
The term optical computing can refer to many different types of technologies. Basically, it refers to computers that use light, rather than electricity, to perform many of its tasks.
While Optalysys’ approach, which employs low-power lasers and a huge liquid crystal grid, is much different from most other competing optical-based models, the company’s results are very promising so far.
While highly complicated, the Optalysys approach projects low-power lasers onto the liquid crystal grid, which in turn initiates reactions within the grid. This generates sophisticated algorithms, accommodating thousands, even millions of calculations simultaneously. By using multiple grids, either in sequence or in parallel, you can significantly increase capacity and processing power.
In addition to providing massive computing oomph, the Optalysys’ system consumes very little power.
The company provided the following statistic to demonstrate the incredible savings in electricity: An optical computer will use roughly $3,500 worth of electricity each year, while today’s most powerful supercomputer, when running at its peak power of 34 petaflops per second, sucks juice at an annual cost of about $21 million.
What could a company do with those kinds of power savings? The truth is that very few companies (Perhaps Google, Microsoft, Amazon, and so on) require supercomputers with that kind of processing oomph. Most of us, on the other hand, would have little use for anywhere close to that kind of computing power.
Still, in addition to reducing the power bill literally by millions, optical computers should also decrease the size of supercomputers themselves drastically, thereby reducing space requirements and a slew of other expenses associated with housing humongous machines. The ability to deliver supercomputer power in a desktop-size machine opens up possibilities in all kinds of areas, including medicine, digital video and other media editing, 3D modeling, CAD—the list goes on and on.
If and when optical computers go mainstream, imagine what such a desktop machine would be capable of.
When will we see optical computers?
According to Optalysys, its optical computing technology has already met the NASA Technology Readiness Level (TRL) 4. This means that it’s ready for full-scale lab testing.
As mentioned, the company says that we’ll see a prototype by January 2015, and that it hopes to have two commercial demo systems up and running by 2017.
A big data analysis system for augmenting conventional supercomputers, and a standalone “Optical Solver” supercomputer that, a company spokesman says, should start at 9 petaflops, with it scaling up to 17.1 exaflops could arrive by 2020.
However, while the technology itself seems sound, Optalysys is just getting started. With this in mind, the 2020 time frame seems ambitious.
If all works out as planned though, the next bunch of years could see some freakishly powerful computers.
- Best desktop computer deals for October 2022
- SSDs could be as cheap as HDDs in time for Black Friday
- Intel 13th-gen Raptor Lake arrives just in time to hit back at AMD
- More than 80% of websites you visit are stealing your data
- Elon Musk plans ‘show and tell’ event on brain-computer technology