As is obvious from the endlessly discovered bugs, flaws, and slow-downs that crop up in websites and software of all kinds, everything needs updating from time to time — preferably on a regular basis. That’s a time consuming part of software development, which is why Adobe and MIT are working on a project to have the code optimize itself.
The big problem is “code rot,” which occurs as standards change and people move to new hardware and software platforms. Compatibility issues arise and everything starts to slow down, because what’s there just isn’t efficient enough to keep up. But having the code improve itself seems like something that only an AI driven future can deliver … doesn’t it?
Apparently not, as the joint project between Adobe and MIT, known as Helium, has already delivered a strong proof of concept. Taking Adobe’s Photoshop image editing tool, the Helium project analyzed commands being sent with image filters and compared them to the end result. From there the software was able to run variants with certain commands removed if they weren’t required to achieve the same visual effect.
That way the software command was able to optimize itself to deliver the same result, but with a more efficient codebase. When those commands were then converted to run on GPU hardware also, Helium was able to make the filters run as much as 75 percent faster than before.
Although the researchers did admit that they were working with a best-case-scenario for making automated optimizations, it shows that certain code can be tested to see if it can run itself faster. We imagine Photoshop could use further optimization, but ExtremeTech points out that this is mainly an MIT project; future developments probably won’t improve the old image-editor. It will be interesting to see what other software could be improved in this manner.
Do you use any older software regularly that you think could benefit from automatic optimizations?