Skip to main content

Artificial intelligence could help us see farther into space than ever before

neural network space telescope hubble2
Distortions in space-time sound like they’d be more of a concern on an episode of Star Trek than they would in the real world. However, that’s not necessarily true: analyzing images of gravitational waves could help enormously extend both the range and resolution of telescopes like Hubble, and allow us to see farther into the universe than has been possible before.

The good news? Applying an artificial intelligence neural network to this problem turns out to accelerate its solution well beyond previous methods — like 10 million times faster. That means that analysis which could take human experts weeks or even months to complete can now be carried out by neural nets in a fraction of a single second.

Developed by researchers at Stanford University and the SLAC National Accelerator Laboratory, the new neural network is able to analyze images of so-called “gravitational lensing.” This is an effect first hypothesized about by Albert Einstein, who suggested that giant masses such as stars have the effect of curving light around them. This effect is similar to a telescope in that it allows us to examine distant objects with more clarity. However, unlike a telescope, gravitational lenses distort objects into smeared rings and arcs — so making sense of them requires the calculating abilities of a computer.

To train their network, researchers on the project showed it around half a million simulated images of gravitational lenses. After this was done, the neural net was able to spot new lenses and determine their properties — down to how their mass was distributed, and how great the magnification levels of the background galaxy were.

Given that projects like the Large Synoptic Survey Telescope (LSST), a 3.2-gigapixel camera currently under construction at SLAC, is expected to increase the number of known strong gravitational lenses from a few hundred to tens of thousands, this work comes at the perfect time.

“We won’t have enough people to analyze all these data in a timely manner with the traditional methods,” said postdoctoral fellow Laurence Perreault Levasseur, a co-author on the associated Nature research paper. “Neural networks will help us identify interesting objects and analyze them quickly. This will give us more time to ask the right questions about the universe.”

Impressively, the neural network doesn’t even need a supercomputer to run on: one of the tested neural nets was designed to work on an iPhone. Studying the universe in greater detail than ever? Turns out there’s an app for that!

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
How astronomers used James Webb to detect methane in the atmosphere of an exoplanet
An artists rendering of a blue and white exoplanet known as WASP-80 b, set on a star-studded black background. Alternating horizontal layers of cloudy white, grey and blue cover the planets surface. To the right of the planet, a rendering of the chemical methane is depicted with four hydrogen atoms bonded to a central carbon atom, representing methane within the exoplanet's atmosphere. An artist’s rendering of the warm exoplanet WASP-80 b whose color may appear bluish to human eyes due to the lack of high-altitude clouds and the presence of atmospheric methane identified by NASA’s James Webb Space Telescope, similar to the planets Uranus and Neptune in our own solar system.

One of the amazing abilities of the James Webb Space Telescope is not just detecting the presence of far-off planets, but also being able to peer into their atmospheres to see what they are composed of. With previous telescopes, this was extremely difficult to do because they lacked the powerful instruments needed for this kind of analysis, but scientists using Webb recently announced they had made a rare detection of methane in an exoplanet atmosphere.

Scientists studied the planet WASP-80 b using Webb's NIRCam instrument, which is best known as a camera but also has a slitless spectroscopy mode which allows it to split incoming light into different wavelengths. By looking at which wavelengths are missing because they have been absorbed by the target, researchers can tell what an object -- in this case, a planetary atmosphere -- is composed of.

Read more
Juice spacecraft gears up for first ever Earth-moon gravity boost
Artist's impression of ESA's Jupiter Icy Moons Explorer (JUICE) approaching Earth.

The European Space Agency (ESA)'s Juice mission is heading to Jupiter, but it isn't traveling all that way in a straight line. Instead, like most solar system missions, the spacecraft makes use of the gravity of other planets to give it a push on its way.

But Juice will be making an unusual maneuver next year, carrying out the first gravity assist flyby around both Earth and the moon. This week, the spacecraft made its longest maneuver yet to get into position ahead of the first of its kind flyby in 2024.

Read more
Super high energy particle falls to Earth; its source is a mystery
Artist’s illustration of ultra-high-energy cosmic ray astronomy to clarify extremely energetic phenomena.

Researchers have detected one of the highest-energy particles ever falling to Earth. Cosmic rays are high-energy particles that come from sources in space such as the sun, but this recent detection is more powerful than anything that can be explained by known sources in our galaxy or even beyond. The particle had an energy of 2.4 x 1020eV, which is millions of times the energy of the particles produced in a particle collider.

Artist’s illustration of ultra-high-energy cosmic ray astronomy to clarify extremely energetic phenomena. Osaka Metropolitan University/Kyoto University/Ryuunosuke Takeshige

Read more