Yesterday, NASA confirmed a very vital piece of information to all mankind: we now know that the sun is, without a doubt, a sphere. For the first time, the space agency is able to see both the front and back of the sun. Luckily, there were no surprises. NASA now has an uninterrupted feed of both sides of the sun, allowing it to monitor and gather invaluable information about its health and inner workings.
To obtain the 360 degree view, NASA launched twin STEREO probes into opposing orbits of the sun. Each satellite sends back high resolution pictures of half of the star and computers here on Earth combine the images to form a full view. The human eye is no match for these cameras. The probes can actually take pictures of four wavelengths of extreme ultraviolet radiation, giving scientists even more important information on a consistent basis.
“With data like these, we can fly around the sun to see what’s happening over the horizon—without ever leaving our desks,” said STEREO program scientist Lika Guhathakurta. “I expect great advances in theoretical solar physics and space weather forecasting.”
Like advances in weather forecasting, seeing the backside of the sun can help scientists see things like sunspots form before they are in view of Earth, giving us valuable time to prepare for events like flares and plasma clouds that could head toward our small planet. “With this nice global model, we can now track solar storms heading toward other planets, too,” points out Guhathakurta. “This is important for NASA missions to Mercury, Mars, asteroids … you name it.”
The STEREO probes left earth in Oct. 2006 and have been working toward alignment ever since. The two probes aligned on Feb. 6, 2011, which is when NASA received its first full view of the sun. The agency receives new photos every 10 minutes.
See the sun in 3D
- NASA thinks 3D-printing spacecraft parts in orbit will help Moon to Mars mission
- NASA’s ‘Refabricator’ lets astronauts recycle 3D-printed tools to make new ones