Last week, a large solar flare was captured in action by the Solar Dynamics Observatory. The results were nothing short of spectacular.
The SDO is a satellite orbiting the earth at a great distance, which permits it to record solar variability year-round. Understanding solar variability is important to society for a number of reasons. The larger solar flares can disturb the Earth's magnetic field sufficiently to knock out power grids, GPS navigation, air travel, financial services, and radio communications. (Last week's solar flare was not quite large enough to have these sorts of effects.)
Thanks to its placement, SDO can warn us when a solar flare has occurred so that we are prepared for its effects when it gets here. Scientists also hope to use its data to learn how to predict solar flares.
There are other less immediate yet equally important reasons to study the Sun more closely. Because the sun is so powerful, even small changes in its irradiance could have effects on our climate. And despite all we have learned about the Universe we live in, we still do not know what really goes on deep inside the Sun. Learning more about the interior of our Sun could help us better understand what happens in the depths of other stars.
On-board there are three experiments: the Atmospheric Imaging Assembly, Extreme Ultraviolet Variability Experiment (EVE), and the Helioseismic and Magnetic Imager (HMI). Each experiment has its own specialty, and gathers its own data.
And it is a lot of data. SDO continuously beams 150 million bits of data per second year round, or 1.5 terabytes per day. That's almost 50 times more science data than any other mission in NASA history, according to the SDO media kit.
"SDO needs a dedicated antenna to receive the data continuously being sent down from the spacecraft, but to accommodate maintenance/downtime a second antenna is needed periodically," said Don Woodraska, the lead for science data processing for EVE. "There are no other Ka-band antennas that are compatible with the SDO high data rate."
That's why SDO has two dedicated radio antennae at a facility near Las Cruces, New Mexico. Providing backup in the event of hardware failure or data line outage is challenging, given that SDO has no room on-board to store 1.5 terabytes per day. Instead, the ground station in New Mexico provides 30 days worth of temporary storage.
AIA and HMI
The AIA and HMI data is sent to their Joint Science Operations Center at Stanford University in California, and the EVE data is sent to the Laboratory for Atmospheric and Space Physics in Boulder, Colorado.
At the JSOC, efforts are made to get the AIA and HMI data ready for release to the public, a requirement for all NASA projects. The two data streams are far from identical, however, and they each have their own requirements.
"The AIA data is available in a few minutes," said Phil Scherrer, a Stanford-based physicist and a member of the HMI collaboration. "The HMI data needs more processing and lags real time by about 20 minutes."
That time lag is in spite of a 512 core Intel Xeon cluster managed by the Sun Grid Engine batch queuing system.
Preprocessing of the AIA data is also completed on computing clusters.
Using a 16-node cluster, the EVE collaboration calibrates their data and removes non-science content. Their workflow separates the raw binary data by channel and uses it to assemble images.
"This processing and associated data management is about four times slower than real-time," Woodraska explained. "So, we depend on distributed computing to keep up with the real-time data."
After all that processing, they wait for NASA's Mission Operations Center to release the latest Flight Dynamics System products they need to generate their highest level data products.
"This uses another set of 12 nodes to produce all of the highest-quality data products, which allows for fault tolerance, and data production in a reasonable amount of time," Woodraska said. "The remaining 20 nodes are used for maintaining databases and reprocessing."
After all of that processing, the data that is particular to EVE has been cut down from 72 gigabytes per day to about 1.2 gigabytes per day, and it's ready for public release.
To the public
So far, however, we've only discussed processing the data and preparing it for release to the public. That processing is necessary because the instruments on-board SDO are so cutting-edge that only the experiment teams have the expertise to generate accurate science products.
"The scientific analysis is almost exclusively done on workstations and laptops by scientists at institutions all over the world," said Petrus Martens, a physicist at Montana State University and a member of the AIA collaboration.
Each experiment offers software and support to help those scientists make use of the SDO data. Martens, for example, is part of the team that maintains the Virtual Solar Observatory, and leads the team behind an automated feature finding project that consists of 12 separate modules (each of which recognizes different phenomena).
Some of that software is also used to monitor space weather.
"There is this setup we call the 'pipeline,' where all data that are received at the JSOC straight from the spacecraft (only a few minutes delay) are analyzed right away. That is important for several applications," Martens said. "For example, we have three modules that recognize space weather phenomena (flares, coronal dimmings, and flux emergence). The output of these three modules (metadata) is directly sent to the US National Oceanic and Atmospheric Administration, which is tasked with providing space-weather alerts for the US government."