Now computers are learning to find solar flares, more in real time

Machine-learning technique searches satellite data for features significant for space weather.

Computers are learning to find solar flares and other events in vast streams of solar images and help NOAA forecasters issue timely alerts, according to a new study.

The machine-learning technique, developed by scientists at CIRES and NOAA’s National Centers for Environmental Information (NCEI), searches massive amounts of satellite data to pick out features significant for space weather.

Changing conditions on the Sun and in space can affect various technologies on Earth, blocking radio communications, damaging power grids, and diminishing navigation system accuracy.

Being able to process solar data in real-time is important because flares erupting on the Sun impact Earth for minutes. These techniques provide a rapid, updated overview of solar features and can point us to areas requiring more scrutiny.

To predict incoming space weather, forecasters summarize current conditions on the Sun twice daily. Today, they use hand-drawn maps labeled with various solar features including, active regions, filaments, and coronal hole boundaries. But solar imagers produce a new set of observations every few minutes.

For example, the Solar Ultraviolet Imager (SUVI) on NOAA’s GOES-R Series satellites runs on a 4-minute cycle, collecting data in six different wavelengths every cycle.

Keeping up with all that data could take up a lot of a forecaster’s time. “We need tools to process solar data into digestible chunks.

The researcher created a computer algorithm that can look at all the SUVI images and spot patterns in the data. With his colleagues, Hughes created a database of expert-labeled maps of the Sun and used those images to teach a computer to identify solar features important for forecasting.

The algorithm identifies solar features using a decision-tree approach that follows a set of simple rules to distinguish between different traits. It examines an image one pixel at a time and decides, for example, whether that pixel is brighter or dimmer than a certain threshold before sending it down a branch of the tree.

Once the system training, it can classify millions of pixels in seconds, supporting forecasts that could be routine or just an alert or warning.

The algorithm learns so it can help forecasters understand what’s happening on the Sun for more than they currently do.

The technique also sees patterns humans can’t. It can sometimes find features we had difficulty identifying ourselves.

The algorithm’s skill at finding patterns is not only useful for short-term forecasting, but also for helping scientists test long-term solar data and improve models of the Sun.

NCEI and SWPC are still testing the tool for tracking changing solar conditions so forecasters can issue more accurate watches, warnings, and alerts. The tool could be made operational as early as the end of 2019.

LEAVE A REPLY

Please enter your comment!
Please enter your name here