Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks

Abstract Ionospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well‐representable driving inputs, and capable c...

Full description

Saved in:
Bibliographic Details
Main Authors: Chung‐Yu Shih, Cissi Ying‐tsen Lin, Shu‐Yu Lin, Cheng‐Hung Yeh, Yu‐Ming Huang, Feng‐Nan Hwang, Chia‐Hui Chang
Format: Article
Language:English
Published: Wiley 2024-02-01
Series:Space Weather
Subjects:
Online Access:https://doi.org/10.1029/2023SW003579
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1841536403425460224
author Chung‐Yu Shih
Cissi Ying‐tsen Lin
Shu‐Yu Lin
Cheng‐Hung Yeh
Yu‐Ming Huang
Feng‐Nan Hwang
Chia‐Hui Chang
author_facet Chung‐Yu Shih
Cissi Ying‐tsen Lin
Shu‐Yu Lin
Cheng‐Hung Yeh
Yu‐Ming Huang
Feng‐Nan Hwang
Chia‐Hui Chang
author_sort Chung‐Yu Shih
collection DOAJ
description Abstract Ionospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well‐representable driving inputs, and capable computational power are required for physical models to reproduce simulations that agree with observations, which may be challenging at times. Recently, data‐driven approaches, such as deep learning, have therefore surged as means for TEC prediction. Owing to the fact that the geophysical world possesses a sequential nature in time and space, Transformer architectures are proposed and evaluated for sequence‐to‐sequence TEC predictions in this study. We discuss the impacts of time lengths of choice during the training process and analyze what the neural network has learned regarding the data sets. Our results suggest that 12‐layer, 128‐hidden‐unit Transformer architectures sufficiently provide multi‐step global TEC predictions for 48 hr with an overall root‐mean‐square error (RMSE) of ∼1.8 TECU. The hourly variation of RMSE increases from 0.6 TECU to about 2.0 TECU during the prediction time frame.
format Article
id doaj-art-595c6acd630040e3a1c9e2f1b57461a0
institution Kabale University
issn 1542-7390
language English
publishDate 2024-02-01
publisher Wiley
record_format Article
series Space Weather
spelling doaj-art-595c6acd630040e3a1c9e2f1b57461a02025-01-14T16:30:41ZengWileySpace Weather1542-73902024-02-01222n/an/a10.1029/2023SW003579Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural NetworksChung‐Yu Shih0Cissi Ying‐tsen Lin1Shu‐Yu Lin2Cheng‐Hung Yeh3Yu‐Ming Huang4Feng‐Nan Hwang5Chia‐Hui Chang6Mathematics National Central University Taoyuan TaiwanSpace Science and Engineering National Central University Taoyuan TaiwanComputer Science and Information Engineering National Central University Taoyuan TaiwanComputer Science and Information Engineering National Central University Taoyuan TaiwanComputer Science and Information Engineering National Central University Taoyuan TaiwanMathematics National Central University Taoyuan TaiwanComputer Science and Information Engineering National Central University Taoyuan TaiwanAbstract Ionospheric total electron content (TEC) is a key indicator of the space environment. Geophysical forcing from above and below drives its spatial and temporal variations. A full understanding of physical and chemical principles, available and well‐representable driving inputs, and capable computational power are required for physical models to reproduce simulations that agree with observations, which may be challenging at times. Recently, data‐driven approaches, such as deep learning, have therefore surged as means for TEC prediction. Owing to the fact that the geophysical world possesses a sequential nature in time and space, Transformer architectures are proposed and evaluated for sequence‐to‐sequence TEC predictions in this study. We discuss the impacts of time lengths of choice during the training process and analyze what the neural network has learned regarding the data sets. Our results suggest that 12‐layer, 128‐hidden‐unit Transformer architectures sufficiently provide multi‐step global TEC predictions for 48 hr with an overall root‐mean‐square error (RMSE) of ∼1.8 TECU. The hourly variation of RMSE increases from 0.6 TECU to about 2.0 TECU during the prediction time frame.https://doi.org/10.1029/2023SW003579TEC predictionneural networkTransformer
spellingShingle Chung‐Yu Shih
Cissi Ying‐tsen Lin
Shu‐Yu Lin
Cheng‐Hung Yeh
Yu‐Ming Huang
Feng‐Nan Hwang
Chia‐Hui Chang
Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
Space Weather
TEC prediction
neural network
Transformer
title Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
title_full Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
title_fullStr Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
title_full_unstemmed Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
title_short Forecasting of Global Ionosphere Maps With Multi‐Day Lead Time Using Transformer‐Based Neural Networks
title_sort forecasting of global ionosphere maps with multi day lead time using transformer based neural networks
topic TEC prediction
neural network
Transformer
url https://doi.org/10.1029/2023SW003579
work_keys_str_mv AT chungyushih forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT cissiyingtsenlin forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT shuyulin forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT chenghungyeh forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT yuminghuang forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT fengnanhwang forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks
AT chiahuichang forecastingofglobalionospheremapswithmultidayleadtimeusingtransformerbasedneuralnetworks