Authors: Jingjing Xu, Caesar Wu, Yuan-Fang Li, Grégoire Danoy, Pascal Bouvry
Abstract: Transformer-based models for time series forecasting (TSF) have attracted
significant attention in recent years due to their effectiveness and
versatility. However, these models often require extensive hyperparameter
optimization (HPO) to achieve the best possible performance, and a unified
pipeline for HPO in transformer-based TSF remains lacking. In this paper, we
present one such pipeline and conduct extensive experiments on several
state-of-the-art (SOTA) transformer-based TSF models. These experiments are
conducted on standard benchmark datasets to evaluate and compare the
performance of different models, generating practical insights and examples.
Our pipeline is generalizable beyond transformer-based architectures and can be
applied to other SOTA models, such as Mamba and TimeMixer, as demonstrated in
our experiments. The goal of this work is to provide valuable guidance to both
industry practitioners and academic researchers in efficiently identifying
optimal hyperparameters suited to their specific domain applications. The code
and complete experimental results are available on GitHub.
Source: http://arxiv.org/abs/2501.01394v1