如何实现 Transformer 来预测时间序列?

问题描述 投票:0回答:1

我需要实现 Transformer 来预测多元时间序列。我从 Tensorflow 找到了这个page,它解释了 Transformer 包含的内容,但我不太清楚它应该如何实现。我尝试以与 LSTM 相同的方式实现它(受 YT 启发),但它不是这样工作的,也无法在 YT 上找到灵感。

这是我的代码:

初始步骤:

df = pd.read_csv('myfile.csv')

train_dates = pd.to_datetime(df['Date'])
cols = list(df[['A', 'B', 'C']])
df_for_training = df[cols].astype(float)
scaler = StandardScaler()
scaler.fit(df_for_training)
df_for_training_scaled = scaler.transform(df_for_training)
trainX = []  
trainY = []
n_future = 1
n_past = 14 
for i in range(n_past, len(df_for_training_scaled) - n_future + 1):
  trainX.append(df_for_training_scaled[i - n_past:i, 0:df_for_training.shape[1]])
  trainY.append(df_for_training_scaled[i + n_future - 1:i + n_future, 0])
trainX, trainY = np.array(trainX), np.array(trainY)

重要:

import tensorflow as tf
import tensorflow_addons as tfa
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Dropout
from tensorflow_addons.layers import MultiHeadAttention, LayerNormalization

model = Sequential()
model.add(MultiHeadAttention(64, num_heads=4))
model.add(LayerNormalization())
model.add(MultiHeadAttention(32, num_heads=4))
model.add(LayerNormalization())
model.add(Dropout(0.2))
model.add(Dense(trainY.shape[1]))

model.compile(optimizer='adam', loss='mse')
model.summary()

history = model.fit(trainX, trainY, epochs=2, batch_size=16, validation_split=0.1, verbose=1)

你能帮我吗?

python tensorflow deep-learning huggingface-transformers transformer-model
1个回答
0
投票

我不确定它会对您有多大帮助,但我建议您点击链接:[https://colab.research.google.com/github/charlesollion/dlexperiments/blob/master/7-Transformers-Timeseries/Transformers_for_timeseries .ipynb]。它对于基本了解如何在时间序列预测中实现 Transformer 非常有用。我的项目涉及针对异常检测问题的多元时间序列数据实施 Transformer。如果你发现什么有用的,请分享给我

© www.soinside.com 2019 - 2024. All rights reserved.