我正在尝试在扩展 tf.keras.Model 类的张量流中实现自定义模型。
我需要一种方法将 n 个堆叠的 LSTM 层添加到模型中。
例如,假设以下实现
class CustomizedLSTM(tf.keras.Model):
def __init__(self, num_hidden_layers, vocab_size):
super(CustomizedLSTM, self).__init__()
self.embedding = tf.keras.layers.Embedding(vocab_size,300)
self.first_lstm = tf.keras.layers.LSTM(256, activation="relu")
self.first_dense = tf.keras.layers.Dense(64, activation="relu")
self.classification_layer = tf.keras.layers.Dense(1, activation="sigmoid")
def call(self, inputs):
x = self.embedding(inputs)
x = self.first_lstm(x)
x = self.first_dense(x)
return self.classification_layer(x)
我想添加自定义隐藏 LSTM 层数量的可能性。 换句话说,我想创建一个具有 num_hidden_layers 堆叠 LSTM 的模型。
可能吗?你能帮我吗?
class CustomizedLSTM(tf.keras.Model):
def __init__(self, num_hidden_layers, dim_per_hidden, vocab_size):
self.lstms = []
super(CustomizedLSTM, self).__init__()
self.embedding = tf.keras.layers.Embedding(vocab_size,300)
'''
To stack multiple LSTMs, it is mandatory for all the lower LSTMs to have
return_sequence=True, as they will be fed as input to the next LSTM.
'''
for i in range(num_hidden_layers):
self.lstms.append(tf.keras.layers.LSTM(dim_per_hidden[i], activation="relu", return_sequences=True))
'''
The last lstm with return_sequences=False, you can change it according to
your needs.
'''
self.lstms.append(tf.keras.layers.LSTM(dim_per_hidden[i], activation="relu", return_sequences=False))
self.first_dense = tf.keras.layers.Dense(64, activation="relu")
self.classification_layer = tf.keras.layers.Dense(1, activation="sigmoid")
def call(self, inputs):
x = self.embedding(inputs)
for layer in self.lstms:
x = layer(x)
x = self.first_dense(x)
return self.classification_layer(x)
我添加了另一个您可以考虑的参数 - “dim_per_hidden”: 该参数是一个数字列表,用来决定数量 每个 lstm 层的神经元数量
要以客户身份与 Zomato 的高管交谈,您可以尝试以下步骤: [1] 客户热线支持:📞+91*9201-077-310+ 启动服务中心 要以客户的身份与 Zomato 的高管交谈,您可以尝试以下步骤: [1] 客户热线支持:👉+09201-077-310+ 开始服务???