整形LSTM的数据,并将密集层的输出馈送到LSTM

整形LSTM的数据,并将密集层的输出馈送到LSTM,第1张

整形LSTM的数据,并将密集层的输出馈送到LSTM

对于第一个问题,我正在做同样的事情,没有收到任何错误,请分享您的错误。

注意 :我将为您提供使用函数式API的示例,该API的使用自由度稍高一些(个人观点)

from keras.layers import Dense, Flatten, LSTM, Activationfrom keras.layers import Dropout, RepeatVector, TimeDistributedfrom keras import Input, Modelseq_length = 15input_dims = 10output_dims = 8n_hidden = 10model1_inputs = Input(shape=(seq_length,input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = LSTM(n_hidden, return_sequences=True)(model1_inputs)net1 = LSTM(n_hidden, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_11 (InputLayer)        (None, 15, 10) 0         _________________________________________________________________lstm_8 (LSTM)     (None, 15, 10) 840       _________________________________________________________________lstm_9 (LSTM)     (None, 10)     840       _________________________________________________________________dense_9 (Dense)   (None, 8)      88        _________________________________________________________________

对于第二个问题,有两种方法:

  1. 如果您发送的数据没有按顺序排列,即 暗淡
    (batch, input_dims)
    ,则可以使用此方法 RepeatVector ,该方法重复相同的权重by
    n_steps
    ,这
    rolling_steps
    在LSTM中仅此而已。

{

seq_length = 15input_dims = 16output_dims = 8n_hidden = 20lstm_dims = 10model1_inputs = Input(shape=(input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = Dense(n_hidden)(model1_inputs)net1 = Dense(n_hidden)(net1)net1 = RepeatVector(3)(net1)net1 = LSTM(lstm_dims, return_sequences=True)(net1)net1 = LSTM(lstm_dims, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_13 (InputLayer)        (None, 16)     0         _________________________________________________________________dense_13 (Dense)  (None, 20)     340       _________________________________________________________________dense_14 (Dense)  (None, 20)     420       _________________________________________________________________repeat_vector_2 (RepeatVecto (None, 3, 20)  0         _________________________________________________________________lstm_14 (LSTM)    (None, 3, 10)  1240      _________________________________________________________________lstm_15 (LSTM)    (None, 10)     840       _________________________________________________________________dense_15 (Dense)  (None, 8)      88        =================================================================
  1. 如果要发送dims序列
    (seq_len, input_dims)
    ,则可以使用 TimeDistributed ,它在整个序列上重复相同权重的密集层。

{

seq_length = 15input_dims = 10output_dims = 8n_hidden = 10lstm_dims = 6model1_inputs = Input(shape=(seq_length,input_dims,))model1_outputs = Input(shape=(output_dims,))net1 = TimeDistributed(Dense(n_hidden))(model1_inputs)net1 = LSTM(output_dims, return_sequences=True)(net1)net1 = LSTM(output_dims, return_sequences=False)(net1)net1 = Dense(output_dims, activation='relu')(net1)model1_outputs = net1model1 = Model(inputs=model1_inputs, outputs = model1_outputs, name='model1')## Fit the modelmodel1.summary()_________________________________________________________________Layer (type)      Output Shape   Param #   =================================================================input_17 (InputLayer)        (None, 15, 10) 0         _________________________________________________________________time_distributed_3 (TimeDist (None, 15, 10) 110       _________________________________________________________________lstm_18 (LSTM)    (None, 15, 8)  608       _________________________________________________________________lstm_19 (LSTM)    (None, 8)      544       _________________________________________________________________dense_19 (Dense)  (None, 8)      72        =================================================================

注意
:在执行此 *** 作时,我在第一层中堆叠了两层

return_sequence
,这将在每个时间步长返回输出,第二层将使用该输出,最后才返回输出
time_step



欢迎分享,转载请注明来源:内存溢出

原文地址: https://outofmemory.cn/zaji/5647357.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-12-16
下一篇 2022-12-16

发表评论

登录后才能评论

评论列表(0条)

保存