为了账号安全,请及时绑定邮箱和手机立即绑定

Keras,在每个时期获得一层的输出

Keras,在每个时期获得一层的输出

慕无忌1623718 2021-07-23 18:06:26
我做了什么?我实现了一个keras模型如下:train_X, test_X, train_Y, test_Y = train_test_split(X, Y, test_size=0.2, random_state=np.random.seed(7), shuffle=True)train_X = np.reshape(train_X, (train_X.shape[0], 1, train_X.shape[1]))test_X = np.reshape(test_X, (test_X.shape[0], 1, test_X.shape[1]))model = Sequential()model.add(LSTM(100, return_sequences=False, input_shape=(train_X.shape[1], train_X.shape[2])))model.add(Dense(train_Y.shape[1], activation='softmax'))model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])model.fit(train_X, train_Y, validation_split=.20,                        epochs=1000, batch_size=50)我想要的是?我想给出support vector machine(SVM)倒数第二层 (LSTM) 的输出,在任何epoch(即 1000)层中svm也要进行训练。但我不知道如何做到这一点?任何的想法?更新:我使用ModelCheckpoint如下:model = Sequential()model.add(LSTM(100, return_sequences=False, input_shape=(train_X.shape[1], train_X.shape[2])))model.add(Dense(train_Y.shape[1], activation='softmax'))model.compile(loss='mean_squared_error', optimizer='adam', metrics=['accuracy'])# checkpointfilepath="weights-{epoch:02d}-{val_acc:.2f}.hdf5"checkpoint = ModelCheckpoint(filepath, monitor='val_acc', verbose=1, save_best_only=True, mode='max')callbacks_list = [checkpoint]model.fit(train_X, train_Y, validation_split=.20,                    epochs=1000, batch_size=50, callbacks=callbacks_list, verbose=0)输出:Epoch 00991: val_acc did not improveEpoch 00992: val_acc improved from 0.93465 to 0.93900, saving model to weights-992-0.94.hdf5Epoch 00993: val_acc did not improveEpoch 00994: val_acc did not improveEpoch 00995: val_acc did not improveEpoch 00996: val_acc did not improveEpoch 00997: val_acc did not improveEpoch 00998: val_acc improved from 0.93900 to 0.94543, saving model to weights-998-0.94.hdf5Epoch 00999: val_acc did not improve问题:如@IonicSolutions 所说,如何加载所有这些模型以获得每个时期中 LSTM 层的输出?
查看完整描述

2 回答

?
慕哥9229398

TA贡献1877条经验 获得超6个赞

在您的情况下最有效的方法取决于您如何准确设置和训练 SVM,但使用回调至少有两个选项:


您可以使用ModelCheckpoint回调来保存您在每个时期训练的模型的副本,然后加载所有这些模型以获得 LSTM 层的输出。


您还可以通过实现Callback基类来创建自己的回调。在回调中,可以访问模型,您可以使用on_epoch_end它在每个时期结束时提取 LSTM 输出。


编辑:要方便地访问倒数第二层,您可以执行以下操作:


# Create the model with the functional API

inp = Input((train_X.shape[1], train_X.shape[2],))

lstm = LSTM(100, return_sequences=False)(inp)

dense = Dense(train_Y.shape[1], activation='softmax')(lstm)


# Create the full model

model = Model(inputs=inp, outputs=dense)


# Create the model for access to the LSTM layer

access = Model(inputs=inp, outputs=lstm)

然后,您可以access在实例化它时传递给您的回调。最关键的事情,这里要注意的是,model与access共享同样的LSTM层,它们的权重会发生变化时训练model。


查看完整回答
反对 回复 2021-07-28
  • 2 回答
  • 0 关注
  • 194 浏览
慕课专栏
更多

添加回答

举报

0/150
提交
取消
意见反馈 帮助中心 APP下载
官方微信