我正在尝试使用 Keras 学习具有简单密集层的 MNIST 数据集。我希望我的图像大小为 16*16 而不是 28*28。我用了很多方法,但都不管用。这是简单的密集网络:import kerasimport numpy as npimport mnistfrom tensorflow.keras.models import Sequentialfrom tensorflow.keras.layers import Densefrom tensorflow.keras.utils import to_categoricaltrain_images = mnist.train_images()train_labels = mnist.train_labels()test_images = mnist.test_images()test_labels = mnist.test_labels()# Normalize the images.train_images = (train_images / 255) - 0.5test_images = (test_images / 255) - 0.5print(train_images.shape)print(test_images.shape)# Flatten the images.train_images = train_images.reshape((-1, 784))test_images = test_images.reshape((-1, 784))print(train_images.shape)print(test_images.shape)# Build the model.model = Sequential([ Dense(10, activation='softmax', input_shape=(784,)),])# Compile the model.model.compile( optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'],)# Train the model.model.fit( train_images, to_categorical(train_labels), epochs=5, batch_size=32,)# Evaluate the model.model.evaluate( test_images, to_categorical(test_labels))# Save the model to disk.model.save_weights('model.h5')
1 回答
慕田峪4524236
TA贡献1875条经验 获得超5个赞
尝试使用此方法一次调整所有图像的大小 -
#!pip install --upgrade tensorflow
#Assuming you are using tensorflow 2
import numpy as np
import tensorflow as tf
#creating dummy images
imgs = np.stack([np.eye(28), np.eye(28)])
print(imgs.shape)
#Output - (2,28,28) 2 images of 28*28
imgt = imgs.transpose(1,2,0) #Bring the batch channel to the end (28,28,2)
imgs_resize = tf.image.resize(imgt, (16,16)).numpy() #apply resize (14,14,2)
imgs2 = imgs_resize.transpose(2,0,1) #bring the batch channel back to front (2,14,14)
print(imgs2.shape)
#Output - (2,16,16)
添加回答
举报
0/150
提交
取消