Python Keras:如何将 predict_generator 与 ImageDataGenerator 一起使用?
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/45806669/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Keras: How to use predict_generator with ImageDataGenerator?
提问by Mario Kreutzfeldt
I'm very new to Keras. I trained a model and would like to predict some images stored in subfolders (like for training). For testing, I want to predict 2 images from 7 classes (subfolders). The test_generator below sees 14 images, but I get 196 predictions. Where is the mistake? Thanks a lot!
我对 Keras 很陌生。我训练了一个模型,并想预测存储在子文件夹中的一些图像(例如用于训练)。为了测试,我想从 7 个类(子文件夹)中预测 2 个图像。下面的 test_generator 看到 14 个图像,但我得到了 196 个预测。错误在哪里?非常感谢!
test_datagen = ImageDataGenerator(rescale=1./255)
test_generator = test_datagen.flow_from_directory(
test_dir,
target_size=(200, 200),
color_mode="rgb",
shuffle = "false",
class_mode='categorical')
filenames = test_generator.filenames
nb_samples = len(filenames)
predict = model.predict_generator(test_generator,nb_samples)
回答by Matin
You can change the value of batch_size
in flow_from_directory
from default value (which is batch_size=32
) to batch_size=1
. Then set the steps
of predict_generator
to the total number of your test images. Something like this:
您可以将batch_size
in的值flow_from_directory
从默认值(即batch_size=32
)更改为batch_size=1
。然后将steps
of设置为predict_generator
测试图像的总数。像这样的东西:
test_datagen = ImageDataGenerator(rescale=1./255)
test_generator = test_datagen.flow_from_directory(
test_dir,
target_size=(200, 200),
color_mode="rgb",
shuffle = False,
class_mode='categorical',
batch_size=1)
filenames = test_generator.filenames
nb_samples = len(filenames)
predict = model.predict_generator(test_generator,steps = nb_samples)
回答by Ioannis Nasios
Default batch_size
in generator is 32. If you want to make 1 prediction for every sample of total nb_samples you should devide your nb_samples with the batch_size
. Thus with a batch_size
of 7 you only need 14/7=2 steps for your 14 images
batch_size
生成器中的默认值为 32。如果您想对总 nb_samples 的每个样本进行 1 次预测,您应该将 nb_samples 与batch_size
. 因此,batch_size
对于 7,您的 14 张图像只需要 14/7=2 步
desired_batch_size=7
test_datagen = ImageDataGenerator(rescale=1./255)
test_generator = test_datagen.flow_from_directory(
test_dir,
target_size=(200, 200),
color_mode="rgb",
shuffle = False,
class_mode='categorical',
batch_size=desired_batch_size)
filenames = test_generator.filenames
nb_samples = len(filenames)
predict = model.predict_generator(test_generator,steps =
np.ceil(nb_samples/desired_batch_size))
回答by DJK
The problem is the inclusion of nb_samples
in the predict_generator
which is creating 14 batches of 14 images
问题是包含nb_samples
在predict_generator
其中创建 14 个批次的 14 个图像
14*14 = 196