我目前正在尝试使用以下模型训练Keras模型。装配线:
history = model.fit(imgs,ground_truths, batch_size=16, epochs=30, shuffle=True,
validation_split=0.2,
callbacks=[model_checkpoint])
两者都具有形状(2080256256,3)
,这是模型的正确输入形状。
然而,由于某种原因,即使我传递了2个参数,我仍然得到以下错误:
ValueError: The model expects 2 input arrays, but only received one array. Found: array with shape (2080, 256, 256, 3)
这是我如何预处理图像的:
def preprocess(imgs):
imgs_p = np.ndarray((imgs.shape[0], img_rows, img_cols, 3), dtype=np.uint8)
for i in range(imgs.shape[0]):
arr = imgs[i]
arr = arr.astype('float')
arr /= 255.
imgs_p[i] = resize(arr, (256, 256), preserve_range=True)
return imgs_p
预处理后的图像在预处理后保存在numpy文件中:
np.save('imgs_train_preprocess.npy', imgs)
np.save('imgs_gt_train_preprocess.npy', ground_truths)
在培训之前,我在培训之前加载如下numpy文件:
imgs = np.load('imgs_cup_train_preprocess.npy')
ground_truths = np.load('imgs_orig_train_preprocess.npy')
这是我的模型。总结:
____________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
====================================================================================================
conv1_1 (InputLayer) (None, 256, 256, 3) 0
____________________________________________________________________________________________________
relu1_1 (Activation) (None, 256, 256, 3) 0 conv1_1[0][0]
____________________________________________________________________________________________________
conv1_2_zeropadding (ZeroPadding (None, 258, 258, 3) 0 relu1_1[0][0]
____________________________________________________________________________________________________
conv1_2 (Conv2D) (None, 256, 256, 64) 1792 conv1_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu1_2 (Activation) (None, 256, 256, 64) 0 conv1_2[0][0]
____________________________________________________________________________________________________
pool1 (MaxPooling2D) (None, 128, 128, 64) 0 relu1_2[0][0]
____________________________________________________________________________________________________
conv2_1_zeropadding (ZeroPadding (None, 130, 130, 64) 0 pool1[0][0]
____________________________________________________________________________________________________
conv2_1 (Conv2D) (None, 128, 128, 128) 73856 conv2_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu2_1 (Activation) (None, 128, 128, 128) 0 conv2_1[0][0]
____________________________________________________________________________________________________
conv2_2_zeropadding (ZeroPadding (None, 130, 130, 128) 0 relu2_1[0][0]
____________________________________________________________________________________________________
conv2_2 (Conv2D) (None, 128, 128, 128) 147584 conv2_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu2_2 (Activation) (None, 128, 128, 128) 0 conv2_2[0][0]
____________________________________________________________________________________________________
pool2 (MaxPooling2D) (None, 64, 64, 128) 0 relu2_2[0][0]
____________________________________________________________________________________________________
conv3_1_zeropadding (ZeroPadding (None, 66, 66, 128) 0 pool2[0][0]
____________________________________________________________________________________________________
conv3_1 (Conv2D) (None, 64, 64, 256) 295168 conv3_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_1 (Activation) (None, 64, 64, 256) 0 conv3_1[0][0]
____________________________________________________________________________________________________
conv3_2_zeropadding (ZeroPadding (None, 66, 66, 256) 0 relu3_1[0][0]
____________________________________________________________________________________________________
conv3_2 (Conv2D) (None, 64, 64, 256) 590080 conv3_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_2 (Activation) (None, 64, 64, 256) 0 conv3_2[0][0]
____________________________________________________________________________________________________
conv3_3_zeropadding (ZeroPadding (None, 66, 66, 256) 0 relu3_2[0][0]
____________________________________________________________________________________________________
conv3_3 (Conv2D) (None, 64, 64, 256) 590080 conv3_3_zeropadding[0][0]
____________________________________________________________________________________________________
relu3_3 (Activation) (None, 64, 64, 256) 0 conv3_3[0][0]
____________________________________________________________________________________________________
pool3 (MaxPooling2D) (None, 32, 32, 256) 0 relu3_3[0][0]
____________________________________________________________________________________________________
conv4_1_zeropadding (ZeroPadding (None, 34, 34, 256) 0 pool3[0][0]
____________________________________________________________________________________________________
conv4_1 (Conv2D) (None, 32, 32, 512) 1180160 conv4_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_1 (Activation) (None, 32, 32, 512) 0 conv4_1[0][0]
____________________________________________________________________________________________________
conv4_2_zeropadding (ZeroPadding (None, 34, 34, 512) 0 relu4_1[0][0]
____________________________________________________________________________________________________
conv4_2 (Conv2D) (None, 32, 32, 512) 2359808 conv4_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_2 (Activation) (None, 32, 32, 512) 0 conv4_2[0][0]
____________________________________________________________________________________________________
conv4_3_zeropadding (ZeroPadding (None, 34, 34, 512) 0 relu4_2[0][0]
____________________________________________________________________________________________________
conv4_3 (Conv2D) (None, 32, 32, 512) 2359808 conv4_3_zeropadding[0][0]
____________________________________________________________________________________________________
relu4_3 (Activation) (None, 32, 32, 512) 0 conv4_3[0][0]
____________________________________________________________________________________________________
pool4 (MaxPooling2D) (None, 16, 16, 512) 0 relu4_3[0][0]
____________________________________________________________________________________________________
conv5_1_zeropadding (ZeroPadding (None, 18, 18, 512) 0 pool4[0][0]
____________________________________________________________________________________________________
conv5_1 (Conv2D) (None, 16, 16, 512) 2359808 conv5_1_zeropadding[0][0]
____________________________________________________________________________________________________
relu5_1 (Activation) (None, 16, 16, 512) 0 conv5_1[0][0]
____________________________________________________________________________________________________
conv5_2_zeropadding (ZeroPadding (None, 18, 18, 512) 0 relu5_1[0][0]
____________________________________________________________________________________________________
conv5_2 (Conv2D) (None, 16, 16, 512) 2359808 conv5_2_zeropadding[0][0]
____________________________________________________________________________________________________
relu5_2 (Activation) (None, 16, 16, 512) 0 conv5_2[0][0]
____________________________________________________________________________________________________
conv5_3_zeropadding (ZeroPadding (None, 18, 18, 512) 0 relu5_2[0][0]
____________________________________________________________________________________________________
conv5_3 (Conv2D) (None, 16, 16, 512) 2359808 conv5_3_zeropadding[0][0]
____________________________________________________________________________________________________
conv2_2_16_zeropadding (ZeroPadd (None, 130, 130, 128) 0 relu2_2[0][0]
____________________________________________________________________________________________________
relu5_3 (Activation) (None, 16, 16, 512) 0 conv5_3[0][0]
____________________________________________________________________________________________________
conv2_2_16 (Conv2D) (None, 128, 128, 16) 18448 conv2_2_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv3_3_16_zeropadding (ZeroPadd (None, 66, 66, 256) 0 relu3_3[0][0]
____________________________________________________________________________________________________
conv4_3_16_zeropadding (ZeroPadd (None, 34, 34, 512) 0 relu4_3[0][0]
____________________________________________________________________________________________________
conv5_3_16_zeropadding (ZeroPadd (None, 18, 18, 512) 0 relu5_3[0][0]
____________________________________________________________________________________________________
concat (InputLayer) (None, 256, 256, 3) 0
____________________________________________________________________________________________________
upsample2__zeropadding (ZeroPadd (None, 130, 130, 16) 0 conv2_2_16[0][0]
____________________________________________________________________________________________________
conv3_3_16 (Conv2D) (None, 64, 64, 16) 36880 conv3_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv4_3_16 (Conv2D) (None, 32, 32, 16) 73744 conv4_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
conv5_3_16 (Conv2D) (None, 16, 16, 16) 73744 conv5_3_16_zeropadding[0][0]
____________________________________________________________________________________________________
new-score-weighting (Conv2D) (None, 256, 256, 1) 4 concat[0][0]
____________________________________________________________________________________________________
upsample2_ (Conv2DTranspose) (None, 262, 262, 16) 4112 upsample2__zeropadding[0][0]
____________________________________________________________________________________________________
upsample4_ (Conv2DTranspose) (None, 260, 260, 16) 16400 conv3_3_16[0][0]
____________________________________________________________________________________________________
upsample8_ (Conv2DTranspose) (None, 264, 264, 16) 65552 conv4_3_16[0][0]
____________________________________________________________________________________________________
upsample16_ (Conv2DTranspose) (None, 272, 272, 16) 262160 conv5_3_16[0][0]
____________________________________________________________________________________________________
sigmoid-fuse (Activation) (None, 256, 256, 1) 0 new-score-weighting[0][0]
====================================================================================================
Total params: 15,228,804
Trainable params: 15,228,804
Non-trainable params: 0
____________________________________________________________________________________________________
模型的JSON架构位于此处:https://pastebin.com/TE0Nda1p
有人知道我该怎么解决这个问题吗?谢谢!
当outputs=[a,b]时,我也有同样的问题,将其更改为outputs=[a]。
我有一个非常奇怪的问题,很简单,但我不明白问题是什么。 我有一个类,ClassA调用ClassB中的函数,比如- 类A是在我的applicationContext中定义的bean。类xml ClassB中的函数定义看起来像 IntelliJ没有指出任何语法问题,一切看起来都很正常。。。然而,当我试图编译时,Maven出现了一个异常 B类与a类位于不同的模块中,因此B类位于a类的pom中。作为依赖项
我正在努力使用 JOLT 转换复杂的 json。 输入JSON: 预期输出: 我无法理解如何在基于“字段名称”的输出中访问和分配“字段值”。请帮我做一下震动测试。 注意:输入JSON中的名称、标题和公司的顺序将是混乱和随机的,这意味着在“data”数组下,第一个对象仅与“Name”相关不是强制性的。
我刚到爪哇。我正在学习将图像和其他数据从db检索到JTable的教程。数据将首先检索到ArrayList中,我得到了一个错误,说array是必需的,但object是找到的。我已经添加了我所有的代码。错误在文件中。如有任何帮助,不胜感激。 course.java
我正在使用Spring Boot开发一个简单的Spring Batch jar。我已经使用配置类创建了dataSource bean,并用@Component进行了注释。但是当我使用命令行Runner运行应用程序时,它在读取ABPBatchInfrastructure.xml时抛出bean not found异常。 我在谷歌上对这个错误做了一些研究,找到了一个解决方案,我在ABPBatchInfr
问题内容: 我正在尝试合并两个模型的输出,并使用keras顺序模型将它们作为第三模型的输入。型号1: 型号1: 型号3: 直到这里,我的理解是,来自两个模型的输出x和y被合并并作为第三模型的输入。但是当我全都喜欢的时候 in1和in2是尺寸为10000 * 750的两个numpy ndarray,其中包含我的训练数据,而np_res_array是相应的目标。 这给了我错误,因为“列表”对象没有属性
我有这个超级班: 我想确定我完全理解了解决方案。通过,我给类DAOBase指定了特定的名称“daoBaseBeanname”,应用程序可以用它来标识类DAOBase,这样就不会把它与扩展DAOBase的其他类混淆了。对吗? 谢谢你。