如何使用 3D 训练数据构建二元分类器

How to build a binary classifier with 3D training data

我有必须分类为 0 或 1 的数据。数据是从 .npz 文件加载的。它为我提供了训练、验证和测试数据。这是他们的样子:

x_train = [[[  0   0   0 ...   0   1   4]
  [  0   0   0 ...   4  25   2]
  [  6  33  15 ...  33   0   0]
  ...
  [  0  23   4 ...   9  31   0]
  [  4   0   0 ...   0   0  12]
  [  5   0   0 ...   3   0   0]]

 [[ 88  71  59 ...  61  62  62]
  [ 74  88  73 ...  59  70  60]
  [ 69  61  85 ...  60  58  82]
  ...
  [ 68  85  58 ...  55  75  72]
  [ 69  69  70 ...  81  76  83]
  [ 74  68  76 ...  60  74  72]]

 [[ 87 134 146 ... 108 116 157]
  [108 117 144 ... 102  58 122]
  [124 148 106 ...  97 135 146]
  ...
  [ 96 153 111 ... 104 129 154]
  [129 140 100 ...  74 114  97]
  [119 115 160 ... 172  84 148]]

 ...

 [[ 92  96  64 ...  69  83  83]
  [ 85  44  89 ... 115  94  76]
  [ 93 103  91 ...  92  81  75]
  ...
  [ 16 109  81 ...  84  95  20]
  [100  27  89 ...  66 107  48]
  [ 24  67 144 ... 104 115 123]]

 [[ 69  70  74 ...  72  73  75]
  [ 72  72  76 ...  73  75  76]
  [ 74  75  72 ...  72  69  73]
  ...
  [ 72  72  69 ...  72  76  72]
  [ 70  72  73 ...  72  76  67]
  [ 69  72  72 ...  72  71  71]]

 [[ 65 137  26 ... 134  57 174]
  [ 91  76 123 ...  39  63 124]
  [ 81 203 134 ... 192  63 143]
  ...
  [  1 102  96 ...  33  63 169]
  [ 82  32 108 ... 151  75 151]
  [ 12  97 164 ... 101 125  60]]]
y_train:
[0 0 0 ... 0 0 0]

这些是我的输入形状:

x_train.shape = (5000, 128, 128)
y_train.shape = (5000,)

如您所见,y 只是标签,x 只是 3D 数据。因为它是一个二元分类器,所以我想构建一个具有 3 个密集层的简单神经网络。这是我的:

model = Sequential()
model.add(Dense(12, input_dim = 8, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.fit(x_train, y_train, epochs=150, batch_size=8)

但是由于我的输入,出现了这个错误:

ValueError: Input 0 of layer sequential_4 is incompatible with the layer: expected axis -1 of input shape to have value 8 but received input with shape (8, 128, 128)

我该如何解决这个问题?我的神经网络对于这类问题是不是太简单了?

您的代码存在多个问题。我尝试添加单独的部分来解释它们。请仔细阅读所有这些并尝试我在下面显示的代码示例。

1。将 samples/batch 通道作为输入维度传递

您正在将批处理通道作为密集层的输入形状传递。那是不正确的。相反,您需要做的是传递模型应该期望的每个样本的形状,在本例中为 (128,128)。该模型自动在前面添加一个通道,以便批次流经计算图为(None, 128, 128),如下图model.summary()

2。密集层的二维输入。

您的每个样本(在本例中共有 5000 个样本)都是形状为 128,128 的二维矩阵。如果不先将其压平,致密层不能直接消耗它。 (或者使用不同的层来更适合于 2D/3D 输入,稍后讨论)。

from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense, Flatten

model = Sequential()
model.add(Flatten(input_shape=(128,128)))
model.add(Dense(12, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
Model: "sequential_6"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
flatten_3 (Flatten)          (None, 16384)             0         
_________________________________________________________________
dense_5 (Dense)              (None, 12)                196620    
_________________________________________________________________
dense_6 (Dense)              (None, 8)                 104       
_________________________________________________________________
dense_7 (Dense)              (None, 1)                 9         
=================================================================
Total params: 196,733
Trainable params: 196,733
Non-trainable params: 0
_________________________________________________________________

3。针对您的问题使用不同的体系结构。

"Is my NN too simplistic for this type of problem?"

这不是关于架构的复杂性,而是更多关于可以处理某种类型数据的层的类型。在这种情况下,您的图像具有单通道 (128,128),这是一个 2D 输入。通常,彩色图像有 RGB 通道,最终以 (128,128,3) 形输入。

一般做法是为此使用 CNN 层。

下面显示了一个例子 -

from tensorflow.keras import Sequential
from tensorflow.keras.layers import Dense, Flatten, Conv2D, MaxPooling2D, Reshape

model = Sequential()
model.add(Reshape((128,128,1), input_shape=(128,128)))
model.add(Conv2D(5, 5, activation='relu'))
model.add(MaxPooling2D((2,2)))
model.add(Conv2D(10, 5, activation='relu'))
model.add(MaxPooling2D((2,2)))
model.add(Conv2D(20, 5, activation='relu'))
model.add(MaxPooling2D((2,2)))
model.add(Conv2D(30, 5, activation='relu'))
model.add(MaxPooling2D((2,2)))
model.add(Flatten())
model.add(Dense(12, activation='relu'))
model.add(Dense(8, activation='relu'))
model.add(Dense(1, activation='sigmoid'))

model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
model.summary()
Model: "sequential_13"
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
reshape_5 (Reshape)          (None, 128, 128, 1)       0         
_________________________________________________________________
conv2d_16 (Conv2D)           (None, 124, 124, 5)       130       
_________________________________________________________________
max_pooling2d_16 (MaxPooling (None, 62, 62, 5)         0         
_________________________________________________________________
conv2d_17 (Conv2D)           (None, 58, 58, 10)        1260      
_________________________________________________________________
max_pooling2d_17 (MaxPooling (None, 29, 29, 10)        0         
_________________________________________________________________
conv2d_18 (Conv2D)           (None, 25, 25, 20)        5020      
_________________________________________________________________
max_pooling2d_18 (MaxPooling (None, 12, 12, 20)        0         
_________________________________________________________________
conv2d_19 (Conv2D)           (None, 8, 8, 30)          15030     
_________________________________________________________________
max_pooling2d_19 (MaxPooling (None, 4, 4, 30)          0         
_________________________________________________________________
flatten_9 (Flatten)          (None, 480)               0         
_________________________________________________________________
dense_23 (Dense)             (None, 12)                5772      
_________________________________________________________________
dense_24 (Dense)             (None, 8)                 104       
_________________________________________________________________
dense_25 (Dense)             (None, 1)                 9         
=================================================================
Total params: 27,325
Trainable params: 27,325
Non-trainable params: 0
_________________________________________________________________

要了解 Conv2D 层和 MaxPooling 层的作用,请查看我的 well-accepted answer on Data Science stack exchange, or check out this blog。但是下图是一种在脑海中形象化它正在做的事情的一般方法。