如何使用keras提取CNN激活?

How to extract CNN activations using keras?

我想使用 keras 从第一个全连接层中提取 CNN 激活函数Caffe中有这样一个功能,但是我无法使用那个框架,因为我遇到了安装问题。我正在阅读一篇使用这些 CNN 激活的 研究论文 ,但作者使用的是 Caffe。

有没有办法提取那些 CNN 激活,这样我就可以通过使用数据挖掘关联规则,apriori 算法[=,将它们用作 transactions 中的项目25=].

当然,首先我必须提取 CNN 激活的 k 最大幅度。所以每张图片都是一次交易,每次激活都是一个项目。

到目前为止我有以下代码:

from __future__ import print_function
import keras
from keras.datasets import mnist
from keras.layers import Dense, Flatten
from keras.layers import Conv2D, MaxPooling2D
from keras.models import Sequential
import matplotlib.pylab as plt

model = Sequential()
model.add(Conv2D(32, kernel_size=(5, 5), strides=(1, 1),
                 activation='relu',
                 input_shape=input_shape))
model.add(MaxPooling2D(pool_size=(2, 2), strides=(2, 2)))
model.add(Conv2D(64, (5, 5), activation='relu'))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(Flatten())
model.add(Dense(1000, activation='relu'))
model.add(Dense(num_classes, activation='softmax'))

model.compile(loss=keras.losses.categorical_crossentropy,
              optimizer=keras.optimizers.Adam(),
              metrics=['accuracy'])

使用 Tensorflow Keras 提及以下解决方案。

为了能够访问Activations,首先我们应该传递一个或多个图像,然后激活对应于这些图像。

传递 Input Imagepreprocessing 的代码如下所示:

from tensorflow.keras.preprocessing import image

Test_Dir = '/Deep_Learning_With_Python_Book/Dogs_Vs_Cats_Small/test/cats'
Image_File = os.path.join(Test_Dir, 'cat.1545.jpg')

Image = image.load_img(Image_File, target_size = (150,150))

Image_Tensor = image.img_to_array(Image)

print(Image_Tensor.shape)

Image_Tensor = tf.expand_dims(Image_Tensor, axis = 0)

Image_Tensor = Image_Tensor/255.0

定义模型后,我们可以使用下面显示的代码(关于猫狗数据集)访问任何层的 Activations

# Extract the Model Outputs for all the Layers
Model_Outputs = [layer.output for layer in model.layers]
# Create a Model with Model Input as Input and the Model Outputs as Output
Activation_Model = Model(model.input, Model_Outputs)
Activations = Activation_Model.predict(Image_Tensor)

First Fully Connected Layer(关于猫狗数据)的输出是:

print('Shape of Activation of First Fully Connected Layer is', Activations[-2].shape)
print('------------------------------------------------------------------------------------------')
print('Activation of First Fully Connected Layer is', Activations[-2])

它的输出如下所示:

Shape of Activation of First Fully Connected Layer is (1, 512)
------------------------------------------------------------------------------------------
Activation of First Fully Connected Layer is [[0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02759874 0.         0.         0.         0.
  0.         0.         0.00079661 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.04887392 0.         0.
  0.04422646 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.01124999
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.00286965 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.00027195 0.
  0.         0.02132209 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00511147 0.         0.         0.02347952 0.
  0.         0.         0.         0.         0.         0.
  0.02570331 0.         0.         0.         0.         0.03443285
  0.         0.         0.         0.         0.         0.
  0.         0.0068848  0.         0.         0.         0.
  0.         0.         0.         0.         0.00936454 0.
  0.00389365 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.00152553 0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.09215052 0.         0.         0.0284613  0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00198757 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.02395868 0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.01150922 0.0119792
  0.         0.         0.         0.         0.         0.
  0.00775307 0.         0.         0.         0.         0.
  0.         0.         0.         0.01026413 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.01522083 0.         0.00377031 0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.02235368 0.         0.         0.         0.
  0.         0.         0.         0.         0.00317057 0.
  0.         0.         0.         0.         0.         0.
  0.03029975 0.         0.         0.         0.         0.
  0.         0.         0.03843511 0.         0.         0.
  0.         0.         0.         0.         0.         0.02327696
  0.00557329 0.         0.02251234 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.01655817 0.         0.
  0.         0.         0.         0.         0.00221658 0.
  0.         0.         0.         0.02087847 0.         0.
  0.         0.         0.02594821 0.         0.         0.
  0.         0.         0.01515464 0.         0.         0.
  0.         0.         0.         0.         0.00019883 0.
  0.         0.         0.         0.         0.         0.00213376
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.00237587
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.02521542 0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.00490679 0.         0.04504126 0.         0.         0.
  0.         0.         0.         0.         0.         0.
  0.         0.        ]]

有关详细信息,请参阅 Francois Deep Learning Using Python 书中的第 5.4.1 节可视化中间激活 Chollet,Keras 之父。

希望这对您有所帮助。快乐学习!