构建用于参数预测的自动编码器网络

building an autoencoder network for parameter predictions

我是机器学习领域的新手。我在第一列中有一个 1d 信号及其相应的频率 mean_amplitude,并且在文件的第二列中保存了一个时间:这些是用于监督训练的输入-输出对,即对于经过测试的 1d 信号,我需要输出频率,mean_amplitude,和一个时间。

-0.000000000000000000e+00     5.80000    
-0.000000000000000000e+00     3.11111   
-0.000000000000000000e+00    -1.3666
-0.000000000000000000e+00
-1.366125990000000065e-14
-1.032400010000000034e-13
-6.034000879999999677e-13
-5.719921059999999811e-13
-1.361178959999999947e-12
-9.374413750000000466e-11
-1.666704970000000006e-10
-1.149504050000000062e-09
5.453276159999999863e-10
1.457022949999999906e-09
-5.355599959999999815e-09
-4.683606839999999697e-09
-2.849577019999999957e-09
-1.108899989999999921e-08
-2.849577019999999957e-09
-4.683606839999999697e-09
-5.355599959999999815e-09
1.457022949999999906e-09
5.453276159999999863e-10
-1.149504050000000062e-09
-1.666704970000000006e-10
-9.374413750000000466e-11
-1.361178959999999947e-12
-5.719921059999999811e-13
-6.034000879999999677e-13
-1.032400010000000034e-13
-0.000000000000000000e+00
-0.000000000000000000e+00

以类似的方式,我将 1000 个输入输出对保存在附件中的目录中,我想训练一个 autoencoder 网络并希望网络预测频率,mean_amplitude 和新测试信号的时间。

在这方面,我需要一些建议,如何为这种输入输出对提供自动编码器的输入。

我在 keras 教程中找到了以下代码,但不知道如何为这种数据实现它。希望机器学习专家能分享一些想法。

input = layers.Input(shape=(28, 28, 1))

# Encoder
x = layers.Conv2D(32, (3, 3), activation="relu", padding="same")(input)
x = layers.MaxPooling2D((2, 2), padding="same")(x)
x = layers.Conv2D(32, (3, 3), activation="relu", padding="same")(x)
x = layers.MaxPooling2D((2, 2), padding="same")(x)

# Decoder
x = layers.Conv2DTranspose(32, (3, 3), strides=2, activation="relu", padding="same")(x)
x = layers.Conv2DTranspose(32, (3, 3), strides=2, activation="relu", padding="same")(x)
x = layers.Conv2D(1, (3, 3), activation="sigmoid", padding="same")(x)

# Autoencoder
autoencoder = Model(input, x)
autoencoder.compile(optimizer="adam", loss="binary_crossentropy")
autoencoder.summary()

autoencoder.fit(x=train_data,y=train_data,epochs=50,batch_size=128,shuffle=True,validation_data=(test_data, test_data),)

这是一个简单的工作模型,其中包含请求的虚拟数据:

import tensorflow as tf

signal_input = tf.keras.layers.Input(shape=(1,))
x = tf.keras.layers.Dense(16, activation='relu')(signal_input)
x = tf.keras.layers.Dense(8, activation='relu')(x)
output = tf.keras.layers.Dense(3, activation='linear')(x)

model = tf.keras.models.Model(inputs=signal_input, outputs=output)
model.compile(optimizer='adam',
              loss='MSE')

signals = tf.random.normal((1000,1)) # 1000 signals with 1 value each
labels = tf.random.normal((1000, 3)) # 1000 labels with 3 values for frequency, mean_amplitude, and a time

model.fit(x = signals, y = labels, epochs=5, batch_size=8)

并且输出:

Epoch 1/5
32/32 [==============================] - 0s 1ms/step - loss: 1.0087
Epoch 2/5
32/32 [==============================] - 0s 1ms/step - loss: 0.9856
Epoch 3/5
32/32 [==============================] - 0s 1ms/step - loss: 0.9777
Epoch 4/5
32/32 [==============================] - 0s 1ms/step - loss: 0.9747
Epoch 5/5
32/32 [==============================] - 0s 1ms/step - loss: 0.9733
<keras.callbacks.History at 0x7f4d0909f7d0>

这应该能让您了解如何为您的数据实施模型。