批量大小可变的 TensorFlow DataSet `from_generator`

TensorFlow DataSet `from_generator` with variable batch size

我正在尝试使用 TensorFlow 数据集 API 读取 HDF5 文件,使用 from_generator 方法。一切正常,除非批量大小不能平均分配给事件数。我不太明白如何使用 API.

进行灵活的批处理

如果事情没有平均分配,你会得到如下错误:

2018-08-31 13:47:34.274303: W tensorflow/core/framework/op_kernel.cc:1263] Invalid argument: ValueError: `generator` yielded an element of shape (1, 28, 28, 1) where an element of shape (11, 28, 28, 1) was expected.
Traceback (most recent call last):

  File "/Users/perdue/miniconda3/envs/py3a/lib/python3.6/site-packages/tensorflow/python/ops/script_ops.py", line 206, in __call__
    ret = func(*args)

  File "/Users/perdue/miniconda3/envs/py3a/lib/python3.6/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 452, in generator_py_func
    "of shape %s was expected." % (ret_array.shape, expected_shape))

ValueError: `generator` yielded an element of shape (1, 28, 28, 1) where an element of shape (11, 28, 28, 1) was expected.

我有一个重现错误的脚本(以及获取几 MB 所需数据文件的说明 - Fashion MNIST):

https://gist.github.com/gnperdue/b905a9c2dd4c08b53e0539d6aa3d3dc6

最重要的代码大概是:

def make_fashion_dset(file_name, batch_size, shuffle=False):
    dgen = _make_fashion_generator_fn(file_name, batch_size)
    features_shape = [batch_size, 28, 28, 1]
    labels_shape = [batch_size, 10]
    ds = tf.data.Dataset.from_generator(
        dgen, (tf.float32, tf.uint8),
        (tf.TensorShape(features_shape), tf.TensorShape(labels_shape))
    )
    ...

其中 dgen 是从 hdf5 读取的生成器函数:

def _make_fashion_generator_fn(file_name, batch_size):
    reader = FashionHDF5Reader(file_name)
    nevents = reader.openf()

    def example_generator_fn():
        start_idx, stop_idx = 0, batch_size
        while True:
            if start_idx >= nevents:
                reader.closef()
                return
            yield reader.get_examples(start_idx, stop_idx)
            start_idx, stop_idx = start_idx + batch_size, stop_idx + batch_size

    return example_generator_fn

问题的核心是我们必须在 from_generator 中声明张量形状,但我们需要在迭代时灵活地更改该形状。

有一些变通方法 - 删除最后几个样本以进行均分,或者只使用批量大小 1...但是如果您不能丢失任何样本且批量大小为 1,则第一个是不好的很慢。

有什么想法或意见吗?谢谢!

from_generator中指定张量形状时,可以使用None作为元素来指定可变大小的维度。这样您就可以容纳不同大小的批次,特别是 "leftover" 比您请求的批次大小稍小的批次。所以你会使用

def make_fashion_dset(file_name, batch_size, shuffle=False):
    dgen = _make_fashion_generator_fn(file_name, batch_size)
    features_shape = [None, 28, 28, 1]
    labels_shape = [None, 10]
    ds = tf.data.Dataset.from_generator(
        dgen, (tf.float32, tf.uint8),
        (tf.TensorShape(features_shape), tf.TensorShape(labels_shape))
    )
    ...