如何使用 Tensorflow 数据集管道进行可变长度输入?

How to use the Tensorflow Dataset Pipeline for Variable Length Inputs?

我正在 Tensorflow 中针对长度不一的数字序列数据集训练循环神经网络,并一直在尝试使用 tf.data API 来创建高效的管道。但是我似乎无法让这个东西工作

我的方法

我的数据集是一个形状为 [10000, ?, 32, 2] 的 NumPy 数组,它以 .npy 格式的文件保存在我的磁盘上。这里的 ? 表示元素在第二个维度上具有可变长度。 10000 表示数据集中的 minibatch 数量,32 表示一个 mini-batch 的大小。

我正在使用 np.load 打开这个数据集,我正在尝试使用 from_tensor_slices 方法创建一个 tf.data.Dataset 对象,但似乎这只适用于所有输入张量形状一样!

我试着阅读 docs 但他们只给出了一个非常简单的例子。

我的代码

因此生成的 numpy 文件如下 -

dataset = []
for i in xrange(num_items):
  #add an element of shape [?, 32, 2] to the list where `?` takes
  # a random value between [1, 40]
  dataset.append(generate_random_rnn_input())

with open('data.npy', 'w') as f:
  np.save(f, dataset)

下面给出的代码是我尝试创建一个 tf.data.Dataset 对象

# dataset_list is a list containing `num_items` number of itesm
# and each item has shape [?, 32, 2]
dataset_list = np.load('data.npy')

# error, this doesn't work!
dataset = tf.data.Dataset.from_tensor_slices(dataset_list)

我得到的错误是"TypeError: Expected binary or unicode string, got array([[[0.0875, 0. ], ..."

继续,还需要帮助!

所以我尝试了@mrry 的回答,现在我可以创建数据集对象了。 但是,我无法按照教程中所述使用迭代器迭代此数据集。这就是我的代码现在的样子 -

dataset_list = np.load('data.npy')

dataset = tf.data.Dataset.from_generator(lambda: dataset_list, 
                                         dataset_list[0].dtype,
                                         tf.TensorShape([None, 32, 2]))

dataset = dataset.map(lambda x : tf.cast(x, tf.float32))

iterator = dataset.make_one_shot_iterator()
next_element = iterator.get_next()

with tf.Session() as sess:
  print sess.run(next_element) # The code fails on this line

我得到的错误是 AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'。我完全不知道这是什么意思。

这是完整的堆栈跟踪 -

2018-05-15 04:19:25.559922: W tensorflow/core/framework/op_kernel.cc:1261] Unknown: exceptions.AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'
Traceback (most recent call last):

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/ops/script_ops.py", line 147, in __call__
    ret = func(*args)

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 378, in generator_py_func
    nest.flatten_up_to(output_types, values), flattened_types)

AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'


2018-05-15 04:19:25.559989: W tensorflow/core/framework/op_kernel.cc:1273] OP_REQUIRES failed at iterator_ops.cc:891 : Unknown: exceptions.AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'
Traceback (most recent call last):

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/ops/script_ops.py", line 147, in __call__
    ret = func(*args)

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 378, in generator_py_func
    nest.flatten_up_to(output_types, values), flattened_types)

AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'


     [[Node: PyFunc = PyFunc[Tin=[DT_INT64], Tout=[DT_DOUBLE], token="pyfunc_1"](arg0)]]
Traceback (most recent call last):
  File "pipeline_test.py", line 320, in <module>
    tf.app.run()
  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/platform/app.py", line 126, in run
    _sys.exit(main(argv))
  File "pipeline_test.py", line 316, in main
    train(FLAGS.num_training_iterations, FLAGS.report_interval, FLAGS.report_interval_verbose)
  File "pipeline_test.py", line 120, in train
    print(sess.run(next_element))
  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 905, in run
    run_metadata_ptr)
  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1140, in _run
    feed_dict_tensor, options, run_metadata)
  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1321, in _do_run
    run_metadata)
  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/client/session.py", line 1340, in _do_call
    raise type(e)(node_def, op, message)
tensorflow.python.framework.errors_impl.UnknownError: exceptions.AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'
Traceback (most recent call last):

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/ops/script_ops.py", line 147, in __call__
    ret = func(*args)

  File "/home/vastolorde95/virtualenvs/thesis/local/lib/python2.7/site-packages/tensorflow/python/data/ops/dataset_ops.py", line 378, in generator_py_func
    nest.flatten_up_to(output_types, values), flattened_types)

AttributeError: 'numpy.dtype' object has no attribute 'as_numpy_dtype'


     [[Node: PyFunc = PyFunc[Tin=[DT_INT64], Tout=[DT_DOUBLE], token="pyfunc_1"](arg0)]]
     [[Node: IteratorGetNext = IteratorGetNext[output_shapes=[[?,32,2]], output_types=[DT_FLOAT], _device="/job:localhost/replica:0/task:0/device:CPU:0"](OneShotIterator)]]

如您所见,tf.data.Dataset.from_tensor_slices() only works on objects that can be converted to a (dense) tf.Tensor or a tf.SparseTensor. The easiest way to get variable-length NumPy data into a Dataset is to use tf.data.Dataset.from_generator(),如下:

dataset = tf.data.Dataset.from_generator(lambda: dataset_list, 
                                         tf.as_dtype(dataset_list[0].dtype),
                                         tf.TensorShape([None, 32, 2]))