PyTables 中的这个错误是什么?
What is this error in PyTables?
我在 python 中通过 pandas
使用 pytables
。我正在尝试使用 pandas.read_hdf()
加载文件,但出现了这个讨厌的错误。我希望我没有丢失 1.1 GB 不可替代的数据。我在保存过程中没有看到任何错误。一切似乎都运行良好。
谁能解释一下这个错误是什么意思?
另外,请问有什么方法可以恢复吗?
HDF5ExtError: HDF5 error back trace
File "H5Dio.c", line 174, in H5Dread
can't read data
File "H5Dio.c", line 449, in H5D_read
can't read data
File "H5Dchunk.c", line 1729, in H5D_chunk_read
unable to read raw data chunk
File "H5Dchunk.c", line 2755, in H5D_chunk_lock
unable to read raw data chunk
File "H5Fio.c", line 113, in H5F_block_read
read through metadata accumulator failed
File "H5Faccum.c", line 254, in H5F_accum_read
driver read request failed
File "H5FDint.c", line 142, in H5FD_read
driver read request failed
File "H5FDsec2.c", line 720, in H5FD_sec2_read
addr overflow, addr = 1108161578, size=7512, eoa=1108155712
类似的问题是
底线。你的文件很无聊。无法从中恢复。这是特别警告的(使用多个 threads/processes 作为作者)。请参阅文档 here.
HDF5 对作者来说 threadsafe/process 不安全。
我在 python 中通过 pandas
使用 pytables
。我正在尝试使用 pandas.read_hdf()
加载文件,但出现了这个讨厌的错误。我希望我没有丢失 1.1 GB 不可替代的数据。我在保存过程中没有看到任何错误。一切似乎都运行良好。
谁能解释一下这个错误是什么意思?
另外,请问有什么方法可以恢复吗?
HDF5ExtError: HDF5 error back trace
File "H5Dio.c", line 174, in H5Dread
can't read data
File "H5Dio.c", line 449, in H5D_read
can't read data
File "H5Dchunk.c", line 1729, in H5D_chunk_read
unable to read raw data chunk
File "H5Dchunk.c", line 2755, in H5D_chunk_lock
unable to read raw data chunk
File "H5Fio.c", line 113, in H5F_block_read
read through metadata accumulator failed
File "H5Faccum.c", line 254, in H5F_accum_read
driver read request failed
File "H5FDint.c", line 142, in H5FD_read
driver read request failed
File "H5FDsec2.c", line 720, in H5FD_sec2_read
addr overflow, addr = 1108161578, size=7512, eoa=1108155712
类似的问题是
底线。你的文件很无聊。无法从中恢复。这是特别警告的(使用多个 threads/processes 作为作者)。请参阅文档 here.
HDF5 对作者来说 threadsafe/process 不安全。