pyshark 数据包队列和 pickle 错误

pyshark packets queue and pickle errors

我正在尝试使用 pyshark 实时捕获数据包。

当我尝试从 multiprocessing.Queueun-pickle 获取数据包时出现以下错误:

python2.7/site-packages/pyshark/packet/layer.py", line 48, in __getattr__
val = self.get_field_value(item, raw=self.raw_mode)
(... multiple times ...)
RuntimeError: maximum recursion depth exceeded while calling a Python object`.

我怀疑在重建对象时存在问题,无论它是从队列中检索的,还是 unpickled 的。
令人惊讶的是,当我使用 Queue.Queue.

执行此操作时没有错误

这是用于重现此问题的代码:

import pyshark
import multiprocessing
import Queue
import cPickle as pickle

# Capture on eth0
interface = pyshark.LiveCapture(interface="eth0")

def queue_test(queue):
    """ Puts captured packets in a queue, then un-queue them and display """
    for packet in interface.sniff_continuously(packet_count=5):
        queue.put(packet)
    while not queue.empty():
        packet = queue.get()
        print "Packet {} {}".format(packet.highest_layer,packet._packet_string)

def pickle_test():
    """ Immediately pickle and unpickle the packet to display it"""
    for packet in interface.sniff_continuously(packet_count=5):
        pickled_packet = pickle.loads(pickle.dumps(packet, pickle.HIGHEST_PROTOCOL))
        print "Packet #{}, {} {}".format(pickled_packet.highest_layer,pickled_packet._packet_string)


if __name__ == "__main__":
    normal_queue = Queue.Queue()
    process_queue = multiprocessing.Queue()

    # Runs fine
    queue_test(normal_queue)

    # Both crash with a RuntimeError
    queue_test(process_queue)
    pickle_test()

为什么我会收到 RuntimeErrors,我该怎么办?
是我做错了什么还是 pyshark 这样做有问题?

在这里没有太大的成功,我在 pyshark's Github 上发布了一个问题,碰巧它在库中丢失了一些东西:

This is caused by the fact that some of the classes packet uses override getattr. Fixed in 541fc52

Link本期:https://github.com/KimiNewt/pyshark/issues/63