从 Python 中的函数返回 OGR 层对象的分段错误

Segmentation fault returning OGR Layer object from function in Python

我有一个简单的程序可以处理来自 Geopackage 层的点。第一次尝试时,我将文件访问封装到一个函数中:

from osgeo import ogr

pointsFile = "points.gpkg"

def getPoints():

    driver = ogr.GetDriverByName("GPKG")
    dataSource = driver.Open(pointsFile, 0)
    layer = dataSource.GetLayer(0)
    print("Returning layer")
    return layer

def main():

    layer = getPoints()
    print("Number of points to process: ", layer.GetFeatureCount())


if __name__ == '__main__': main()

当 returns 图层对象时,它因分段错误而失败:

$ python3 testReturn.py
Returning layer
Segmentation fault (core dumped)

但是,在 main:

中进行文件访问
from osgeo import ogr

pointsFile = "points.gpkg"

def main():

    driver = ogr.GetDriverByName("GPKG")
    dataSource = driver.Open(pointsFile, 0)
    layer = dataSource.GetLayer(0)
    print("Number of points to process: ", layer.GetFeatureCount())


if __name__ == '__main__': main()

程序按预期运行:

$ python3 testDirect.py
Number of points to process:  21872

是什么导致了这个问题?

GDB测试了代码,调用时出现分段错误:

layer.GetFeatureCount()

一些额外的调试信息:

Starting program: /usr/bin/python3 testReturn.py

[Thread debugging using libthread_db enabled]

Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".

Returning layer Program received signal SIGSEGV, Segmentation fault. 0x00007ffff5c42298 in OGR_L_GetFeatureCount () from /usr/local/lib/libgdal.so.20 (gdb)