Python cx_Oracle Insert Into table with multiple columns automatically the values (1:,2: ... 100:)

Python cx_Oracle Insert Into table with multiple columns automating the values (1:,2: ... 100:)

我正在编写一个脚本,用于在一个环境中从 oracle table 中读取大约 75 列,并将其加载到不同环境中的相同 table 定义中。到目前为止,我一直在使用 cx_Oracle cur.execute() 方法来 'INSERT INTO TABLENAME VALUES(:1,:2,:3..:8);' 然后使用 'cur.execute(sql, conn)' 方法加载数据。

但是,我要加载的这个 table 有大约 75 列以上,写 (:1, :2 ... :75) 会很乏味,我猜这不是其中的一部分最佳实践。

是否有自动遍历列数并自动填充 SQL 查询的 values() 部分的方法。

user = 'username'
pass = getpass.getpass()
connection_prod = cx_Oracle.makedsn(host, port, service_name = '')
cursor_prod = connection_prod.cursor()

connection_dev = cx_Oracle.makedsn(host, port, service_name = '')
cursor_dev = connection_dev.cursor()

SQL_Read = """Select * from Table_name_Prod"""
Data = cur.execute(SQL_Read, connection_prod)
for row in Data:
    SQL_Load = "INSERT INTO TABLE_NAME_DEV VALUES(:1, :2,:3, :4 ...:75);" --This part is ugly and tedious.
    cursor_dev.execute(SQL_LOAD, row)

这是我需要帮助的地方

connection_Prod.commit()
cursor_Prod.close()
connection_Prod.close()

您可以执行以下操作,这不仅有助于减少代码,而且有助于提高性能:

connection_prod = cx_Oracle.connect(...)
cursor_prod = connection_prod.cursor()

# set array size for source cursor to some reasonable value
# increasing this value reduces round-trips but increases memory usage
cursor_prod.arraysize = 500

connection_dev = cx_Oracle.connect(...)
cursor_dev = connection_dev.cursor()

cursor_prod.execute("select * from table_name_prod")
bind_names = ",".join(":" + str(i + 1) \
        for i in range(len(cursor_prod.description)))
sql_load = "insert into table_name_dev values (" + bind_names + ")"
while True:
    rows = cursor_prod.fetchmany()
    if not rows:
        break
    cursor_dev.executemany(sql_load, rows)
    # can call connection_dev.commit() here if you want to commit each batch

cursor.executemany() 的使用将显着提高性能。希望这对您有所帮助!