Python 和 Snowflake 错误附加到云上的现有 table
Python and Snowflake error on appending into existing table on the cloud
我正在尝试将数据帧上传到雪花云中现有的 table。这是数据框:
columns_df.head()
现在,当使用 pandas 中的 to_sql()
将数据附加到现有 table 时:
columns_df.to_sql('survey_metadata_column_names', index=False, index_label=None, con=conn, schema='PUBLIC', if_exists='append', chunksize=300)
我收到以下错误:
DatabaseError: Execution failed on sql 'SELECT name FROM sqlite_master
WHERE type='table' AND name=?;': not all arguments converted during
string formatting
TypeError: not all arguments converted during string formatting
部分列名称包含破折号和下划线。
To write data from a Pandas DataFrame to a Snowflake database, do one of the following:
- Call the write_pandas() function.
- Call the pandas.DataFrame.to_sql() method, and specify pd_writer as the method to use to insert the data into the database.
注意第二点。 write_pandas
。我仍然注意到使用这两种方法的几个问题,但这些是官方解决方案。
from snowflake.connector.pandas_tools import pd_writer
columns_df.to_sql('survey_metadata_column_names',
index = False,
index_label = None,
con = Engine, #Engine should be an SQLAlchemy engine
schema = 'PUBLIC',
if_exists = 'append',
chunksize = 300,
method = pd_writer)
或者
from snowflake.connector.pandas_tools import write_pandas
con = snowflake.connector.connect(...)
success, nchunks, nrows, _ = write_pandas(con,
columns_df,
'survey_metadata_column_names',
chunk_size = 300,
schema = 'PUBLIC')
请注意,第一种方法需要 SQLAlchemy
引擎,而第二种方法可以使用常规连接。
看看我在这里发布的允许写入(创建和替换)和追加的解决方案:
write_pandas snowflake connector function is not able to operate on table
我正在尝试将数据帧上传到雪花云中现有的 table。这是数据框:
columns_df.head()
现在,当使用 pandas 中的 to_sql()
将数据附加到现有 table 时:
columns_df.to_sql('survey_metadata_column_names', index=False, index_label=None, con=conn, schema='PUBLIC', if_exists='append', chunksize=300)
我收到以下错误:
DatabaseError: Execution failed on sql 'SELECT name FROM sqlite_master WHERE type='table' AND name=?;': not all arguments converted during string formatting
TypeError: not all arguments converted during string formatting
部分列名称包含破折号和下划线。
To write data from a Pandas DataFrame to a Snowflake database, do one of the following:
- Call the write_pandas() function.
- Call the pandas.DataFrame.to_sql() method, and specify pd_writer as the method to use to insert the data into the database.
注意第二点。 write_pandas
。我仍然注意到使用这两种方法的几个问题,但这些是官方解决方案。
from snowflake.connector.pandas_tools import pd_writer
columns_df.to_sql('survey_metadata_column_names',
index = False,
index_label = None,
con = Engine, #Engine should be an SQLAlchemy engine
schema = 'PUBLIC',
if_exists = 'append',
chunksize = 300,
method = pd_writer)
或者
from snowflake.connector.pandas_tools import write_pandas
con = snowflake.connector.connect(...)
success, nchunks, nrows, _ = write_pandas(con,
columns_df,
'survey_metadata_column_names',
chunk_size = 300,
schema = 'PUBLIC')
请注意,第一种方法需要 SQLAlchemy
引擎,而第二种方法可以使用常规连接。
看看我在这里发布的允许写入(创建和替换)和追加的解决方案: write_pandas snowflake connector function is not able to operate on table