-
Notifications
You must be signed in to change notification settings - Fork 151
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
SNOW-219884: Pandas datetime with timezone converts to timestamp_ntz
in snowflake
#199
Comments
timestamp_ntz
in snowflaketimestamp_ntz
in snowflake
Any possibilities I got this error? |
I tried to inspect the log file of alchemy. The sql statement is
|
Or should I set the timestamp mapping specify in this link: https://docs.snowflake.com/en/sql-reference/parameters.html#timestamp-type-mapping? |
I think I solve it by
setting the timestamp type mapping and timezone. |
Saw the same problem using panda "Timezone aware datetime columns will be written as Timestamp with timezone type with SQLAlchemy if supported by the database. Otherwise, the datetimes will be stored as timezone unaware timestamps local to the original timezone. |
@lightupyiqian But it seem snowflake supports Timestamp with timezone? I expect pd_writer can handle that? |
yeah, it should, but has other problem like not quoting identifier properly. |
This bug was introduced in v1.1.18 and is also touches |
Hi guys, this issue still persists with snowflake-connector-python==2.7.11. My dataframe column datatype is datetime64[ns, UTC], but when I use df.to_sql to write to Snowflake, result datatype is TIMESTAMP_NTZ. Any ideas how to fix this? |
Having the same issue Pandas |
I am also encountering the same problem.
|
Has anyone found a true fix for this? Still persists using sqlalchemy and pdwriter |
I'm having the same issue using Meanwhile, I was able to get the desired data types by explicitly creating the table with column definitions before ingesting data. |
Also having this issue -- |
bump |
hi and apologies for taking so long to look into this, we're going to change this going forward. checking this issue and thank you Nazarii for your reproduction on the linked issue! |
unfortunately it looks like this long regression is still there :( and switching to sqlalchemy.TIMESTAMP does not help either; the query is still generated as we're going to take a look and fix this. until then, the workarounds already suggested in this issue are available to use:
thank you for bearing with us ! |
@sfc-gh-dszmolka Thanks, btw I have a PR that fixes this issue that was opened for a long time, now there are merge conflicts that should be fixed, |
thanks @Nazarii for the contribution! added some reviewers from the connector team in hope that the review process can be sped up however as you mentioned, the conflicts still need to be fixed eventually; do you think that would be something possible to do ? or even submit a new PR based on the current main? there's been probably tons of changes since the PR was submitted originally |
Please answer these questions before submitting your issue. Thanks!
What version of Python are you using (
python --version
)?Python 3.8.2
What operating system and processor architecture are you using (
python -c 'import platform; print(platform.platform())'
)?macOS-10.15.7-x86_64-i386-64bit
What are the component versions in the environment (
pip list
)?Package Version
asn1crypto 1.4.0
awswrangler 1.10.0
azure-common 1.1.25
azure-core 1.8.2
azure-storage-blob 12.5.0
boto3 1.15.18
botocore 1.18.18
certifi 2020.6.20
cffi 1.14.3
chardet 3.0.4
cryptography 3.2.1
idna 2.10
isodate 0.6.0
jmespath 0.10.0
msrest 0.6.19
numpy 1.19.4
oauthlib 3.1.0
oscrypto 1.2.1
packaging 20.4
pandas 1.1.4
pip 20.2.1
psycopg2-binary 2.8.6
pyarrow 2.0.0
pycparser 2.20
pycryptodomex 3.9.9
PyJWT 1.7.1
PyMySQL 0.10.1
pyOpenSSL 19.1.0
pyparsing 2.4.7
python-dateutil 2.8.1
pytz 2020.4
requests 2.23.0
requests-oauthlib 1.3.0
s3transfer 0.3.3
setuptools 50.2.0
six 1.15.0
snowflake-connector-python 2.3.5
snowflake-sqlalchemy 1.2.4
SQLAlchemy 1.3.20
sqlalchemy-redshift 0.8.1
urllib3 1.25.11
wheel 0.35.1
If possible, provide a recipe for reproducing the error.
A complete runnable program is good.
I extract data from MySQL and constructed a pandas data frame. For example,
the column types are:
I used to following code to load data into snowflake:
data.to_sql(target_table, conn, if_exists='replace',index=False, method=pd_writer)
What did you expect to see?
TIMESTAMP_TZ
columnsWhat did you see instead?
TIMESTAMP_NTZ(9)
Can you set logging to DEBUG and collect the logs?
The text was updated successfully, but these errors were encountered: