Preserving a MySql datetime field in Python

942
12
02-13-2020 07:47 AM
JoeBorgione
MVP Esteemed Contributor

I have a python script that extracts data from a MySql database and writes it to a csv.  The original table is a mix of all sorts of field types including datetime and timestamp.  Once the csv is written and I perform an an arcpy table to table to get it into a fgdb, the datetime fields automagically become text fields.  Not so good if you need to filter by date....

CSV files are both a boon and bane: is there a way to go right from MySql to a File geodatabse, thus preserving the field types?  Unfortunately I do not have admin rights to our e-gdb (Sql server back-end) so I can't go MySql to SqlServer....

That should just about do it....
Tags (2)
12 Replies
DanPatterson_Retired
MVP Esteemed Contributor
s = "2020-01-15 23:15:30"  # --- universal

print(datetime.datetime(*time.strptime(s, "%Y-%m-%d %H:%M:%S")[:6]))

2020-01-15 23:15:30

t = "01-15-2020 23:15:30"  # mon, day year

print(datetime.datetime(*time.strptime(t, "%m-%d-%Y %H:%M:%S")[:6]))
2020-01-15 23:15:30

YYYY-MM-DD date format is it?

0 Kudos
JoeBorgione
MVP Esteemed Contributor

In mySql the values (when not Null) are:

2019-07-08 10:59:11

Bruce Harold‌ provides: https://community.esri.com/community/open-platform-standards-and-interoperability/blog/2019/09/26/us...  which discusses odbc connections:  my 64 bit windows 10 desktop has:

And the mySql site ( MySQL :: Download Connector/ODBC ) offers a 32 bit and a 64 bit driver.  So I'm a little confused as to which odbc driver to down load and for that matter how to use pyodbc to traverse it. (Installing pyodbc package as I type this...)

That should just about do it....
0 Kudos
DanPatterson_Retired
MVP Esteemed Contributor

I added to my post, YYYY-MM-DD is universal format great for sorting. If you need to parse by month, you can split on the string's - separator

0 Kudos
JoeBorgione
MVP Esteemed Contributor

It's a little more complicated that that:  these tables get geocoded, and work their way into a published service which is then consumed by an Operations Dashboard app.  Ultimately, it's in OD where I need to filter by date.

That should just about do it....
0 Kudos
DanPatterson_Retired
MVP Esteemed Contributor

Ahhh sorry ... don't do the 'consumed' side of things

0 Kudos
JoeBorgione
MVP Esteemed Contributor

I was afraid of losing you with that point!

As mentioned, I'm adding the pydobc package through the ArcGIS Python Package Manager, and its still running, which if you notice my post regarding that is about at an hour.  Something is quite right there.....

That should just about do it....
0 Kudos
BruceHarold
Esri Regular Contributor

All, this pain is totally avoidable, just obtain the Data Interoperability extension and go directly from MySQL to File GDB.  In Data Interop the reader is called MariaDB, in fact there are two, one spatial, one not.  Bouncing data through CSV will throw the schema overboard as even writing a schema.ini file can't handle all data types.

JoeBorgione
MVP Esteemed Contributor

Thanks Bruce Harold!

That should just about do it....
0 Kudos
BruceHarold
Esri Regular Contributor

OK full disclosure I'm the Product Manager for Data Interoperability ;)