It will be great if ArcGIS Desktop/Enterprise could support PostgreSQL 11.
It looks like that was just released in October: PostgreSQL: PostgreSQL 11 Released!
I'm sure we'll get support for it built into upcoming releases and will post back here with a status update when we have an idea about timeframe. I don't believe we'll see it in the Pro 2.3/ArcGIS 10.7 releases which are just around the corner, so hopefully the subsequent one.
Kory, thank you for answer. I wait and see!
Although not yet officially supported, I have successfully used PostgreSQL 11.1 and PostGIS 2.5.1 for the past two months to connect to a spatial database in PostgreSQL from ArcGIS. NOTE: this database was NOT ESRI Enterprise Geodatabase enabled, it is just a plain PostgreSQL / PostGIS database with spatial columns!
I have seen no real issues connecting to such database from ArcMap 10.6.1 and ArcGIS Pro 2.2.4 using both a standard Windows PostgreSQL ODBC DSN and using the "pyodbc" package and arcpy to connect to the database and send SQL DDL statements to create spatial views, and to subsequently acces the created spatial views using a standard ArcGIS Database Connection (*.sde) in the Catalog window to add them to ArcMap or ArcGIS Pro as ArcGIS Query Layers (http://pro.arcgis.com/en/pro-app/help/data/query-layers/what-is-a-query-layer-.htm)
If you have a non-critical (geo)database, and have proper backups, and you are in need of PostgreSQL 11, I think you can give it a try as well to upgrade. Based on my experiences, I think it quite likely you won't run into major issues, but for the official "green light", you'll need to wait for ESRI.
Have anyone hear something new regarding this topic? I've recently checked with some Esri guys but they were unable to give me a timeline for supporting version 11.
Any new advise/tip/feedback from people currently using PostgreSQL last version?
As another update to what I wrote in my previous post 4 months, ago, I have now been running happily PostgreSQL 11 for the past half year. I recently upgraded to 11.3 + PostGIS 2.5.2 as well.
Bear in mind again, that I am *** NOT *** using an ESRI Enterprise Geodatabase, just a plain PostGIS spatial database, that has not been enterprise geodatabase enabled. I access data through Query Layers, after having run the:
geoprocessing tool, that adds a unique ID field and associated sequence to the tables I want to access. This is a vital step, as it allows adding the layers and setting the unique key OBJECTID field when adding a Query Layer, and having full functionality like being able to select stuff. I add the layers by having an ArcGIS database connection in my project, that makes use of the PostgreSQL ODBC driver for Windows.
I both access tables, views and materialized views this way.
I additionally use pyodbc, again in combination with the PostgreSQL ODBC driver, to create views using DDL statements send from arcpy / Python. Note that you can install pyodbc from the ArcGIS Pro Python Package Manager, it is one of the listed available packages. This works great once you've figured out how to use pyodbc (which may be a bit tricky at first).
I have even recently used this setup to dissolve a PostGIS layer using the ArcGIS Dissolve_management tool. This posed a particular problem for my setup: since the database is not Enterprise enabled, I could not save the result of the Dissolve tool straight in the database, as you would be able to do if it had been an Enterprise Geodatabase. Instead, I saved an intermediate dissolved dataset as a Feature Class in a locally stored File Geodatabase. I subsequently used an arcpy.da.SearchCursor in combination with a pyodbc cursor to transfer all records from the Feature Class to the PostGIS database using OGC WKT by calling the arcpy.da.SearchCursor with the "SHAPE@WKT" token.
This caused one problem though: for some reason I haven't yet figured out, some of the transfered records were reported as invalid shapes by ArcGIS once I tried to acces the resulting PostGIS table in ArcGIS. I solved this by running a
pyodbcCursor.execute("UPDATE " + tableName + " SET way=ST_MakeValid(way)")
using the pyodbc connection in arcpy / Python. Note that this was based on OpenStreetMap data, that slightly curiously uses the name "way" for the geometry column, so substitute with your geometry column's name. After that, I could access the table without issues. To be honest though, as to this particular issue, I still need to scale up my testing and see if it really is fine now, but first indications seem fine. Of course, the workflow described here is pretty far fetched for most ArcGIS users, but it is nice to see it is actually possible to do this kind of stuff against a plain spatial database not Enterprise enabled.
In a similar way, I also needed to set the spatial reference ID on the layer, as the SRID was not included in the WKT returned by the arcpy.da.SearchCursor. That was solved by this statement:
pyodbcCursor.execute("UPDATE " + tableName + " SET way=ST_SetSRID(way," + str(spatialRefID) + ")")
As an extension to what I wrote about using a local File Geodatabase for geoprocessing and transferring records to the PostgreSQL database using an arcpy.da.SearchCursor and pyodbc cursor:
The arcpy.da.SearchCursor also supports binary transfer of the geometries using the "SHAPE@WKB" token. I have now attempted this, and it appears faster. (NOTE: transferring just over 4M records as WKT took about 4 hours on my setup). I therefor recommend using the WKB token over the WKT token. PostgreSQL / PostGIS was able to handle the transferred data without issues, see the image below of DBeaver (similar to pgAdmin) showing 80 polygons that were locally buffered & dissolved in a File Geodatabase using ArcGIS Pro's PairwiseBuffer and PairwiseDissolve tools, then transferred back to the database using OGC WKB:
Thank you very much Marco! Really clear and useful information.
Good, I actually needed to correct something though. I really need to do some real benchmarking, but my first statement about the speed difference between WKB and WKT was probably over-optimistic. I still belief it is faster, but would need to do some real testing and careful timing on larger datasets to find out how much. I have therefor removed the first estimate.
Additionally, I have now found out that the issues with reading the WKB/WKT back into the database, were caused by self-intersecting Polygons in the File Geodatabase's feature class, probably caused by the geoprocessing done. Running the ArcGIS Repair Geometry tool before inserting the data in the database using the cursors, now seems the most efficient solution, and allowed me to view the data in Pro without issues after transfer to PostgreSQL / PostGIS. The Repair Geometry tool also seems considerably faster than my first attempt using the PostGIS ST_MakeValid function after transfer to the database.
Luis - ArcGIS Pro 2.4 will support PostgreSQL 11. From what I hear, AGP 2.4 should be available before User Conference 2019.
Thanks for the news. Let's wait and see... if Pro will support v11 soon, Enterprise support should not be far I guess.
Retrieving data ...