Select to view content in your preferred language

change Objectid datatype to bigInt in Create Enterprise Database for Postgresql

822
4
08-01-2022 01:04 PM
Status: Open
SarahMyers
Occasional Contributor

We were to set up a new Enterprise Geodatabase, but wanted to create it on the same postgresql server as our internally managed data lake. The creation process failed with a pretty mysterious error: "Error: The Geodatabase system table could not be created. GDB_Items : Error (-4)" that we had a lot of trouble running down. In the end it turned out to be an incompatibility between the Enterprise Geodatabase creation wizard and the volume of data in our data lake. Specifically, here is what I found:

The Enterprise Geodatabase creation process adds a bunch of new types to the database, which are tracked in the pg_types table. Those types are assigned an OID, which is a shared sequential key used in a bunch of places throughout the postgres catalog. In most postgres installs those OID values are pretty low (10s / 100s of thousands). However, in our database which has a few terabytes of pretty complex data in it, those OID values are in the billions, specifically about 2.8 billion.

Here is a csv example of the types added in our database:

oid,typname
2825153047,st_state_data_type
2825153222,st_geometry_set
2825153221,_st_geometry_set
2825153050,_st_state_data_type

The issue is that during the creation process at one point ArcMap casts that OID value to an integer. Not a bigint, just a normal int. When that happens it doesn't return the proper type, because 2.8B is too large to be cast to an integer (caps out at 2.1B) so it can't return the right value. This causes the process to error out because it thinks the type creation has failed.

Ideally this could be resolved by ESRI casting those values as bigint instead of int, or just not casting those values at all. In our case there doesn't seem to be a way to work around it, but if anyone has an idea it would be appreciated! 

4 Comments
VinceAngelo

64-bit OID support is an existing enhancement request, as is 64-bit integer column support.

- V

RusselBell

We are still encountering this problem as of ArcGIS Pro 3.2.  The OID columns in pg_type have values in excess of 2 billion.  We are unable to "enable" a new enterprise geodatabase on existing server because of the high water mark it uses when assigning OID values in system catalog.  Tried dropping the extension and recreating after vacuum of system tables, but it did not work.  The 'geometry' typname still gets too large of a number and causes the failure.  Did you find a workaround for this besides creating a new PG cluster and using pg_dump/pg_restore?

George_Thompson

@RusselBell have you seen the updated documentation on this: https://pro.arcgis.com/en/pro-app/latest/help/data/geodatabases/overview/arcgis-field-data-types.htm...     and the new GP tool: https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/migrateobjectidto64bit.htm

I would review the information in the documentation before running the tool.

 

 

 

 

RusselBell

George-Thompson: Thanks for responding. The links are helpful info, but in this case the problem is in the PostgreSQL system catalog tables.  When the "CREATE extension postgis" command is executed to install the PostGIS extension, a "geometry" type record is inserted into the pg_type system catalog table.  The inserted row has several OID column values > 2 billion because of PG cluster-wide OID assignment logic.  We are at PG13 and do not use OID in our user tables.  Nor is it not related to 32-bit or 64-bit objectid values because it is a system catalog issue unrelated to a given ESRI feature class.