BigInt + PostGIS = "Invalid Column Data Type" in Desktop

2800
5
01-24-2012 06:42 PM
KyleRotte
Occasional Contributor
I have created a type using the following statements:

CREATE TABLE blocks(objectid bigint NOT NULL, block varchar(4), res smallint);

SELECT AddGeometryColumn('blocks', 'shape', 4326, 'GEOMETRY', 2 );

sdelayer -o register -l blocks,shape -C ojbectid,SDE -e a -t PG_GEOMETRY -u sde -p sde -i esri_sde

Now, anytime I go to access that layer in ArcDesktop 10 tools I get the "Invalid Column Data Type" error. If I switch objectid to integer instead of a bigint it works fine. I also thought it might just be an issue with having the objectid field as bigint, but apparently if a bigint field is anywhere in the table I get the same error in ArcDesktop 10. I have tried setting SDE server parameter to allow INT64TYPES to TRUE, but that didn't help anything either so I set it back. Does ArcDeskop 10 not support fields that are bigint? If it does, how do I get this to work?
0 Kudos
5 Replies
VinceAngelo
Esri Esteemed Contributor
While ArcSDE does, ArcGIS Desktop and ArcGIS Server do not support BIGINT (64-bit integer)
datatypes (which is why INT64TYPES is disabled by default).

- V
0 Kudos
KyleRotte
Occasional Contributor
Thanks for the reply Vince. I thought that if INT64TYPES was set to false that SDE converted the data to a double that could be handled by desktop. Is that not true?

While ArcSDE does, ArcGIS Desktop and ArcGIS Server do not support BIGINT (64-bit integer)
datatypes (which is why INT64TYPES is disabled by default).

- V
0 Kudos
VinceAngelo
Esri Esteemed Contributor
It's a bit more exotic than that.  Oracle doesn't directly support ANSI integer and
floating-point types, so INT64TYPES was added to allow the server to map the
columns of previously undiscovered tables into SE_INT64_TYPE or SE_FLOAT64_TYPE
columns.  It will also prevent creation of new tables with SE_INT64_TYPE columns
via the ArcSDE API.

I'm not exactly sure how it interacts with the PG svrmgr (since PG uses discrete
types, not "NUMBER", there's less flexibility in column bind variable types).  I believe
the  BIGINT type is simply unsupported without regard to INT64TYPES content (that
is, ArcSDE identifies the correct type, but ArcGIS doesn't support it).  You could
confirm this by doing a 'sdetable -o describe' on a "new" table -- If the describe
fails, then ArcSDE is honoring INT64TYPES=FALSE.

- V
0 Kudos
TrevorMillen
Occasional Contributor

This post is from 2012, but is there a newer response or update on the 64-bit integer types being usable?

I'm extracting data from my legacy data > converting it to use in ArcGIS Pro

Large Int = all our table_ID fields. Will I have to continue to convert to float or int? Which one is suggested Float or Int?

I wish discussion's from 6 years ago were not first in my search results

0 Kudos
VinceAngelo
Esri Esteemed Contributor

I'm not aware of any changes to type support in the last six releases of ArcGIS.  I haven't tried to see if ArcGIS Pro handles bigint types.

Using either integer or double is problematic in terms of primary keys.  Neither can capture the full range of bigint values.  If there are no values which exceed 2^31-1, then you should certainly use an integer type.  If the values exceed two billion, I'd recommend considering either hi/lo integers as a compound key, or using UUID (128-bit) with the 64-bit integer mapped to the lower half of the range.  After that would be ASCII text, and then double, since float equivalence is fundamentally unreliable.

- V

0 Kudos