Select to view content in your preferred language

Environment setting to default Integer to Long not BigInteger

4910
12
02-22-2024 01:25 AM
Status: Open
Labels (1)
KimOllivier
Honored Contributor

When importing data from geopackage or sqlite any field defined as Integer in the DDL gets cast to BigInteger instead of (Long) Integer. This is a new incompatible behaviour since BigIntegers were included (ArcPro 3.2?). There are several tools that do not work with BigIntegers such as RelationshipClasses.

It would be helpful if there was a switch to turn off this unwelcome enhancement. There is a switch in Options/Map and Scene - but it doesn't work.

The only way to fix the problem is to define a FieldMapping on every copy operation and there are many of these that do not have FieldMapping as a parameter.

If the field is readonly such as OBJECTID then it cannot be changed at all. I see another user has used FME to fix this.

This is like a reverse single precision / double precision incompatibility!

Maybe a switch somewhere in the settings or the environment settings to retrofit a fix?

12 Comments
KimOllivier

You may need to fix a bug(?) that adds a blank field to the end of the arcpy.ListFields(fc) command.

fields = arcpy.ListFields(input_fc)
    print(f"L42 {[f.name for f in fields]}")
    for field in fields:
        if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
            if field.name: # very strange blank name added to list by Esri
                field_map = arcpy.FieldMap() # new for each field
                print(f"L46 {input_fc},<{field.name}>")
KimOllivier

But there is another  problem not covered by my workaround. OBJECTID can be 32 or 64 bit as well. This is much harder to change and even detect. The type is "ÓID" for both (not helpful) but you can find out the length is 4 or 8.

If the table is empty you can edit the field schema by hand to fix it, but I don't know how to do it in a script.