Select to view content in your preferred language

Environment setting to default Integer to Long not BigInteger

2232
11
02-22-2024 01:25 AM
Status: Open
Labels (1)
KimOllivier
Honored Contributor

When importing data from geopackage or sqlite any field defined as Integer in the DDL gets cast to BigInteger instead of (Long) Integer. This is a new incompatible behaviour since BigIntegers were included (ArcPro 3.2?). There are several tools that do not work with BigIntegers such as RelationshipClasses.

It would be helpful if there was a switch to turn off this unwelcome enhancement. There is a switch in Options/Map and Scene - but it doesn't work.

The only way to fix the problem is to define a FieldMapping on every copy operation and there are many of these that do not have FieldMapping as a parameter.

If the field is readonly such as OBJECTID then it cannot be changed at all. I see another user has used FME to fix this.

This is like a reverse single precision / double precision incompatibility!

Maybe a switch somewhere in the settings or the environment settings to retrofit a fix?

11 Comments
BruceHarold

Thanks Kimo I'll get this to the right team.

Ranga_Tolapi

Completely agree @KimOllivier 

In ArcGIS Pro 3.2, at least tools like Copy Features, Feature class to Geodatabase, etc... should honor the version of the pre-created target File Geodatabase.

If the pre-created target File Geodatabase version is 10.0 or below then data conversion/import tools should not introduce the new features of ArcGIS Pro 3.2 e.g., 64-bit OID, Big integer, Timestamp offset, Date only, Time only.

 

@SirishByreddy 

KoryKramer
MelanieWawryk

ArcPro and AGOL  also do not work well with Big integer. We don't have anything that is a big integer and only use long. If you look at the layer in ArcCatelog it shows as long integer but the data design view of the same layer shows it as big integer. If we export to AGOL it gives a warning message about big integer and sure enough, it is big integer in AGOL. I guess the bug is in ArcPro 3.2.2. Please fix

Big Integer bug.PNG

MelanieWawryk_0-1715036790341.png

 

KimOllivier

Not fixed in 3.3. Ah well, back to 3.1 so that I can attempt to port workflows from ArcMap!

There is a setting in Options/Map and Scene - but it doesn't work for me. [Edit: it casts BigInteger to Double, not Long so cannot be used as a key]


settings.pngquery_layer.png

DannyMac

@KimOllivier 

After delving deeper into the issue of INTEGER vs BIGINT, I discovered the following information. ArcGIS Pro didn't support 64-bit integers until ArcGIS Pro 3.2. It's important to note that GeoPackage uses SQLite under the covers, and INT and INTEGER datatypes in SQLite have always been 64-bit. Consequently, with ArcGIS Pro 3.2 now supporting 64-bit integers using BIGINT and considering that an INTEGER is 64-bit in a GeoPackage, the columns are cast to BIGINT. From a technical standpoint, this behavior is correct, which could argue against it being classified as a bug. However, I believe a more effective approach would involve inspecting the column values before automatically casting them to BIGINT. For instance, ArcGIS could examine the values for all INTEGER columns first to determine if casting to BIGINT is necessary.

Here's an example query:

SELECT * FROM MyRoads
  WHERE MyRoads.LINK_ID < -2147483648
  OR MyRoads.LINK_ID > 2147483647;

If the query returns nothing, then casting to INTEGER would be more appropriate than using BIGINT.

KimOllivier

Thanks for that explanation. It does show why it is happening. But the option switch says "to use field types that are compatible with ArcGISPro 3.1", which are long integer. So even if it is correct to move 64 bit fields in Sqlite to BigInteger they do not work in Pro for many functions if they are cast that way. This must be a problem with other databases in Enterprise too. I suspect they had a very narrow view of this problem when implementing the option.

All I am asking for is for the switch to work everywhere. [Edit. It only works inside ArcGISPro, you can use the switch in arcpy.env.useCompatibleFieldTypes = True when running scripts outside Pro. But a cast to Double is not useful for me.]

DannyMac

@KimOllivier  I currently have an enterprise support ticket open for this issue, and I've referenced your post in my ticket to ensure that they are aware that I'm not the only one affected by this unexpected change. For us, it means we either need to update all of our application code to utilize BigInt, or we can modify all the columns after importing features and change them back to Integer. At the moment, we are choosing the latter option because none of the columns in our database would require a BigInt data type.

Marshal

Just to chime in here.  It appears BigInt is not a compatible sort_field in tools such as ExportFeatures and PointsToLine (only tested in arcpy), which is problematic for many of our script tools.  Hopefully this is resolved quickly.

ERROR 003911: The field is not of type SHORT | LONG | FLOAT | DOUBLE | TEXT | DATE | XML | OBJECTID

  

KimOllivier

Even if the policy is correctly set for casting the ambiguous Integer type I still need a solution that works for me. I can see that computing has evolved from 16 to 32 to 64 bit integers and some are defined as signed and unsigned. I have seen LONG, MEDINT, INTEGER and SHORT as well as the new BigInteger to try to extend the meaning of 'LONG' which in 32 bit days meant only 32 bits. To keep backward compatibility the temptation is to reuse 'Long' for 64 bit when using a 64 bit compiler.

I was interested to see that MEDINT was interpreted as Long and INTEGER as BigInteger going from a gpkg to a filegeodatabase. So redefining the gpkg field type might be a good workaround that avoids having to explicitly cast the fieldtype using a fieldmappings parameter with ExportFeatures_conversion() replacing CopyFeatures(). [note that sqlite does not care what you call it in the DDL, they are always 64 bit integers internally]

The small print in the help also explains why the option setting in Options in ArcGISPro does not work for me. Because if running a script outside ArcGISPro in the command line you have to set the options in the arcpy environment. arcpy.env. But Double is useless for me. I want to use the field as an indexed key so it has to be some sort of integer. Insert below:

useCompatibleFieldTypes
(Read and Write)

Specifies whether field types that are compatible with ArcGIS Pro 3.1 will be used. This setting relates to the use of the Date Only, Time Only, Timestamp Offset, and Big Integer field types in CSV tables and unregistered database tables with tools that create layers or table views.

When set to True, a layer or table view created from these sources will use field types compatible with ArcGIS Pro 3.1. Date Only, Time Only, and Timestamp Offset fields will be displayed as Date fields, and Big Integer fields will be displayed as Double fields. When set to False, all original data source field types will be used.

 

Note:

This property is applicable when used from stand-alone Python or for a geoprocessing service. When used in ArcGIS Pro, this property will always match the Use field types that are compatible with ArcGIS Pro 3.1 and earlier releases when adding query layer... option.

 

A better option I have implemented is to replace CopyFeatures_management() with my own function CopyFeaturesLong() that looks for any BigInteger arcpy interpretations and keeps them as a compatible Integer, not a Double. Here it is if you want to use it. There will need to be a similar function to replace CopyRows().

 

 

 

def CopyFeaturesLong(input_fc, output_fc):
    """CopyFeatures replacement
    to ensure every BigInteger is mapped to Long
    with a fieldmappings parameter using arcpy.conversion.ExportFeatures()
    with a bit of help from copilot!
    full paths to featureclasses assumed """

    # Create FieldMappings and FieldMap objects
    field_mappings = arcpy.FieldMappings()
    # List of fields to process
    # it would be nice if objectid and shape could be excluded easily
    fields = arcpy.ListFields(input_fc)

    for field in fields:
        if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
            field_map = arcpy.FieldMap() # new for each field
            field_map.addInputField(input_fc, field.name)
            # Check if the field type is BigInteger and change to Long
            field_name = field_map.outputField
            if field_name.type == 'BigInteger':
                field_name.type = 'Long'
            field_map.outputField = field_name
            # Add the FieldMap to the FieldMappings object
            field_mappings.addFieldMap(field_map)

    # Use ExportFeatures to apply the field mappings and create the output feature class
    arcpy.conversion.ExportFeatures(input_fc, output_fc, field_mapping=field_mappings)
    if debug:
        for fld in arcpy.ListFields(output_fc):
            print(fld.name,fld.type)
        print(f"{output_fc} Feature class successfully exported with updated field types!")

    return True
def CopyRowsLong(input_tab, output_tab):
    """CopyRows replacement
    to ensure every BigInteger is mapped to Long
    with a fieldmappings parameter using arcpy.conversion.ExportTable()
    with a bit of help from copilot!
    full paths to featureclasses assumed """

    # Create FieldMappings and FieldMap objects
    field_mappings = arcpy.FieldMappings()
    # List of fields to process
    # it would be nice if objectid and shape could be excluded easily
    fields = arcpy.ListFields(input_tab)

    for field in fields:
        if field.type not in ('OID', 'Geometry'): # not allowed in fieldmappings
            field_map = arcpy.FieldMap() # new for each field
            field_map.addInputField(input_tab, field.name)
            # Check if the field type is BigInteger and change to Long
            field_name = field_map.outputField
            if field_name.type == 'BigInteger':
                field_name.type = 'Long'
            field_map.outputField = field_name
            # Add the FieldMap to the FieldMappings object
            field_mappings.addFieldMap(field_map)

    # Use ExportFeatures to apply the field mappings and create the output feature class
    arcpy.conversion.ExportTable(input_tab, output_tab, field_mapping=field_mappings)
    if debug:
        for fld in arcpy.ListFields(output_tab):
            print(fld.name,fld.type)
        print(f"{output_tab} Table class successfully exported with updated field types!")

    return True