ArcGIS 10.3.1 does not recognize Long Integer right

14744
18
06-03-2015 09:05 AM
FilipKrál
Occasional Contributor III

Hi, I'm going through a puzzling experience with a shapefile that behaved as expected in 10.2 and which has a WBID column of type Long Integer storing values from 1 to 50235. In ArcMap 10.2.x on another machine and in Quantum GIS 2.2.0 this column loads as expected (Long Integer 1 to 50235).

However, In ArcMap 10.3.1 (on another machine) I can see the WBID column as Short Integer with values from -32767 to +-32767 (containing numbers from -32767 to -15301 and from 1 to 32767).

Have you ever experienced similar behaviour?

I'd be grateful for any hints.

Filip.

Message was edited by: Filip Král: Changed the title to better describe the issue.

0 Kudos
18 Replies
JoshuaBixby
MVP Esteemed Contributor

Is there a bug number or something else that will be documented at 10.4 so users know whether the "unintended consequences" were dealt with or not?  Similar to SciPy being announced for ArcGIS 10.3.1, with some fanfare, and not quite making it into the release; it would be good if there was something users could track on to know whether changes that are announced/planned for this issue in 10.4 are actually addressed in 10.4.

by Anonymous User
Not applicable

There is a similar problem with long integer and double and this is definitely a bug:

If I export a feature class (FileGDB) with a long integer field to a Shapefile with ArcGIS 10.3.1 and import this Shapefile with ArcGIS 10.2.x the field type is mapped to double in the feature class of the FileGDB.

A Shapefile generated with 10.3.1 cannot reliable be handled with versions < 10.3.1.

Every export/import has unwanted side effects. 

Why do you wait for 10.4? It is a serious bug.

0 Kudos
JoshuaBixby
MVP Esteemed Contributor

The issue can be isolated down to a single table with a single long integer field.  In ArcGIS 10.3.1, exporting that table to a DBF will show a "Long Integer" field but viewing that same DBF in ArcGIS 10.2.2 or ArcGIS 10.1 shows "Double."  With ArcGIS 10.3.1 and 10.2.2 showing different data types, it is unclear which one is "correct."  Looking to outside tools, MS Access shows the data type as "Double," so it seems ArcGIS 10.3.1 is altering the data type but still sees it as unaltered.

Not sure if this issue is related to the ongoing discussion about truncation.

DanPatterson_Retired
MVP Emeritus

The short int type is also ignored as you go from arcpy to numpy and back to arcpy as well.  I don't know if that was by design, but short integer fields created in 10.3.1 ... "int16" ("<i2") ...  come out as "int32" ("<i4") in numpy and hence when they go back into arcpy and arcmap.

0 Kudos
VinceAngelo
Esri Esteemed Contributor

I just used 10.2.2 Desktop to import a shapefile with an "N 10" column into an enterprise geodatabase, and that field was mapped to Double (because a "10" width could exceed INT_MAX, e.g., 2222222222), and had no difficulty reassigning the type to Long (and the NULLS_ALLOWED to False), then correctly importing the contents.

This issue is a fundamental problem with shapefile numeric representation, but I don't believe it's a bug in ArcGIS.  The old behavior (assuming Long with a width of 10) was likely to destroy data (which was a serious bug).  The current behavior requires more careful review of column mapping, but that is why the column mapping options exist.

- V

0 Kudos
NatashaLongpine
New Contributor II

We are having this exact same problem with Shapefiles exported from VISUM.  Everything looks fine in 10.1, QGIS, even Excel, as well as Pro, just not in 10.3.1.  I agree, this is a major issue and should be addressed before 10.4. 

FilipKrál
Occasional Contributor III

Hi all,

I also agree that this is a really serious issue in practical terms so please post back here if you have any news about this.

Today I really had to deal with the shapefile I mentioned in my original post and this is a work around I came up with based on responses from Vince Angelo​. It is far from generic but it worked in my situation. You will likely need to adjust it and check it for your cases. I must acknowledge my colleague helped me figure out the ins and outs of the _fix_short function. He's really good with binary stuff.

"""Rewrite a shapefile to a feature class in ArcGIS 10.3.1 and fix SHORT integer.
This example assumes input shapefile in_shp with columns WBID and DBAREA.
DBAREA is a DOUBLE and WBID is the incriminated integer column.
"""
import numpy
in_shp = 'c:/temp/poly.shp'
out_fc = 'c:/temp/db.gdb/poly'


def _fix_short(a):
    """Fix 16 bit SHORT to the right 32 bit LONG in ArcGIS 10.3 context.
    a -- input integer to convert to unsigned 32 bit integer
    For reasoning see https://community.esri.com/thread/159997
    """
    i = 65535 # i.e. int(16*'1', 2), the largest 16 bit integer
    return numpy.int32(numpy.int16(a) & i)


# create the new output feature class
sr = arcpy.Describe(in_fc).spatialReference
out_fc = arcpy.management.CreateFeatureclass(
    os.path.dirname(out_fc),
    os.path.basename(out_fc),
    "POLYGON", spatial_reference = sr
    ).getOutput(0)
arcpy.management.AddField(staging_lakes, "WBID", "LONG")
arcpy.management.AddField(staging_lakes, "DBAREA", "DOUBLE")


# rewrite rows using cursors
with arcpy.da.SearchCursor(in_fc, ["SHAPE@", "WBID", "DBAREA"]) as sc:
    with arcpy.da.InsertCursor(staging_lakes, ["SHAPE@", "WBID", "DBAREA"]) as ic:
        for row in sc:
            wbid = _fix_wbid(row[1])
            ic.insertRow([row[0], wbid, row[2]])

Filip.

RyanKelso
Occasional Contributor III

I just encountered this issue myself.  I can't believe the problem isn't being considered serious enough to get a patch pushed out.  There's potential for people's data to get really messed up if you don't catch the problem.

FarazAhmed1
New Contributor

I'm still on arcgis 10.2.2 and I think I have something similar but I'm not sure. So I had a long integer field and it was populated using the field calculator and for some reason it was limiting everything to 9 characters. So I went and changed it to 10 characters and it automatically changed it from long to double. I guess this is supposed to happen? Then I went to load a shapefile's info into a database's shapefile, and it loaded all the columns correctly except for the one it turned into a double (I had to go into the database and change that column into a double first as well to match). Instead of loading all 3000+ rows, it only loaded one row for that one column and it gave some huge negative value (-2147483648). No idea what's going on.

0 Kudos