Select to view content in your preferred language

FeatureLayer edit_features throwing error 1000: No mapping exists from object type System.Object[] to a known managed provider native type.

09-08-2023 11:54 AM
Labels (2)
Occasional Contributor


One of the fields was an integer type in the hosted feature service, and the input value from Knack was a string. I changed the field type, but it's still not working. New errors have arisen- one feature's error code is 1003: Operation rolled back and the other is 1000: String or binary data would be truncated. The statement has been terminated.

Original post:

I am working on updating a hosted feature service from a spatial data frame extracted from the Knack API. I used the Python API documentation example posted here.

Script snippet:




svc_lyr = svc_url + "/0"
ft_lyr = FeatureLayer(svc_lyr)

ft_set = ft_lyr.query(where="1=1")
if len(ft_set) > 0:
    template_ = deepcopy(ft_set.features[0])

input_features = []

add_ft = deepcopy(template_)

# records is a list of json records extracted from Knack API with
# attributes, similar to feature set features.
for r in records:
    add_ft.attributes['ProjectKey'] = r['ProjectKey']
    add_ft.attributes['ProjectName'] = r['ProjectName']
    latitude = r['Latitude']
    longitude = r['Longitude']

    add_ft.geometry = geometry.Geometry({"x" : longitude, "y" : latitude, "spatialReference" : {"wkid" : 4326}})


ft_lyr = FeatureLayer(svc_lyr)

add_result = ft_lyr.edit_features(adds=input_features)





When I run the script, no data is added and I get the error: {'addResults': [{'objectId': 7, 'uniqueId': 7, 'globalId': None, 'success': False, 'error': {'code': 1000, 'description': 'No mapping exists from object type System.Object[] to a known managed provider native type.'}}], 'updateResults': [], 'deleteResults': []}

I'm not finding much about this error related to this edit_features function... but given that I'm taking a deepcopy() to create the feature template, I'm unsure how the mapping could be off...


Any insights would be much appreciated!

0 Kudos
1 Reply
Esri Regular Contributor


Seems like your next step is to figure out which field(s) is causing the 1000: String or binary data would be truncated error.

From there, you can either truncate the string yourself or increase the length of the string field.


One way to figure out which fields are causing the issue is to create a dataframe from your input records, get the max length of all string columns, and compare that length to the field length:


# where 'df' is created from your input records
string_fields = [field for field in if field.type == "esriFieldTypeString"]
for field in string_fields:
    max_string_length = df[].apply(lambda x: len(x)).max()
    if max_string_length > field.length:
        print(f"{field.alias}: length needs to meet or exceed {max_string_length}")


0 Kudos