Hello,
I have a large feature layer that is constantly updated and resides on AGOL. The feature layer contains over 77,000 parcels. My program worked fine when testing it on a smaller data set but now I am getting a time out error. I was able to modify the api, if a 504 error occurred and while I can successfully delete all the records, but I cannot add new records.
My code and the error message are posted below.
req = requests.get(siteurl,timeout=1000 ) if req.status_code == 200: logging.info("success") target = GIS(siteurl, uname, pword) #Search for file by title owner and type newquery = ("title: " + '"' + filetitle + '"' + " AND owner: " + '"' + fileowner + '"') parcelsearch = target.content.search(query = (newquery), item_type = 'Feature Layer Collection', max_items = 1000) for layer in parcelsearch: print(layer.title) feature_layer = parcelsearch[0] epoch_crt = datetime.datetime.fromtimestamp(feature_layer.created/1000.0).strftime('%d/%m/%Y') crt_epoch = datetime.datetime.strptime(epoch_crt, '%d/%m/%Y').strftime('%m/%d/%Y').replace('X0','X').replace('X','') epoch_mod = datetime.datetime.fromtimestamp(feature_layer.modified/1000.0).strftime('%d/%m/%Y') mod_epoch = datetime.datetime.strptime(epoch_mod, '%d/%m/%Y').strftime('%m/%d/%Y').replace('X0','X').replace('X','') logging.info("Url of feature layer %s Feature Layer id %s Item created on %s Item modified on %s" % (feature_layer.url, feature_layer.id, crt_epoch, mod_epoch)) flc = FeatureLayerCollection(feature_layer.url, target) flc_ly1=flc.layers[0] f1_list = feature_layer.layers #delete the file and add records parcel = f1_list[0] logging.info("Total number of records before delete %s " %parcel.query(return_count_only=True)) #get the total number of features if(parcel.delete_features(where="objectid > 0")): logging.info("Total number of records after delete %s " %parcel.query(return_count_only=True)) #get the total number of features newfile = FeatureLayerCollection.fromitem(feature_layer) uploadfile = ("r" + "'" + filenm + "'") print(uploadfile) #spatialDataFrame = fs.SpatialDataFrame(data=df, geometry=listGeometries) sdf1 = SpatialDataFrame.from_featureclass(r'C:\name of geodatabase file) fs = sdf1.to_featureset() parcel.edit_features(adds=fs) error message:
File "Updateparcels_agol.py", line 119, in <module>
parcel.edit_features(adds=fs)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\arcgis\features\layer.py", line 1121, in edit_features
return self._con.post(path=edit_url, postdata=params, token=self._token)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\arcgis\_impl\connection.py", line 1104, in post
resp = opener.open(url, data=encoded_postdata.encode())
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\request.py", line 526, in open
response = self._open(req, data)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\request.py", line 544, in _open
'_open', req)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\request.py", line 504, in _call_chain
result = func(*args)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\request.py", line 1361, in https_open
context=self._context, check_hostname=self._check_hostname)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\request.py", line 1320, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [WinError 10054] An existing connection was forcibly closed by the remote host>#
Anne,
Were you able to get past this issue?
Hi Adrian
No. I tried everything. I looped thru the data, added it in chunks of 1000. I still get the timeout. So no resolution to date.
Thanks
Anne
Hmmm, I wonder if it would be a good time to get in touch with tech support over this.
How long before you get the error, i.e., how much time expires? On the surface, it looks like AGOL is terminating the connection: "existing connection was forcibly closed by the remote host." You can also get a similar error when an organizational firewall terminates the connection. All the client knows is that the connection was terminated remotely, not exactly who or what terminated it. I would check with your IT department and ask about firewall timeouts.