Select to view content in your preferred language

Timeout error

09-21-2018 09:01 AM
New Contributor II


I have a large feature layer that is constantly updated and resides on AGOL. The feature layer contains over 77,000 parcels. My program worked fine when testing it on a smaller data set but now I am getting a time out error. I was able to modify the api, if a 504 error occurred  and while I can successfully delete all the records, but I cannot add new records. 

My code and the error message are posted below.

req = requests.get(siteurl,timeout=1000 )

if req.status_code == 200:"success")
   target = GIS(siteurl, uname, pword)
   #Search for file by title owner and type
   newquery = ("title: " + '"' + filetitle + '"' + " AND owner: " + '"' + fileowner + '"')
   parcelsearch = = (newquery), item_type = 'Feature Layer Collection', max_items = 1000)
   for layer in parcelsearch:

   feature_layer = parcelsearch[0]
   epoch_crt = datetime.datetime.fromtimestamp(feature_layer.created/1000.0).strftime('%d/%m/%Y')
   crt_epoch = datetime.datetime.strptime(epoch_crt, '%d/%m/%Y').strftime('%m/%d/%Y').replace('X0','X').replace('X','')
   epoch_mod = datetime.datetime.fromtimestamp(feature_layer.modified/1000.0).strftime('%d/%m/%Y')
   mod_epoch = datetime.datetime.strptime(epoch_mod, '%d/%m/%Y').strftime('%m/%d/%Y').replace('X0','X').replace('X','')"Url of feature layer %s Feature Layer id %s Item created on %s Item modified on %s" % (feature_layer.url,, crt_epoch, mod_epoch))
   flc = FeatureLayerCollection(feature_layer.url, target)
   f1_list = feature_layer.layers
   #delete the file and add records
   parcel = f1_list[0]"Total number of records before delete %s " %parcel.query(return_count_only=True)) #get the total number of features
   if(parcel.delete_features(where="objectid > 0")):"Total number of records after delete %s " %parcel.query(return_count_only=True)) #get the total number of features
      newfile = FeatureLayerCollection.fromitem(feature_layer)
      uploadfile = ("r" + "'" + filenm + "'")
      #spatialDataFrame = fs.SpatialDataFrame(data=df, geometry=listGeometries)
      sdf1 = SpatialDataFrame.from_featureclass(r'C:\name of geodatabase file)
      fs = sdf1.to_featureset()


error message:

File "", line 119, in <module>
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\arcgis\features\", line 1121, in edit_features
return, postdata=params, token=self._token)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\site-packages\arcgis\_impl\", line 1104, in post
resp =, data=encoded_postdata.encode())
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\", line 526, in open
response = self._open(req, data)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\", line 544, in _open
'_open', req)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\", line 504, in _call_chain
result = func(*args)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\", line 1361, in https_open
context=self._context, check_hostname=self._check_hostname)
File "C:\Program Files\ArcGIS\Pro\bin\Python\envs\arcgispro-py3\lib\urllib\", line 1320, in do_open
raise URLError(err)
urllib.error.URLError: <urlopen error [WinError 10054] An existing connection was forcibly closed by the remote host>#

0 Kudos
4 Replies
MVP Honored Contributor


Were you able to get past this issue?

0 Kudos
New Contributor II

Hi Adrian

No. I tried everything. I looped thru the data, added it in chunks of 1000. I still get the timeout. So no resolution to date.



0 Kudos
MVP Honored Contributor

Hmmm, I wonder if it would be a good time to get in touch with tech support over this.

0 Kudos
MVP Esteemed Contributor

How long before you get the error, i.e., how much time expires?  On the surface, it looks like AGOL is terminating the connection:  "existing connection was forcibly closed by the remote host."  You can also get a similar error when an organizational firewall terminates the connection.  All the client knows is that the connection was terminated remotely, not exactly who or what terminated it.  I would check with your IT department and ask about firewall timeouts.