Select to view content in your preferred language

Can someone help with this Automation download and update script? Keep getting a timeout error

1024
1
11-17-2023 05:00 PM
Labels (1)
Nicole32Nicholson32
New Contributor

Hello! I am writing this notebook in Arcgis online and keep running into the same timeout error. The code is meant to download the designated zip file to the designated folder, then update the content on Arcgis online, and then delete the zip file. This code works for downloading the zip file in Arcgis pro, but for some reason I cannot get it to work online. Any help would be appreciated!


from arcgis.gis import GIS
gis = GIS("home")
from arcgis.features import FeatureLayerCollection
import os
import requests
import zipfile

 

# Variables
itemid = 'ba6a79321c504405b6ec0ab1d1423e78'
httpshost = r'https://www.fhwa.dot.gov'
httpspath = r'/planning/national_highway_system/nhs_maps/'
httpsfile = r'nhs_20230906.zip'
dwnpath = r'U:\GEOGRAPHY\geogrpy-classwork\GEOGRPY-477-Fall2023-compase\NicholsoNM23\Road Data'
newname = os.path.join(dwnpath, 'NHS.zip')

user = '*****'
password = '*****'

 

try:
# Download https data
url = httpshost + httpspath + httpsfile
response = requests.get(url)

with open(newname, 'wb') as f:
f.write(response.content)

# Log into ArcGIS Online
gis = GIS('https://www.arcgis.com', user, password)
un = gis.properties.user.username
print('Logged in as: {}'.format(un))

# Overwrite hosted feature layer
dataitem = gis.content.get(itemid)
flayercol = FeatureLayerCollection.fromitem(dataitem)
flayercol.manager.overwrite(newname)

# Remove the downloaded ZIP file
if os.path.exists(newname):
os.remove(newname)

print('Script Complete!')

except Exception as error:
print(error)

 

Nicole32Nicholson32_0-1700268927395.png

 

1 Reply
gis_KIWI4
Occasional Contributor II

Hi @Nicole32Nicholson32 

I made a small tweak to your code - 
response = requests.get(url, stream = True)

I was able to download the file and save it as a zip. 

Some more information here - https://realpython.com/python-download-file-from-url/
Essentially, it keeps the connection 'alive' and avoid time-outs 🙂 

Hope this helps. 





0 Kudos