AnsweredAssumed Answered

Exporting multiple Web Feature Layers as Geodatabase

Question asked by ae.swagoner on Nov 8, 2018

I have been working on this python script and I cannot figure out why it is not working.  I have about 15 Web Feature Layers I want to regularly export to individual geodatabases then download them using the script.  If I try to just download them it returns a 1kb zip file.  So after looking online for a solution a ran across this one:Download feature service as file geodatabase 

 

It worked great for 1 file, but like I said, I have 15 I would like to regularly download.  so I spent some time tweaking that and come up with this code:

 

   

import time
import os
import zipfile
import arcgis.gis
from zipfile import ZipFile

# *** modify these four lines ***
outputFolder = r"OUTPUT"  # where the GDB will be extracted to
gis = arcgis.GIS("URL",
                 "USER", "PASS"# replace these with your credentials
item_id = ["ID1", "ID2", ......]
GDBname = ["MULTIPLE_NAMES"]
for i in range(len(item_id)):
    AGOLitem = gis.content.get(item_id[i])

    print("Exporting Hosted Feature Layer...")
    AGOLitem.export(GDBname[i], 'File Geodatabase', parameters=None, wait='True')
    time.sleep(10# add 10 seconds delay to allow export to complete

    search_fgb = gis.content.search(query="title:{}*".format(GDBname[i]))  # find the newly created file geodatabase in ArcGIS online
    fgb_item_id = search_fgb[0].id
    fgb = gis.content.get(fgb_item_id)
    fgb.download(save_path=outputFolder)  # download file gdb from ArcGIS Online to your computer
    print("Zipping exported geodatabase for download...")

    '''while statement runs until a valid zipped file is created'''
    # randomly the output is a 1 KB file that is not a valid zipped file.
    # The while statement forces a valid zipped file to be created.
    zipfullpath = os.path.join(outputFolder, GDBname[i] + ".zip"# full path to the zipped file once it is downloaded to your computer
    while zipfile.is_zipfile(zipfullpath) is False:
        fgb.download(save_path=outputFolder)
    zf = ZipFile(os.path.join(outputFolder, GDBname[i] + ".zip"))

    '''deleting hosted File Geodatabase'''
    # NOTE: This will delete the temporary File Geodatabase in ArcGIS Online
    print("Deleting "+fgb.title+"("+fgb.type+")"+" from ArcGIS Online...")
    fgb.delete()


print("Done!")

  When I run this it gets to zipping exported geodatabase then it gets hung up.  In the folder it is supposed to save to, I have either a service definition or just a file with the original file name (So if the name online is "Features_for_you" but GDBname should have been "New_Features" I will get a file with the name: "Features_for_you".

 

 

Things I have tried:

  • I thought maybe I wasn't waiting long enough for the export, so I extended the sleep to 200 seconds just to test. Same result
  • I have 2 text files, one with the id and one with the GDBname.  Instead of a long list I had:
    • item_id=open("id.text").read().splitlines()
    • GDBname=open("id.text").read().splitlines()

 

 

I do have another log-in that I use.  I upload some smaller test data and ran the exact same script with only 3 Feature Layers and it worked just fine.

 

The only difference between the live data and the test data is the live data has a ton of information:  I believe each Feature layer has 4 - 10 layers and each individual layer has 40 -70 fields with domains all over the place.

 

Any reason my script isn't work properly?

Outcomes