Select to view content in your preferred language

Automating FEMA Shapefile Downloads for SDE Integration

197
1
02-27-2025 06:54 PM
AlmaFrith1
New Contributor

I feel like this has been done by someone before and found some GitHub scripts, but it's no exactly what I need. I need to automate the monthly download of FEMA flood hazard shapefiles from the FEMA Map Service Center https://msc.fema.gov/portal/advanceSearch#searchresultsanchor. Currently, our team manually selects a state, picks each county, and downloads the files. I'd like to script this process so that all county shapefiles for a given state (or all states) are downloaded automatically every month and stored in an enterprise SDE geodatabase.

Requirements:
Automate searching for available county datasets
Extract direct download links for each county
Download all available shapefiles
Import into SDE using arcpy

Any guidance, sample scripts, or Esri tools that might help would be appreciated

0 Kudos
1 Reply
BobBooth1
Esri Regular Contributor

Hi Alma,

If you have people manually downloading you could have them copy the link (right-click, copy link address) on the download links and record the links.

FEMA_download_links.png

It looks as though the download links are of the form:

https://msc.fema.gov/portal/downloadProduct?productTypeID=FINAL_PRODUCT&productSubTypeID=FIRM_PANEL&...

Seems like you could copy the Product IDs from the table.

If you know all the ProductIDs that you want, you can use Python to iterate over that list and concatenate each Product ID onto the base string for the download URL, e.g.

prodIDList = ["39035C0087E", "39035C0089E"...] # this is all the product IDs that you want. If it is State or Country wide... lots of ids.. you may choose to store it some other way.
baseURL = "https://msc.fema.gov/portal/downloadProduct?productTypeID=FINAL_PRODUCT&productSubTypeID=FIRM_PANEL&productID="
for prdID in prodIDList:
    downloadURL = baseURL + prdID
    # insert code to download, e.g. using requests, or urllib3 libraries

There may be some overall list of ProductIDs you can work from, or you could inventory the ProductIDs of the files you have already downloaded to make your list.

You could store the list in a file or database table and read it from there.

You would end up with a bunch of zipfiles that would need to extracted, then you could use ArcPy to load the shapefiles into your geodatabase.

This tutorial shows using the zipfile library to zip up a geodatabase; the unzipping process will be similar.

https://learn.arcgis.com/en/projects/create-a-python-script-tool/

For the ArcPy part, you can use List Featureclasses to get a list of shapefiles from a workspace (such as a folder where you've extracted the shapefiles) : https://pro.arcgis.com/en/pro-app/latest/arcpy/functions/listfeatureclasses.htm

This tutorial may help:

https://learn.arcgis.com/en/projects/automate-a-geoprocessing-workflow-with-python/

This tutorial has a part on using Windows Scheduled Tasks to automate a download process:

https://learn.arcgis.com/en/projects/schedule-automated-near-real-time-data-updates/

Things to think about:

Doing it iteratively, one at a time, will be slow. Maybe for a monthly process that's OK. Maybe you will need to look into parallel execution with Python for the download part.

Web downloads, and other processes sometimes fail. It may be worth looking into error handling with try/except blocks.

0 Kudos