Select to view content in your preferred language

Merge & Clip FIRM Layers to Watershed - Table Not Found (Error 999999)

1225
4
06-21-2017 11:27 AM
ChrisMcCloud
Emerging Contributor

I was trying to combine a number of flood hazard (FIRM) layers of different jurisdictions in multiple states together and then limit it to a watershed within the designated area.  I built a model to this effect, combining each layer together with Merge and then running a clip to the watershed of interest.  However, when I run the model, I get the following error on the Clip portion (I won't include the merge portion of the model code, as it is extremely lengthy from the number of files involved across about 5 states):  

Executing (Clip): Clip "D:\Todd Research\Processing\Chesapeake_Bay.gdb\Chesapeake_States_Watershed_Flood_Hazard_Areas2NL" "D:\Todd Research\Processing\Chesapeake_Bay.gdb\Chesapeake_Bay_Watershed" "D:\Todd Research\Processing\Chesapeake_Bay.gdb\Chesapeake_Watershed_Flood_Hazard_Areas2NL" #

Start Time: Mon Jun 19 21:54:30 2017

Reading Features...

Processing Tiles...

ERROR 999999: Error executing function.

The table was not found.

The table was not found. [Chesapeake_Watershed_Flood_Hazard_Areas2NL]

The table was not found.

The table was not found. [Chesapeake_Watershed_Flood_Hazard_Areas2NL]

General function failure

General function failure

Failed to execute (Clip).

Failed at Tue Jun 20 00:35:54 2017 (Elapsed Time: 2 hours 41 minutes 23 seconds)

Does anyone know what the problem might be here?  I know the error number is just a general error that could be caused by a number of things.  I'm a bit concerned that the problem might be caused by an issue with the table generated by the Merge preceding the Clip, even though the model outputs say that executed successfully.  I tried running it twice, once with a preliminary firm from an additional jurisdiction and once without, and got the same error both times.  I did make sure to modify the preliminary FIRM attribute table to match the final FIRM attribute tables in format as well, for the time that I included it.  

0 Kudos
4 Replies
IanMurray
Honored Contributor

I think your merge is working just fine, as the error on the table its throwing is the output of the clip (Chesapeake_Watershed_Flood_Hazard_Areas2NL), not the input of the clip (Chesapeake_States_Watershed_Flood_Hazard_Areas2NL).

I recently encountered a similar problem which was caused by have some bad clipping geometry that needed to be repaired, might be similar in your situation.  If not have a look at the link below for some other fixes for 999999 errors.

http://gisgeography.com/esri-arcgis-999999-error/

JayantaPoddar
MVP Esteemed Contributor

If the Merge tool has run successfully, it's output should be available irrespective of the failure of the Clip tool. Check if the output feature class is available and you are able to add it successfully to ArcMap.

If the above output of Merge tool looks fine, there is a probability that the resultant feature class of Merge tool is a Multipart feature class. In this case, you need to add Multipart to Singlepart tool in between Merge and Clip tool.



Think Location
0 Kudos
curtvprice
MVP Esteemed Contributor

This sounds like a massive merged dataset, the Clip may be failing because the inputs are just so huge. You may want to try using an iterator on select by location to select the polygons from each of the inputs that overlap your clip_feature, copy each to temporary data sets, and merge those instead of the whole mess. (This would require a sub-model to do the iteration, returning the temp datasets as an output parameter of Collect Values.)

IanMurray
Honored Contributor

Now that you mention that, I ran into a similar problem when merging and clipping several states worth of floodplain data together.  I really only needed 100 and 500 year flood plain areas, not areas of minimal hazard that are also included in the datasets.  I queried my data down prior to merging and it worked alot better since it was removing alot of additional geometry from the new dataset.

0 Kudos