Clipping Large Datasets

295
0
02-25-2014 10:00 AM
MikeMacRae
Occasional Contributor III
Hello,

I am writing a script that includes the arcpy.Clip_analysis tool. I am running a very large dataset that errors out like this:

Traceback (most recent call last):
  File "\\spatialfiles.bcgov\Work\em\vic\mtb\Local\scripts\python\overlap_report\mike_mac\Test_Scripts\Test_Clip_Placier_Poly.py", line 11, in <module>
    arcpy.Clip_analysis(r"Database Connections\ABC.sde\SPATIAL.GRID_POLY",r"W:\Shapefiles\Test_Extent.shp",r"W:\Geodatabases\Clipped_FeatureClasses.gdb\OUTPUT")
  File "E:\sw_nt\ArcGIS\Desktop10.1\arcpy\arcpy\analysis.py", line 56, in Clip
    raise e
arcgisscripting.ExecuteError: ERROR 999999: Error executing function.
The table was not found.
The table was not found. [OUTPUT]
The table was not found.
The table was not found. [OUTPUT]
Unable to create logfile system tables. User perhaps lacks permissions or resources to create tables [ORA-01031: insufficient privileges
]
Failed to execute (Clip).


After doing a lot of research and going through the steps here I have determined that the memory allocation avaible to me is insufficient. I am using ArcGIS in a terminal server for Gov't. I have very little control over changing my allocation amount and will have to go through quite a process with our IT department to change this.

The basis of my script is that I am running about 73 SDE Feature Classes through the clip tool based on a common clip feature that a user will define (probably using a python addin)

My question would be, is there any other way around the limiting my memory allocation? Or, is there another way to handle the data, like somehow limiting the extent of the tiling process closer to the size of the clip feature? Someone in my office mentioned using a spatial view, but I don't see how that wuld work. Any suggestions would be welcome.

Thanks,
Mike
Tags (2)
0 Kudos
0 Replies