Buffer Performance on Large Datasets

Discussion created by Lady_Jane on Oct 21, 2011
Latest reply on Jan 17, 2012 by rutledgecl2
Hi everyone,
I am writing a script that calculates the different types of land cover present within a certain distance of a stream within a catchment.  The basic steps are as follows:
1. Intersect streams and catchments (output: line)
2. Buffer the streams, dissolving based on the catchment ID
3. Perform a bunch of other dissolves, selects, and intersects to ensure that the buffer doesn't extend beyond the stream's catchment
4. Intersect buffer with land cover
5. Export table.

The script is consistently failing with error 999999 on the buffer step which essentially means that the intersected streams and catchments feature class is too large to buffer.  I tried solving this by creating a fishnet, selecting the catchments whose centroid falls within each fishnet cell and performing all the steps separately for each set of catchments and finally merging together all the tables.  I have gotten to the point of dividing my original catchments layer into 9 separate selections, but still the intersection of catchments and streams is too large to be buffered.
If I have to go loop through the script 9+ times, the processing time is really starting to get out of hand.

I know that buffer can create very large files because of all the vertices that are required, but there must be some way around this.  Has anyone else ever had to deal with this issue?