1) we have roads, addresses, other individual point, and line features.
2) this update process runs on a weekly basis
3) on average we see 20k-50k changes total
a. this is why we do the delta b/c so little changes that we don't want to waste time rebuilding everything
4) when we rebuild tiles we do it in two steps
a. forced scales - these are level 0-15, very little data total so we just rebuild them all (usually only 10min)
b. selective scales - this is levels 16-19, this is where we implement our area of interest based on the delta results
1) it is within the selective scales that we see the memory usage spike even faster and ultimately failing
5) the sole purpose of this machine/server is to build tiles, nothing else runs on this machine, we did not want anything interrupting this process or bogging down the machine to slow the process.
note: i really wanted to separate the vector data from the raster data, i feel that the sheer size of all the data due to them being combined is an issue in itself. the raster data only changes once a year, whereas the vector data changes weekly. the reasoning behind combining them is so we would not have to make two separate requests for the vector and raster data, thus bogging down the bandwidth of the system.
dave