I am sure most of us do this, but we in Oklahoma recreate a very large Denormalized Table every weekend using the Overlay Route Events Tool. That table is used by a majority of the Agency for their processes. It also makes it easier for the GIS group and my group to do mileage totals and other analytics.
Our table has about 80 columns which pulls from 70 events and compiles against the entire Public Roadway System and not just the State Highway System. One thing we have been running into is it's monstrous run time. We have it built into a model that runs the Overlay Tool and then overwrite a Feature Class in an SDE that the rest of the Agency can reach. Those two operations have been taking 10 - 12 hours to complete!
My question is, is that what other people are seeing or do we something way wrong and need to evaluate how that is running?
Our process is different, however those run times do not see that out of line to me and to a certain extent that seems fast. For example on one of our spatially derived products for a full coverage event on all public roads for one event with 397,692 records was 8 hour and 41 minutes.