I've been having performance issues between my develoment and production environments. The problem exists with a Geoprocessing Task using a Python script, a similar tool using only a model doesn't exhibit the same behavior.
The development environment is running ArcGIS Server 10 sp 1, and will execute a the geoprocessing task without hesitation or delay. The same geoprocessing script when published on the production machine running ArcGIS Server 10 SP 3, has significant performance degradation. In looking at the server logs it appears the geoprocessing task completes the request in roughly the same amount of time as the development environment. The majority of the delay appears to be after the request has been completed and waiting for the server context to be released. Any suggestion would be greatly appreciated.