As I type this, I am about 10 minutes in on a very simple field calculation on 667 records. This should be virtually instantaneous, but It looks like I have about five more minutes to go. Utterly catastrophic to productivity.
I have very little to offer in the way of details, but could I get some advice on what to look for to begin solving this unbelievably frustrating problem?
The layers are fgdb feature classes. There is tight security that I think may be the issue, but I don't have a firm grasp of the specifics.
DB is a file geodatabase in this instance.
Over a network, and (this is probably the kicker) my computer is a "thick" client. ArcMap is loaded on it and calculating values for data on the server.
I am afraid I do not understand what do you mean by "thick client".
I may be using the jargon incorrectly, but what I mean by "thick client" is that my laptop is doing a lot of the work, using software that is loaded on it. What would be preferable would be a "thin client" (as I understand it) where the laptop does little more than connect to a computer that runs the software (ArcGIS, ArcCatalog, etc..), without actually performing the geoprocessing. It just receives the results. I have a very good internet connection, so I think that would work well.
Copying to local drive and testing is a sensible suggestion. I appreciate your input and I will try IT. I've been under the gun and haven't had time to test, but I definitely will as soon as I see daylight.
In your map and geoprocessing environments, where is your workspace and default geodatabase at? I always set these local. It's possible there is something wrong with these, such as they are fragmented.
I have found that the limitation is doing updates on data that resides on a network Thought for that few of records it should not take that long.
There must be some underlying issue with your network, your date, or your application itself.
how large is your MXD. If it is large try using MXD Doctor on it and or ArcGIS Document defragmenter
Have you compressed your data regularly?
for my experience Thick Client as you call it is the better method unless your PC is weak.
Thank you everyone for your helpful input. I think the problem that motivated me to write this is a rather specific one and I think I found it.
We all find our system slower than we'd like, but for a period of time, things were so slow that it was almost literally impossible to get work done - select a folder, wait five minutes. Select a file, wait five minutes, calculate a few records, wait five minutes...(With a deadline looming). Very stressful.
I was working with event themes based on text files. I converted them to fgdb features. After struggling for days, I converted to sde layers, and got an error message with two of the files. The error message was cryptic (surprise!), but was my first hint the data might be the culprit
Turns out each one of these files had a field with a length value in the thousands. The field would not show in the attribute table but listed in the field list, where I saw the length number. Forgot the exact number, but a ridiculously high number. I added new fields, calculated values over and dropped the bad fields.
I am now back to the world of slower-than-I'd-like-but-I-can-live-with-it.