Geodatabase extremely slow

2144
10
02-24-2014 06:26 AM
TracyDash
New Contributor III
Hi,

I have a huge and growing geodatabase (10,000 fields now) and it's connected to a basic entry and search form for my coworkers to enter data with. Both have gotten extremely slow lately and I'm not sure what to do. I've compacted the database and deleted excess data. Right now it takes a good three or four minutes to simply OPEN. Submitting information also takes a very long time. Any ideas as to how to make it faster??

Thanks!
0 Kudos
10 Replies
GISDev1
Occasional Contributor III
We need more details. What geometries? Polygons, lines, points? What kind of extents? Where is the File GDB located? A network drive, local drive? What kind of network speed do you have if it's a network drive? Are you accessing the File GDB via the API at all or just in ArcMap/Catalog? Did you notice when the slowness started or was it just randomly 1 day?
Have you tried copy pasting all of the feature classes into a brand new File GDB? What version is the File GDB and what version are the clients? If you have multiple people editing, that should really be done in SDE, from what the Esri Documentation states.
0 Kudos
TracyDash
New Contributor III
Sorry, should have been more specific.

-FGDB stored on fast server and accessed by a dozen people
-two point feature classes and five tables (one point class is the extremely large one)
-accessing the fgdb via the api and arcexplorer (although very few have explorer) and I'm the only one with arcmap/catalog
-slowness comes and goes: the program might be fine for a few weeks and then it's hit with a sudden slowness. however, it was slow two days last week and today

I've tried copying the actual GDB and using that one but it does no difference. Would copying feature classes into a new gdb yield a different result?

The FGDB doesn't need to be upgraded (arcmap is 10.1)
0 Kudos
JacobBoyle
Occasional Contributor III
Due to the nature of how Arc keeps tracks of changes(adds and deletes mostly), are you running a regular compact on the database from ArcCatalog?

Compacting the database regularly cleans up the underlying system tables and should vastly improve performance.
0 Kudos
TracyDash
New Contributor III
Jacob,

Yes, I compact when I remember every few weeks. Just did it this morning. How often do you recommend compacting a large gdb?
0 Kudos
GISDev1
Occasional Contributor III
So, it's a Network File GDB edited by a dozen people? I'd say the best answer is to switch to SDE. I bet it would make your life much easier as the admin.
0 Kudos
TimHayes
Occasional Contributor III
A File GDB used in this way is likely the cause of the slowness. I would switch to using a SQL Server Enterprise Geodatabase (SQL Server Express 2008 R2 and ArcGIS for Server 10.2.1 Enterprise Basic or Standard) with Versioning, maybe each user would have their own Version.

10,000 fields? Have you considered separating out these fields into different tables and relating them to each other? this will be easier to manage. From this you can generate Views (as long as you are using the Enterprise Geodatabase in SQL).

SQL Server Express 2008 R2 is a free download.
0 Kudos
TracyDash
New Contributor III
I have thought about separating them into different tables...just not sure how to go about that yet.

But GIS Dev and Tim, thanks! All good advice. I'll definitely look into it tomorrow.
0 Kudos
MarcoBoeringa
MVP Regular Contributor
Hi,

( I have a huge and growing geodatabase 10,000 fields now) and it's connected to a basic entry and search form for my coworkers to enter data with. Both have gotten extremely slow lately and I'm not sure what to do. I've compacted the database and deleted excess data. Right now it takes a good three or four minutes to simply OPEN. Submitting information also takes a very long time. Any ideas as to how to make it faster??

Thanks!


-two point feature classes and five tables (one point class is the extremely large one)


10,000 fields? Have you considered separating out these fields into different tables and relating them to each other? this will be easier to manage. From this you can generate Views (as long as you are using the Enterprise Geodatabase in SQL).


I have thought about separating them into different tables...just not sure how to go about that yet.


Tracy,

I can not even begin to fathom what enterprise data-model would require 10.000 fields to track attributes in just a handful of tables... Are you sure "fields" shouldn't be "records"?... How would you even handle and make sense of 10.000 attributes if you had to base some sort of decision or calculation on it?

Can you please give some hints as to the kind of data these "tables" and "fields" contain?

Shouldn't you possibly be pivoting the table to make records out of fields to make a sensible data-model of this?

Marco
0 Kudos
TracyDash
New Contributor III
Marco

I meant records. Sorry.

Digitizing every job a surveying company has done for the last two decades. You have no idea how much data goes into that.
0 Kudos