AnsweredAssumed Answered

Database tables and their attributes in ArcGIS Server 10.3+

Question asked by janweststeyn on Feb 8, 2016

Hi everyone,

 

I have a dilemma that I hope folks will be able to advise on.

 

The organisation I work for delivers datasets to clients for loading into their SDE environments. Historically we have relied upon the command line loading tools (sdeimport) to provide a data loading workflow (i.e. we supplied standardized scripts with our data in). We became aware a little time ago that the command line SDE binaries were no longer bundled with newer versions of ArcGIS.  Here is my first question - we assumed that the 'preferred' mechanism for loading data into SDE was via ArcCatalog (either via Copy/Paste or using Export/Import to Geodatabase). Is this true or is our assumption wrong?

 

We already used to deliver the data in a file geodatabase format to our clients and it seemed sensible to increasingly move over to this as a delivery platform. In this way we would no longer have platform dependant scripts, because ArcCatalog would be able to load data out of the FGDB into SDE whether it was based on Oracle/SQL Server/etc. This would kill two birds with one stone. We would move away from needing the command line tools, and we would also be able to support a wider array of databases because we did not have individual import scripts to maintain.

 

The trouble is that file geodatabases do not store any of the more nitty gritty database attributes such as keys, indexes and particularly field attribution like precision and scale. In particular, we've had clients use this method to import their data into Microsoft SQL Server based SDE but all their numeric fields have been assigned default precision/scale of 38/8. I understand that FGDB's do not have a mechanism for storing this, but I just wondered if anyone else had come across this issue and had issues for how to overcome it. We can't have our numbers appear with 8 decimal places; but we're also talking about a very large number of tables and columns, so maintaining a script to do this post-import would also be a little painful.

 

Right now it seems  that moving forward we will need to plan for deliveries that will be loaded the data back into an enterprise geodatabase, need to be made in a geodatabase that does support such features (MS SQL Express? SQLLite perhaps?). Unless anyone is aware of tools or features that allow underlying databases to automatically 'choose' field defaults based upon the underlying data?

 

Advice, comments and criticism welcome,


Thanks

Outcomes