Hi,
I'm utilizing a SQL Server database as our primary enterprise data repository. I've established a feature class within this SQL Server setup. Currently, I'm seeking recommendations on the most efficient methods to insert/update data into this table. Specifically, I'm looking to integrate data from the following formats: 1) CSV files, 2) shapefile datasets, and 3) file geodatabases. Additionally, if there are alternative methods to acquire and update this data, I'd appreciate suggestions on those as well.
Thanks & Regards,
Abhishek Kumar Choudhary
This is a really broad question. The "most efficient" method depends on the nature of the data, the way the data is managed, and the nature of updates, and your experience and skill in ETL solutions. The source formats are somewhat trivial; it's what you need to do with them that matters.
Some folks have no problems taking 10 minutes to update 100k rows via a Portal API, while others are updating 100 million rows every 8 hours.
Once you add in different ways to acquire data, well, there are too many options beyond generic advice like "As efficiently as possible."
If you want specific advice, you'll need to provide more details on the process you are trying to optimize.
- V
Question: "I'm looking to integrate data from the following formats: 1) CSV files, 2) shapefile datasets, and 3) file geodatabases. Additionally, if there are alternative methods to acquire and update this data, I'd appreciate suggestions on those as well."
Answer:
If you are a SQL Server Administrator and an ArcSDE Geodatabase Administrator then read my white paper.
community.esri.com How Load Large Featureclass SQL Server Geodatabase
For more best practices visit Mapping and Charting Solutions (MCS) Enterprise Databases Best Practices