SDE to File geodatabase export missing records in File geodatabase

1870
5
02-18-2021 02:21 PM
Ravichandran_M_Kaushika
Occasional Contributor

Readers,

good afternoon.  I was assigned a task to 'fix' problems with missing record counts from previous year reports.  Last time a nationwide report of land use report was run during Oct 2019-Jan 2020 time frame and all was good.  Similarly Oct 2018-Jan 2019 records were intact.

Only thing that changed if it should affect - In August of 2020, many servers were upgraded from ArcGIS 10.5.1 to 10.7.1 (dev stack and prod stack - separately).

Oct/November 2020, when the same programs were run, the record counts were at 40% of expected values.

Here are the facts:

  1. SQL Server (w shape geometry) national table has 21.1 million records - as expected.
  2. We split it state wise using STIntersect() function and got many tables that added up 21.1 million records.
  3. When we wanted to deliver those many tables to the customers as file geodb, we did an export from Arc Catolog-->Right Click--> Export --> To Geodatabase (single) and To Geodatabase (multiple) and we lost many records.  No difference whether we used the single or double option
  4. We did a shape.STIsValid() = 0 to see which records were failing - no records seemed to be invalid in SQL server.
  5. We use ArcGIS Pro and Arc Catalog to export and we found that there was no difference in the missing record counts (exploring 32 bit vs. 64 bit issues - but does not seem to be so).

Please see the attached spreadsheet that talks about the differences.

Thank you for your support.

 

regards

ravi kaushika

 

0 Kudos
5 Replies
MichaelVolz
Esteemed Contributor

Were you exporting to a file gdb on the local drive or a network share?

0 Kudos
Ravichandran_M_Kaushika
Occasional Contributor

@MichaelVolz ,

good morning.  thank you for suggesting it.  I saw that the exported to E:\ drive attached to the machine and the other export to the  \\ share drive did not make a difference in the count.  To confirm it further, I went and exported the features to C:\Temp\missingRecords.gdb\Iowa feature class. 

The record counts were the same...

0 Kudos
MichaelVolz
Esteemed Contributor

That did make a difference in my environment for for scripts that append features to a file gdb.  Prior to ArcMap 10.5.x, it always worked on a network share, but when my org upgraded to ArcMap 10.7.x we were losing records sometimes when appending to a file share so the script was modified to run locally which solved the problem.  I'm sorry this solution did not work for you.

VinceAngelo
Esri Esteemed Contributor

Getting folks to download a Microsoft file format with security challenges to review your data could result in low participation. Providing an ASCII table in a code block would get more viewers.

Note that Esri and Microsoft (and the other RDBMS vendors) have different definitions of valid geometry (at times), so using Esri tools like Check Geometry might be required to confirm data integrity.

I transfer tens to hundreds of millions of records between FGDB and PostgreSQL quite frequently without data loss, so I wonder if there's something specific about the data that you are using.  Have you tried to identify common properties in the dropped data? (One way would be to pass the preserved unique key values back into the database, then create a LEFT OUTER JOIN where the returned keys have a NULL value.)

-- V 

 

0 Kudos
Ravichandran_M_Kaushika
Occasional Contributor

Thank you for your response.    I will work on your ideas and let you know.

regards.

0 Kudos