Scripting a sync process from fabric to traditional parcels feature class

1338
9
01-31-2022 09:22 AM
by Anonymous User
Not applicable

I'm working on a script to synchronize between our recently implemented parcel fabric and our legacy parcels feature class currently residing in a versioned feature dataset.

I'm curious if others are doing something similar and whether you'd be willing to share your experience. The first part of my test is to truncate the parcels layer and I'm running into an error 999999 "A class in the topology necessitates validation being run within an edit session. 

I created a new instances of the da.Editor and specified the connection and attempted to start an edit operation but I'm still getting the message "A class in the topology necessitates validation being run within an edit session."

Any feedback is appreciated.

0 Kudos
9 Replies
ThomasKonzel
Occasional Contributor

Good afternoon @Anonymous User,

We're doing something similar by backfilling our traditional parcel layers and annotation layers with the new Parcel Fabric data for legacy ArcMap and other systems.  This is just about a mandatory step if you want to continue to access the data from ArcMap via an enterprise geodatabase.  However, we first unversioned our data to remove these complexities and, because the legacy data is now read-only, we no longer needed it to be versioned.  Topology and versioning is now handled in the Parcel Fabric dataset.

I hope this helps.

@ThomasKonzel 

by Anonymous User
Not applicable

That's helpful. Did you do the scripting in Python to export from Fabric into Enterprise GDB?

0 Kudos
ThomasKonzel
Occasional Contributor

Yes, we have a scheduled task that runs a python script nightly to backfill a stand-alone feature class having the same name as the original, versioned feature class.  This way, all the legacy apps that used the versioned feature class in the past can still connect to the new, stand-alone feature class outside the feature dataset.  It appears to be a simple process.  After reviewing the script, we don't use the last_edited_date for updates as @jcarlson does - we push the entire dataset nightly.

0 Kudos
jcarlson
MVP Esteemed Contributor

Are the legacy parcels and the new fabric available as published services? We do a similar sort of process between our parcel fabric and a simple polygon layer, but prefer to use the ArcGIS Python API to do so.

Due to the sheer quantity of parcels, we also use the last_edited_date field to identify parcels edited since the previous run of the script, and selectively add / edit / delete features as needed, rather than a full truncate/append process.

- Josh Carlson
Kendall County GIS
0 Kudos
by Anonymous User
Not applicable

I figured I'd test the process in Model builder and here's what I get when I do this:

Error 160034: A class in the topology necessitates validation being run within an edit session.

In the database I have a mix of the feature datasets that are part of the parcel fabric and the feature dataset copied and pasted from SDE. Everything in the Land Records Feature Dataset is from the Legacy Parcels Feature Dataset.

I don't have another test instance to test the process in so I figured this is the closest I can get for now.

Is this an ok method to test the process?

0 Kudos
ThomasKonzel
Occasional Contributor

Keary,

  1. I believe you should first prepare your legacy target feature class by copying parcel_poly to a new name like parcel_poly_test, which is OUTSIDE of a feature dataset;
  2. Then ensure versioning is disabled and no topology is associated with this new, temp feature class;
  3. Point your backfill script to this new, temp feature class;
  4. Keep in mind that, unless your PF and temp feature class schemas are identical, you may have to map fields from the PF feature class to your new, temp feature class.
0 Kudos
ThomasKonzel
Occasional Contributor

Good afternoon @Anonymous User 

I didn't feel like my previous list was sufficient so I updated it here.  This may be a better set of steps to follow.  I'd appreciate if @jcarlson could also review and provide additional feedback.  My steps may not be complete.

  1. Use ArcGIS Pro to export your Pro Parcel Fabric Tax_Parcel feature class to an XML Workspace Document (Schema Only)
    • This will create an empty XML document containing only the schema
  2. Use ArcCatalog to import this XML Workspace Document schema into your test geodatabase as a new feature class OUTSIDE a feature dataset
    • Name this new feature class the same as the original but with _test appended to the end, like Tax_Parcel_test
    • Ensure there is no versioning or topology associated with this new, test feature class
    • This should yield a ArcMap ready feature class to receive the Pro Parcel Fabric features
  3. Point your script it at this new, test feature class instead of your actual Parcel Fabric Tax_Parcel feature class within the feature dataset.
    • Keep in mind that you may need to map fields if you’re not using an identical copy of the schema.

If I’m not mistaken, my team is using the Pro python because Desktop doesn't have access to the Pro Parcel Fabric.

0 Kudos
jcarlson
MVP Esteemed Contributor

It sounds like it'd work, but in my org we don't use ArcMap, and almost never use arcpy in our automated scripts, as everything is in published feature services. I don't know that I have much feedback for this particular workflow, but utilizing the XML schema to ensure the fields match is a great step to include in the process.

- Josh Carlson
Kendall County GIS
0 Kudos
KenGalliher1
Esri Contributor

Another suggestion would be to join the Pro and ArcMap polygon tables and create a database view.  This view can be filtered by dates, parcel types, branch versions, etc.  The feature classes remain editable which essentially updates the view. 

Here is some more info on querying branch versioned feature classes and creating views.

Query Branch Versioned Parcels - Esri Community