|
BLOG
|
A colleague brought me this problem, a utility customer with large, versioned enterprise geodatabases wished to maintain what amounts to replicas synchronized daily (overnight). Geodatabase replication was not feasible (I took his word for this, something to do with geometric networks, but in any event if the target was a feature service definitely the case). The target was to be in Web Mercator and not the source low-distortion coordinate system. Normally I relish every opportunity to pull a ChangeDetector transformer out of my hat as its is a very flexible, fast way to derive INSERT, UPDATE & DELETE change sets that are then efficient to write. The problem in this case though was data scale, reading the data into my ETL workspace would take hours (please don't take this as a general statement about Data Interoperability ETL workspaces, it's just enterprise geodatabases are busy things and reading very large batches of data can take time, and this database had over 10 million features). Time to visit a little known feature of Data Interoperability's enterprise geodatabase reader (GEODATABASE_SDE short name) - reading version differences. Here is how to use it. You're going to need a version that is a direct child of Default that you never edit, it doesn't matter what other versions you have but only edits posted to Default will propagate to the mirror. I call my child version 'Deltas'. Your initial target system (geodatabase, database or feature service) must be a copy of Default. Your daily workflow is to get your edits into Default then compare version differences with Deltas. If you think about it, after a daily edit post Deltas is a view of the database one day before Default. That means its 'older', or an 'ancestor' of sorts (made my head spin too at first that a child is a logical ancestor but bear with me). When adding the geodatabase reader to your ETL workspace, set the Read Version Differences property and use a connection to Deltas as the transactional common ancestor. I know, its weird, but it works. When the reader executes the features returned will be in the context of edits needed to make Deltas look like Default, and will have a format attribute fme_db_operation set to INSERT, UPDATE or DELETE. Now all you have to do is apply the differences. I'll walk you through the sample workspace pictured below. The reader uses the option to merge all selected feature types using a * wildcard. This lets you use a single reader for the entire set of versioned feature classes - did I mention how powerful this option is? There is always a format attribute fme_feature_type available to let you see what source feature class anything came from. The AttributeExposer lets me access fme_db_operation and an attribute FACILITYID, a unique key within each feature type that lets me support update operations. If you don't have such a key field you will need to handle updates as delete/add pairs which I haven't modeled here. Inserts and updates can go directly to the target database or feature service but deletes are a bit tricky. The FeatureReader reads one Deltas feature at a time with a SQL where statement that selects by ObjectID and you must also set the accumulation mode to Merge Initiator and Result to get the attributes from the deleted features onto the feature (they come through as null otherwise - they were deleted!). Then the data goes to the writer, which has FACILITYID set as the match column. After your synchronization completes the simplest way to be ready for the next day's run is to drop and recreate the Deltas version. You could automate this with a shutdown script like in this article. In the customer system about 2000 edits across dozens of feature classes were 'posted' to the target in less than 10 minutes. A sample ETL workspace is in the blog download. Comment here for any clarifications.
... View more
03-03-2021
09:25 AM
|
2
|
0
|
1801
|
|
POST
|
Well you can but it would be extracting data from Redshift to do it, not a direct read.
... View more
03-01-2021
08:11 AM
|
1
|
0
|
6313
|
|
POST
|
Data Interoperability extension supports Redshift now and will add Redshift Spatial (both read and write) at Pro 2.8
... View more
03-01-2021
07:43 AM
|
1
|
3
|
6320
|
|
POST
|
This tool wraps the Data Interoperability function in a Python script tool: https://pm.maps.arcgis.com/home/item.html?id=834e3ba8034e4e7f83d9fc4fcfb5713c
... View more
02-23-2021
08:28 AM
|
0
|
0
|
1611
|
|
BLOG
|
Good to hear, but by the way, your screen shot of 10.8.1 for Desktop above relates to the ArcMap install, but I could see the Pro install was OK as the Analysis ribbon commands are active.
... View more
02-10-2021
04:57 AM
|
0
|
0
|
4761
|
|
BLOG
|
I replaced the .pth file with one that expects Python 3.7 in Pro 2.7, please download the zipfile again, your process should work.
... View more
02-08-2021
11:55 AM
|
0
|
0
|
4773
|
|
BLOG
|
Hi, please check you have Data Interoperability extension installed and licensed, the import error looks like a missing install or license issue. As your real goal is to detect changes in a feature service it may be more efficient to use a Spatial ETL tool directly in a model. If you are able to share some sample data I can show you how that would work.
... View more
02-08-2021
05:09 AM
|
0
|
0
|
4787
|
|
POST
|
You should be OK then, but in case you get into any arguments with geodesists make sure you record that the heights are AGL. If the CSV data is available electronically somehow, say FTP, HTTP, email, cloud storage etc. then you can automate download and processing end to end with Data Interop.
... View more
01-28-2021
09:05 AM
|
1
|
0
|
3731
|
|
POST
|
I was thinking things like visibility analysis; constructing your 3D geometry is OK with VertexCreator, you just need 'ground' to visualize it. Z from the AGL values is OK. There are specialist solutions for radio propagation which I don't know anything about but Esri has a specialist if you need.
... View more
01-28-2021
08:42 AM
|
0
|
2
|
3738
|
|
POST
|
If you have AGL heights then you'll need a ground surface for any analysis.
... View more
01-28-2021
08:20 AM
|
0
|
4
|
3746
|
|
POST
|
Adam, WGS84 elevations are defined in meters above/below the ellipsoid so convert feet to meters except if your Z values are from the center of the earth then you'll need to additionally do a vertical datum transformation with CsMapReprojector. A wider issue is using WGS84 at all, if you're doing anything involving visibility or density or area a local projected coordinate system would be better.
... View more
01-28-2021
08:08 AM
|
0
|
6
|
3754
|
|
POST
|
Adam you can chain two NullAttributeMapper transformers to do this, in one change 'Yes' to a new value 1 and in the other 'No' to 0. Somewhat unhelpfully BulkAttributeMapper is an alias name for the transformer.
... View more
01-25-2021
05:46 AM
|
1
|
2
|
2831
|
|
IDEA
|
Hi, you can go a long way with Data Interoperability extension, including browsing cloud storage: https://community.esri.com/t5/arcgis-data-interoperability/interoperability-connections-in-arcgis-pro/ba-p/1008051
... View more
01-04-2021
11:41 AM
|
0
|
0
|
1595
|
|
POST
|
No that's us! If you create a support call with Esri it will be officially handled.
... View more
12-10-2020
09:52 AM
|
1
|
0
|
11242
|
| Title | Kudos | Posted |
|---|---|---|
| 1 | 10-03-2025 05:45 AM | |
| 1 | 11-21-2025 05:34 AM | |
| 2 | 10-06-2025 05:36 AM | |
| 1 | 11-03-2025 05:14 AM | |
| 3 | 11-04-2025 08:41 AM |
| Online Status |
Offline
|
| Date Last Visited |
yesterday
|