POST
|
Another thing, try reducing the batch size being written from the default 1000 to 500 (and less if necessary), if your data is very point rich it might be a problem.
... View more
|
0
|
0
|
0
|
POST
|
Hello again, I see you're reading in state plane and writing out web mercator. In this situation the data is supposed to be automatically reprojected but it's possible something is not working at this step, so you could add an EsriReprojector (or plain Reprojector) in your workspace to see if that helps. I also see you're using OneDrive as a shared file location, we have not tested this and it's possible it can be an issue. If problems remain after importing the write definition and reprojecting your data please open a support call.
... View more
3 hours ago
|
0
|
0
|
23
|
POST
|
Hi June When adding a writer you have the option to import its definition from an existing dataset, see outlined in red here: I'll look at your log shortly.
... View more
4 hours ago
|
0
|
0
|
25
|
POST
|
Another possibility is the wrong geometry type is in the data, for example it is OK if the data has null geometry but not if it has line or polygon geometry when you're trying to write to a point feature layer. Also, it is always a good idea when reading and writing between different storage platforms to define your writer by importing the schema definition from the source dataset. Both sides of your equation are Esri technology so this will be reliable.
... View more
7 hours ago
|
0
|
2
|
43
|
BLOG
|
Hello everyone. If you downloaded the blog attachment with workspace source files before 5:45 am PDT Friday 3rd October then please do so again, I simplified the LAViewSourceSwap.fmw workspace to remove unnecessary parameters which were artifacts of development testing.
... View more
7 hours ago
|
0
|
0
|
33
|
BLOG
|
The answer is both for me, but you be the judge for your situation! Read on for the decision criteria... If you want to maintain a large hosted feature service from external data it is best practice to avoid complete overwrites at each refresh, for two reasons: Large write transactions can be fragile Large write transactions can have significant service downtime To avoid both issues it is preferable to implement a CDC (change data capture) approach and write only deltas to the target feature service. This blog will describe two ways to do this: Writing deltas directly to the target feature service Maintain a hosted feature layer view pointing it alternately at two services Write delta transactions to the service not currently the source then swap it to be the view source In the usual situation where a period delta is a small fraction of the data, a direct delta write might take several seconds, while for a view source swap the downtime can be milliseconds, but has twice the storage cost. We'll do a worked example so you can choose between the approaches, but either way you're the winner using CDC! Here is my subject matter data, about a million street address points in Los Angeles, California, maintained daily: Los Angeles Address Points The job is to calculate and apply the daily delta transaction (typically the low hundreds of features) with low downtime, and while our candidate write modes (direct, view source swap) insulate the job's service downtime from the calculation time of the delta, it's always good to build in any optimizations you can. The city's open data site supports CSV download, and CSV is a performant format in spatial ETL tools, so that is half the delta calculation step. The other half is reading the current state of the feature service/view. Here is my optimization for feature service reading, in LAChangeDetection.fmw (in the blog download): Direct Write After Change Detection While the Esri ArcGIS Connector package supplies a feature service reader, in the quest for speed I implemented reading the target service using multiple concurrent Query calls with HTTP. I found that the default maximum record count per call (2000) in 4 concurrent requests gave optimal performance, roughly double the packaged reader's rate. The ChangeDetector transformer calculates the delta in seconds once it has the data, then writing the delta takes 3-4 seconds for a typical daily changeset (if you inspect the workspace you'll see I instrumented it with Emailer transformers to call home with some timestamp information). For people not satisfied with a few seconds service downtime, implementing view source swap is only slightly more challenging, see LAViewSourceSwap.fmw in the blog download: View Source Swapping You'll see logic in the workspace to toggle between "A" and "B" services for reading, writing and source swapping. For this reason changes are detected a little differently; the same public URL accessing the address data as CSV is read, but the delta is calculated versus the hosted feature layer that is not the current source for the hosted feature layer view, and the delta is applied to that feature layer. Then the updated feature layer must be swapped into being the source for the feature layer view. How? The answer requires some detective work, inspecting how ArcGIS natively handles view source swap in item settings: View Source Swap What you're looking at above is me manually doing a source swap but with the browser developer tools active, filtered to record POST transactions in big request row view. As I clicked through the view source swap I could see the system uses two calls, deleteFromDefinition and addToDefinition. Even better, if I inspect any POST call I can see the JSON payload used in it - which is lucky because the REST API documentation is a bit challenging for a no-code person like me 😉. The deleteFromDefinition payload is trivial, but the addToDefinition JSON payload is huge. However, as I made my services with default settings I'm not looking to change, I cut the JSON down to objects I thought worth keeping, and of course the required pointer to the desired source. Here is the JSON: {
"layers": [
{
"currentVersion": 11.5,
"id": 0,
"name": "LosAngelesAddresses",
"type": "Feature Layer",
"cacheMaxAge": 30,
"displayField": "Street_Name",
"description": "",
"copyrightText": "",
"defaultVisibility": true,
"adminLayerInfo": {
"viewLayerDefinition": {
"sourceServiceName": "@Value(_nextSourceName)",
"sourceLayerId": 0,
"sourceLayerFields": "*"
}
},
"geometryType": "esriGeometryPoint",
"objectIdField": "OBJECTID",
"uniqueIdField": {
"name": "OBJECTID",
"isSystemMaintained": true
},
"useStandardizedQueries": true,
"minScale": 0,
"maxScale": 0,
"extent": {
"xmin": -13210040.1828,
"ymin": 3989386.3054,
"xmax": -13153020.1132,
"ymax": 4073637.6182,
"spatialReference": {
"wkid": 102100,
"latestWkid": 3857
}
},
"spatialReference": {
"wkid": 102100,
"latestWkid": 3857
},
"globalIdField": "",
"maxRecordCount": 2000,
"standardMaxRecordCount": 32000,
"standardMaxRecordCountNoGeometry": 32000,
"tileMaxRecordCount": 8000,
"maxRecordCountFactor": 1,
"capabilities": "Query"
}
]
} In production I could edit the JSON to tweak things if desired, like extent or display field, but it's probably a better investment to get your layer design right before the fact. One key thing I learned about the payload is at line 15 where I inject a feature attribute into the JSON at runtime, sourceServiceName is a property that keys the service being swapped in, there is no reference to its item ID or its service URL. In my case the source service name toggles between "LosAngelesAddressesA" and "LosAngelesAddressesB" in consecutive runs. If any delta transaction contains no edits then no service swap occurs. So now we have squeezed as much downtime out of a feature service update as we can, it's your call if the average period's delta transaction is big enough (many thousands of features?) to justify the extra storage cost of view source swap and guaranteed minimum downtime. While I'm focusing here on downtime minimization, not run time for the whole job, if anyone is curious it's taking 3-5 minutes to refresh the million points I'm dealing with. I'm guessing the variability is coming from server load conditions where the data is coming from and going to. Acknowledgements: I was inspired to write this post by my Esri colleague @sashal who first explored this workflow, and to whom I'm grateful, and some prior art in a related workflow where file geodatabases are republished, see the first presentation here.
... View more
yesterday
|
2
|
1
|
126
|
IDEA
|
Hi Brad Support is partial, depending on what you want to connect to: https://community.safe.com/ideas/sql-database-in-microsoft-fabric-entra-authentication-37811 We recommend you post to the thread, to which I'm subscribed. We can also help with direct communication with Safe Software.
... View more
Wednesday
|
0
|
0
|
35
|
BLOG
|
If you track the ArcGIS Pro Roadmap you'll see that OneDrive integration is coming. What's planned includes (but is not limited to) support for sharing file geodatabases. If you surf the ArcGIS Pro Ideas space for OneDrive you'll see a bunch more requests for OneDrive support. One simple workflow - read-only data sharing to OneDrive - is already supported by ArcGIS Data Interoperability. Well, technically, if something doesn't work you can't call Esri support, but support is coming so let's be ready and try it out! The workflow I'll highlight is sharing a snapshot of an ArcGIS Online hosted feature layer (with table sublayers having relationships with the parent layer) as a file geodatabase. This "backup" sharing scenario is easy as ArcGIS Online is set up to export file geodatabase format. If you add a format conversion step you might share mobile geodatabase, GeoParquet or anything you like to OneDrive - ArcGIS Data Interoperability does it all 😉. Of course, on the input side, the data you're pushing to OneDrive may not be coming from ArcGIS Online at all, it can be anything you can reach. Here is my spatial ETL workspace: Share File Geodatabase to OneDrive I'll let you browse the workspace (in the blog download), but what it does is submit an Export job to ArcGIS Online, downloads the resultant zipped file geodatabase item, unzips, renames and then uploads the geodatabase to a folder in OneDrive. The workspace cleans up the export item as well. The secret ingredient is the OneDriveConnector transformer, one of a family of web-centric connectors in ArcGIS Data Interoperability. Note when naming the geodatabase I include a timestamp down to second resolution in the name. Using a static name and an overwrite approach can result in duplicate file geodatabase component names, which can cause problems. Similarly, while you can delete a target geodatabase before uploading a new version, the synchronization behavior of OneDrive can create race conditions, so to be safe use a dynamic name. After sharing to your desired audience (OneDrive can add a shortcut to Windows Explorer's "My Files", this must be dragged into Pro's Folders leaf in the Catalog pane, you cannot browse for it), the experience they will enjoy is like this, geodatabase features and behavior in ArcGIS Pro. File Geodatabase in OneDrive The OneDriveConnector requires a web service and connection be made, here is the workflow I used. If you're using a business account for OneDrive it's likely you'll need IT department assistance, and in my case I had some adventures with the authorization URL prompt parameter, which needed to be "select_account". Have fun with OneDrive!
... View more
a week ago
|
2
|
0
|
758
|
BLOG
|
There are many ways to read data with the ArcGIS Data Interoperability extension. Convenience matters, and users also appreciate not having to make copies of data to get their work done. On a recent webinar we showcased the Esri ArcGIS Connector package and mentioned using ArcGIS Online as a web filesystem, but there was no time to demonstrate the feature, so here we go in this blog, bringing together the convenience of file storage in ArcGIS Online with simple browsing to these files in ETL tools, including the Quick Import system tool. In an earlier post I was working with GeoParquet files, which are an example of data you might maintain in ArcGIS Online. The item type in Online is Parquet, but Data Interoperability can be told it's working with GeoParquet. Here is the result of bringing a GeoParquet file into my map in the Memory workspace. The mint green polygons are parcel subdivision edits the parquet files model. Browsing to Online for GeoParquet Data Over the map are dialogs relating to the Quick Import tool built into my Quick Import To Memory sample. When I navigate to my data I get the opportunity to filter by file type and pick the one I want: Browse to GeoParquet items in Online Quick Import To Memory does what it's name suggests and makes a memory feature class from GeoParquet. It really is that simple, when you go to pick a file for any ETL tool, and the file is stored in ArcGIS Online, you can use the option to browse the web, use a web connection to ArcGIS Online, and your data will be downloaded for you in the background. You do need to create a web connection either beforehand or at run time, which will persist for re-use. Follow this workflow. The Quick Import To Memory sample isn't guaranteed to work for every format, but for files storing one logical object of simple feature type it should work. For the usual case of spatial ETL tools authored with readers you can handle any format complexities when using ArcGIS Online as a web filesystem. Now you have another way to leverage ArcGIS Online!
... View more
2 weeks ago
|
3
|
0
|
333
|
POST
|
Data integrations of external data sources and ArcGIS Online and Enterprise portals are comprehensively supported by ArcGIS Data Interoperability, both natively with the software and on a continuous release basis via FME Hub packages. Esri and Safe Software collaborated in this webinar to spread the word about new and changed functionality in the Esri ArcGIS Connector hub package. Try it out for yourself and see why it's getting so many downloads!
... View more
2 weeks ago
|
1
|
0
|
95
|
BLOG
|
Business, facility or event addresses are coordinates that want to be found, but sometimes data with both geometry and well formed address attributes has not had the benefit of the high quality geocoding available in ArcGIS Online being applied to records where the geometry could do with some uplift. This blog shows how to apply geocoding to untrusted geometries within a dataset that changes over time without incurring the cost of recurring geocoding of records previously geocoded. The information product to be maintained is a hosted feature layer in ArcGIS Online or your own ArcGIS Enterprise portal. Here is the subject matter, around 600K business locations registered with the City of Los Angeles Office of Finance: Businesses in downtown Los Angeles You'll appreciate businesses may open, close, relocate or change name, and this dataset is refreshed monthly to track this. Businesses typically care they can be found, and what we see in the data is high quality address fields. If we want to maintain a useful spatial view of the data, such as in a hosted feature layer, we need to support this refresh cadence. In addition to records that change, we also need to uplift any records that didn't get correct geometry when first created. Records like these: Null Island and Equal Archipelago The red dots are either on Null Island or a region I'm going to name Equal Archipelago, meaning points where latitude equals longitude. Together with records with genuinely null geometry (liberties were taken in naming Null Island!) we have an easily identifiable subset of business locations to fix, which we're confident we can do from address fields in the data. Here is the subset of business locations we need to geocode: Locations to geocode Note the record count - 94086. As we'll be using ArcGIS Online to do the geocoding (requires credits), we need to be mindful of the credit cost for geocoding, and geocode target records only once unless they change address. This is where geocoding smarter comes in, geocode records with untrusted geometry or where address fields change, but only once, not each update, while still applying any changes to trusted geometry or non-address fields. Let's get started. The source data is in an open data portal for which there is a reader available via package download. In the blog attachment is a workspace CreateLosAngelesBusinesses.fmw (requires ArcGIS Pro 3.5+ plus ArcGIS Data Interoperability, or FME 2025+), the reader will install if you edit the workspace, as will the Geocoder package. The workspace creates an initial feature class. Here it is: Create business location feature class This though is where you cannot avoid the credit cost of an initial geocode, but notice we only geocode records with null geometry or on Null Island or along Equal Archipelago - these have untrusted geometry. Notable processing steps are querying the back end server to generate separate processing streams, creating point geometry from string values like '(34.2817, -118.5361)' and using batch forward geocoding for speed. The output feature class is shared as a web layer, which is the information product we'll be maintaining. This is where processing gets more interesting, and here it is, the workspace named RefreshLosAngelesBusinesses.fmw in the blog download: Refresh Business Listings The trick to not re-geocoding previously trodden ground is to merge known geometry onto untrusted records from the target feature layer, and to merge on the primary key and the address fields. This way if an address changes the feature will not merge and will be re-geocoded. Here are the FeatureMerger properties used to pick up geometry from known locations: Merge Geometry Only Once any geocoding is complete then all incoming and existing feature are sent to a ChangeDetector, which generates an efficient minimal transaction to update the target feature service. For the geocoding, you'll note that I switched from batch to row based geocoding in the Geocoder transformer. Batch is faster but row based geocoding can deliver better quality. Here is a month's worth of business location churn for Los Angeles (September 15th 2025), 596402 locations in total, of which 1688 business locations were updated, 4385 inserted and 1707 deleted. Of the 94086 incoming locations with untrusted location, only 1822 needed to be geocoded. A mere 9 addresses could not be found and need clerical review. A month's business addresses update Here are the ChangeDetector settings used to calculate the changeset transaction. If any attributes or geometry change for a primary key value, then a write record is generated with INSERT, UPDATE or DELETE transaction type. ChangeDetector settings The blog attachment contains the spatial ETL tool fmw source files to create an initial file geodatabase feature class of business locations and perform a refresh of the data on demand. There is also one for finding untrusted records to inspect before further processing. Have fun geocoding smarter!
... View more
2 weeks ago
|
2
|
0
|
325
|
BLOG
|
@AliciaMunchel1 I expect so, it would take some research into the non-human account options for Snowflake. @NicholasGiner1 do you have any insights? Thanks.
... View more
3 weeks ago
|
0
|
0
|
49
|
POST
|
OK thanks, the reply chain got lost. I'll see if the tool owner has any insights, not being able to log a support call makes it harder.
... View more
08-27-2025
06:11 AM
|
0
|
0
|
57
|
POST
|
I haven't tried it but I think you can abstract working with any type of feature layer in core ArcPy and delete any or all features as you see in the examples here: https://pro.arcgis.com/en/pro-app/latest/tool-reference/data-management/delete-features.htm
... View more
08-26-2025
11:33 AM
|
0
|
2
|
70
|
Title | Kudos | Posted |
---|---|---|
2 | yesterday | |
2 | a week ago | |
3 | 2 weeks ago | |
1 | 2 weeks ago | |
2 | 2 weeks ago |
Online Status |
Online
|
Date Last Visited |
8 hours ago
|