Select to view content in your preferred language

Esri Developer & Technology Summit 2026 - Exploring Tools and Patterns for Data Migration

228
0
03-12-2026 12:52 PM
ShareUser
Esri Community Manager
0 0 228

At writing it's conference week for the Esri Developer & Technology Summit 2026 and I'm just back from presenting my technical session Exploring Tools and Patterns for Data Migration.

I like to go demo-heavy at event presentations, it lets me generate content I can share.  Here's a snapshot of some of the live action - ETL via script tool of Overture Maps Foundation Places theme points of interest brought into ArcGIS Pro's memory workspace as a base point layer (the green points) and related (with 1:M cardinality) place categories in a table view as my output information product, 64 million points queried from GeoParquet files in AWS S3 with map interaction to define an area of interest.

ETL with map inputETL with map inputETL with map input

Here I identify some of the points, see the alternate categories accessed by map relate.

Place CategoriesPlace CategoriesPlace Categories

I'm getting ahead of myself so let's back up a bit.  Ideally I could share a project package with attached documents but the relevant geoprocessing tool errored for me on a validation issue, so I'm just going to share tools, documents and minimal data in the post attachments.

Here is the relevant content, which I'll talk to in the order shown in the Catalog pane.

Relevant ContentRelevant ContentRelevant Content

The toolset Using Data Interoperability contains two Spatial ETL tools that embody a 2-stage process for maintaining a hosted feature service information product in ArcGIS Online.  First create geodatabase objects with the desired data model, relationships and schema, add them to a map and define symbology, popup behavior and metadata, publish the layer as a hosted feature service, then secondly update this information product on demand.

Not demonstrated on the day but something I know is in demand is a ModelBuilder model FeatureSetInput namely how to build map interaction into an ArcGIS Data Interoperability  Spatial ETL tool.  Here it is:

FeatureSetInput modelFeatureSetInput modelFeatureSetInput model

The model isn't shown as validated (it validates at run time) but the processing works like this:  A model variable of type FeatureSet is an input parameter, filtered to be polygon.  This means at run time you can pick a polygon layer or create new area of interest features on your map.  The core tool Features to JSON is then used to write an EsriJSON file to the project scratch folder as intermediate data.  This file's path is then an input parameter expected by the Spatial ETL tool DoSomethingWithEsriJSON which takes it from there to do whatever you want.  This behavior is frequently wanted for web tools.

Here is a view of the tool log as it runs, showing the input map polygon has been read into the workspace.

FeatureSetInput runtime detailsFeatureSetInput runtime detailsFeatureSetInput runtime details

Now for the Using ModelBuilder toolset.  The model EVPopulationGeocoded shows reading a local CSV file with the Export Table core tool to impose a desired schema, plus how to geocode unique ZIP code values to supply geometry to all rows and joining the geometry onto the base table.  The desired schema is tuned to fit the data, the script tool ReportMaximumTextDataWidth helps with deep inspection of field width requirements, but on the day I also showed using Notepad++ and Pro to help with problem discovery, such as nulls encoded as zero , empty strings, and empty geometry values.

EVPopulationGeocodedEVPopulationGeocodedEVPopulationGeocoded

Next up was URL2EVPopulation, which showed how to give ModelBuilder a boost with a little Python to retrieve the source CSV data from a URL where it lives in an open data catalog, this was a popular feature.

URL2EVPopulationURL2EVPopulationURL2EVPopulation

The submodel EVPopulation showed an alternate approach to geometry creation, namely ArcPy's ability to convert OGC Well Known Text into Esri geometry.  The core tool Convert Coordinate Geometry doesn't support WKT.

EVPopulationEVPopulationEVPopulation

Using Script Tools toolset has the GeneratePlaces script tool that makes the Places layer and Place Categories related table view shown above.  This shows how to add map interaction to your ETL tools, and introduces DuckDB as both a SQL-aware database and powerful integration client for local or remote data.  You can throw SQL queries at any data type DuckDB supports, right down to CSV.

That takes me to Notebooks as an integration option.  I demonstrated ImportPlacesByDivisionArea that used a SQL where clause defining an area of interest by naming Overture Divisions features that make up the area.  To help discover what the area names need to be the notebook ImportCurrentDivisionAreas is included in the post downloads.  This downloads all Overture Divisions areas worldwide.

ImportPlacesByDivisionArea goes further than the script tool GeneratePlaces in terms of making an information product.  It adds metadata to the output but more importantly maintains a geocoding locator "Places_Locator" of POI type using the latest data.  A copy of the locator is attached - see how it supports category filtering by entering "hotel" in the Locate pane in Pro - you'll see all hotels in Palm Springs are candidates!  Coders out there don't forget that locators at a path can be used programmatically by arcpy.geocoding to do things like repairing null geometry in script tools, you don't need to stand up a geocode service for row-based processing.

Not shown on the day but in the downloads is a toolbox QuickImportToMemory containing the tool QuickImportToMemory that does what it suggests using ArcGIS Data Interoperability - it lets you do data inspection in Pro from any supported data source.  You can learn about your data before committing to an ETL approach.

Al tools built with ArcGIS Pro 3.6.

Do comment in the post with your observations.